Welcome to the lab.

Curiouser and curiouser

[Tweet : nvALT]

Some of us always follow the rabbit. It’s not because we want to. It’s just that the rabbit…

10 a.m., somewhere in Minnesota

Sit down to write a blog post. It’s basically written already in my head, just need 20 minutes to get it down.

Double check some code I’m including, and realize it doesn’t work right in an edge case that just crossed my mind.

Drop back to Sublime Text for some testing. This edge case is revealing a more substantial failure. Get a little frustrated and head to StackOverflow.

Find a StackOverflow answer that wasn’t exactly what I needed but follow a link to GitHub and start examining some code, out of context, that I think might have the answer. See a function used that I’ve never heard of and have to track it down through the repository. Turns out it’s a native function in the language that I’d never seen before.

Consider the possibilities of this functionality. What’s the use case for this? Why was it considered important enough to be part of the core functionality. Figure out some cases, start to think “so what else could this be used for?” I can think of some code that could be refactored to play with.

Grep through to find an old script and start trying out the new method. Little snag, I need to convert the data when it’s ingested. That will require a separate library. Package manager gives me a missing library error.

Run brew update to prepare for installing the needed package. See the list of changed and new packages. Wait, what does that one with the cool name do? brew home newcoolness. What? I didn’t even know this was possible!

Dig into the new package, consulting docs and trying out various pipelines. Holy cow, I should write a blog post about this. Wasn’t I doing that already? Check timestamp. Look at watch. Tomorrow, I guess, it’s time to make dinner now.

Cognitive Productivity With macOS: 7 Principles for Getting Smarter With Knowledge

[Tweet : nvALT]

The latest book from Luc P. Beaudoin is out: Cognitive Productivity with macOS: 7 Principles for Getting Smarter with Knowledge, available on Leanpub, the iBookstore and Amazon.

Luc is the co-founder of CogSci Apps (creator of mySleepButton), founder of CogZest, and most importantly, a Cognitive Scientist who specializes in Mac and iOS-based productivity. So he’s got credentials. You may even have heard my chat with him on Systematic.

If you’re a fan of this blog, or a listener of shows like Systematic, Mac Power Users, or any of the Mac shows with a productivity focus, this book will be for you. While it certainly has appeal to the academic side of knowledge gathering and consumption, it breaks the processes down in a way that will help anyone working to organize their information. By focusing on key principles in the areas of knowledge gathering, organizing, assessment, and mastery. It includes great info and tips for using Mac apps like OmniFocus and OmniOutliner, Anki, Timing, Leap and Yep, working with plain text and Markdown, and great tips for working in Finder and keeping projects and knowledge organized and accessible.

Luc has a 2-second rule that says any information you use frequently should be accessible on your Mac within two seconds. It’s a rule I clearly love, as I’ve dedicated a good part of my adult life to making it happen for myself. I’ve learned a lot from reading through Luc’s tips surrounding the 2-second rule, and I can’t imagine there’s anybody who wouldn’t walk away with at least 5 new ideas.

Cognitive Productivity with MacOS is currently listed on the iBookstore and there’s a Kindle Version available, but if you purchase directly through Leanpub using coupon brettterpstra2018, you can get a 20% discount on PDF, EPUB, MOBI, and Kindle versions all at once (plus free updates if the book is changed or added to). I highly recommend taking the time!

PodTagger 1.1

[Tweet : nvALT]

I know a few people are using PodTagger, so I thought I should publish the latest updates. It’s mostly fixes, but it also now adds a “metadata” section at the top of the shownotes.md file that it writes out that looks like:

<!-- Metadata
Duration Seconds: 5243
Duration: 01:27:23
Filesize: 63004650

When I post a Systematic to SquareSpace, this is info I need when setting up the external mp3 link, so I figured I’d just have podtagger figure it out for me. A few more steps saved.

Download below, or check out the PodTagger project page for full details!

PodTagger v1.1.0

Automated ID3 tagging for podcasts

Updated Sun May 06 2016.

DonateMore info…

PodTagger.app v1.1.0

An application version of PodTagger

Updated Thu Jun 21 2018.

DonateMore info…

The XML Data Liberation Front

[Tweet : nvALT]

Despite the grandiose title, this post is pretty specific: converting RegExRX files to Markdown so I can include them in my nvALT snippets collection. Despite that, I’m sharing it because you can use it as a base to modify and start “rescuing” your own data out of other applications. I understand why applications of any complexity store their data in structured files, whether XML, JSON, or a database format, but I like to keep my data portable. Since the Data Liberation Army isn’t huge in number, the onus falls on us to find our own ways.

This script specifically works with XML and outputs to Markdown, but you could easily make the idea work with JSON files, binary XML (with a little help from plutil), or SQLite database queries, and output to any format you wanted with a little templating.

Ok, diatribe over. Back to the script.

Out of all the editors/testers for regular expressions out there, I’ve always come back to RegExRx. It’s not pretty (the Mac App Store icon couldn’t even get shadow transparency right), but it has all the features I could ask for. As I work, I save my successful regular expressions to RegExRX files. These are plain text XML files with the patterns stored as hex. This makes them pretty human-unreadable, and you know me…

I wrote a script to convert a folder full of these .regexrx files to Markdown files I could drop into nvALT or Quiver. I won’t go into a ton of detail on this because I’m pretty sure there aren’t more than 5 people in the world who will ever need this script, but…

In this script, you can specify a few options when you run it:

$ regexrx2md.rb -h
Usage: /Users/ttscoff/scripts/regexrx2md.rb [OPTIONS]
-o, --output-dir=DIRECTORY       Output folder, defaults to "markdown output"
-p, --prefix=PREFIX              Prefix added before output filenames
-t, --template=TEMPLATE          Use alternate ERB template
-h, --help                       Display this screen

Specify an output folder, a note title prefix, and your own template for the output (there’s a default one if you don’t make your own). A template is an ERB file that uses the variables @title, @flags, @search, @replace, and @source. The @source one is the contents of the “source text” in RegExRX, a string or text block to test the expression against. There are also helpers like “@source.indent” which will take every line and indent it 4 spaces (to make a Markdown code block). Also, .to_js simply replaces forward slashes with \/ so you can use /[search]/ in your template. Note that it doesn’t account for already-escaped slashes because I don’t use them in RegExRX (its copy-as feature does it automatically), but that’s something I’ll probably fix sooner than later.

Here’s an example template that imports nicely into Quiver:

The result in Quiver:

Side note: annoyingly, a lot of other snippet apps (like SnippetsLab) can’t just import Markdown files as notes. I had to import the results of this script in Codebox (which I think is now defunct) and then import that library in SnippetsLab.

And here’s the Ruby script. You need to have Nokogiri installed, which is (usually) just a matter of running gem install nokogiri (though depending on your setup you may need sudo gem install nokogiri and there’s a 50% chance you run into issues with libXML that you’ll have to search the web about).

Even if you don’t use RegExRX, I hope this inspires some data liberation for some folks.

macOS dialog shortcut keys

[Tweet : nvALT]

You probably know that I’m a big fan of keyboard shortcuts. I try to learn them all and rarely click a button or pull down a menu. There’s one area where I had a big question I had to pose to Twitter: how do I trigger the “Delete” button in an “Are you sure you want to close…” dialog. I got the answer from Sören Kuklau: ⌘⌫. I figured that, since those are harder to find, I’d write out a few of my favorite dialog box shortcuts.

Full Keyboard Access

First, if you have Full Keyboard Access set to “All Controls” in System Preferences -> Keyboard -> Shortcuts (at the bottom), you can use Tab and Shift-Tab to navigate all the buttons in a dialog. The fully highlighted button (which won’t change while tabbing) responds to Return, and the one with the highlighted outline responds to Space.

Don’t Save

Just like it has been for as long as I can remember, if the “Do you want to save…” dialog’s button reads “Don’t Save,” ⌘D will choose that option directly.


In most dialogs you can use Escape to cancel, but not always. More reliably, you can trigger the “Cancel” button with ⌘. (Command-period), which is also old school. It was originally the “interrupt” command for canceling long-running actions in applications (and still is in some, such as Logic Pro X).


When you close an unsaved document in apps that are set up as document based apps, instead of “Don’t Save,” the dialog’s ignore button reads “Delete,” indicating that if you choose not to save or cancel the action, the document will never be written to disk. The shortcut for that button is ⌘⌫ (Command-Delete).

Other Shortcuts

In any save or open dialog, you can trigger the field that lets you manually enter a path (with Tab completion) using ⇧⌘G, or just typing a tilde (~) or slash (/). Typing a tilde will open the field with a tilde in the text field, which is a shortcut to your user’s home folder. A slash starts at the root of the filesystem, handy for getting to /Volumes/other disk.

You can toggle the display of hidden files with ⇧⌘. (Command-Shift-period).

All of the Finder folder shortcuts work, for example:

  • ⇧⌘H for Home
  • ⇧⌘D for Desktop
  • ⌥⌘L for Downloads
  • ⇧⌘O for Documents

You can see all of the shortcuts for folders by opening Finder and pulling down the Go menu in the menu bar.

Web Excursions for June 13, 2018

[Tweet : nvALT]

Web excursions brought to you in partnership with MindMeister, the best collaborative mind mapping software out there.

Tripetto - Full-fledged form kit
The world needs better forms. Tripetto includes a visual form editor, a collector for gathering response, and an SDK for developing form building blocks. It’s a self-hosted node app, so not for everyone, but really nice for flowing forms.
Dialogue : Screenplay Writer
Despite not being a screenwriter, I’m enamored with the idea of writing screen dialog in text message format. Check out the product page, you’ll see what I mean.
Fathom - simple, trustworthy website analytics
Fathom (now available on GitHub) is a new website analytics platform built on simplicity and trustworthiness (recall the now-unsupported Mint?). Get the analytics you need without giving Google (Analytics) access to any of your visitors data. This site should be switching over soon.
Home Assistant
Home Assistant is worth noting in addition to all of my mentions of homebridge. Pointed out to me by Adrian Rudman, it has modules for just about everything you can imagine wanting to pull together (and add Siri/Alexa integration to).
Well that’s just freaking adorable. (a login widget that reacts to text field interactions)

Check out MindMeister and start brainstorming, collaborating, and boosting productivity.

Thoughts on the state of home automation

[Tweet : nvALT]

Forgive me for pontificating and recollecting like an old man for a while. I’ll be 40 next month, so I’m practicing.

Let’s start in the late 90s. My landlord had informed me that he sold the property I was renting, and I had a few weeks to move out. I’m still not sure that was legal, but my parents had moved out of state but kept the house where I spent my teens, and now I was renting their furnished basement while they were gone.

I lived with Aditi in that house for the first few years we were married. She put up with a lot of my early home automation experimentation (grudgingly). All of the light switches became X10 switches. I installed (poorly-mounted) speaker systems in the bathroom, kitchen, and living rooms, running wires from the SoundBlaster card in my PC in the utility room. An AMP jukebox and voice synthesis app let the house start providing multi-room audio, and even talking to us.

I added remotes around the house. Alarm clocks that could also turn lights on and off. Slow-wake sunrises with the bedroom lights. The TV remote could control the TV, my homemade DVR, and lighting scenes. I hacked a couple of Audreys with WiFi adapters and LCARS menus (handcrafted in Flash) to for touch-screen control of everything, and mounted them in the stairwell and the hall to the bedroom.

Back then, I was always the first one in the bathroom in the morning, so it was easy to have a morning automation routine that was just for me. It would only trigger once during the day, and only between 5 and 6am. The bathroom light shined directly into the bedroom, so a door close sensor would trigger the ramp up of the bathroom and kitchen lights, start the coffee maker, and then proceed to read me the weather and my appointments for the day in a hushed tone.

In a time when most people considered voice control the stuff of sci-fi, I rigged the house’s late 80s intercom system up to my PC running a voice accessiblity program and Homeseer to control all of my X10 switches. It didn’t work terribly well, but I could instruct the computer to “turn on the lights in the living room” as long as I’d left the intercom in “listen” mode (or walked across the room to press the “talk” button that was 3’ from the light switch).

Eventually I got my own house. The electrical system was noisy, and X10 (which communicates mostly over power lines) stopped working as well. Over time I upgraded my system to Insteon, and when I switched to Mac in 2000, I started using Indigo. The hardware interfaces for Insteon also controlled X10 devices, so everything kept working together.

Just like with my Homeseer setup, I could program complex criteria and sequences for my home in Indigo. I could have multiple lights respond to a single switch, and respond differently based on variables.

I could have motion detectors that were smart enough to keep each other active when only one was seeing motion at a time, or trigger different events based on the order that the hallway motion sensors were triggered (so lights can follow you).

I could have open/close sensors on doors. I could have moisture and light sensors trigger anything I wanted to.

I could connect anything I wanted to.

And I could hack anything I wanted to. I took a beam sensor from a grocery store door, originally intended for counting foot traffic, and turned it into a laser trip wire that controlled software variables in the system. That made directional motion sensing way easier. I hacked a Radio Shack mailbox light sensor to announce when the mail had run, waiting until I got home if the house was empty. I combined a series of motion sensors, open/close sensors, and the beengone utility I wrote to determine if I was in my office, routing notifications (and summons from the wife) to blink(1)s and Powermates if I was there.

Then Siri came along, and what I wanted more than anything was to tell her to do some of this stuff for me. Good voice recognition and response means infinitely more possibilities than any switch available. Being able to turn on the deck lights from my watch was an amazing prospect.

When I started, X10 was it. By now, protocols had become disparate and proprietary, and sticking with X10, Insteon, and Z-wave meant no HomeKit compatibility. I’ve since hacked my way around that with homebridge, and it’s been working well. Even though my lights are a combination of various protocols and manufacturers, I can control them from bed with a “Hey Siri,” and turn off the basement lights from the living room by talking to my watch like freaking Dick Tracy.

Over the last year I’ve been adding Hue bulbs, switches, and motion sensors. They’re amazingly responsive and reliable, and I can control them with both Siri and Alexa, though initially not with Indigo. If I didn’t want to use the more limited capabilities of the Home app on my iPhone, I couldn’t really can’t make them talk to each other or control them via a central, scripted platform.

When I started adding Alexa devices to the mix, I ran into more issues with tying it all together. Then I discovered a homebridge plugin that broadcasts the whole system to Alexa as well. I also discovered that there’s a plugin for Indigo itself that provides 2-way integration with Hue products, and one that integrates Alexa right in (without needing homebridge). So now I can incorporate all of these newer products into my scripting and control system.

In addition to being able to control all of my devices, scenes, and routines via both Siri and Alexa, I can also easily integrate things like Flic buttons, thanks to Indigo’s REST API. This means I can have a button under my desk that toggles lights (and absolutely does not close or lock any doors automatically). The API also means that services like IFTTT can provide some glue that would otherwise be overly complex to engineer.

I can use the latest devices, and still have a system where a wall switch determines whether it’s daytime, and executes a different function than it would at night. I can have an open/close trigger on the bathroom door that triggers different routines based on the time of day. I just need to get RFID or BTLE identification working so that I can control actions based on who is triggering them.

I get some of the best voice control I could reasonably ask for (more with Alexa than with Siri), and the best of scriptable automation. Automation hardware is becoming more affordable, though the protocols are becoming more and more fragmented. I’m quite thankful for HomeKit in this area. Manufacturers can have all the proprietary protocols they want, and as long as everything publishes to HomeKit, it all works together. (Same with other home automation protocols, as long as manufacturer’s publish to them.) It is annoying to have to buy a different hub for every protocol, but it’s not as though any serious home automation enthusiast expected minimal hardware to be part of the equation.

While I’d still love to see HomeKit itself becomes more scriptable — allowing full control over all of the devices it recognizes with complex interactions between switches, sensors, and devices — I can’t imagine programming it all on any mobile device. Which, of course, means Apple isn’t going to be interested in developing it. So Indigo gets to fill in the gaps.

I love the idea of the Apple TV or the HomePod as a central brain. For a while there it seemed like the coolest stuff was only going to work when a specific iOS device was around to interface everything. Eventually I might not even have to have an always-on Mac mini processing everything. Honestly, the hardware options (within the Apple ecosphere and outside of it) are getting really good, and the Home app itself isn’t bad. Most of HomeKit’s target audience isn’t going to feel constrained by it. Only the sci-fi-loving nerds who probably have the skills to do it anyway are going to feel the need to build desktop apps and controllers.

I’ve been doing this for a long time, and I’ve never been more excited about the possibilities.

Alexa and Siri and bringing it all together

[Tweet : nvALT]

I have a longer home automation post in the works. It’s actually more philosophical than “how-to,” so I’m taking my time with it. My discovery this week bears mentioning on its own, though.

I’ve become more and more enamored with Amazon’s Alexa, and fascinated with its superiority to Siri. Echo dots are relatively cheap, and the Philips Hue integration with the spying little devices is polished. My biggest issue was that the majority of my home is automated using devices that are neither Alexa nor HomeKit compatible, at least not in a way that works with all of the scripting I’ve done previously.

I’d hacked around the HomeKit issues using homebridge, which I’ve talked about before. It requires an always on home server, so it’s not a solution for everyone, but it did the trick for me. I don’t have a HomePod, but I imagine that it would be a nice addition to that integration. What I do have is 4 Echo Dots, and what I wanted was Alexa control over my Indigo setup.

Then over the weekend I discovered that there’s an Alexa plugin for homebridge. The setup is, relative to the Siri setup, really simple. With the combination of my Indigo plugin and the Alexa plugin, I have complete voice control over all of my Hue devices and my Insteon/z-wave devices, as well as access to my custom Indigo Actions and Triggers.

If you have any kind of similar setup, it’s definitely worth looking at the homebridge-alexa plugin. You’ll need an account through cloudwatch to install the Alexa skill (search for homebridge in the Skill section of your Alexa app). You’ll also need to run your homebridge instance in insecure mode (homebridge -I). All together it took me about 15 minutes to have full Alexa access to all of my devices, and I can even add Insteon and Z-Wave devices to Alexa’s “rooms,” so that I can just tell the Dot in my office to “turn on the lights” and it knows which lights to toggle.

It has the further benefit of being allowing me to ask Siri and Alexa to do the same things, and not have to think as much about which one has which capabilities.

In case it didn’t come through in my writing, I’m very excited about this.

Shell Tricks: Autocompleting system sound names on the command line

[Tweet : nvALT]

I find the bash commands complete and compgen overly mysterious, so I’m often playing with them to try to get a better grasp on all of the poorly-documented options. complete is a shell built-in, no man page, just help complete output. It’s vague.

I’ve detailed some of my other exploits with this in the past, one of my favorites being custom app alias completions. This time I wanted to go a lot simpler.

The afplay command comes default with OS X’s BSD installation. It’s the “Audio File Play” command used to play sound files in compatible formats. I usually use it in scripts to play the system sounds (Glass, Basso, etc.). So I wrote a quick function to make it easier to get to those:

play() {
	command afplay "/System/Library/Sounds/$1.aiff"

With that in place, I can just call play basso and it will play the sound. I don’t always remember the names of all the sounds, though, which means I have to ls /System/Library/Sounds to see them. A perfect job for shell completion, right?

So here’s the simple script that I source in ~/.bash_profile to give me tab completion of system sounds, listing them all if I haven’t started typing a word yet.

What it does is create an array from the result of listing the sounds directory and getting the base name of every file minus the .aiff extension. Then, rather than using compgen to do the matching, it uses a custom loop to handle case insensitive matching. This would normally work by default with compgen and shopt -s nocasematch, but for reasons I’m not clear on, it doesn’t when you’re providing a custom list.

The _complete_system_sounds function is used to case-insensitively complete the “play” function when I call it, so typing play f[TAB] will offer me “Frog” and “Funk.” afplay completion continues using the Bash default, only my custom function is affected.

Neat trick.