Freedom and Compliance

I'm speaking at Atlassian Summit this year on the topic of Continuous Delivery at Scale. In the lead up to the conference, which is April 9-11 in Las Vegas, I recorded an interview about the topic with our marketing team:

It came out pretty well and I'm now even more excited to speak.

Mar 22nd, 2019

Jekyll to Day One

After pushing my post on How I Use Day One live the other day, I got to thinking about other Day One use cases that I wasn’t actively exploring which might be useful or interesting. The first one that came to mind was something I had considered in the past but rejected for largely refutable reasons: storing a backup of the posts on this site. Since my posts are already markdown-formatted, storing them in Day One should have been something I did ages ago, but I had wanted to keep my first post date to my younger son’s birthday. Since my blog existed before that, storing those posts in Day One would “mess things up” in some sense.

Except it was my own stupid rule so I got over it.

Now, since it’s all Markdown, I could just copy and paste every single entry into Day One, update the date/time, add some tags, and repeat… but that’s a lot of work and absolutely no fun. Instead I spent my Sunday afternoon writing a python script to parse the YAML front-matter for each post, re-format any content which used Liquid tags and Jekyll plugins, and rip out images to a local directory so that all of this could be imported to a Blog journal via the Day One CLI.

This mostly worked, except where it did not, for instance

  • Any remote images whose links were broken. The code doesn’t handle that situation, so I had to manually fix the whole two posts that were like this.
  • Sometimes an entire post would be in italics or bolded after import. I saw nothing consistent about these, and I suspect a Day One issue as I’ve had random “stuck in italics” problems on the Mac client that I cannot reliably reproduce.
  • Day One no longer technically supports footnotes. They still work for now, but I’m wary and in a few spot-check cases I just removed them and put the content in-line. This isn’t much of a backup but makes the post more portable to a non-Markdown-based system in the future. Maybe.
  • There is no valid way to use a strike-through in Markdown, so in my blog entries I used s tags. Those translate as code fragments in Day One, which is weird looking. It’s just a visual abberation.
  • Any cross-reference links that make use of Liquid tags (e.g. ) just stay like that. I fixed those ~30ish posts manually.
  • Any time my Liquid syntax or YAML front-matter wasn’t perfect. Jekyll is more forgiving than me, apparently, so there were a few posts that threw really ugly error messages when their imports failed.
  • Too many markdown reference-style links in a post just break the entry and the links stop working. I only had a couple of posts like this but they required a lot of work to fix.

Of the 232 posts that I imported, about 55 had something about them which needed to be fixed.

Since the script might be useful to someone else, I put it up on Bitbucket. You can find it here. I have a little clean-up to do in the repo, but it’s serviceable.

My “Journal” journal still has the start date that’s important to me, but now my site’s posts are preserved in another format that’s just as, if not more, important to me.

Dec 4th, 2018

How I Use Day One

Or, why I don’t write on this site anymore.

Why I Journal

When my younger son, B, was born, I wanted to record my thoughts on being a dad to two boys. My handwriting is horrific, so a written journal was not going to support this habit. I had an iPhone, and was already snapping photos of our older son all the time, so the fusion of my circumstances led me to buy Day One for iOS on the (really early) morning of B’s birth, and write about sleeping and feeding and how this all compared to when his brother was born.

Even over five years ago, the UI for Day One was really superb. It had been the App of the year in 2012, and it won an Apple Design Award shortly thereafter that was well-deserved. Writing in it was (and still is) a joy. I could use the form of Markdown I already knew and loved, and the stylesheet looked great. I grabbed the Mac app a week or two later and could write from nearly anywhere. At the time, it only supported a single journal (called Journal) and synchronization was done either via iCloud (buggy as all get out) or Dropbox (super-reliable but a little slow). I chose the latter, and upgraded to the “Plus” tier at some point to be able to have multiple photos per entry and multiple Journals. I used both IFTTT and Slogger to automate my social life and other services into my journal.

A couple of years ago, Day One introduced a Paid subscription tier, as many other app/service vendors have. I avoided upgrading for a long time, but as they started talking about their roadmap (end-to-end encryption, audio recordings, web editing, etc.), I realized that this app’s future was actually important to me. I was using it almost daily, and if I wanted it to be there in the future, it behooved me to provide, in some small part, financial support. Since then, those first two features have arrived and then some, and they continue to be worth every penny.

Realistically, though, these are excuses. I don’t journal just because I like writing, I journal because I don’t want to forget. While I can remember convoluted plot lines of almost any fiction I consume, I struggle to remember precisely what I was thinking about 30 minutes ago, let alone yesterday, last month, or last year. On a recent episode of Hardcore History Addendum, Dan Carlin said something to the effect that, to people in the midst of some event, that event is “the most important” thing happening, but as time separates the observer from the event, and it is seen within the grand context of existence, that event is somewhat unimportant. Even historical figures like, say, Aristotle, may have just been some guy yelling in a marketplace, but with a good publisher...

In any case, I journal because I know I’d forget details of my life that have monumental value in the moment and which may have some value to future me. When I look back at these writings through the lens of time I get to re-live some of that weight and re-learn some of the lessons I was learning then. That has supreme value to me.

What I Journal

I use Day One for, at the moment, 8 major categories of things, and these line up with the individual journals that I have:

Journal

This is where I do general journaling about life. It’s for things I’m thinking about, for when my kids do something cute, or for when I take a nice picture of my dog that I want to capture. This is what a “journal” might be if I only used it for personal activity and life commentary.

As I take photos on my phone, any good ones, or any that catalog what I have been up to recently end up in Journal with a paragraph or two about it, why it was momentous, etc. I liberally tag my entries with people’s names, topics, event groupings, etc. I’d love to have richer tagging for people to make a mini/personal social graph of people and places and events, but that sort of scope creeps beyond a journal and I like that Day One focuses on what it does best.

I automate hardly anything into Journal, save podcast notes which are templated by sharing an Overcast link with a Shortcut that pulls down all of the metadata and formats the entry. All of the rest of this content is created by me. I do enjoy the Activity Feed view in the iOS client as it helps me catch up and write entries about places and photos of the last few days.

Faith & Scripture

This journal is for sermon notes, Bible study notes, thoughts on a verse of the day that I want to capture or highlight, things where I see God clearly doing something in my or someone else’s life that I want to remember, etc..

I have a particular format for some entry types and I am really looking forward to in-app templates that I can use for that content. I tag each book of the Bible mentioned in each entry, as well as any themes, study plans, etc. So far Genesis has the most uses followed closely by 1 Corinthians, but that’s mostly because every time I start a chronological study, I am really diligent about journaling my notes and less so over time. I found the F260 method earlier this year and that’s certainly helped.

I don’t automate anything into this Journal. I would love to see an integration between The Bible App - YouVersion and Day One so that my highlights and notes could be brought over. I considered building a Shortcut for it, but it’s not too many fewer steps to copy and paste than to share to a Shortcut.

Daily Book

This is an inconsistent attempt to write a summary of my working day. I earnestly tried to write this for about a month. It was useful at the time, and I should get back to it. It’s another use case for a structured template.

I semi-automate the creation of these entries with a reminder template. When the editor switched out of Markdown to rich text, though, the template got all wonky. My ideal world is one where, every morning, some service or process creates a template with the content I want in it from Google Calendar and Todoist and then places it in my journal to be filled in throughout the day. This journal isn’t encrypted so that might be possible with the CLI. A project for a rainy day, perhaps.

Instagram

All of my wife and my Instagram feeds dump our photos here via IFTTT. I add/edit tags after the fact to include people’s names that we wouldn’t tag on Instagram, like our kids. I’d love for Day One to support multiple Instagram accounts so that I didn’t need to have multiple IFTTT accounts to cover all of these feeds.

Reading

This journal is for any time I am reading a book and want to quote something, or I am reading something on http://instapaper.com/ and I highlight it. The Instapaper entries are created via IFTTT as one per highlight, so I go back later and consolidate.

To capture text from physical books, I use either Scanner Pro or Prizmo Go. Both work pretty well and I haven’t settled on one that I prefer. I don’t use tags much in this journal.

Libations

My whisky (and Scotch and Bourbon) tasting notes go here. I use a 5-star rating system, tag by rating, region, variety, etc., and write my thoughts on the liquor. I have a few beers in here also, but it’s mostly whisky and whisky-like drinks. I’ve used a handful of iOS apps for this purpose before, but their providers all fall into dis-use over time. My notes are really for me, anyway, but I’m always happy to share.

Entries into this journal are 75% automated via a Siri Shortcut that I wrote. In addition to needing to write my own notes for obvious reasons, I also have to have a photo already in my photos library, though I suppose I could modify it to let me take a new one at the time. There is no searchable whisky database I could use to back-end my creation process. That would be nifty.

Notes

This is for general notes which I want to be more permanent. Notes from one or more transient note applications end up here if I want them to live forever. More on this in a bit.

The Plan

My wife and I are starting to look at other properties. So far there are only two entries; we’re not that serious yet.

This journal is almost entirely automated. I wrote a small python script to scrape relevant details out of a Zillow listing and shove them into a journal entry that I can go back and write about later. I share a listing to a Shortcut and the entry is created in a few seconds. Right now my script is running on PyhonAnywhere, but I will probably move it to run only locally via Pythonista.

As we get more serious I’ll probably start using the audio recording feature to capture our thoughts as we walk around a property or as we leave.

Gratitude

(Wait, that’s nine…)

I’m reading through The Gratitude Diaries by Janice Kaplan right now, at the recommendation of the Day One community. It’s a pretty good read, and I think I’m going to give her method a shot in 2019. So that’s one more journal, but I haven’t started that yet, so it’s still technically 8.

Day One as the End Game

So I use Day One for a few things, eh?

In an article on The Sweet Setup, Josh Ginter wrote about for what he uses each of Bear and Day One and why they have different places in his tool belt. I admit that Bear is useful; I drafted this post in Bear, for instance, but ultimately Bear is transient for me, much the way that Apple Notes was for a long time. What he says rings true to me, too: Day One is the end game; anything that I want to have last for any reasonable period of time finds its way into Day One. Bear is for meeting notes and packing lists and drafts of things I will eventually journal.

That’s where my Notes Journal comes in handy: for those notes (that were originally) in Apple Notes or Bear which have long-standing purpose, there is a specific journal where they end up. If Bear disappeared tomorrow (read: the day after I publish this draft) there isn’t much I’d miss. If Day One were gone, I’d be crushed.

That concept is what underscores the purpose of Day One in my life: it is a commonplace book, as another Sweet Setup writer put it. I automate the process of funneling my other creations into Day One so that I have one place for things. I don’t care that Instapaper has my highlights, for instance—they have been copied into Day One and that’s where I read and review them anyway. I used to be a Pocket user and only switched because Instapaper had highlighting that I could put into Day One. It looks like Pocket can, now, too, so one day I may end up switching back. The initiating platform is unimportant; the result is what I care about.

A note about encryption: I’m a big fan, however as you might expect, an encrypted journal cannot be written to by external automation systems, such as IFTTT. This means that my default is that a journal should be encrypted unless I plan to automate entries into it. Automation from Shortcuts still works regardless of the state of the journal, so long as I’m willing to open Day One at the end.

Memories

As I said when I started this post, though, the other, and really more important reason why I do all of this, is so that I remember. My natural memory is shockingly poor at times. I have a very hard time remembering events or remembering what I was thinking when something happened, or my rationale for a decision. Journaling gives me a chance to capture that information and then re-visit it on occasion.

Day One makes this all the easier with the “On this day” feature, which ranks as pretty much every user’s favorite thing about the app. Every single day, I look at things I wrote between one and five years ago on that same day. It’s humbling to see where we all were in life even just one year ago. The most amazing is 3-4 years ago when my kids were at very different stages of life. These are the years my mind has found easiest to lose track of and without this journal I would not regularly have the joy of looking back on them at those ages.

Memory is a funny thing. As prideful humans, we think that our memories are indelible, especially events we consider particularly important. A study was done in the aftermath of 9/11 where people who were in NYC and up close to the event were asked to describe where they were that day, on the one year anniversary, at two years, at five years, and finally at ten years. They were then shown what they wrote at each of those increments. Most whose depictions differed over the years were adamant that they could not have possibly written what’s they wrote at the one year mark, that their memory now was clearer, and that their writings must have been altered (despite agreeing it was, indeed, their handwriting...).

We really do give our brains too much credit. Instead, as we get further removed from an event, our brains fill in gaps with other things people have said, other events we have experienced, or even just crap it made up. When we try to look back on past events, we need to remember that we look through a mirror darkly, not through a magnifying glass.

Unless, that is, you wrote it all down!

In my case, on the morning of 9/11/2002 I wrote down what happened on that day one year prior. I have a journal entry long-ago-exported from LiveJournal which survived imports into my subsequent MovableType- and then Wordpress-based blog(s) and several years of languishing in a mysql db export. I know pretty well what happened in my life that day because I can read about it and refresh the gaps in my fuzzy memory with my own words.

Forgive the writing style… it was 16 years ago. Names changed for privacy:

When Flight 175 barreled into the South Tower, everything stood still. It was the first one I watched happen on CNN. I woke up around 8:55. My friend [Jill] had something in her [AIM] profile about “my heart goes out to all the families..” I got spooked really quick. My energy went, and I turned to the only authoritative news source I could think of, msnbc.com. There I saw pictures of the North Tower, and was impressed that... wow... this happened about 8 minutes ago, and there's an almost full spread on it.

I tuned to CNN as the maintenance dude knocked on my door to fix my rug (it was bunched up in front of my doorway). We both watched CNN as the South Tower was hit. Then it was 9:04, and the panic continued.

It was a long day at work, even though we closed early. Many of the staff watched CNN on the TV that was normally reserved for Printer and Public Site status reports. I kept #news rolling on IRC b/c it was many times more up to date than cnn or msnbc… those sites were smacked pretty bad that day.

I tried calling [Jess in NYC], but the phones were all a mess. Thankfully her building hadn't lost its internet connection, and she was online.. 'scared to death' but still alive.. that's all that mattered to me. It was only a dozen blocks from where she slept.

We kept asking the consultants if they had family in NYC - many do. It took a few days, but everyone's everyone was accounted for. I called [Allie] to make sure her dad was ok, and he had been in Disaster Mode at [redacted place of work] all morning, but he was fine. Everyone I knew was ok, but I'm a lucky one.

God be with us.

That day is mostly indelible for me because I made the effort to make it that way, and I re-read that post on 9/11 every year because I put it in Day One a few years ago, so now I see it every other! There wasn’t much effort to do this. It was worth it for that memory, and is worth it for the others that I capture as my life has changed since B was born. A lot has happened since then, and it is a real blessing to to be able to read about it and feel some of those feelings again.

What Next

I do have some regrets that I didn’t start this all sooner. I am down to a single grandparent, for instance, and I have lost a lot of memories from my childhood for not having been a faithful journaler at a younger age. The journals I do have from my adolescence are hilarious, embarrassing, and should be used as fire-starter. My other LiveJournal-era entries are similarly embarrassing for the most part.

That said, there are more things being considered in the app that I am looking forward to:

  1. More and more functionality for audio recording. This has been an amazing addition to the tool belt for me. I record a post every now and again while walking the dog. If it’s short enough, Day One transcribes it for me and I can edit it after the fact. It’s hard to type when walking a dog, as you might expect. I’m planning on buying an Apple Watch Series 4 in the new year to replace my aging Series 0. One of the features that works on the newer watches is that you can natively record audio on-device for up to 90 minutes. This will be a game-changer for me. When I’m having a chat with one of my kids that I want to remember or when my wife and I are wrestling with a decision, or when I cannot take notes on something someone is saying, I can revisit it later and also not worry about its privacy due to the end-to-end encryption of my most sensitive journals.
  2. Video support. Most of the videos I take with my phone are there to chronicle a short event. I don’t want them on Youtube or another hosting service, and for now they are “collecting dust” in a Dropbox folder tree. Adding video support in Day One would be supremely helpful. I have converted some very short videos to GIFs to make an entry more “alive”, though, and that’s nifty.
  3. Shared journals. This is one extension for my “The Plan” journal which would open up a lot of potential. My wife, who doesn’t use Day One much, could open the same journal and add her thoughts.

Again, Day One is the end game. It’s where all of the stuff I want to keep track of ends up.

I would like to automate more things into Day One, but I’ve reached the limits of my digital life for now. There are numerous inspirations out there for automation, for example:

Why I Don’t Blog Anymore

And now we come to the payoff…

A colleague of mine asked me what my next blog post was going to be about way back in May. I was sure I would write about the treehouse I was going to build, but I never really wanted to share it like that. It was a rushed process that I over-engineered, and I wasn’t sure how to describe it. I sure do have a ton of journal entries about it, though, and that is why I do not post here any more: most of my written thoughts are for later recall and not for public consumption. I don’t have any need to share them, so I just do not.

It’s not healthy, though, to keep everything inside. I firmly believe that the best life is one shared with, and especially in service to, others, however that doesn’t mean I need to blog more. It means that I need to share life with people 1:1 and 1:several. A customer of mine stumbled upon this blog and mentioned to me that he liked what I posted and would enjoy seeing more. I get it; I consume other people’s writing all the time and would it not be fair for some to expect me to share alike?

Despite being an extrovert, this isn’t the medium I want to contribute my thoughts to right now. I could make an empty promise to the ether about aiming to write more, but who am I trying to impress? I am definitely going to journal more, but I promise nothing about this place.

Dec 1st, 2018

Admins of Atlassian Podcast Appearance

Finally.™

I set a goal a long, long time ago to be on a podcast at some point. I've been an avid consumer of them for over a decade, starting with TWiT and Hardcore History on my long train rides between Everett and Wellesley back in 2004. As much fun as I thought it would be, I struggled to figure out how to get started. I could talk about anything but the podcasts I enjoyed the most had hosts who really knew their topic cold and as much as I was willing to talk about something, I never felt I had a handle on any one thing enough to talk about it on a regular basis.

Fast forward a decade... I met Mark Williams at Summit 2015. I had started listening to his "Admins of Atlassian" podcast a few weeks beforehand, and so I took a shot and asked him if I could collaborate. Mark was kind enough to not say "no" right away, so I figured there was hope... He ended up taking a job at Atlassian shortly after we met, and the podcast went on hiatus for a good long while. After my move over, I reached out to see if I could help restart the podcast, and a well-intentioned as I am, I think Mark likes doing it on his own for the most part. It's his brand and I do not fault him one single bit for this.

But, Mark is an awesome guy and he recognized that there were some topics where having a guest host or two makes for a good show, and so when he asked, I may have been the first person to volunteer. Maybe.

I can neither confirm nor deny.

Turns out, I really did love the experience, and I really don't hate the way my voice sounds.

Take a listen below, or check the episode out here if you're interested, wherein Mark chats with Jennifer Van Leeuwen and me about our best practice thoughts around upgrading Atlassian tools.

Dec 19th, 2017

Enter Sandman on Classroom Instruments

This is so great... and I was wondering the best way to get my kids into Metallica...

Nov 18th, 2016

Publishing This Blog With Bitbucket Pipelines

Earlier this year, Atlassian released Bitbucket Pipelines, it's Cloud CI offering, as a beta product coupled with their hosted Git/Hg source control service. I like to know as much as possible about my products, but I'm not much of a software developer1, so finding a use case to try out Pipelines was puzzling.

Then I remembered that I build this very blog. (Oops) It's not much of a job to run an octopres/jekyll build, but my current setup was highly-dependent on a single machine at my house always being on with Dropbox working. It was a little too fragile. I host this blog on nearlyfreespeech.net, and though I know there are other hosting options out there that might do the whole build and deploy process natively for me2, I (also) try to never shy away from a challenge.

At Summit last month, Atlassian released Pipelines as GA, and set an intro price of FREE for the remainder of the year.3 I was out of excuses, so in my spare minutes over the past month I have tried to get a simple jekyll build of the site to work in pipelines.

And, of course, ran into one problem after another.

First, pipelines runs its build in a Docker container. This isn't a negative, actually, but it added a complexity with which I had relatively little experience.. I figured the simplest way to get started was to use an existing ruby image that was the same version on my Mac and just install dependencies as part of the build. Jekyll, though, requires a JavaScript compiler and no matter what I tried to do in the build to install oneall of them, the best option was to get node working since that's how my current build works. At that point, though, my build script was tens of steps each run, meaning my build-minute use was going to be super high each time I wanted to publish something.

Rather than build my own Dockerfile, which was really tempting, I deciced to use another image in the Docker Hub that has both ruby and node already set up. It's not far from what I was about to do myself, so no use re-inventing...

...which was good, because second, I really didn't want to have to install Docker. Every time I have previously tried to install Docker and have it work reliably, the VirtualBox piece just dies on me at some point. (More on this in a moment, though, because I'm rarely this lucky.)

My blog is already a private bitbucket repository. I was able to skip a few parts of the setup and just enable Pipelines on my existing repo, though I chose to create a branch for this work which I merged in once I was happy with the results. I also wanted my repo to be as simple as possible, so I spent some time adding items to my .gitignore and expanding the rake task for cleanup to remove the generated site.

An aside: Now that I work for Atlassian, I have a parallel set of accounts to the ones I had before as a customer. This means I have two bitbucket accounts, and as a result my normal method of keeping my bitbucket ssh key in my local keyring failed to choose the right key when working with my blog repo locally. Enter [git aliasing](https://developer.atlassian.com/blog/2016/04/different-ssh-keys-multiple-bitbucket-accounts/), which is super handy.

That fixed, I looked at how my current site generation task is run4 and tried to replicate that via the bitbucket-pipelines.yml:

1
2
3
4
5
6
7
8
9
image: starefossen/ruby-node:latest

pipelines:
  default:
    - step:
        script:
          - bundle install
          - rake generate
          - rake deploy

After that, I knew I'd need to get key-based SSH set up to actually deploy the content:5

1
2
3
4
5
6
7
8
9
10
11
12
image: starefossen/ruby-node:latest

pipelines:
  default:
    - step:
        script:
          - mkdir -p ~/.ssh
          - cat my_known_hosts >> ~/.ssh/known_hosts
          - (umask 077 ; echo $SSH_KEY_VAR | base64 --decode > ~/.ssh/id_rsa)
          - bundle install
          - rake generate
          - rake deploy

And while I expected that to work, I quickly found out that rsync wasn't part of the ruby-node docker image:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
image: starefossen/ruby-node:latest

pipelines:
  default:
    - step:
        script:
          - apt-get update
          - apt-get install -y rsync
          - mkdir -p ~/.ssh
          - cat my_known_hosts >> ~/.ssh/known_hosts
          - (umask 077 ; echo $SSH_KEY_VAR | base64 --decode > ~/.ssh/id_rsa)
          - bundle install
          - rake generate
          - rake deploy

This worked, nearly flawlessly, with one major problem: the results of rake generate did not match what I had at home, and for a little while I had a very ugly, completely empty, web site. (Oops)

After a respite and far too many (frustrating) re-runs, I decided I had no choice but to bite the bullet and get Docker installed locally. In the last 8-ish months, though, Docker has fixed the problem that had been bugging me for ever and ever, and removed the need to have a separate VM application running on the machine. Replicating my pipeline environment was therefore trivial6 and I ran through the build steps without any issues.

Which was infuriating, because it worked perfectly fine.

So, you know, it has to be the environment.

Turns out I was over-zealous in my cleanup efforts and had removed the Gemfile.lock from the repo, meaning that it was grabbing all of the latest dependency versions from rubygems. Somewhere in there was an update that broke generation altogether. The lock file was still there locally (because I had run bundle install outside of Docker at some point in the troubleshooting process), so once it was pushed back to the remote, the pipeline build completed flawlessly. And deployed. And I was happy.

Each time my site builds, I will be consuming about 2.5 build-minutes. I can publish 20ish posts per month (HA!) for free for the foreseeable future.

This was a great learning experience. Prior to this a majority of my CI experience had been using Bamboo to kick off mvn commands. I'd highly recommend using pipelines for simple build tasks, even if it's for things like unit tests, content generation, publishing to a remote, whatever.


  1. My weak-arse perl skills don’t count, nor do I write unit tests for my hack-jobs, alleviating me of any sort of CI benefits…

  2. I had been using the free aerobatic.io hosting in bitbucket for a bit, and I could move to Github and hsot there with a CNAME, but I’m happy with my setup.

  3. Starting in January the cost goes up to whopping $0 for 50 build-minutes, $10/mo for 1000 if for some reason I need that much.

  4. Via hazel, as a simple shell script.

  5. I apprecate that Atlassian made it trivial to store a local key securely via a protected environment variable.

  6. No, seriously. Maybe 5 minutes, while the kids ran around screaming and hitting each other with light sabers.

Nov 10th, 2016

Cubs Win! Cubs Win! Holy Cow!

Nov 3rd, 2016

Wintergatan

Breathtakingly awesome. The videos explaining how it works (Part 1 and Part 2) are fascinating. He plays the song on the keyboard at the end of part 2 to explain how he modifies a repeated set of melodies. It's almost as good on just a piano.

Sep 4th, 2016

Leaving on a Rocket Ship

In mid-2005, I was working as the Residential Network Support Manager for Wellesley College. It was my job to coordinate all of the technical support for the 2,000+ students, train two dozen students with almost no tech support background to do field and email support, and plan out the welcome program for getting new students acclimated to their network and software and such. Though I had been relatively successful there, I really didn't enjoy my job. There was a vast cultural difference between myself and the other staff, and despite my best efforts to push for material changes, they were met with an attitude of "that's not the way we do things", which was prevalent in EdTech, especially at smaller schools.

So I started job-hunting, and through a roller-coaster of life events, finally took a job doing application support for a small-ish healthcare tech startup called eScription. They made a suite of tools for Computer Aided Medical Transcription using a mixture of proprietary and open-source speech recognition tools and home-grown applications. The solution was sold as a SaaS platform with some on-premise workflow tools and a user-facing client embedded in Microsoft Word, and the entire stack was supported by a (then) 7-person team. We had about four dozen customers at the time, mostly individual hospitals around the country. The job gave me all sorts of opportunities to flex my unix muscles, and I grew with the organization, eventually managing one of the support teams.

About two years later, I moved out of Support and into R&D, owning one of those workflow products. Shortly after that, eScription was acquired by Nuance, and we were quickly assimilated into the vast acquisitive machine, joining Dictaphone and Commissure among the Healthcare division's more recent acquisitions. eScription was the go-forward platform for background-speech transcription1, and though we bled original staff members like a stuck pig, we generally flourished.

In late 2010, my team was spun off of eScription along with some other staff in the division to start a skunkworks project to deliver our first NLP solution to the healthcare market. Though we stumbled a bit those first couple of years an had a failed partnership or two, we learned from our mistakes and put some narrowly-focused products in the market. After a couple of additional acquisitions in 2014, we developed more of a footing in the industry, and now deliver the top-ranked Clinical Documentation Improvement platform in the industry, along with the under-pinnings for intelligent Radiology assistance, structured documentation generation from narrative medical text, and Computer Assisted Coding for medical billing. It's a platform I am proud to have helped build from the ground, up.

...and now I'm done. Though it has been a great ride, my days at Nuance have come to an end. I am going to miss my team so very much, but after just shy of 11 years, I am ready for my next adventure: In mid-September, I am going to be joining Atlassian as a Technical Account Manager working remotely with East Coast enterprise customers. Over the last three years, I've become intimately familiar with their tools and services, and have gotten to know many of their staff. Atlassian looks like a great place to work, and I am looking forward to finding out for myself in just a few short weeks.

When the TAM program was introduced2, I pushed to have Nuance purchase this service as we were growing our footprint of users from a few dozen to a few hundred (which then turned in to a couple thousand). It was one of the best decisions we made. Our TAM has been instrumental in supporting our division's adoption of every tool that Atlassian makes, from JIRA to Bamboo, and ensuring we follow best practices along the way. I'm such an unabashed fan that I'm quoted on the TAM web site (scroll to the bottom):

Our TAM gave us product advice that was able to save a department of 300 people roughly four hours a night. —Matt Shelton, Engineering Manager, Nuance Communications

Admittedly I'm just as much of a fan of the Premier Support team -- these two services are worth every penny for an enterprise customer.3

This is going to be a very different job from the onemany that I do now. For starters, I haven't been an individual contributor since 2006! Having only my own work products to focus on will be a change. Not working directly for an R&D group will also be a big shift, but I haven't been able to do much that is customer-facing in a while, and I am excited to get out there again. I'm going to be adding some DevOps experience and CI/CD exposure to the team, and given Atlassian's trajectory there is so much room to grow. I can barely wait.

Here's to the getting on the rocket ship!


  1. If ever there were to be a niche…

  2. Late 2013/Early 2014, if memory serves.

  3. This wasn’t supposed to be an ad read.

Aug 27th, 2016

Maven Extension for Feature Branch Isolation

Last year I had the privilege of speaking at the Atlassian Summit on the topic of selecting a branching model when migrating a team to Git. I had a blast, but I also came away with a small amount of regret: During the presentation, I mentioned that my team had used a custom maven extension to automate the process of isolating our build artifacts in Artifactory. We had been, and at the time still were, working with our legal department to obtain clearance to publish the extension as an open source project.

That process, unfortunately, took a lot longer than I had expected. Thankfully, the wait is over. I am incredibly pleased to finally release our Maven Feature Branch Extension to the general public. The extension is published under the Apache Software License, using the same version as Maven 3.x. Take it, use it, fork it, etc. We'll track issues in our bitbucket project and try to get to them as quickly as we can. We'll also happily accept pull requests.

I am deeply embarrassed that there is a very real possibility that someone's work might have been stalled waiting on me for nine months. I have tried to reach out to everyone that asked for this extension at Summit, posted comments on this blog, and emailed me directly. I may have missed some and if you fall into that category, please accept my most sincere apologies. I'll be at Summit this year; I'd be happy to buy you a beer.

Jul 29th, 2016