jb… a weblog by Jonathan Buys

Adopting BBEdit Scripts for Vim

January 4, 2015

In addition to my experiments with the design of this site, I was also testing out BBEdit as my main writing and programming tool. BBEdit didn’t stick, but I did like some of the scripting the good Dr. Drang has done, and wanted to adopt a few for MacVim. I started with three of his scripts today, one to paste and select text in one command, one to convert a tab-separated table to Markdown, and another to even up the Markdown table so it’s easier to read in plain text.

Since Dr. Drang’s scripts read from stdin and output to stdout1, converting them to Vim was very easy, once I found the right syntax for my vimrc file2. My first thought was that I would be able to copy the same syntax I use for calling the outstanding formd by Seth Brown, but formd is meant to parse the entire text of the file, not just the selection. Eventually, I found my answer on StackOverflow.

My vimrc file now has the following lines:

" Even up a markdown table
vmap <leader>mn <esc>:'<,'>!~/Unix/bin/Normalize-Tables.py<CR>

" Convert a tab separated tabel to a markdown table
vmap <leader>mt <esc>:'<,'>!~/Unix/bin/Tabs-to-Markdown-Tables.pl<CR>

The first word, vmap, maps the shortcut to visual selections in Vim. Next, <leader>mn creates the shortcuts ,mn for Normalize-Tables.py and ,mt for Tabs-to-Markdown-Tables.pl.

The next part <esc>:'<,'> grabs the selection and passes it to the command, which starts with an exclamation point3 and ends with <CR>, which stands for “Carriage Return”.

I need to spend some time in my vimrc file to sort out the naming convention for all the key maps, but for now, I’m thinking “,mn” for “markdown normalize”, and “,mt” for “markdown from tabs”.

For the third part I’m borrowing from Dr. Drang, I wanted to paste and select text at the same time. Once again I had to turn to StackOverflow, and now have this mapped in vimrc:

nnoremap <leader>sp :set paste<CR>:put  *<CR>:set nopaste<CR> <Bar> `[v`]

The first part sets the mapping, ,sp, which I’m thinking of as “select paste”, and then pastes the text from the OS X system clipboard. Next, the <Bar> entry strings two mappings together in Vim. Finally, \[v`]` performs the selection on the last change to the text.

So, now I can take text from Excel like this:

Left align	Center align	Right align
This	This	This
column	column	column
will	will	will
be	be	be
left	center	right
aligned	aligned	aligned

paste and select it with ,sp, followed by ,mt to convert the table to Markdown.

|Left align|Center align|Right align|
|--|--|--|
|This|This|This|
|column|column|column|
|will|will|will|
|be|be|be|
|left|center|right
|aligned|aligned|aligned|

and finally ,mn to even the table up nicely:

| Left align | Center align | Right align |
|:-----------|:------------|:-------------|
| This       | This        | This         |
| column     | column      | column       |
| will       | will        | will         |
| be         | be          | be           |
| left       | center	   | right        |
| aligned    | aligned     | aligned      |

As always, my thanks to the good Doctor for scripts and inspiration.

  1. As God intended. 

  2. Someday, someone will create the perfect app for managing your vimrc file, but today is not that day. 

  3. Or a bang, if you’re an old Unix guy. 


All Mine

January 3, 2015

I’ve been experimenting with the design of this site for the past couple weeks. First, I used a default Jekyll template, slightly modified to my liking. Next, I tried out a very nice theme that made good use of hero images and included nice typography. I changed the name of the site to “INTERACT”, and briefly considered leaving it at that. Unfortunately, the more I looked at the site the more it looked like it belonged to someone else.

Something reminded me of Merlin Mann and John Gruber’s South By Southwest panel where John said he wanted to own every pixel of his site, from the top left to the bottom right. I agree. This little part of the Internet has my name on it, it’s jonathanbuys.com, and I want to be able to point at this site and say “I built this”.

It’s not much, I’m not a web designer. I’ve tried to optimize the site to make it easy to read, and easy to port between hosting companies. I’m back to using my own Python script for generating the site, so there might be a few inconsistencies or characters that are missing or not rendered correctly. I’m going to work on that.

In the meantime, once again, I own every pixel of this site. From the top left, to the bottom right.


Cellular Options

January 2, 2015

I pulled into the gas station on my way home after a long day, picked up my phone in my left hand, intending to put it in my pocket, and opened the door of my pickup. While pushing the door of the truck open, the phone slipped out of my hand and fell face down on the pavement, shattering the screen.

I’m not the type who can carry around a broken phone for months, so the next day I called US Cellular to see what my options were. I was one year into a two year contract, so the prospects of actually getting anything out of my carrier were grim. I was right. The support person on the phone said that since I didn’t opt into the US Cellular insurance program, I either have to look into having the phone fixed, or buy a new phone at full price. Turns out, there are several options.

Buying a new phone at full price is the most expensive up front, and the option where I received conflicting information from the support people. Since the only iPhone 5C carried by US Cellular is the 8GB model (which should be an embarrassment for them and Apple), I was looking to buy a 5S, or possibly a 6. The 5S retails for $549, and the 6 for $650, but when considering the benefits of paying for the phone up front I became confused. The phone is not of much use without the cellular contract to go with it, and US Cellular charges a $40 “connection fee” for each line attached to my $70 per month plan. My question to US Cellular was what is the benefit from buying my phone up front if I still have to pay the connection fee?

Faced with a $650 charge for a new phone, I pulled the ace from my sleeve and said the magic words: “I think I might cancel my account.” The cancellation fee is $350, which some carriers might pay for you, which would bring my out of pocket costs down to $200, or maybe $100 to get into a new phone. Albeit with a new carrier and a complicated situation with the rest of my family still on the old plan. I didn’t think it was a serious option, but mentioning the cancellation was enough to get me pushed through to another level of support, who was happy to move my upgrade eligibility up an entire year to keep me with the company.

So, now my options were a little better. I could buy a new phone at full price, buy a new phone at the discounted rate, I could finance the full price of the phone over two years (which is a new option I wasn’t aware of before), or I could keep my phone and fix the screen.

The question remained though, why would I ever buy a new phone at full price? The source of my confusion centered around the $40 connection fee. One representative told me that the fee was there so the company could recoup the cost of the subsidized phone. If so, they are making a healthy profit off of each phone sold; $40 per month over two years comes out to $960. Another representative told me that no matter what, if I owned my phone or if I bought it at a discounted price, I had to pay the connection fee. The only way not to pay the fee was to finance the phone.

If I financed the phone for two years they would wave the fee, but in place of the fee I would be paying $32 per month on an interest free twenty month loan. Also, financing the phone would make me eligible to upgrade to a new line every eighteen months, if I was willing to pay the remainder of the phone off, or trade in the phone for a new model. This seemed interesting, but it also seemed like a way for me to remain in debt of one kind or another continuously. Even though the total paid out seemed to make more sense over time, financing did not appeal to me.

So, I decided, grudgingly, to buy a new phone at the discounted rate. I found the nearest US Cellular store, and went in to buy an iPhone 5S. While I was talking to the representative I asked her again about the connection fee. This was at least the third time I’d talked to someone at US Cellular about the fee, and this time the answer was different. If you own your phone, and you are not under contract, there is no connection fee. This changed things for me. If there was no fee for my phone that I had now, the best course of action I could take would be to fix the phone. I told her about my change in plans, and the conflicting information I’d gathered. To make her point she removed the fee from my account while I was sitting there.

I left the store without a new phone, went back to my desk, and ordered a $35 replacement screen and toolkit from Amazon. In two days the screen arrived, and I spent an hour and a half replacing it. Now my white iPhone 5C has a white face on it, and a black button, giving the phone some character. Also, I’m not under contract with US Cellular. I think for my next phone, whenever that happens to be, I’ll be paying full price for it, and staying out of contract lock-in from here on out.


The Million Monkeys

January 1, 2015

Computers, the bicycles for the mind, the idea engines; when we work at a computer we open the door to limitless avenues of creativity. Cracking open the lid of a laptop can be the first step to writing a novel, starting a new career, or getting in touch with long lost friends. But, when the machines misbehave, when they don’t perform as expected or present their interface in ways that are difficult or impossible to decipher, even the most mundane of tasks become a chore. The possibilities for the future melt away under the perception that computers are difficult and unreliable, our untrustworthy opponent to getting things done.

I use a Mac because every time I need it it works, and it has a consistent interaction language. That is, the main tools all recognize a similar set of common keyboard and mouse commands. Cut, copy, paste, click & drag, swipe to go back, etc… in a well designed environment each application operates nearly indistinguishably as part of the whole. The Mac is everything I wanted from a Linux or BSD desktop: all the power of Unix under the hood, and a well designed GUI on top of it. As great as I’ve always thought open source is, after waiting for fifteen years I think it’s safe to say that the year of the Linux desktop is never going to happen.

I remember when I first learned about OS X and the FreeBSD underpinnings. I was overseas, stationed in England in the Navy. I spent my days building firewalls and web servers with OpenBSD, and my nights loading every version of Linux I could get my hands on into this beige IBM PC. I resolved that as soon as I got back to the states I’d get a Mac. At the time, desktop Linux was rough, I mean really rough. Most of the time the modem wouldn’t work, there was no broadband, some of the devices (like the sound card) would work, some wouldn’t. You might be able to get native resolution on your monitor, you might not. You could download and compile a new driver for your CD player to listen to music, but that might drop you into dependency hell.

Things have cleaned up quite a bit in the open source world since then. Package managers like yum and apt make installing applications a breeze, while automatically handing dependency issues. However, the overall quality of the systems still can’t compare to OS X in either application fit and finish, or in system reliability. I challenge anyone to come up with a desktop application that debuted first on the Linux desktop, I mean something that is truly original happening first on Linux. That’s simply not where the innovation in the field is happening.

I should stop for a moment to say that I believe open source takes two forms: consumer code and server. Server side open source has been advancing at a breakneck speed, pushed by major influx of talent and money by big companies like Google and Amazon, as well as entrepreneurs building on Linux to deploy their applications. The best tools for the server are being built for open source, the best software for the consumer is being built by independent Mac developers.

Years ago some thought that open source would eventually take over everything, we were always just one release away from Linux being great. After all, the open source desktop had thousands of developers all over the world, all donating their time and talent for free, because they believed in what they were doing. What has actually happened is that the Linux desktop has become just good enough for some limited day to day use in specific scenarios. Why haven’t the thousands of programmers been able to surpass OS X after all these years? Why isn’t the Mac playing catch up to PCs running Ubuntu, instead of the other way around?

Part of it has to do with how Apple controls both the hardware and the software of their systems. Part of it has to do with how Apple has been able to pay good wages to talented developers for a very long time. Part of it has to do with the hierarchy of designers within the company, all working towards a common goal and an overarching theme. Linux has thousands of developers, but it doesn’t have a single unified vision for what the system should be. Instead, they have massively distributed groups, each working individually on their own component, each defining how their part should work.

Linux has talented developers, and pockets of designers, but until they are all under one roof they won’t be able to compete with Apple. Apple has nearly 40 years of experience developing personal computers, that’s tough to compete with for anyone.

As for Linux? A million monkeys typing endlessly may someday create the entire works of Shakespeare, but in the meantime, they are going to write a lot of junk.


Merry Christmas

December 24, 2014

It’s nearly midnight on Christmas Eve. I’m the only one awake, perhaps with the exception of my dog, Oliver, although as the minutes tick by I’m less sure of him. Tomorrow morning the kids will wake us up earlier than we’d like, and we will tear into the presents, eat a wonderful breakfast, and have a fantastic day enjoying each others company.

Merry Christmas everyone, and have a happy new year.


Green Beasts

December 23, 2014

My commute takes me past the sanitary landfill every morning, a daily reminder to be careful about what I throw away, and what I can recycle. Driving by the dump doesn’t bother me per se, but the maniacs who drive the dump trucks do. I’ve learned to watch the turn into the dump, watching for the massive metal beasts, trying to anticipate when they’ll pull out, and if I’ll have to swerve out of their way.

They take left turns towards Des Moines, swinging out into the fast lane of 163 and hitting the gas. So far they haven’t hit me, but any time someone pulls into the road in the adjacent lane I get nervous. They appear too quickly, not yet with the flow of the traffic, a towering green structure of steel on wheels tilting slightly towards the top from the force of the turn. The dump trucks accelerate quickly, first matching my speed, then passing it. They’re in a hurry, they’ve got a job to do.

I get out of their way.

This morning one passed me on the bypass; I was going seventy.

It’s hard to be critical of someone doing their job, doubly so when their job is driving a dump truck. It’s not exactly the most desirable job in the world, although I understand they are well compensated. The trucks take me out of my own little world and remind me of our human bravery. Every day I spend hours hurtling down asphalt in my own steel beast, one mistake away from certain death. It happens every day; the sign on 235 says over three hundred people have died on the roads in Des Moines this year. We never think it could happen to us, until it does. I never think it could happen to me, until a green dump truck swings into the road unexpectedly and scares me half to death.


Things That Dont Belong in Browsers

December 2, 2014

I still have a soft spot in my heart for Firefox, but it’s not my primary browser. I use Safari for just about everything, except the rare occasion when I need flash, then I use Chrome. Firefox is only used in the even rarer occasion that I need to use a site that Safari doesn’t support properly. Since I use all three browsers, I keep an eye on new features and development, wondering if something new from Firefox will draw me away from Safari.

I don’t think it’s going to happen.

Firefox released version 34 yesterday, which included a handful of fixes and improvements, but also included these two new features:

  • New - Firefox Hello real-time communication client
  • Developer - WebIDE: Create, edit, and test a new Web application from your browser

This isn’t the first time a Mozilla browser included a “real-time communications client” as part of the browser. Feature blot that leads to including features that would be better suited as separate applications is one of the reasons Firefox was initially split from the Mozilla browser to begin with, as this quote from the release notes of Phoenix 0.1 highlight:

Third, “Mozilla” is not the name of an application; it is the name of a monolithic suite containing a browser, a mail client, an irc client, and an indoor skating rink (we hear that’s coming, anyways.) Even if we did decide to call this browser Mozilla, we’d still have to call the standalone mail client (see below) something else. We also believe Mozilla, in general, is going in the wrong direction in terms of bloat and UI, and see no reason for our releases to carry those connotations.

To come full circle, Firefox now includes a browser, a video chat client, and an integrated development environment in addition to the indoor skating rink. Firefox was amazing because it was super fast, and it attributed much of that speed to being very focused on the browsing experience. Mozilla has seen fit to morph Firefox over the years into a platform, one that is ill suited for most users everyday needs.

The tools and features that Mozilla builds into Firefox are high quality, especially the developer tools, but they would benefit far more from being separate applications.


New Mac Essentials - 2014 Edition

November 24, 2014

It appears I’ll be getting a new Mac soon, which means it’s time to take inventory of what I need. I’ve written about this a couple of times before, and it’s interesting to look back and see what apps stick, and which have gone by the wayside.

Obviously, I’m still using OS X, and I’m still using most of the built in apps like Safari, Mail, Calendar, Contacts, and so on. I still backup with Time Machine, although I’ve added a secondary, off-site back up with Backblaze.

Apps from 2008 I stopped using:

  • Adium - Work forced me to move to Lync, which is terrible, slow, and bloated, but at least it crashes every now and then.
  • Shimo - The built in VPN client in OS X is so good I don’t need Shimo anymore.
  • Google Calendar - I don’t use Google anything anymore.
  • MarsEdit - I’ve got a lot of respect and admiration for Daniel Jalkut over at Red Sweater, but I manage this blog with Jekyll, and do all of my writing in MacVim. I think at one point I just didn’t upgrade my license.
  • TextMate - I mentioned in the old post that I couldn’t get my configuration working for vi. Well, that’s no longer an issue, and my MacVim setup works just fine.
  • Yojimbo - I wish I could say that I still used Yojimbo, and that Bare Bones came out with an iPhone client that synced seamlessly using iCloud. But, that didn’t happen, and Evernote ate Yojimbo’s lunch. Even when I obsess about file storage and personal information organization, I eventually realize that Evernote is the best choice for what I want, which is to not think about what I’m storing where.

Apps from 2012 I stopped using:

  • NetNewsWire - I really, really wanted to believe that Black Pixel was going to do great things with NetNewsWire. I decided I didn’t want to wait anymore, and I’ve moved on.
  • DEVONthink - Call me when DEVONthink To Go 2.0 is out, and has sync that works.
  • Chrome - Used only when I absolutely, positively, have to use Flash.
  • Read Later - Beautiful app, but I moved on.
  • The Hit List - I finally drank the MacSparky Kool-Aid.

Now, what I am using.

Essential Utilities

  • Quicksilver - Still the best. I’ve tried switching over to Spotlight full time, but there are too many features from Quicksilver that I miss.
  • Dropbox - Until iCloud Drive makes sense.
  • Caffeine - So my Mac won’t fall asleep when I don’t want it to.
  • Moom - Keyboard window management, this is a fantastic tool.
  • TextExpander - For all your text expanding needs. I don’t use this nearly as much as I think I could, but I get enough out of it that I notice when it’s missing right away. Normally as soon as I go to name a file.

Library Applications

  • 1Password - I can’t imagine trying to keep over 400 strong, unique passwords without 1Password.
  • Evernote - Sometimes I go a little nuts on information management. Evernote sits and waits patiently for me to get my wits about me and come back.
  • Day One - I’m an infrequent journaler, but the more I use Day One, the more I value this app. Looking back at the past year of entries is a great way to put a smile on my face.
  • Oh, and, um, iTunes, I guess.

Tools for Creation

  • MacVim - Still the best tool for writing words and manipulating text.
  • Marked - Recently upgraded to v2, Marked continues to earn its spot in my Dock, right next to MacVim.
  • OmniGraffle - The best diagramming tool.
  • OmniFocus - For keeping all the big and little things I need to do, and helping me know when to actually do them.
  • Photoshop - I bought a copy of CS5 while in grad school, Photoshop is the only part that I’m probably going to keep using. Possibly Illustrator, depends on what the next year or so brings.

Spare Time

  • Hibari - Still my favorite Twitter app, although I’m not sure if I’ll be able to keep using it for long
  • ReadKit - ReadKit combined both Read Later and NetNewsWire, and although the reading experience is first rate, I miss the smooth keyboard navigation of NNW. However, ReadKit integrates with all the latest RSS services, and NetNewsWire does not. Like DEVONthink though, call me when you’ve got good sync to iOS.

In addition to the apps, I use a handful of command line tools from time to time, installed with Homebrew. I’m getting too old not to.

If my transition of apps that I use had a theme, it would be moving to systems where mobile is a priority. I’m no longer using Yojimbo or NetNewsWire, and to be honest, I’m a little disappointed in that fact.


AWS reInvent Conference

November 21, 2014

The re:Invent conference was fascinating.

Culture

First, the overall experience of the conference was fantastic. There were, of course, a few hiccups along the way, that’s going to happen anytime you put 13,000+ people in the same conference, but overall everyone I talked to was enthusiastic and positive about what Amazon is doing. The feel was energetic, and, importantly, diverse.

I met people from many different races and nationalities at the conference, and saw quite a few women attending as well. The men outnumbered the women, probably by 10:1 at least, but they were there, and that’s important. I’d like to see that number getting closer to 1:1 as the years go by.

The conference had a Japanese track, where popular sessions were given again, this time speaking in English a bit slower, and live translation provided for a primarily Japanese speaking audience. The sessions were never very crowded, and I found that I could easily get a seat at the front. Of course, if I ever saw any Japanese waiting for a seat I would have given mine up, but there were lots of empty seats.

One of the most poorly designed aspects of the conference were that it didn’t stagger the sessions out, so at the same time each hour, 13,000 people crowded the hallways and escalators, making it difficult to get from one session to the next.

Scale

The most interesting session I attended was held by James Hamilton, AWS VP and Distinguished Engineer (quite the title), titled “Innovation at Scale”. James Hamilton has been part of the tech industry for quite a while. He was formerly at Microsoft, and IBM before that, and was the lead architect on DB2 when it was ported to Unix. This guy knew what he was talking about.

This is the next decade in our industry.

I knew Amazon was big, but I didn’t realize just how big till I sat in on this session. Amazon S3 has grown by 132% in the past year, and EC2 over 99%. AWS has over five times the capacity in use than the aggregate total of the other fourteen providers in the industry. That includes Azure, Rackspace, IBM, Joyent, and all the rest.

Every day, AWS adds enough new server capacity to support all of Amazon’s global infrastructure when it was a $7B annual revenue enterprise, back in 2004. They are big, and growing fast.

AWS is split into eleven regions world-wide, and private AWS fiber interconnects all major regions. Inside each region are very high levels of redundancy. For example, in the US East region, there are 82,864 fiber strands connecting their availability zones and data centers. In each region, there are at least two Availability Zones (AZ). Latency between the AZs is less than 2ms, and normally less than 1ms.

Peak network traffic between AZs reaches 25Tbps, and yes, that’s terabits per second. All AZs are in different data centers. Capacity in an AZ is added by adding new data centers. Failover between data centers within an AZ is transparent. Each AWS data center holds between 50,000 and 80,000 servers. Inbound bandwidth to a single data center is up to 102Tbps.

Another interesting fact I learned is that Amazon has custom chips built only for them by Intel for their custom-built servers. The chips have a faster processor at a given core count than what is available to the public from this partnership because of the scale Amazon operates at.

Amazon also builds all of their own networking equipment. They found that it was a lot cheaper to build their own networking gear. Not only did the overall cost of the networking gear go down, but the availability went up. It turns out that building their own gear allowed them to specialize on solving only their problems, which is a much smaller set of problems than the entire worlds that commercial networking vendors have to solve for. Amazon spun up 8000 servers to test their networking gear before they went into production.

Amazon runs their own version of Linux, which originally started off life as a Red Hat clone, something like CentOS, but has subsequently been heavily modified for Amazon’s particular needs. For example, Amazon has built their own networking stack tuned for the volume of traffic they need to process.

A lot of the work on their hypervisor has been to eliminate the virtualization tax. In the latest system they are rolling out, the NIC in each server supports SR-IOV (Single-Root I/O Virtualization), and each VM gets it’s own hardware virtualized NIC. This results in much lower latency, and less latency jitter from the instances.

Building the Future

I’m not sure how long Amazon is going to own this space, but they are certainly not slowing down and waiting for their competitors to catch up. I’m more in favor of a distributed Internet than one modeled after the old mainframe, client-server approach, but the capabilities that Amazon gives other businesses can’t be ignored.

My favorite analogy is the electric company. Any business that wanted to could build their own power generators and distribution infrastructure, but it would be crazy for them to do it. The economics just aren’t there. It’s far, far more affordable, and reliable, to let the specialists do it for you, and just pay for what you use. That’s what AWS is building, computing on tap, just bring your code.

The times are changing; either keep up, or get out of the way.


Open Source News Design

October 28, 2014

Finding good design in open source can be hard, but it’s almost impossible to find in open source news sites. These sites take “reader hostile” to a new level. Take example “A”, Phoronix:

The advertisement completely obstructs the text. Once the ad is closed, which I’m assuming counts as a “click”, the site is not too terrible to read. Of course, it’s no Daring Fireball.

The problem is compounded when using an aggregator site like Linux Today. Initially, it looks like a series of links:

But, as soon as you click on a link, another full screen, text obstructing ad appears.

OK, fine, close the ad, and see that you still have another link to click on to get to the article you want to read.

Now I’m wondering just how much I care about the Debian Civil War (spoiler: not much), but by this time, I’m invested, let’s read that article. Click the Complete Story link.

Nope! Close the ad, and, finally, find the text I was looking for.

Has it really come to this? Apparently.