New Mac Essentials - 2014 Edition

November 24, 2014

It appears I’ll be getting a new Mac soon, which means it’s time to take inventory of what I need. I’ve written about this a couple of times before, and it’s interesting to look back and see what apps stick, and which have gone by the wayside.

Obviously, I’m still using OS X, and I’m still using most of the built in apps like Safari, Mail, Calendar, Contacts, and so on. I still backup with Time Machine, although I’ve added a secondary, off-site back up with Backblaze.

Apps from 2008 I stopped using:

  • Adium - Work forced me to move to Lync, which is terrible, slow, and bloated, but at least it crashes every now and then.
  • Shimo - The built in VPN client in OS X is so good I don’t need Shimo anymore.
  • Google Calendar - I don’t use Google anything anymore.
  • MarsEdit - I’ve got a lot of respect and admiration for Daniel Jalkut over at Red Sweater, but I manage this blog with Jekyll, and do all of my writing in MacVim. I think at one point I just didn’t upgrade my license.
  • TextMate - I mentioned in the old post that I couldn’t get my configuration working for vi. Well, that’s no longer an issue, and my MacVim setup works just fine.
  • Yojimbo - I wish I could say that I still used Yojimbo, and that Bare Bones came out with an iPhone client that synced seamlessly using iCloud. But, that didn’t happen, and Evernote ate Yojimbo’s lunch. Even when I obsess about file storage and personal information organization, I eventually realize that Evernote is the best choice for what I want, which is to not think about what I’m storing where.

Apps from 2012 I stopped using:

  • NetNewsWire - I really, really wanted to believe that Black Pixel was going to do great things with NetNewsWire. I decided I didn’t want to wait anymore, and I’ve moved on.
  • DEVONthink - Call me when DEVONthink To Go 2.0 is out, and has sync that works.
  • Chrome - Used only when I absolutely, positively, have to use Flash.
  • Read Later - Beautiful app, but I moved on.
  • The Hit List - I finally drank the MacSparky Kool-Aid.

Now, what I am using.

Essential Utilities

  • Quicksilver - Still the best. I’ve tried switching over to Spotlight full time, but there are too many features from Quicksilver that I miss.
  • Dropbox - Until iCloud Drive makes sense.
  • Caffeine - So my Mac won’t fall asleep when I don’t want it to.
  • Moom - Keyboard window management, this is a fantastic tool.
  • TextExpander - For all your text expanding needs. I don’t use this nearly as much as I think I could, but I get enough out of it that I notice when it’s missing right away. Normally as soon as I go to name a file.

Library Applications

  • 1Password - I can’t imagine trying to keep over 400 strong, unique passwords without 1Password.
  • Evernote - Sometimes I go a little nuts on information management. Evernote sits and waits patiently for me to get my wits about me and come back.
  • Day One - I’m an infrequent journaler, but the more I use Day One, the more I value this app. Looking back at the past year of entries is a great way to put a smile on my face.
  • Oh, and, um, iTunes, I guess.

Tools for Creation

  • MacVim - Still the best tool for writing words and manipulating text.
  • Marked - Recently upgraded to v2, Marked continues to earn its spot in my Dock, right next to MacVim.
  • OmniGraffle - The best diagramming tool.
  • OmniFocus - For keeping all the big and little things I need to do, and helping me know when to actually do them.
  • Photoshop - I bought a copy of CS5 while in grad school, Photoshop is the only part that I’m probably going to keep using. Possibly Illustrator, depends on what the next year or so brings.

Spare Time

  • Hibari - Still my favorite Twitter app, although I’m not sure if I’ll be able to keep using it for long
  • ReadKit - ReadKit combined both Read Later and NetNewsWire, and although the reading experience is first rate, I miss the smooth keyboard navigation of NNW. However, ReadKit integrates with all the latest RSS services, and NetNewsWire does not. Like DEVONthink though, call me when you’ve got good sync to iOS.

In addition to the apps, I use a handful of command line tools from time to time, installed with Homebrew. I’m getting too old not to.

If my transition of apps that I use had a theme, it would be moving to systems where mobile is a priority. I’m no longer using Yojimbo or NetNewsWire, and to be honest, I’m a little disappointed in that fact.

AWS re:Invent Conference

November 21, 2014

The re:Invent conference was fascinating.

Culture

First, the overall experience of the conference was fantastic. There were, of course, a few hiccups along the way, that’s going to happen anytime you put 13,000+ people in the same conference, but overall everyone I talked to was enthusiastic and positive about what Amazon is doing. The feel was energetic, and, importantly, diverse.

I met people from many different races and nationalities at the conference, and saw quite a few women attending as well. The men outnumbered the women, probably by 10:1 at least, but they were there, and that’s important. I’d like to see that number getting closer to 1:1 as the years go by.

The conference had a Japanese track, where popular sessions were given again, this time speaking in English a bit slower, and live translation provided for a primarily Japanese speaking audience. The sessions were never very crowded, and I found that I could easily get a seat at the front. Of course, if I ever saw any Japanese waiting for a seat I would have given mine up, but there were lots of empty seats.

One of the most poorly designed aspects of the conference were that it didn’t stagger the sessions out, so at the same time each hour, 13,000 people crowded the hallways and escalators, making it difficult to get from one session to the next.

Scale

The most interesting session I attended was held by James Hamilton, AWS VP and Distinguished Engineer (quite the title), titled “Innovation at Scale”. James Hamilton has been part of the tech industry for quite a while. He was formerly at Microsoft, and IBM before that, and was the lead architect on DB2 when it was ported to Unix. This guy knew what he was talking about.

This is the next decade in our industry.

I knew Amazon was big, but I didn’t realize just how big till I sat in on this session. Amazon S3 has grown by 132% in the past year, and EC2 over 99%. AWS has over five times the capacity in use than the aggregate total of the other fourteen providers in the industry. That includes Azure, Rackspace, IBM, Joyent, and all the rest.

Every day, AWS adds enough new server capacity to support all of Amazon’s global infrastructure when it was a $7B annual revenue enterprise, back in 2004. They are big, and growing fast.

AWS is split into eleven regions world-wide, and private AWS fiber interconnects all major regions. Inside each region are very high levels of redundancy. For example, in the US East region, there are 82,864 fiber strands connecting their availability zones and data centers. In each region, there are at least two Availability Zones (AZ). Latency between the AZs is less than 2ms, and normally less than 1ms.

Peak network traffic between AZs reaches 25Tbps, and yes, that’s terabits per second. All AZs are in different data centers. Capacity in an AZ is added by adding new data centers. Failover between data centers within an AZ is transparent. Each AWS data center holds between 50,000 and 80,000 servers. Inbound bandwidth to a single data center is up to 102Tbps.

Another interesting fact I learned is that Amazon has custom chips built only for them by Intel for their custom-built servers. The chips have a faster processor at a given core count than what is available to the public from this partnership because of the scale Amazon operates at.

Amazon also builds all of their own networking equipment. They found that it was a lot cheaper to build their own networking gear. Not only did the overall cost of the networking gear go down, but the availability went up. It turns out that building their own gear allowed them to specialize on solving only their problems, which is a much smaller set of problems than the entire worlds that commercial networking vendors have to solve for. Amazon spun up 8000 servers to test their networking gear before they went into production.

Amazon runs their own version of Linux, which originally started off life as a Red Hat clone, something like CentOS, but has subsequently been heavily modified for Amazon’s particular needs. For example, Amazon has built their own networking stack tuned for the volume of traffic they need to process.

A lot of the work on their hypervisor has been to eliminate the virtualization tax. In the latest system they are rolling out, the NIC in each server supports SR-IOV (Single-Root I/O Virtualization), and each VM gets it’s own hardware virtualized NIC. This results in much lower latency, and less latency jitter from the instances.

Building the Future

I’m not sure how long Amazon is going to own this space, but they are certainly not slowing down and waiting for their competitors to catch up. I’m more in favor of a distributed Internet than one modeled after the old mainframe, client-server approach, but the capabilities that Amazon gives other businesses can’t be ignored.

My favorite analogy is the electric company. Any business that wanted to could build their own power generators and distribution infrastructure, but it would be crazy for them to do it. The economics just aren’t there. It’s far, far more affordable, and reliable, to let the specialists do it for you, and just pay for what you use. That’s what AWS is building, computing on tap, just bring your code.

The times are changing; either keep up, or get out of the way.

Open Source News Design

October 28, 2014

Finding good design in open source can be hard, but it’s almost impossible to find in open source news sites. These sites take “reader hostile” to a new level. Take example “A”, Phoronix:

The advertisement completely obstructs the text. Once the ad is closed, which I’m assuming counts as a “click”, the site is not too terrible to read. Of course, it’s no Daring Fireball.

The problem is compounded when using an aggregator site like Linux Today. Initially, it looks like a series of links:

But, as soon as you click on a link, another full screen, text obstructing ad appears.

OK, fine, close the ad, and see that you still have another link to click on to get to the article you want to read.

Now I’m wondering just how much I care about the Debian Civil War (spoiler: not much), but by this time, I’m invested, let’s read that article. Click the Complete Story link.

Nope! Close the ad, and, finally, find the text I was looking for.

Has it really come to this? Apparently.

Sensible Information Organization

October 27, 2014

There is no one application or system that is right for managing all of your information. If there were, we wouldn’t need apps like Contacts or Calendar, those things would just be merged into the Finder, or whatever mythical computing system I found myself wishing for the past couple of weeks. This is a good thing, even if not having a single view into all my data drives me a bit nuts sometimes. Specialization allows applications to provide a better experience for the specific type of data they were designed to handle.

I lamented on Twitter the other day that personal information management is still not a solved problem. It’s a hard problem, because everyone’s needs for what they wish to keep and access on a computing device, how much they care about format and privacy, and the interface they prefer for accessing their data are different. What I thought I wanted, and what I spent far too long looking for, was a single application that I could actually dump everything into. It doesn’t exist, and it shouldn’t.

The system I’ve come up with, the system that works for me, and probably only for me, is to corral the different types of data that I wish to keep in different applications, based on what they are, and if I need access to them while mobile.

Types of Data

The kind of data I frequently find myself wanting to keep falls into a few categories.

  • Notes - Small pieces of text, normally nothing more than a few lines. Like meeting minutes, lists of books to read, random ideas, etc…

  • Writing - Longer articles or blog posts. May include links, images, or other media.

  • Technical or Academic Reference - PDFs or Web Archives containing detailed technical information gathered for later reference when writing and professional development.

  • Archived Reference - PDF scans of bills and statements. Needed at times for trend analysis (is our water bill going up?).

  • Disk Images - Install files for applications like Microsoft Office, or downloaded disk images for operating system installs. Rarely needed.

  • Screenshots or other images. Sometimes needed to explain or convey ideas, Also collected for inspiration or to indulge a hobby (Someday I’m going to find the perfect 1967 VW Beetle).

  • Scripts or Automator workflows - Home-built tools for automating reproducible Mac workflows.

  • Recipes - Everything from scanned PDFs of my wife’s great-grandmothers notecards, to saved web pages.

  • Receipts - I need to be able to grab scans of these on the run quickly and easily. Good to have for later analysis of spending habits, and for tracking business expenses while traveling.

Requirements

Deciding on the right place for this data depends on defining the requirements up front.

  • Data must be stored in, or easily exported to, an open format.
  • Mobile data must be available and editable on all devices.
  • It should be fast and easy to get to and add to my data. The less friction in the workflow the better, but not at the expense of the previous two points.

Mapping Purpose to Application

Accessable Only on Local Device

  • Financial or Medical Data - Filesystem
  • Disk Images - Filesystem
  • Archived Reference Files - Filesystem
  • Technical or Academic Research - DEVONthink

Accessable On Mobile Devices

For things like the directions to my kids soccer games, dragging and dropping the PDF to nvAlt will extract the text and create a new note. If need be, I can open the note in MacVim and clean up the formatting, then drop the original PDF in the filesystem under archived reference files.

Some data types can benefit from the organization of a database application. For that type of data, I’m leaning on the additional capabilities of DEVONthink to help me process the files and clippings I collect into new knowledge. DEVONthink’s AI engine helps me find connections to other entries that I might not have realized myself, and helps me to build a more solid understanding of the topic.

I think the same basic concept applies to recipes. I’m working on building a system that can take the basic ingredients as search terms and return a collection of recipes, as well as tags for things like “lunch”, “dinner”, and “family favorite”. For now, I’ll keep the recipes in Dropbox and index them in DEVONthink. Hopefully, I’ll soon be able to import them and sync them over to the mythical DEVONthink To Go 2.0.

This system is new to me, but each of the components I’ve been using on and off for years. While I do like to simplify my computing environment, doing so at the expense of a sensible system is foolish. My attempts to combine unlike files and data into the same system failed, but allowing each type of data to be manipulated by an application specifically designed to handle it looks promising. I’ll be sure to post updates as the system evolves.

Shellshocked Security Specialists

September 30, 2014

Between 2000 and 2003 I was part of a small group that was responsible for the security of the network in a remote military base. The work we did there was foundational for the rest of my career, at least so far. Once a week our team shut down for the afternoon to do training, and in the training one of us was responsible for researching a topic in depth and then presenting it to the rest of the team. We built web servers, firewalls, and proxies with OpenBSD, managed our intrusion detection system that we designed and installed ourselves, we even built a honeypot to watch malicious traffic. We spent a lot of long nights, and did a lot of hard work, but it paid off.

Part of the reason this was such a unique experience was the time. The Internet was small, and slow, and the tools we had available were rudimentary and low level. Part of it was the place, we were part of the Navy, but we were remote and isolated, and not under the constant watchful eye of senior brass. We were able to get away with more because there was no one around to tell us we couldn’t. But more than either of these, the main reason was Senior. Senior was the Senior Chief in charge of our team, and he was obsessed with security. He taught college level courses for the University of Maryland, would speak at the annual Black Hat conference, and did security consulting on the side. I recently learned he’s also an author of several books. His obsession with security was only met by his obsession with education, and because we were his team, we were the beneficiaries of both.

So, for years, night and day, we built servers, ran scans, performed penetration testing, and researched, researched, researched. We were all in front of the firehose.

It was fantastic.

I left the security field shortly after transferring out of the team. I simply couldn’t stand the level of incompetence in the general IT population among so called “security specialists”. The field outside of our team was composed of people concerned with paperwork, and constant one-upmanship. This kind of security specialist seemed more interested in telling you what you couldn’t do than building things that let you do the things you want. I’m a builder.

The recent hubbub over the so named “shellshock” vulnerability in bash reminded me again of why I left the security field. I was invited to a live screencast with a chat room put on by a big name in the security industry, so I logged on to have a few things clarified. I was unimpressed.

The vulnerability in bash is only remotely exploitable if your web application is calling out to bash, and specifically bash. The presenters came up with a few ideas when I asked them for clarification on the specifics, but they seemed contrived to me. I asked about perl CGI’s, and they said it’s possible that the application could be vulnerable if perl called out to bash for any exec() or system calls, but when I asked for specifics they didn’t have any. I doubt that would work, since perl would need to pass along the maligned bash environmental variables, but I’d be willing to be proven wrong.

Secondly, they said that “similar to heartbleed”, some applications could precompile bash for inclusion instead of depending on the system bash. I asked if they could provide an example of an application that did that, but they could not. I have never heard of an application doing that, and if I found one, I think it would be incredibly poor development practices. That would be an application to avoid at all costs. I asked on Twitter if anyone else had heard of something like that, but received no response.

These kinds of statements are a problem because they are inaccurate and generalizing in a field that depends on exact and precision correctness. It doesn’t change the fact that the bug still needs to be patched, just to cover the areas we don’t know about, but having a correct understanding of what the exposure of the bug is, how it can be exploited, and where there is an actual emergency speaks volumes about the reliability and trustworthiness of the vendor.

Someone can only cry wolf so many times. The heartbleed bug was worth staying up all night to fix. This, as far as I can tell, is not. That doesn’t stop the security firms and media outlets from getting excited about it though.

This kind of fear mongering doesn’t help anyone. What does help people is having the right information so they can make intelligent, informed choices that are right for them. Senior would never have let the kind of flippant comments I heard, being presented as fact from a trusted source, leave the meeting without providing proof. To do so not only makes you look incompetent, but reflects poorly on the entire industry. Are you in this to help protect people? Or just to make a buck?

Finally, there is this garbage of an article by ZDNet: Apple issues incomplete OS X patch for Shellshock, which claims that the patch issued by Apple did not patch for all listed CVE vulnerabilities, when, according to my testing, it does.

Testing by ZDNet showed that while the patch fixed the issues outlined in the original CVE-2014-6271 report and CVE-2014-7169, OS X remains vulnerable to CVE-2014-7186.

I ran the tests on my machine for both CVE-2014-7186 and CVE-2014-7187, and both return as not being vulnerable.

I don’t mean to demean the importance of fixing the bug, I simply want to make sure the it is correctly and accurately understood. Tech journalists and security specialists alike benefit from sensational claims, which is a shame when there are so many more pressing problems in the world they could be working on.