Select A Column of Text in MacVim
I often need to work with columns of text; output from commands, text grabbed from a web page, what have you. Since I have a somewhat odd aversion to using a spreadsheet like a normal person, I discovered, nearly by accident, that I could easily select a column of text in MacVim.
To do so, simply position the cursor where you would like to start, hold down option while dragging over the text you’d like to select. Once the text is selected, you can delete it, yank it, or insert new text for every row selected.
For an example, today I needed to comment out a few lines of text in a config file. Just for kicks I selected the first two characters of every row, pressed shift “i” (a capital i), typed the hash symbol, and when I pressed escape every row of text I had selected was commented out.
That example is a bit contrived, what I mainly use this for is just deleting columns to pair down the text I’m working with. Give it a try for yourself. I’m sure there’s a way to do this without using the trackpad, but this is quick and easy enough for me to remember.
Goodbye to The Annual Review
John Siracusa hung up his cape today, announcing on his blog that he would no longer be reviewing OS X.
Nearly 15 years ago, I wrote my first review of Mac OS X for a nascent “PC enthusiast’s” website called Ars Technica. Nearly 15 years later, I wrote my last. Though Apple will presumably announce the next major version of OS X at WWDC this coming June, I won’t be reviewing it for Ars Technica or any other publication, including the website you’re reading now.
It’s a bittersweet moment for those of us who have been following John for over a decade, but it’s well deserved, and the volume of work that he’s left is a wonderful gift to the community.
John’s explanation of Spotlight in the OS X 10.4 review was fundamental in my understanding of OS X as not just another Unix system. OS X is something different, something more. I remember this part in particular blowing my mind:
Any file i/o that goes through the Tiger kernel will trigger the appropriate metadata importer. This kernel-level integration ensures that the Spotlight indexes are always up to date.
Read the whole thing, actually, start at the beginning and read every review from DP2to Yosemite. His unapologetically deep dives into the details of OS X were something I and a lot of other geeks on the Internet looked forward to with each release.
Siracusa’s reviews are required reading for anyone wanting a better understanding of how and why their Mac works the way that it does. I’m looking forward to the published, hardcover book, if it ever comes.
Research Kit and the GPL
Apple released ResearchKit as an open source project on GitHub today. The project is complete with pull requests, a wiki, and a few sample projects to get started. While the project is great in its own right, it was the context of this tweet by Daniel Jalkut that caught my eye:
ResearchKit will probably save and improve more lives thanks to being unencumbered by the GPL.
— Daniel Jalkut (@danielpunkass) Tue Apr 14 2015 12:36 PM CDT
Apple could have kept this to themselves, simply added ResearchKit along with AppKit and UIKit as another capability easily programmed into one of their platforms, but they didn’t. That’s not to say that Apple doesn’t have selfish motivations with ResearchKit. Putting the iPhone at the center of medical research at a time when the healthcare industry is just starting to feel the weight of the baby boomers is clearly a strategic move for Apple, but I’m not sure that’s entirely why they did it. Call me an idealist, but I think Apple created and released ResearchKit for the greater good. Which, finally, brings me to the license.
The license that they chose, the BSD license, permits the most “free” use of the code. If someone wants to port the code to Android, close it off, and sell it, they are perfectly free to do so, and Apple might even be ok with that. I think the spirit of what Daniel was saying is that, despite the ethical arguments on the side of the GPL, Apple is going to actually do more good for society with the combination of their devices and this liberally licensed open source code.
I’ve made similar arguments before. The GNU community far too often overlooks actual utility in pursuit of a utopian dream. But, trust is a complicated subject, and trust and control are at the heart of what the GNU project is fighting for. They believe that if you do not have access to the source code of an application or device, that opens you up to being manipulated, spied on, or otherwise harassed. The license is designed to prevent anyone from taking a GPL licensed code base and closing it to public access. They are correct, to a point. However, the same people who make that argument are more often than not consumers of free online services like Gmail and Google Docs.
The proponents of the GPL like to define different levels and explanations of “freedom”. Free as in beer, free as in speech, and so on. Apple released code today that is designed to make the world a better place, and left it on the table for anyone to do whatever they wish with it. Study it, change it, use it as the base for your next hit app, whatever you like. That is real freedom, all around.
The Invisible MacBook
A thread of minimalism weaves through Apple’s products, starting with the Bondi blue iMac and flowing to the Apple Watch. One could argue that the minimal thread weaves back to the original Macintosh, a single, all-in-one device that made computing accessible, but I think the theme is most visible when looking at the modern age of Apple. Jony Ive’s designs have consistently focused on aesthetically pleasing, usable design. A concept that simultaneously puts the device at the center of our day, and almost makes it disappear. Technology is best when it is nearly invisible. The Apple Watch may be the culmination of this invisible tech, but it’s the new MacBook that I believe embodies the design philosophy of Apple best. When the lid is closed on the new 12” Retina MacBook, it’s so small and light that you hardly know it’s there.
I spent a half hour or so with the new MacBook at our local Apple Store today. I walked away from the device with two conflicting feelings: 1. the space gray one is probably the most beautiful piece of computing hardware I’ve ever seen, and 2. I’ll most likely not buy one. Not yet anyway.
The Case
I have an allergic reaction to cables. At work I’m lucky enough to have a Thunderbolt display that handles and hides most of the cables from me, so that I only ever need to plug in two things to my MacBook Pro: power and the display. Thankfully, both are part of the same cable that comes from the display, but even this I’d rather not have. The new MacBook seems to really get me, it’s simplified everything down to the absolute minimum of what’s possible.
The retina screen is beautiful, and the case is just big enough to hold the keyboard. It’s astonishing how small this computer really is. It’s comparable to the 11” MacBook Air, but feels to be only a fraction of the bulk, and yet the MacBook retains the solid and sturdy construction we’ve come to expect in an Apple product.
The Trackpad
I was impressed with my first experience with the new Force Touch trackpad on the 13” MacBook Pro. I had a hard time believing at first that the glass was not actually moving, and like David Sparks I was wondering if I was actually using the new trackpad or not. The trackpad on the MacBook felt almost the same, but there was something in the feel of the first click that felt off to me. It’s not to say that there was anything wrong with how it functioned, and the second force click worked perfectly, just as before, but the MacBook’s trackpad lost some of the magic of the one in the MacBook Pro.
It could be that I was surprised with how good it was in the Pro, and then expected it to be as surprising in the MacBook. It could also be that the particular models I tested were suffering from “demo version” syndrome, but I’ve rarely come across anything other than the best on display in an Apple Store.
The Keyboard
Apple made a big production out of their new keyboard design. The new butterfly key design provides a smoother, more consistent key press than the older scissor keys. As someone who types for a living, the keyboard is very important to me, and the feel of the keys makes a big difference. I’ve tried mechanical keyboards, ergonomic keyboards, and horrible PC keyboards, but I’ve always liked Apple’s aluminum bluetooth keyboards the best. They have the same feel as the notebook keyboards, with just the right balance of resistance and feedback. I know when I’m pressing the right key on a standard Apple keyboard.
I didn’t have the same experience with the MacBook keyboard. The keys are so close to the case that there is almost no travel when pressing down. The lack of travel makes it hard to tell if you’ve actually pressed the key, particularly when pressing one of the special keys like shift, option, or command. I typed out a few paragraphs in a few different apps on two display models, and repeatedly had issues not capitalizing a new sentence, or stopping to make sure I was hitting the right key.
It could be that the new keyboard will just take some getting used to. After all, the current line of keyboards was a change from the previous versions, but I think the current line is the best. I think if the MacBook keyboard stayed the same size, but increased tactile feedback when pressing a key, they’d have a winner. I didn’t have any issues with the arrow keys (I never use them), but I did notice that the esc key was too close to the top left edge. Vim users might find that annoying, most everyone else probably won’t.
Compromise
The MacBook is certainly a compromised machine, but the deals it make are for the better. Smaller, lighter, more invisible. Stashed away when you don’t need it, there for you immediately when you do, taking up so little space that you barely notice it. The MacBook might not be right for me now, but I’m going to give version two a very close look.
The New MacBook
The tech world is once again loosing their grip after Apple has gone, as they see it, too far, too fast with the new MacBook. They can think of a thousand reasons why the Mac’s single USB-C port is a deal-breaker for any sane person. The single port is too restrictive. What if I want to hook up a USB mouse while I’m charging? Why isn’t there a removable battery? Why can’t I expand the storage? Less space than a Nomad. Lame.
The thing to keep in mind about the new Mac is the lack of a “Pro” moniker at the end of the name. The distinction between the MacBook and the MacBook Pro is the difference between a person who needs to hook up their Mac to two or three displays and one who needs a computer to write emails and read Buzzfeed. It’s the difference between someone who takes the time to research and understand exactly what “USB-C” means, and someone who chooses to see the machine as a tool that allows them to do things that have nothing to do with computers.
It would be easy to assume that this is not a computer for geeks.
But it can be. Perhaps the more accurate description is that this computer is not for gadget geeks. Leonardo da Vinci is quoted as saying “Simplicity is the ultimate sophistication.” Apple pursues both simplicity and sophistication in all their products, pushing the public and the industry to abandon awkward technologies before they are comfortable. The floppy disk, the CD-ROM drive, and now all ports but one. Apple understands that technology is at it’s best when it is virtually invisible. When it can integrate deeply and easily into our everyday lives without asking us to make accommodations for it. I’m a Unix geek, and I can tell you that as long as this Mac runs OS X, it can absolutely be a computer for geeks.
It can also be a computer for postage stamp collectors, and bird watchers, and students, and people into following celebrities, and sports fans, home brewers, dog lovers, cat lovers, athletes… in general, people. This may not be the right machine for you if you want to connect it to your USB keyboard, hard drive, printer, and scanner all at once. Not that that’s wrong in any way, it’s just that this particular Mac is not the right tool for the job. Simply because the MacBook isn’t the right tool for this job though is not going to stop it from selling in the millions.
There are a few reasons you might not want to buy this generation MacBook though. It is less powerful and more expensive than a MacBook Air, although that is offset by the Retina display. The new trackpad has no moving parts, and I’m a bit skeptical of how it will feel and perform over long periods of use, at least in the first generation. Subsequent generations of this machine, in this form factor are going to get better, and most likely less expensive. I expect in a few years for this Mac to start at $999.
Apple is moving in new directions, so I’m glad they are still doing interesting things with the Mac. The new MacBook is a beautiful piece of hardware, and an impressive technical accomplishment. On those merits, I expect it will do well.
The Long View
Computers as tools for creation are unique in that they change and evolve over time as software is updated. A hammer that you buy today can reasonably be expected to perform the same in twenty years, assuming that the tool is taken care of properly. Similarly, the bench that you build with the hammer will still be good to sit on, no matter what happens to the hammer that built it. Not so with computers and software. Not only do the tools used to create change over time and perform in sometimes unexpected ways, the artifacts of our creation are often subject to artificial limitation on use. How ridiculous would it be for a bench to only be able to be sat in if you were holding the hammer you used to build it? And yet, this is the arrangement we agree to with our software more often than not.
Creating documents with Word or Pages, storing precious family photographs in iPhoto, or locking your research away in OneNote or Evernote are all examples of a short-sighted view of technology. They work well for the near term, but taking the long view of technology requires a consideration of the nature of the format your data is stored in.
The amount one should care about the format their data is stored in and their access to their data is in direct proportion to how much one cares about the data. Proprietary file formats come with a built-in expiration date; unlike a gallon of milk though, you don’t get to know when your software’s expiration date is. Sooner or later, whoever controls that format will update it, and eventually leave your important data irretrievable. Are you sure you’ll be able to open that Pages document in twenty years? You may not be able to open a Pages document from three years ago.
Some of this argument may sound familiar to the open source community. Control and longevity of personal computing systems is at the heart of much of what open source stands for. What open source enthusiasts often miss is that the method of manipulating the data matters far less in the long run than ensuring that your data remains in a format that makes the best effort to be accessible in twenty, thirty, or a hundred years from now. That means not locking your important data in proprietary formats that may go by the wayside, but it also means using the best tools for the job at hand.
Using commercial software is perfectly acceptable as long as the tools either offer export of your data to an open format, or work directly with the open format1. For writing, I prefer plain text, HTML, and PDF. Occasionally, I may revert back to LaTeX if necessary to create a complicated document intended to be printed, but the need for that is less and less as time goes on. My files are organized in a filesystem, using logical file names and a simple folder hierarchy.
No company is guaranteed to be around forever. Nor are they guaranteed to always keep your best interests at heart. However, as long as there is a way to easily export your data to an open format, it makes sense to use the best tools available. In my opinion, the best computer on the market is a MacBook Pro, and the best operating system is OS X. OS X offers the best combination of usability, aesthetics, and power of any system currently available.
Much of my opinion on this matter comes from my own experience. Much of it was influenced by the writings of Dr. Drang and his series on text files, as well as Seth Brown and David Sparks. If you’d like to read more from people that have been bitten by proprietary formats and/or poor organizational methods, here’s a weekends reading list.
Dr. Drang
- Text files and me - Part 1
- Text files and me - Part 2
- Text files and me - Part 3
- Text files and me - Part 3.5
- Text files and me IV
- Text files and me V
Seth Brown
David Sparks
The new and shiny always looks wonderful when new, and shiny, but you don’t really get your value from investment until the shine wears off.
-
BBEdit is the best example of an outstanding commercial product that works directly with an open format: plain text. ↩︎
The Best of What's Around
Marco struck a nerve with his latest post lamenting the declining quality of Apple software. The post was picked up by “analysts” and debated on television by a panel of “experts”. While I understand the frustrations of those affected by more serious bugs than I’ve seen, I can’t help but wonder if they really understand what the alternatives are like.
This whole débâcle reminded me of 2006 when Mark Pilgrim, Cory Doctoro, and Tim Brey left the Mac for Linux. Some of the same reasons were cited then, although for these three I think the openness of Linux meant more than software quality. But, as Daniel Jalkut pointed out, Apple’s software has always had bugs, and people have always been upset about it. It’s easy to look at the past through rose-colored glasses, but the truth is that there were some pretty terrible releases in the past.
We do seem to be at a low point in the ebb and flow of Cupertino software. I use Mail, Safari, Calendar, and iTunes daily, and from time to time I see bugs here and there. My personal pet peeve is that the Dock no longer automatically minimizes when I move a window to full screen mode. Instead, it just obstructs the bottom part of the window. An annoyance, to be sure, but it’s not the end of the world.
Overall Yosemite has been a fantastic release for me. I enjoy the new aesthetic, although I think the transparency could be toned down a bit, and the apps I use daily work great. I haven’t had the kind of problem that forces a reboot once a week, or to see iTunes crash once a day. I’m not saying it doesn’t happen, clearly it does, and I’m just one data point, but I just don’t see it.
Apple’s apps aren’t what make the platform great anyway. What makes the Apple ecosystem a great place to work and play is the abundance of very high quality third party apps. OmniGraffle, DEVONthink, Day One, Soulver, Quicksilver, TextExpander, Hazel, the list goes on. It was Apple’s platform and, more importantly, their taste, which encouraged the developers to build their absolute best. No other platform has this. I know, I’ve looked.
Linux is a mess from top to bottom. Windows is a hollow corporate shell trying to be relevant again. FreeBSD? Not a chance. Yes, it’s important to point out the flaws in the Mac and iOS platforms, and yes it’s good to remind Apple to stay on the path of light, but before we all decide to download the latest Ubuntu iso, let’s also take a moment to appreciate how fantastic these machines and their software really are. In comparison, nothing else even comes close.
PS. Apple, fix your bugs. Seriously, some of this is just embarrassing.
Adopting BBEdit Scripts for Vim
In addition to my experiments with the design of this site, I was also testing out BBEdit as my main writing and programming tool. BBEdit didn’t stick, but I did like some of the scripting the good Dr. Drang has done, and wanted to adopt a few for MacVim. I started with three of his scripts today, one to paste and select text in one command, one to convert a tab-separated table to Markdown, and another to even up the Markdown table so it’s easier to read in plain text.
Since Dr. Drang’s scripts read from stdin and output to stdout1, converting them to Vim was very easy, once I found the right syntax for my vimrc file2. My first thought was that I would be able to copy the same syntax I use for calling the outstanding formd by Seth Brown, but formd is meant to parse the entire text of the file, not just the selection. Eventually, I found my answer on StackOverflow.
My vimrc file now has the following lines:
" Even up a markdown tablevmap <leader>mn <esc>:'<,'>!~/Unix/bin/Normalize-Tables.py<CR>" Convert a tab separated tabel to a markdown tablevmap <leader>mt <esc>:'<,'>!~/Unix/bin/Tabs-to-Markdown-Tables.pl<CR>
The first word, vmap
, maps the shortcut to visual selections in Vim.
Next, <leader>mn
creates the shortcuts ,mn
for Normalize-Tables.py
and ,mt
for Tabs-to-Markdown-Tables.pl.
The next part <esc>:'<,'>
grabs the selection and passes it to the
command, which starts with an exclamation
point3 and ends with
<CR>
, which stands for “Carriage Return”.
I need to spend some time in my vimrc file to sort out the naming convention for all the key maps, but for now, I’m thinking “,mn” for “markdown normalize”, and “,mt” for “markdown from tabs”.
For the third part I’m borrowing from Dr. Drang, I wanted to paste and select text at the same time. Once again I had to turn to StackOverflow, and now have this mapped in vimrc:
nnoremap <leader>sp :set paste<CR>:put *<CR>:set nopaste<CR> <Bar> `[v`]
The first part sets the mapping, ,sp
, which I’m thinking of as “select
paste”, and then pastes the text from the OS X system clipboard. Next,
the <Bar>
entry strings two mappings together in Vim. Finally,
\
[v`]` performs the selection on the last change to the text.
So, now I can take text from Excel like this:
Left align Center align Right alignThis This Thiscolumn column columnwill will willbe be beleft center rightaligned aligned aligned
paste and select it with ,sp
, followed by ,mt
to convert the table
to Markdown.
|Left align|Center align|Right align||--|--|--||This|This|This||column|column|column||will|will|will||be|be|be||left|center|right|aligned|aligned|aligned|
and finally ,mn
to even the table up nicely:
| Left align | Center align | Right align ||:-----------|:------------|:-------------|| This | This | This || column | column | column || will | will | will || be | be | be || left | center | right || aligned | aligned | aligned |
As always, my thanks to the good Doctor for scripts and inspiration.
All Mine
I’ve been experimenting with the design of this site for the past couple weeks. First, I used a default Jekyll template, slightly modified to my liking. Next, I tried out a very nice theme that made good use of hero images and included nice typography. I changed the name of the site to “INTERACT”, and briefly considered leaving it at that. Unfortunately, the more I looked at the site the more it looked like it belonged to someone else.
Something reminded me of Merlin Mann and John Gruber’s South By Southwest panel where John said he wanted to own every pixel of his site, from the top left to the bottom right. I agree. This little part of the Internet has my name on it, it’s jonathanbuys.com, and I want to be able to point at this site and say “I built this”.
It’s not much, I’m not a web designer. I’ve tried to optimize the site to make it easy to read, and easy to port between hosting companies. I’m back to using my own Python script for generating the site, so there might be a few inconsistencies or characters that are missing or not rendered correctly. I’m going to work on that.
In the meantime, once again, I own every pixel of this site. From the top left, to the bottom right.
Cellular Options
I pulled into the gas station on my way home after a long day, picked up my phone in my left hand, intending to put it in my pocket, and opened the door of my pickup. While pushing the door of the truck open, the phone slipped out of my hand and fell face down on the pavement, shattering the screen.
I’m not the type who can carry around a broken phone for months, so the next day I called US Cellular to see what my options were. I was one year into a two year contract, so the prospects of actually getting anything out of my carrier were grim. I was right. The support person on the phone said that since I didn’t opt into the US Cellular insurance program, I either have to look into having the phone fixed, or buy a new phone at full price. Turns out, there are several options.
Buying a new phone at full price is the most expensive up front, and the option where I received conflicting information from the support people. Since the only iPhone 5C carried by US Cellular is the 8GB model (which should be an embarrassment for them and Apple), I was looking to buy a 5S, or possibly a 6. The 5S retails for $549, and the 6 for $650, but when considering the benefits of paying for the phone up front I became confused. The phone is not of much use without the cellular contract to go with it, and US Cellular charges a $40 “connection fee” for each line attached to my $70 per month plan. My question to US Cellular was what is the benefit from buying my phone up front if I still have to pay the connection fee?
Faced with a $650 charge for a new phone, I pulled the ace from my sleeve and said the magic words: “I think I might cancel my account.” The cancellation fee is $350, which some carriers might pay for you, which would bring my out of pocket costs down to $200, or maybe $100 to get into a new phone. Albeit with a new carrier and a complicated situation with the rest of my family still on the old plan. I didn’t think it was a serious option, but mentioning the cancellation was enough to get me pushed through to another level of support, who was happy to move my upgrade eligibility up an entire year to keep me with the company.
So, now my options were a little better. I could buy a new phone at full price, buy a new phone at the discounted rate, I could finance the full price of the phone over two years (which is a new option I wasn’t aware of before), or I could keep my phone and fix the screen.
The question remained though, why would I ever buy a new phone at full price? The source of my confusion centered around the $40 connection fee. One representative told me that the fee was there so the company could recoup the cost of the subsidized phone. If so, they are making a healthy profit off of each phone sold; $40 per month over two years comes out to $960. Another representative told me that no matter what, if I owned my phone or if I bought it at a discounted price, I had to pay the connection fee. The only way not to pay the fee was to finance the phone.
If I financed the phone for two years they would wave the fee, but in place of the fee I would be paying $32 per month on an interest free twenty month loan. Also, financing the phone would make me eligible to upgrade to a new line every eighteen months, if I was willing to pay the remainder of the phone off, or trade in the phone for a new model. This seemed interesting, but it also seemed like a way for me to remain in debt of one kind or another continuously. Even though the total paid out seemed to make more sense over time, financing did not appeal to me.
So, I decided, grudgingly, to buy a new phone at the discounted rate. I found the nearest US Cellular store, and went in to buy an iPhone 5S. While I was talking to the representative I asked her again about the connection fee. This was at least the third time I’d talked to someone at US Cellular about the fee, and this time the answer was different. If you own your phone, and you are not under contract, there is no connection fee. This changed things for me. If there was no fee for my phone that I had now, the best course of action I could take would be to fix the phone. I told her about my change in plans, and the conflicting information I’d gathered. To make her point she removed the fee from my account while I was sitting there.
I left the store without a new phone, went back to my desk, and ordered a $35 replacement screen and toolkit from Amazon. In two days the screen arrived, and I spent an hour and a half replacing it. Now my white iPhone 5C has a white face on it, and a black button, giving the phone some character. Also, I’m not under contract with US Cellular. I think for my next phone, whenever that happens to be, I’ll be paying full price for it, and staying out of contract lock-in from here on out.
AWS reInvent Conference
The re:Invent conference was fascinating.
Culture
First, the overall experience of the conference was fantastic. There were, of course, a few hiccups along the way, that’s going to happen anytime you put 13,000+ people in the same conference, but overall everyone I talked to was enthusiastic and positive about what Amazon is doing. The feel was energetic, and, importantly, diverse.
I met people from many different races and nationalities at the conference, and saw quite a few women attending as well. The men outnumbered the women, probably by 10:1 at least, but they were there, and that’s important. I’d like to see that number getting closer to 1:1 as the years go by.
The conference had a Japanese track, where popular sessions were given again, this time speaking in English a bit slower, and live translation provided for a primarily Japanese speaking audience. The sessions were never very crowded, and I found that I could easily get a seat at the front. Of course, if I ever saw any Japanese waiting for a seat I would have given mine up, but there were lots of empty seats.
One of the most poorly designed aspects of the conference were that it didn’t stagger the sessions out, so at the same time each hour, 13,000 people crowded the hallways and escalators, making it difficult to get from one session to the next.
Scale
The most interesting session I attended was held by James Hamilton, AWS VP and Distinguished Engineer (quite the title), titled “Innovation at Scale”. James Hamilton has been part of the tech industry for quite a while. He was formerly at Microsoft, and IBM before that, and was the lead architect on DB2 when it was ported to Unix. This guy knew what he was talking about.
This is the next decade in our industry.
I knew Amazon was big, but I didn’t realize just how big till I sat in on this session. Amazon S3 has grown by 132% in the past year, and EC2 over 99%. AWS has over five times the capacity in use than the aggregate total of the other fourteen providers in the industry. That includes Azure, Rackspace, IBM, Joyent, and all the rest.
Every day, AWS adds enough new server capacity to support all of Amazon’s global infrastructure when it was a $7B annual revenue enterprise, back in 2004. They are big, and growing fast.
AWS is split into eleven regions world-wide, and private AWS fiber interconnects all major regions. Inside each region are very high levels of redundancy. For example, in the US East region, there are 82,864 fiber strands connecting their availability zones and data centers. In each region, there are at least two Availability Zones (AZ). Latency between the AZs is less than 2ms, and normally less than 1ms.
Peak network traffic between AZs reaches 25Tbps, and yes, that’s terabits per second. All AZs are in different data centers. Capacity in an AZ is added by adding new data centers. Failover between data centers within an AZ is transparent. Each AWS data center holds between 50,000 and 80,000 servers. Inbound bandwidth to a single data center is up to 102Tbps.
Another interesting fact I learned is that Amazon has custom chips built only for them by Intel for their custom-built servers. The chips have a faster processor at a given core count than what is available to the public from this partnership because of the scale Amazon operates at.
Amazon also builds all of their own networking equipment. They found that it was a lot cheaper to build their own networking gear. Not only did the overall cost of the networking gear go down, but the availability went up. It turns out that building their own gear allowed them to specialize on solving only their problems, which is a much smaller set of problems than the entire worlds that commercial networking vendors have to solve for. Amazon spun up 8000 servers to test their networking gear before they went into production.
Amazon runs their own version of Linux, which originally started off life as a Red Hat clone, something like CentOS, but has subsequently been heavily modified for Amazon’s particular needs. For example, Amazon has built their own networking stack tuned for the volume of traffic they need to process.
A lot of the work on their hypervisor has been to eliminate the virtualization tax. In the latest system they are rolling out, the NIC in each server supports SR-IOV (Single-Root I/O Virtualization), and each VM gets it’s own hardware virtualized NIC. This results in much lower latency, and less latency jitter from the instances.
Building the Future
I’m not sure how long Amazon is going to own this space, but they are certainly not slowing down and waiting for their competitors to catch up. I’m more in favor of a distributed Internet than one modeled after the old mainframe, client-server approach, but the capabilities that Amazon gives other businesses can’t be ignored.
My favorite analogy is the electric company. Any business that wanted to could build their own power generators and distribution infrastructure, but it would be crazy for them to do it. The economics just aren’t there. It’s far, far more affordable, and reliable, to let the specialists do it for you, and just pay for what you use. That’s what AWS is building, computing on tap, just bring your code.
The times are changing; either keep up, or get out of the way.
Open Source News Design
Finding good design in open source can be hard, but it’s almost impossible to find in open source news sites. These sites take “reader hostile” to a new level. Take example “A”, Phoronix:
The advertisement completely obstructs the text. Once the ad is closed, which I’m assuming counts as a “click”, the site is not too terrible to read. Of course, it’s no Daring Fireball.
The problem is compounded when using an aggregator site like Linux Today. Initially, it looks like a series of links:
But, as soon as you click on a link, another full screen, text obstructing ad appears.
OK, fine, close the ad, and see that you still have another link to click on to get to the article you want to read.
Now I’m wondering just how much I care about the Debian Civil War (spoiler: not much), but by this time, I’m invested, let’s read that article. Click the Complete Story link.
Nope! Close the ad, and, finally, find the text I was looking for.
Has it really come to this? Apparently.
Sensible Information Organization
There is no one application or system that is right for managing all of your information. If there were, we wouldn’t need apps like Contacts or Calendar, those things would just be merged into the Finder, or whatever mythical computing system I found myself wishing for the past couple of weeks. This is a good thing, even if not having a single view into all my data drives me a bit nuts sometimes. Specialization allows applications to provide a better experience for the specific type of data they were designed to handle.
I lamented on Twitter the other day that personal information management is still not a solved problem. It’s a hard problem, because everyone’s needs for what they wish to keep and access on a computing device, how much they care about format and privacy, and the interface they prefer for accessing their data are different. What I thought I wanted, and what I spent far too long looking for, was a single application that I could actually dump everything into. It doesn’t exist, and it shouldn’t.
The system I’ve come up with, the system that works for me, and probably only for me, is to corral the different types of data that I wish to keep in different applications, based on what they are, and if I need access to them while mobile.
Types of Data
The kind of data I frequently find myself wanting to keep falls into a few categories.
-
Notes - Small pieces of text, normally nothing more than a few lines. Like meeting minutes, lists of books to read, random ideas, etc…
-
Writing - Longer articles or blog posts. May include links, images, or other media.
-
Technical or Academic Reference - PDFs or Web Archives containing detailed technical information gathered for later reference when writing and professional development.
-
Archived Reference - PDF scans of bills and statements. Needed at times for trend analysis (is our water bill going up?).
-
Disk Images - Install files for applications like Microsoft Office, or downloaded disk images for operating system installs. Rarely needed.
-
Screenshots or other images. Sometimes needed to explain or convey ideas, Also collected for inspiration or to indulge a hobby (Someday I’m going to find the perfect 1967 VW Beetle).
-
Scripts or Automator workflows - Home-built tools for automating reproducible Mac workflows.
-
Recipes - Everything from scanned PDFs of my wife’s great-grandmothers notecards, to saved web pages.
-
Receipts - I need to be able to grab scans of these on the run quickly and easily. Good to have for later analysis of spending habits, and for tracking business expenses while traveling.
Requirements
Deciding on the right place for this data depends on defining the requirements up front.
- Data must be stored in, or easily exported to, an open format.
- Mobile data must be available and editable on all devices.
- It should be fast and easy to get to and add to my data. The less friction in the workflow the better, but not at the expense of the previous two points.
Mapping Purpose to Application
Accessable Only on Local Device
- Financial or Medical Data - Filesystem
- Disk Images - Filesystem
- Archived Reference Files - Filesystem
- Technical or Academic Research - DEVONthink
Accessable On Mobile Devices
- Notes - nvALT + Simplenote
- Recipes - Dropbox - Until DTTG 2.0, then we will see.
- Receipts - Dropbox + PDFpen Scan Plus + Hazel
- Screenshots and other images - Ember
- Writing - Dropbox + MacVim + Nebulous Notes
For things like the directions to my kids soccer games, dragging and dropping the PDF to nvAlt will extract the text and create a new note. If need be, I can open the note in MacVim and clean up the formatting, then drop the original PDF in the filesystem under archived reference files.
Some data types can benefit from the organization of a database application. For that type of data, I’m leaning on the additional capabilities of DEVONthink to help me process the files and clippings I collect into new knowledge. DEVONthink’s AI engine helps me find connections to other entries that I might not have realized myself, and helps me to build a more solid understanding of the topic.
I think the same basic concept applies to recipes. I’m working on building a system that can take the basic ingredients as search terms and return a collection of recipes, as well as tags for things like “lunch”, “dinner”, and “family favorite”. For now, I’ll keep the recipes in Dropbox and index them in DEVONthink. Hopefully, I’ll soon be able to import them and sync them over to the mythical DEVONthink To Go 2.0.
This system is new to me, but each of the components I’ve been using on and off for years. While I do like to simplify my computing environment, doing so at the expense of a sensible system is foolish. My attempts to combine unlike files and data into the same system failed, but allowing each type of data to be manipulated by an application specifically designed to handle it looks promising. I’ll be sure to post updates as the system evolves.
A Technical Education - The Operating System
It’s good to think of a computer as something like a cake with several layers. If the hardware is the first, and foundational layer, then the operating system is the second, and applications are the third. Today, we are going to look at that second layer, and leave with a basic understanding of what an operating system is, what it does, and what the differences are between the major operating systems available today.
Chances are, you’ve already heard of a few operating systems, even if you didn’t know what the differences meant. Microsoft’s Windows is the most well known and popular operating system, powering the majority of desktop and laptop computers sold for many years now. Apple has two operating systems, OS X for desktop and notebook computers, and iOS for iPhones and iPads. Likewise, Google has two operating systems, Android and Chrome OS, both based on the open source (more on that term at a later date) Linux kernel.
An operating system (OS) has two major functions. The first is to provide control of, and access to the hardware of the computer. Without an operating system, when a computer powers on it will simply sit there with nothing to do. Computers have a small built-in OS, conveniently called the BIOS, which provides access to some very basic settings, but it’s main job is to find the disk that contains the real OS and start it, or “boot it up”.
When the BIOS finds the operating system, the first thing it does is load the kernel into memory. The kernel is the core of the operating system, the control point that manages access to all other resources of the machine. If there is a problem in the kernel, the entire computer crashes, which, thankfully, doesn’t happen all that often anymore. Next, a series of other programs are started that manage various functions of the computer. Eventually, the OS finishes starting all the programs that it needs to provide all the background services we rely on, and it is ready for the user.
The second function of an operating system is to provide a framework for third party applications to build on. The OS provides hooks and libraries, reusable blocks of code made available to other applications to give the OS a unified and cohesive look and feel, no matter which application is in use. For example, if an application wants to draw a window on the screen, the developer doesn’t have to write all the code required to draw the window herself, she simply has to make the right calls to the OS and have the system draw the window for her. There are thousands of these programming interfaces available on each OS, but each OS does things differently. Windows not only looks different from OS X, it feels different too, that’s because the way that it functions, and the programming interfaces that it presents to third party developers are very, very different internally from OS X.
This is why applications made for one operating system are often not available for another, and if they are, sometimes they look and feel a bit out of place. This is also why you can’t simply move an application from one OS to another and expect it to work. There are other, more technical explanations that have to do with compilers and runtime environments and interrupts, but for now, the important thing to understand is that applications are tied fairly deeply into the OS they were built to support.
One last item that is important to understand is the concept of device drivers. Let’s say you have a printer, and that printer comes with a USB cable that you plug into your computer. Your computer needs to know how to talk to that printer in a way it understands, so when the paper is printed, it looks the way it should. Operating systems use a special program known as a device driver that is normally provided by the manufacturer of the printer (or whatever other device you want to connect to) and developed specifically for your operating system. Device drivers have special privileges that allow them to talk to the hardware, and allow other programs or applications to talk to the driver.
Modern operating systems include drivers for thousands of different devices, and a mechanism that allows them to automatically download new drivers when they come across a device they don’t currently have. However, there may still be times when the operating system doesn’t know what to do with the device that’s plugged into it, and you may have to load the driver yourself. Normally, the manufacturer includes an installer along with the device, but unless the device is brand new, it might be a problem to install it. Third party manufacturers don’t move at the same speed as operating system development, so the drivers are often behind. Since device drivers operate at a low level in the operating system, having one that is out of date or faulty in some way can be a source of system instability. If the device you are loading a driver for is more than a year old, or if the stated supported operating system on the box is older than the one you are using, the best thing to do is visit the manufacturers web site and download and up to date driver. However, remember that that is only if the operating system doesn’t automatically do that for you.
So, the basics of an operating system are that it manages access to the hardware, provides a framework for third party development, and manages the drivers needed for extending the computer with third-party devices. This just barely skims the top of what an operating system is and can do, if you’d like to know more, the Wikipedia article is a great next step.
A Technical Education - Base Level Hardware
The first and most important thing to remember when considering a computer is that computers are machines. Incredible, wondrous, bordering on magical machines, but machines nonetheless. They were built by people who are no smarter than you, and designed by people every bit as fallible as you. There are no magic incantations, no special spells, and no generational gap that make one group of people better able to understand computers than another. Computers are machines, machines that you can understand.
So let’s get started.
Viewing a computer from the outside, there are four main components. Normally two ways to interact with the computer, a mouse and a keyboard. There will be a way to view the results of your interaction through the monitor, the screen that is the most visible part of the machine, and there will be the case that encloses the actual guts of the machine, the engine if you will.
These main external parts are present in computers of all sizes, even a smart phone, although the input mechanism and the screen have been combined.
Inside the case, all computers share the same basic components: CPU, RAM, storage, networking, and input and output ports. Let’s go over each component.
CPU
The central processing unit, CPU, can be considered the brain of the computer. At a high level, you can think of it as making decisions, very simple decisions, very, very quickly. The speed at which it makes these decisions is measured in hertz. A CPU that operated at one hertz would make one decision per second, but that would be an incredibly slow computer. Most computers now work with CPU’s measured in Gigahertz. For example, the computer I’m using to type this on has a CPU speed of 2.66 GHz, which means it can make 2,660,000,000 decisions per second.
Secondly, most CPUs come with multiple cores. A single CPU with two or more cores can be thought of as having the ability to make more than one decision, or threads of decisions, at a time. Each core runs at the same speed, but can operate on different information.
Generally, the faster the CPU speed, the faster the computer can process information, which should make it feel faster to the user. However, the other parts of the computer contribute to the overall feel of responsiveness too.
RAM
While the CPU is working on making all those decisions, it needs a place to put things. RAM, which stands for Random Access Memory, and is also known as primary storage, is a fast, convenient place for the CPU to put things that it’s currently working on, or that it has worked on recently, and might work on again. Here’s an example: if you open up a document in Word, everything you are typing into the document is being stored in RAM, up until the moment you choose to save the document. Unfortunately, RAM doesn’t persist between reboots, so everything that is saved only in RAM is lost if the computer reboots. If we want to save things on the computer for good, we have to save it to secondary storage, also known as the hard drive.
Storage
There are two options for storage in computers today: traditional spinning disk hard drives, and the newer and much, much faster solid state drives, called SSDs. Hard drives are slow because on the inside there is a metal platter spinning at thousands of revolutions per second, and a mechanical arm holding a head that moves across the disk to read and write the data. While advances in hard drive technology have made access faster over time, the physical limitations of the mechanics make this part the slowest on the computer.
On the other hand, an SSD has no moving parts, and access to reading and writing the data is done entirely electronically. An SSD is an order of magnitude faster than a hard drive, but for the moment it holds less data and is more expensive than hard drives. That’s the trade off, today, you either pay more for a faster computer with less storage, or less for a slower computer with more storage.
In my opinion, the price of solid state drives has come down enough that unless you know you need lots of internal storage, go for the SSD. It will make the entire computer feel faster.
Networking
Networking is a complex topic, and to fully understand it is beyond the scope of this introductory article. In a future ATE article we will cover the basics of how the Internet works, but not today. Suffice to say that there are two forms of networking to be concerned with: wired and wireless. Wireless, or WiFi, uses a small radio inside of the computer to connect to a network. The stronger the signal, and the less interference there is with the signal, the better your connection will be. Generally, if you are connecting at home or at school, your computer will share a network with other devices, and will connect to one or more wireless routers or access points. The router will most likely connect to a cable modem (assuming a home network), and the cable modem provides access to the Internet.
Wired networking uses a standard called Ethernet. Basically, there is a cable that connects your computer to a router, and the router connects to the modem for Internet access. Again, there are trade-offs. Wireless access is simple to setup, and lets you move your computer all around your house or school without loosing access to the Internet. The cost of this convenience is speed. Although WiFI is getting faster, it is still nowhere near as fast or as reliable as a directly cabled connection.
For me, the convenience of wireless outweighs the speed and reliability benefits of a wired connection. If I used a desktop computer that was always in the same spot, then a wired connection would make far more sense.
Input/Output Ports
Along the side or back of your computer will be a series of ports; different shaped plugs for different types of cables. These ports allow you to connect additional devices to your computer to extend the usefulness of the machine. For example, you may wish to add an external display to a notebook computer, and would plug into the mini-display port. Or, you may wish to plug in an external hard drive to backup your computer, and would plug the drive into the Universal Serial Bus (USB) port. There are normally ports for headphones and microphones, and possibly a place to plug in storage cards from cameras.
Most modern computers will allow you to plug a device into one of these ports have have the computer automatically configure the device for use. Don’t be afraid to plug something in, and if it doesn’t work, don’t be afraid to pull it back out again. That’s what the ports are there for. Some devices, like hard drives, want to be “ejected” before pulling the cable out of the computer, but even that may soon be a thing of the past.
Motherboard
One last part I forgot to mention above: the motherboard. Also known as the “system board”, the motherboard is the glue that connects all the other components together. Everything connects to the motherboard, and thin strips of conductive material printed onto the board called the bus connect the different parts together.
Conclusion
You look at the screen, type on the keyboard, and interact with the mouse or touchpad. Internally, the computer uses the CPU to make decisions, stores things temporarily in RAM, writes data permanently to a storage drive, connects to the Internet over a wired or wireless network connection, and is extensible via the input/output ports. And, all the different parts connect through the motherboard.
Now that we’ve covered the basics of hardware, next week we will talk about some of the software that makes the machine come to life.
A Technical Education
I didn’t grow up with computers. They just weren’t a common thing in Montana in the 80’s. When my family moved to Texas for two years during my sixth and seventh grades, one of my friends had one in her room that we would play Oregon Trail on, but otherwise it was unremarkable. With the exception of video games and VHS tapes, my childhood was very much like the childhoods of the generations before me. If I wanted to see a friend, I’d have to walk over to his house. If I wanted to send someone a letter, I had to sit down and write it out on paper, scratching out misspellings along the way, then folding it up, stuffing it in an envelope, licking a stamp on it, and dropping it in the mailbox. And then, I’d wait. Sometimes for weeks, sometimes for months. In the past twenty years however, our world has changed dramatically.
If my daughter wants to talk to someone, she pulls out her phone and sends a text. If she wants to send a longer message, she might, if pressed, sit down at her Mac and send an email. Then she waits five or ten minutes, tops, for a reply. More likely, during those ten minutes she’s sent a Facebook message and posted to Twitter. Computers and the Internet have changed how we interact with each other, and technology has improved faster than our culture and education system has been able to adapt to it.
What are these magic boxes that have intruded on our lives? How do they function? How can we best use them? How can we ensure that we become their master, and not the other way around? There are websites, games, and apps that have become very good at exploiting basic human psychology to extract our personal information, time, and money.
Education is the first and best defense against those who would use our ignorance against us. In the past twenty years, computers have barged their way into the spotlight of nearly every facet of our personal and professional lives, but they are not magic.
I’m starting a series of posts here where we are going to pull back the curtain and see that the wizard is, yet, just a man. We will examine the inner workings of the machine, the components that make up the whole. By the time we finish, you will be able to identify the basic hardware components of a computer and their function, explain what an operating system is and how the main options differ, have a basic understanding of what the Internet is and how it works, and make educated and informed choices about online services.
Reading this series won’t make you an expert on computers, but it is my goal to give you the basic knowledge required to operate computers confidently, and discuss the available options intelligently.
Look for weekly updates to A Technical Education right here.
Home Built Software and Systems
GigaOm is running an article written by Ralph Dangelmaier, the CEO of BlueSnap, claiming “We’ve reached the end of ‘build it yourself’ software.” It’s a nice thought, along the same lines as “We’ve reached the end of ‘host it yourself hardware’,” and “We’ve reached the end of you needing anything other than what someone else has already developed.” In the past fourteen years I’ve been in the industry though, the systems I’ve seen run the best are the ones hosted on our own hardware running our own code. Off-the-shelf software can be great for certain situations, but if you are outsourcing a core function of your business, what kind of value are you really providing?
Admittedly, building your own software from scratch is too much for most. However, if you use the building blocks of open source correctly, you gain the best of both worlds. Functionality and flexibility.
Dangelmaier’s claims center around an odd story of a company nearly sixty years ago who started building entire houses using an assembly chain technique. The company could spit out up to thirty homes per day; thirty identical homes. I’m sure they were affordable at the time, what I wonder is how many of those homes are still standing today. When applying that same thought process to software systems, the concept of being able to slightly customize assembly line software starts to break down as soon as the needs of the business start bumping up against the upper limits of the purchased software.
If you never need to run that Windows only application on anything other than a single server, you might be fine. As soon as you need to expand that system to provide high availability, failover, or disaster recovery, things start to fall apart, and costs go through the roof. The initial pain of developing the software yourself is made up for later by having the flexibility to modernize and adapt your system to changing times.
I’ve recently started looking at building out my own system based on FreeBSD jails. I’ve had a fascination with what I call the beautiful system for years, I think it’s high time I stopped making prototypes and built something worthwhile.
The Apple I Knew
As usual, John Gruber has the best take on the Apple Watch that I’ve read, and one sentence in particular stood out.
Rather, I think Apple Watch is the first product from an Apple that has outgrown the computer industry.
The Apple that is releasing that watch is not the same scrappy underdog from decades past. This is the new Apple, a massive powerhouse making the best products in the industries they enter. Computers, phones, tablets, and now, watches. This isn’t the same Apple that advertised their new operating system to Unix geeks.
Or, is it?
I don’t think the Apple Watch is a product designed for me, and that’s fine. I’m happy to see Apple grow and mature, as long as we keep seeing hints that they are still the same company with the same values, simply expressed in different ways. The Apple in the Unix ad above valued simplicity, beauty, power, and obsessive attention to detail. When I look at that watch I see the expression of those values in a new product.
The Unix ad above drew me to the Mac, and I’ve stayed because of the community. The community came together because we all shared the values we saw expressed in the products Apple made, and in their own statements. There are always going to be a few missteps along the way, some ham-handed attempts, and inelegant solutions. There will be times when Apple does things that are embarrassing, or just flat out wrong, but they’ve been doing that all along.
Sometimes they don’t pay quite close enough attention in their betas, which worries us:
Why am I worried about iOS 8? I keep seeing things like this: twitter.com/bradleychamber…
— Dr. Drang (@drdrang) Sep 16 2014 8:50 AM
Sometimes we see trends with their software quality that worry us:
The sad truth is that EVERYONE is rushing software out the door because of Apple product releases.
Not a sustainable activity for ANYONE…
— Craig Hockenberry (@chockenberry) Sep 15 2014 2:16 PM
But, really, these are things we’ve been seeing all along. In fact, it used to be common knowledge that a new OS X release would not be stable till at least the 10.x.3 release.
One of the endearing qualities of Apple is that their grasp almost always exceeds their reach. They are daring greatly, aspiring to do things that the tech industry simply doesn’t understand, and that they may or may not be able to pull off. Stretching a little further, a little wider, straining at times to accomplish their goal.
I think Gruber is right, he normally is, and that the Apple Watch will sell well. How the Mac, OS X, and the rest of the ecosystem evolve along with Apple will be exciting to watch.
Marked Down
If you really, really care about Markdown, Jeff Atwood of Coding Horror and Stack Exchange fame has a new project for you. Apparently, Jeff didn’t think Markdown’s original creator’s care of the code was quite up to snuff, and decided to build a new project to more accurately codify the syntax and implementation details. All good things, if, again, you really care about such details. If, however, you are using Markdown like the majority of us: to making writing on the web a bit easier, well, this all might go by unnoticed. At least, it probably would have if Jeff had named his project anything other than “Standard Markdown”.
Markdown has two parts. First, a very bare syntax that defines things like links, italics, and headings. Second, a small, but very clever perl script that parses the Markdown text and converts it into HTML markup. Over the years several other people have written their own parsers for Markdown text, which has led to a fantastic array of available editors and parsers for all platforms, which allows writers to concentrate on writing, and not get bogged down in the details of actually putting our text on the web. Jeff’s heartache seems to be that each of these parsers rendered HTML a bit differently. Gruber has no problem with that, and, for what it’s worth, neither do I, but it seems to bother Atwood quite a bit.
There is only one “standard” markdown, and it’s a perl script written in 2004, hosted at Daring Fireball. Everything else is a derivative work, and for Atwood to claim the name Standard Markdown is wrong. He did not create the syntax or the original parser, and that he is unsatisfied with the handling of the pair is immaterial. It doesn’t matter how he feels about it, he should name his project something else.
I actually don’t care all that much about whether there is a spec for Markdown. I use various aspects of the language all day every day. I use it on every computer I touch. That’s a statement against Jeff Atwood’s express motivation. I’ve never once cared about the project’s stewardship. I care that it is not complicated and it’s easy to read.
Gruber created something that he wanted to use, then put it out there for the world to use, and in the ten years since he last updated it, Markdown has become extremely popular. However, just because the idea became popular does not mean that anyone is entitled to demand anything more from the original creator. Markdown works for me every day, and I imagine it will continue to do so as long as perl works, no matter what the spec is.
Small Site Update
I’ve been publishing this site with Jekyll for several years. I’m not sure exactly when I switched over from Wordpress, but it’s long enough ago that I’ve forgotten when I started.1 Over the past few weeks I’ve run into a few issues with Jekyll that have caused me to reevaluate if it was still the right choice for me. The short answer is no, the long answer is that this site is now published with my own Python script.
List of Grievances
Jekyll is popular enough with the geek crowd that there are probably reasonable solutions to everything listed below. However, that would assume that I’m reasonable, which I think we’ve established is not always the case. And besides, something Dr. Drang said the other day has been stuck in my head:
the great advantage of making your own software is that you can customize it to match your own idiosyncrasies.
Thus, 370 lines of Python. On to the motivation to move.
- Dependencies
Strictly speaking, there are not that many Ruby dependencies for
Jekyll, and the are all
automatically installed when running gem install jekyll
. To be able to
compile the gems, you need to have either the full Xcode IDE installed,
or at a minimum the Xcode command line tools. Not much, still more than
I thought necessary to parse text and move files.
- Lost Pages
One of the ways I used Jekyll was to build an internal site where I work. I use the site to keep coworkers updated with what I’m working on, but more importantly I use the site to publish reports. The reports are kept in a separate “/reports” directory under the site root, and Jekyll used to automatically compile the markdown to html in that directory along with the rest of the site. I’m not sure what happened, but at some point that stopped working, and when I rsync’d my site using the “–delete” flag, all my reports were gone. Luckily, I had a backup so I was able to quickly restore the reports, but once I realized what had happened I had to rethink my “modern living document”. 2 A process I was in the middle of when I encountered the next grievance.
- Failure to Build Site
Jekyll failed to build my site last week because of a UTF-8 error; it was all I needed to start looking for something else. Apparently there was a special character in the title of one of my posts. Again, this wasn’t anything new, that post must have been built before because I wasn’t building anything new at the time. Something changed, I don’t know what, and troubleshooting this error led me down a rabbit hole of Ruby bugs I didn’t want to go down.
Options
I evaluated, and discarded, several options.
- Wordpress.com
- Self-Hosted Wordpress
- Squarespace
- Ghost
- Hakyll
- Hyde
- Hugo
I briefly looked at a few others, but these were the ones that received the most thought.