jb…

Shellshocked Security Specialists

Posted on September 30, 2014

Between 2000 and 2003 I was part of a small group that was responsible for the security of the network in a remote military base. The work we did there was foundational for the rest of my career, at least so far. Once a week our team shut down for the afternoon to do training, and in the training one of us was responsible for researching a topic in depth and then presenting it to the rest of the team. We built web servers, firewalls, and proxies with OpenBSD, managed our intrusion detection system that we designed and installed ourselves, we even built a honeypot to watch malicious traffic. We spent a lot of long nights, and did a lot of hard work, but it paid off.

Part of the reason this was such a unique experience was the time. The Internet was small, and slow, and the tools we had available were rudimentary and low level. Part of it was the place, we were part of the Navy, but we were remote and isolated, and not under the constant watchful eye of senior brass. We were able to get away with more because there was no one around to tell us we couldn’t. But more than either of these, the main reason was Senior. Senior was the Senior Chief in charge of our team, and he was obsessed with security. He taught college level courses for the University of Maryland, would speak at the annual Black Hat conference, and did security consulting on the side. I recently learned he’s also an author of several books. His obsession with security was only met by his obsession with education, and because we were his team, we were the beneficiaries of both.

So, for years, night and day, we built servers, ran scans, performed penetration testing, and researched, researched, researched. We were all in front of the firehose.

It was fantastic.

I left the security field shortly after transferring out of the team. I simply couldn’t stand the level of incompetence in the general IT population among so called “security specialists”. The field outside of our team was composed of people concerned with paperwork, and constant one-upmanship. This kind of security specialist seemed more interested in telling you what you couldn’t do than building things that let you do the things you want. I’m a builder.

The recent hubbub over the so named “shellshock” vulnerability in bash reminded me again of why I left the security field. I was invited to a live screencast with a chat room put on by a big name in the security industry, so I logged on to have a few things clarified. I was unimpressed.

The vulnerability in bash is only remotely exploitable if your web application is calling out to bash, and specifically bash. The presenters came up with a few ideas when I asked them for clarification on the specifics, but they seemed contrived to me. I asked about perl CGI’s, and they said it’s possible that the application could be vulnerable if perl called out to bash for any exec() or system calls, but when I asked for specifics they didn’t have any. I doubt that would work, since perl would need to pass along the maligned bash environmental variables, but I’d be willing to be proven wrong.

Secondly, they said that “similar to heartbleed”, some applications could precompile bash for inclusion instead of depending on the system bash. I asked if they could provide an example of an application that did that, but they could not. I have never heard of an application doing that, and if I found one, I think it would be incredibly poor development practices. That would be an application to avoid at all costs. I asked on Twitter if anyone else had heard of something like that, but received no response.

These kinds of statements are a problem because they are inaccurate and generalizing in a field that depends on exact and precision correctness. It doesn’t change the fact that the bug still needs to be patched, just to cover the areas we don’t know about, but having a correct understanding of what the exposure of the bug is, how it can be exploited, and where there is an actual emergency speaks volumes about the reliability and trustworthiness of the vendor.

Someone can only cry wolf so many times. The heartbleed bug was worth staying up all night to fix. This, as far as I can tell, is not. That doesn’t stop the security firms and media outlets from getting excited about it though.

This kind of fear mongering doesn’t help anyone. What does help people is having the right information so they can make intelligent, informed choices that are right for them. Senior would never have let the kind of flippant comments I heard, being presented as fact from a trusted source, leave the meeting without providing proof. To do so not only makes you look incompetent, but reflects poorly on the entire industry. Are you in this to help protect people? Or just to make a buck?

Finally, there is this garbage of an article by ZDNet: Apple issues incomplete OS X patch for Shellshock, which claims that the patch issued by Apple did not patch for all listed CVE vulnerabilities, when, according to my testing, it does.

Testing by ZDNet showed that while the patch fixed the issues outlined in the original CVE–2014–6271 report and CVE–2014–7169, OS X remains vulnerable to CVE–2014–7186.

I ran the tests on my machine for both CVE–2014–7186 and CVE–2014–7187, and both return as not being vulnerable.

I don’t mean to demean the importance of fixing the bug, I simply want to make sure the it is correctly and accurately understood. Tech journalists and security specialists alike benefit from sensational claims, which is a shame when there are so many more pressing problems in the world they could be working on.

A Technical Education - The Operating System

Posted on September 29, 2014

It’s good to think of a computer as something like a cake with several layers. If the hardware is the first, and foundational layer, then the operating system is the second, and applications are the third. Today, we are going to look at that second layer, and leave with a basic understanding of what an operating system is, what it does, and what the differences are between the major operating systems available today.

Chances are, you’ve already heard of a few operating systems, even if you didn’t know what the differences meant. Microsoft’s Windows is the most well known and popular operating system, powering the majority of desktop and laptop computers sold for many years now. Apple has two operating systems, OS X for desktop and notebook computers, and iOS for iPhones and iPads. Likewise, Google has two operating systems, Android and Chrome OS, both based on the open source (more on that term at a later date) Linux kernel.

An operating system (OS) has two major functions. The first is to provide control of, and access to the hardware of the computer. Without an operating system, when a computer powers on it will simply sit there with nothing to do. Computers have a small built-in OS, conveniently called the BIOS, which provides access to some very basic settings, but it’s main job is to find the disk that contains the real OS and start it, or “boot it up”.

When the BIOS finds the operating system, the first thing it does is load the kernel into memory. The kernel is the core of the operating system, the control point that manages access to all other resources of the machine. If there is a problem in the kernel, the entire computer crashes, which, thankfully, doesn’t happen all that often anymore. Next, a series of other programs are started that manage various functions of the computer. Eventually, the OS finishes starting all the programs that it needs to provide all the background services we rely on, and it is ready for the user.

The second function of an operating system is to provide a framework for third party applications to build on. The OS provides hooks and libraries, reusable blocks of code made available to other applications to give the OS a unified and cohesive look and feel, no matter which application is in use. For example, if an application wants to draw a window on the screen, the developer doesn’t have to write all the code required to draw the window herself, she simply has to make the right calls to the OS and have the system draw the window for her. There are thousands of these programming interfaces available on each OS, but each OS does things differently. Windows not only looks different from OS X, it feels different too, that’s because the way that it functions, and the programming interfaces that it presents to third party developers are very, very different internally from OS X.

This is why applications made for one operating system are often not available for another, and if they are, sometimes they look and feel a bit out of place. This is also why you can’t simply move an application from one OS to another and expect it to work. There are other, more technical explanations that have to do with compilers and runtime environments and interrupts, but for now, the important thing to understand is that applications are tied fairly deeply into the OS they were built to support.

One last item that is important to understand is the concept of device drivers. Let’s say you have a printer, and that printer comes with a USB cable that you plug into your computer. Your computer needs to know how to talk to that printer in a way it understands, so when the paper is printed, it looks the way it should. Operating systems use a special program known as a device driver that is normally provided by the manufacturer of the printer (or whatever other device you want to connect to) and developed specifically for your operating system. Device drivers have special privileges that allow them to talk to the hardware, and allow other programs or applications to talk to the driver.

Modern operating systems include drivers for thousands of different devices, and a mechanism that allows them to automatically download new drivers when they come across a device they don’t currently have. However, there may still be times when the operating system doesn’t know what to do with the device that’s plugged into it, and you may have to load the driver yourself. Normally, the manufacturer includes an installer along with the device, but unless the device is brand new, it might be a problem to install it. Third party manufacturers don’t move at the same speed as operating system development, so the drivers are often behind. Since device drivers operate at a low level in the operating system, having one that is out of date or faulty in some way can be a source of system instability. If the device you are loading a driver for is more than a year old, or if the stated supported operating system on the box is older than the one you are using, the best thing to do is visit the manufacturers web site and download and up to date driver. However, remember that that is only if the operating system doesn’t automatically do that for you.

So, the basics of an operating system are that it manages access to the hardware, provides a framework for third party development, and manages the drivers needed for extending the computer with third-party devices. This just barely skims the top of what an operating system is and can do, if you’d like to know more, the Wikipedia article is a great next step.

A Technical Education - Base Level Hardware

Posted on September 20, 2014

The first and most important thing to remember when considering a computer is that computers are machines. Incredible, wondrous, bordering on magical machines, but machines nonetheless. They were built by people who are no smarter than you, and designed by people every bit as fallible as you. There are no magic incantations, no special spells, and no generational gap that make one group of people better able to understand computers than another. Computers are machines, machines that you can understand.

So let’s get started.

Viewing a computer from the outside, there are four main components. Normally two ways to interact with the computer, a mouse and a keyboard. There will be a way to view the results of your interaction through the monitor, the screen that is the most visible part of the machine, and there will be the case that encloses the actual guts of the machine, the engine if you will.

These main external parts are present in computers of all sizes, even a smart phone, although the input mechanism and the screen have been combined.

Inside the case, all computers share the same basic components: CPU, RAM, storage, networking, and input and output ports. Let’s go over each component.

CPU

The central processing unit, CPU, can be considered the brain of the computer. At a high level, you can think of it as making decisions, very simple decisions, very, very quickly. The speed at which it makes these decisions is measured in hertz. A CPU that operated at one hertz would make one decision per second, but that would be an incredibly slow computer. Most computers now work with CPU’s measured in Gigahertz. For example, the computer I’m using to type this on has a CPU speed of 2.66 GHz, which means it can make 2,660,000,000 decisions per second.

Secondly, most CPUs come with multiple cores. A single CPU with two or more cores can be thought of as having the ability to make more than one decision, or threads of decisions, at a time. Each core runs at the same speed, but can operate on different information.

Generally, the faster the CPU speed, the faster the computer can process information, which should make it feel faster to the user. However, the other parts of the computer contribute to the overall feel of responsiveness too.

RAM

While the CPU is working on making all those decisions, it needs a place to put things. RAM, which stands for Random Access Memory, and is also known as primary storage, is a fast, convenient place for the CPU to put things that it’s currently working on, or that it has worked on recently, and might work on again. Here’s an example: if you open up a document in Word, everything you are typing into the document is being stored in RAM, up until the moment you choose to save the document. Unfortunately, RAM doesn’t persist between reboots, so everything that is saved only in RAM is lost if the computer reboots. If we want to save things on the computer for good, we have to save it to secondary storage, also known as the hard drive.

Storage

There are two options for storage in computers today: traditional spinning disk hard drives, and the newer and much, much faster solid state drives, called SSDs. Hard drives are slow because on the inside there is a metal platter spinning at thousands of revolutions per second, and a mechanical arm holding a head that moves across the disk to read and write the data. While advances in hard drive technology have made access faster over time, the physical limitations of the mechanics make this part the slowest on the computer.

On the other hand, an SSD has no moving parts, and access to reading and writing the data is done entirely electronically. An SSD is an order of magnitude faster than a hard drive, but for the moment it holds less data and is more expensive than hard drives. That’s the trade off, today, you either pay more for a faster computer with less storage, or less for a slower computer with more storage.

In my opinion, the price of solid state drives has come down enough that unless you know you need lots of internal storage, go for the SSD. It will make the entire computer feel faster.

Networking

Networking is a complex topic, and to fully understand it is beyond the scope of this introductory article. In a future ATE article we will cover the basics of how the Internet works, but not today. Suffice to say that there are two forms of networking to be concerned with: wired and wireless. Wireless, or WiFi, uses a small radio inside of the computer to connect to a network. The stronger the signal, and the less interference there is with the signal, the better your connection will be. Generally, if you are connecting at home or at school, your computer will share a network with other devices, and will connect to one or more wireless routers or access points. The router will most likely connect to a cable modem (assuming a home network), and the cable modem provides access to the Internet.

Wired networking uses a standard called Ethernet. Basically, there is a cable that connects your computer to a router, and the router connects to the modem for Internet access. Again, there are trade-offs. Wireless access is simple to setup, and lets you move your computer all around your house or school without loosing access to the Internet. The cost of this convenience is speed. Although WiFI is getting faster, it is still nowhere near as fast or as reliable as a directly cabled connection.

For me, the convenience of wireless outweighs the speed and reliability benefits of a wired connection. If I used a desktop computer that was always in the same spot, then a wired connection would make far more sense.

Input/Output Ports

Along the side or back of your computer will be a series of ports; different shaped plugs for different types of cables. These ports allow you to connect additional devices to your computer to extend the usefulness of the machine. For example, you may wish to add an external display to a notebook computer, and would plug into the mini-display port. Or, you may wish to plug in an external hard drive to backup your computer, and would plug the drive into the Universal Serial Bus (USB) port. There are normally ports for headphones and microphones, and possibly a place to plug in storage cards from cameras.

Most modern computers will allow you to plug a device into one of these ports have have the computer automatically configure the device for use. Don’t be afraid to plug something in, and if it doesn’t work, don’t be afraid to pull it back out again. That’s what the ports are there for. Some devices, like hard drives, want to be “ejected” before pulling the cable out of the computer, but even that may soon be a thing of the past.

Motherboard

One last part I forgot to mention above: the motherboard. Also known as the “system board”, the motherboard is the glue that connects all the other components together. Everything connects to the motherboard, and thin strips of conductive material printed onto the board called the bus connect the different parts together.

Conclusion

You look at the screen, type on the keyboard, and interact with the mouse or touchpad. Internally, the computer uses the CPU to make decisions, stores things temporarily in RAM, writes data permanently to a storage drive, connects to the Internet over a wired or wireless network connection, and is extensible via the input/output ports. And, all the different parts connect through the motherboard.

Now that we’ve covered the basics of hardware, next week we will talk about some of the software that makes the machine come to life.

The Apple I Knew

Posted on September 16, 2014

As usual, John Gruber has the best take on the Apple Watch that I’ve read, and one sentence in particular stood out.

Rather, I think Apple Watch is the first product from an Apple that has outgrown the computer industry.

The Apple that is releasing that watch is not the same scrappy underdog from decades past. This is the new Apple, a massive powerhouse making the best products in the industries they enter. Computers, phones, tablets, and now, watches. This isn’t the same Apple that advertised their new operating system to Unix geeks.

Or, is it?

I don’t think the Apple Watch is a product designed for me, and that’s fine. I’m happy to see Apple grow and mature, as long as we keep seeing hints that they are still the same company with the same values, simply expressed in different ways. The Apple in the Unix ad above valued simplicity, beauty, power, and obsessive attention to detail. When I look at that watch I see the expression of those values in a new product.

The Unix ad above drew me to the Mac, and I’ve stayed because of the community. The community came together because we all shared the values we saw expressed in the products Apple made, and in their own statements. There are always going to be a few missteps along the way, some ham-handed attempts, and inelegant solutions. There will be times when Apple does things that are embarrassing, or just flat out wrong, but they’ve been doing that all along.

Sometimes they don’t pay quite close enough attention in their betas, which worries us:

Why am I worried about iOS 8? I keep seeing things like this: twitter.com/bradleychamber…
Dr. Drang (@drdrang) Sep 16 2014 8:50 AM

Sometimes we see trends with their software quality that worry us:

The sad truth is that EVERYONE is rushing software out the door because of Apple product releases.

Not a sustainable activity for ANYONE…

Craig Hockenberry (@chockenberry) Sep 15 2014 2:16 PM

But, really, these are things we’ve been seeing all along. In fact, it used to be common knowledge that a new OS X release would not be stable till at least the 10.x.3 release.

One of the endearing qualities of Apple is that their grasp almost always exceeds their reach. They are daring greatly, aspiring to do things that the tech industry simply doesn’t understand, and that they may or may not be able to pull off. Stretching a little further, a little wider, straining at times to accomplish their goal.

I think Gruber is right, he normally is, and that the Apple Watch will sell well. How the Mac, OS X, and the rest of the ecosystem evolve along with Apple will be exciting to watch.

Home Built Software and Systems

Posted on September 12, 2014

GigaOm is running an article written by Ralph Dangelmaier, the CEO of BlueSnap, claiming “We’ve reached the end of ‘build it yourself’ software.” It’s a nice thought, along the same lines as “We’ve reached the end of ‘host it yourself hardware’,” and “We’ve reached the end of you needing anything other than what someone else has already developed.” In the past fourteen years I’ve been in the industry though, the systems I’ve seen run the best are the ones hosted on our own hardware running our own code. Off-the-shelf software can be great for certain situations, but if you are outsourcing a core function of your business, what kind of value are you really providing?

Admittedly, building your own software from scratch is too much for most. However, if you use the building blocks of open source correctly, you gain the best of both worlds. Functionality and flexibility.

Dangelmaier’s claims center around an odd story of a company nearly sixty years ago who started building entire houses using an assembly chain technique. The company could spit out up to thirty homes per day; thirty identical homes. I’m sure they were affordable at the time, what I wonder is how many of those homes are still standing today. When applying that same thought process to software systems, the concept of being able to slightly customize assembly line software starts to break down as soon as the needs of the business start bumping up against the upper limits of the purchased software.

If you never need to run that Windows only application on anything other than a single server, you might be fine. As soon as you need to expand that system to provide high availability, failover, or disaster recovery, things start to fall apart, and costs go through the roof. The initial pain of developing the software yourself is made up for later by having the flexibility to modernize and adapt your system to changing times.

I’ve recently started looking at building out my own system based on FreeBSD jails. I’ve had a fascination with what I call the beautiful system for years, I think it’s high time I stopped making prototypes and built something worthwhile.

A Technical Education

Posted on September 12, 2014

I didn’t grow up with computers. They just weren’t a common thing in Montana in the 80’s. When my family moved to Texas for two years during my sixth and seventh grades, one of my friends had one in her room that we would play Oregon Trail on, but otherwise it was unremarkable. With the exception of video games and VHS tapes, my childhood was very much like the childhoods of the generations before me. If I wanted to see a friend, I’d have to walk over to his house. If I wanted to send someone a letter, I had to sit down and write it out on paper, scratching out misspellings along the way, then folding it up, stuffing it in an envelope, licking a stamp on it, and dropping it in the mailbox. And then, I’d wait. Sometimes for weeks, sometimes for months. In the past twenty years however, our world has changed dramatically.

If my daughter wants to talk to someone, she pulls out her phone and sends a text. If she wants to send a longer message, she might, if pressed, sit down at her Mac and send an email. Then she waits five or ten minutes, tops, for a reply. More likely, during those ten minutes she’s sent a Facebook message and posted to Twitter. Computers and the Internet have changed how we interact with each other, and technology has improved faster than our culture and education system has been able to adapt to it.

What are these magic boxes that have intruded on our lives? How do they function? How can we best use them? How can we ensure that we become their master, and not the other way around? There are websites, games, and apps that have become very good at exploiting basic human psychology to extract our personal information, time, and money.

Education is the first and best defense against those who would use our ignorance against us. In the past twenty years, computers have barged their way into the spotlight of nearly every facet of our personal and professional lives, but they are not magic.

I’m starting a series of posts here where we are going to pull back the curtain and see that the wizard is, yet, just a man. We will examine the inner workings of the machine, the components that make up the whole. By the time we finish, you will be able to identify the basic hardware components of a computer and their function, explain what an operating system is and how the main options differ, have a basic understanding of what the Internet is and how it works, and make educated and informed choices about online services.

Reading this series won’t make you an expert on computers, but it is my goal to give you the basic knowledge required to operate computers confidently, and discuss the available options intelligently.

Look for weekly updates to A Technical Education right here.

The Apple I Knew

Posted on September 10, 2014

As usual, John Gruber has the best take on the Apple Watch that I’ve read, and one sentence in particular stood out.

Rather, I think Apple Watch is the first product from an Apple that has outgrown the computer industry.

The Apple that is releasing that watch is not the same scrappy underdog from decades past. This is the new Apple, a massive powerhouse making the best products in the industries they enter. Computers, phones, tablets, and now, watches. This isn’t the same Apple that advertised their new operating system to Unix geeks.

Or, is it?

I don’t think the Apple Watch is a product designed for me, and that’s fine. I’m happy to see Apple grow and mature, as long as we keep seeing hints that they are still the same company with the same values, simply expressed in different ways. The Apple in the Unix ad above valued simplicity, beauty, power, and obsessive attention to detail. When I look at that watch I see the expression of those values in a new product.

The Unix ad above drew me to the Mac, and I’ve stayed because of the community. The community came together because we all shared the values we saw expressed in the products Apple made, and in their own statements. There are always going to be a few missteps along the way, some ham-handed attempts, and inelegant solutions. There will be times when Apple does things that are embarrassing, or just flat out wrong, but they’ve been doing that all along.

Sometimes they don’t pay quite close enough attention in their betas, which worries us:

Why am I worried about iOS 8? I keep seeing things like this: twitter.com/bradleychamber…
Dr. Drang (@drdrang) Sep 16 2014 8:50 AM

Sometimes we see trends with their software quality that worry us:

The sad truth is that EVERYONE is rushing software out the door because of Apple product releases.

Not a sustainable activity for ANYONE…

Craig Hockenberry (@chockenberry) Sep 15 2014 2:16 PM

But, really, these are things we’ve been seeing all along. In fact, it used to be common knowledge that a new OS X release would not be stable till at least the 10.x.3 release.

One of the endearing qualities of Apple is that their grasp almost always exceeds their reach. They are daring greatly, aspiring to do things that the tech industry simply doesn’t understand, and that they may or may not be able to pull off. Stretching a little further, a little wider, straining at times to accomplish their goal.

I think Gruber is right, he normally is, and that the Apple Watch will sell well. How the Mac, OS X, and the rest of the ecosystem evolve along with Apple will be exciting to watch.

Marked Down

Posted on September 04, 2014

If you really, really care about Markdown, Jeff Atwood of Coding Horror and Stack Exchange fame has a new project for you. Apparently, Jeff didn’t think Markdown’s original creator’s care of the code was quite up to snuff, and decided to build a new project to more accurately codify the syntax and implementation details. All good things, if, again, you really care about such details. If, however, you are using Markdown like the majority of us: to making writing on the web a bit easier, well, this all might go by unnoticed. At least, it probably would have if Jeff had named his project anything other than “Standard Markdown”.

Markdown has two parts. First, a very bare syntax that defines things like links, italics, and headings. Second, a small, but very clever perl script that parses the Markdown text and converts it into HTML markup. Over the years several other people have written their own parsers for Markdown text, which has led to a fantastic array of available editors and parsers for all platforms, which allows writers to concentrate on writing, and not get bogged down in the details of actually putting our text on the web. Jeff’s heartache seems to be that each of these parsers rendered HTML a bit differently. Gruber has no problem with that, and, for what it’s worth, neither do I, but it seems to bother Atwood quite a bit.

There is only one “standard” markdown, and it’s a perl script written in 2004, hosted at Daring Fireball. Everything else is a derivative work, and for Atwood to claim the name Standard Markdown is wrong. He did not create the syntax or the original parser, and that he is unsatisfied with the handling of the pair is immaterial. It doesn’t matter how he feels about it, he should name his project something else.

Gabe Weatherhead said it best on MacDrifter:

I actually don’t care all that much about whether there is a spec for Markdown. I use various aspects of the language all day every day. I use it on every computer I touch. That’s a statement against Jeff Atwood’s express motivation. I’ve never once cared about the project’s stewardship. I care that it is not complicated and it’s easy to read.

Gruber created something that he wanted to use, then put it out there for the world to use, and in the ten years since he last updated it, Markdown has become extremely popular. However, just because the idea became popular does not mean that anyone is entitled to demand anything more from the original creator. Markdown works for me every day, and I imagine it will continue to do so as long as perl works, no matter what the spec is.

Small Site Update

Posted on July 11, 2014

I’ve been publishing this site with Jekyll for several years. I’m not sure exactly when I switched over from Wordpress, but it’s long enough ago that I’ve forgotten when I started.[1] Over the past few weeks I’ve run into a few issues with Jekyll that have caused me to reevaluate if it was still the right choice for me. The short answer is no, the long answer is that this site is now published with my own Python script.

List of Grievances

Jekyll is popular enough with the geek crowd that there are probably reasonable solutions to everything listed below. However, that would assume that I’m reasonable, which I think we’ve established is not always the case. And besides, something Dr. Drang said the other day has been stuck in my head:

the great advantage of making your own software is that you can customize it to match your own idiosyncrasies.

Thus, 370 lines of Python. On to the motivation to move.

  • Dependencies

Strictly speaking, there are not that many Ruby dependencies for Jekyll, and the are all automatically installed when running gem install jekyll. To be able to compile the gems, you need to have either the full Xcode IDE installed, or at a minimum the Xcode command line tools. Not much, still more than I thought necessary to parse text and move files.

  • Lost Pages

One of the ways I used Jekyll was to build an internal site where I work. I use the site to keep coworkers updated with what I’m working on, but more importantly I use the site to publish reports. The reports are kept in a separate “/reports” directory under the site root, and Jekyll used to automatically compile the markdown to html in that directory along with the rest of the site. I’m not sure what happened, but at some point that stopped working, and when I rsync’d my site using the “–delete” flag, all my reports were gone. Luckily, I had a backup so I was able to quickly restore the reports, but once I realized what had happened I had to rethink my “modern living document”. [2] A process I was in the middle of when I encountered the next grievance.

  • Failure to Build Site

Jekyll failed to build my site last week because of a UTF–8 error; it was all I needed to start looking for something else. Apparently there was a special character in the title of one of my posts. Again, this wasn’t anything new, that post must have been built before because I wasn’t building anything new at the time. Something changed, I don’t know what, and troubleshooting this error led me down a rabbit hole of Ruby bugs I didn’t want to go down.

Options

I evaluated, and discarded, several options.

  • Wordpress.com
  • Self-Hosted Wordpress
  • Squarespace
  • Ghost
  • Hakyll
  • Hyde
  • Hugo

I briefly looked at a few others, but these were the ones that received the most thought.


  1. There was, of course, Paragraphs, but I’m content to let that go. Making peace with your past, learning from your mistakes, and moving on older and wiser is the only way to live in peace.  ↩

  2. My term for an internal, corporate blog. I maintain it as a way to avoid emailing Word documents and PowerPoint presentations to each other. When someone wants something like that from me, they get a link to an HTML document.  ↩