jb… a weblog by Jonathan Buys

Reinventing The Wheel

October 12, 2012

Or, Redesigning the Tablet Computer

This is an old paper I wrote in college in the Spring of 2006. I wrote this before the iPad, iPhone, or iOS was released, so it looks a little silly and dated now, but I still want to keep it around.

Evolution

Somewhere around 2000 BC, man discovered that mathematics were not exactly his strong suit. It became apparent that by using a tool external to his mind he could perform far more advanced calculations than he could in his head. Thus, the abacus was born, the first computer.

Evolution of Computers

Computing technology stagnated for thousands of years before taking a giant leap forward in the 1970’s. Advances in other fields of science led to the creation of vacuum tubes and punch cards that were used in the first fully mechanical and programmable computers. The first computers were big enough to fill a large room, and had (relatively) little computing power. Over time, the computer internal components have grown smaller and smaller, to the point that now you can carry a computer with several magnitudes the computing power of the first machines in the palm of your hand.

While computer architecture has undergone several changes, the basic way people interface with the computer has only undergone two major transitions. The first transition was the change from purely mechanical forms of data input like punched cards, to a digital interface via a command line. The command line is a simple text interface where a person types in data on a keyboard, and the computer returns information based on what the person typed. The second transition in user-computer interface was the move away from the command line to a Graphical User Interface, or GUI for short. According to Wikipedia:

“A GUI is a method of interacting with a computer through a metaphor of direct manipulation of graphical images and widgets in addition to text.”

The most recognizable computer GUI is the interface to Windows XP. XP uses a “desktop” and “filing cabinet” metaphor to symbolize the location of files in within the system. While this works great for simple filing systems, the amount of data needed to keep track of quickly out grows the now outdated system.

Necessity

The computer has grown smaller and larger at the same time. The physical size of computers has decreased, while the computing power and storage space has grown exponentially. Many computer users have now been using their machines for several years, and have accumulated a large amount of data in the form of documents, email, photographs, music, movies, and games. Recently, the computer industry released what they are calling the Ultra-Mobile PC, or UMPC for short. The UMPC is the culmination of the shrinking of the computer, (My how times have changed!) a full computer that you can hold in the palm of your hand. UMPCs run a full version of Windows XP Tablet PC edition, and, while functional, they suffer from the complete lack of style that often accompanies the PC marketplace. To remedy the outdated desktop, filing cabinet metaphor and create a new computing utility we need to think outside the start menu.

The computer operating system is the interface between the user and the computer hardware. It is the software that makes the computer cold silicone come alive and react to external stimulus. For personal computers, there are several operating systems available, but only Windows XP is widely known, mainly because of Microsoft’s 95% market share. Every PC available for purchase comes preloaded with Windows, and most users do not care to bother with another OS. While this is seemingly convenient to the user, they are being forced to learn a system that does not think like they do.

As stated earlier, there are several different PC operating systems. My personal favorite is the Mac OS from Apple Computer. Mac OS is based on the FreeBSD operating system, a freely available OS. Another type of freely available OS is known as Linux. Linux is “open source”, meaning that if so desired the source code is available and can be modified and redistributed by anyone. Given the “free” and “open” nature of Linux, there are literally hundreds of different “flavors” of Linux. Red Hat is one of the better known flavors, along with Suse, Slackware, and Debian. There are even flavors of Linux that are based off of other flavors. In a market dominated by a multi-billion dollar company, Linux is truly an odd bird. It is unfortunate that the two major interfaces to Linux, known as Gnome and KDE (The K Desktop Environment) both borrow heavily from Windows XP, including a start menu, nested menus, and nested folders. Linux, while free, suffers from a major drawback in being both unfamiliar to regular users (not being Windows XP), and being unintuitive to use (by trying to be Windows XP). Mac OS does a better job of presenting a more discerning interface, but also suffers from nested folders and other small interface drawbacks.

Gnome

KDE

Using a Mac can be made much more productive by installing the free application named Quicksiver from Blacktree. Quicksilver runs in the background and waits until the user presses a predefined key combination. Once the main Quicksilver window is available, the user types in the first few letters of what he is looking for, followed by a tab, and then the first few letters of what the user wants to do with the item selected. For example, to launch the Safari web browser, the user could type S <tab> O <enter> and the application would launch. Quicksilver is a major leap forward in human computer interaction, however it currently has a very steep learning curve and takes some getting used to.

Quicksilver

Another great advancement on the Mac is called Spotlight. Spotlight indexes all of the data on your computer, and all of the data about the data (or metadata) in real time. Meaning, if I have an Adobe PDF file that has something about Mezzo in it, spotlight will find it. Not only files, but emails, music, picture and most other forms of data stored on the computer. The Spotlight search functions almost instantly, building its results in real-time as you type.

Searching instead of browsing with Spotlight

A creative Mac user named Jason Spisac recognized the short-comings of the current computer interface and wrote a paper now known as the “Mezzo Greypaper” detailing his idea of a new de- sign for a GUI named Mezzo. In Jason’s paper, he details how major functions should be grouped in the corners of the screen, where they cannot be missed with a mouse. This idea was taken up by a group of Linux developers who are creating a new Operating System named Sym- phonyOS. The SymphonyOS is still in development, but it looks very promising. Using the ide- als and standards described in the Mezzo Greypaper they have designed a GUI with no file manager, hot corners, integrated search (via a linux application named “beagle”, not as powerful as Spotlight, but not bad either), and tight web integration.

The Mezzo Interface

Parts is Parts

The perfect user interface would incorporate all of the components listed above; Mezzo desktop, integrated search, and a Quicksilver like manager to bring it all together. I would like to propose a new UMPC, or Tablet PC, or Navi, or whatever you would like to call it. I’ve become fond of calling it the Farmdog.

The Original Farmdog Idea

Farmdog is a type of hybrid, addressing the shortcomings of the major computer systems, and designed for the user, not the computer programmer. Using a touch screen and a stylus (or tablet pen) the user can interface with the SymphonyOS using the Mezzo desktop’s hot corners to bring up different functions or access different parts of the computer. For example, moving the stylus into the lower left-hand corner of the screen would bring up a full screen menu of the installed programs. Bringing the stylus to the upper left-hand corner would give the user a menu of available documents and an integrated search box. Likewise, if something is no longer needed on the system, the user could drag the item to the lower right hand corner and be rid of it. No aiming, no hassle, no problem. Accessing quicksilver would be as easy as a tap on the screen with the stylus. Using handwriting recognition technology, the user could simply tap on the screen to bring up quicksilver, write a S, then a line to the right, then an O, followed by another tap to launch the Safari web browser. Likewise, the user could tap on the screen, write a R and a P, then tap again to open the file named “Research Paper”. All files, emails, applications, bookmarks… everything available from one simple tap of the pen.

The Art of Tech

One other technology that I would borrow from Apple’s Mac OS is called Exposé ́. Exposé is a window management system that allows the user to see all open windows as thumbnails and then choose which one is needed. The following two pictures illustrate Exposé. ́

Exposé

Utilizing this technology on a small hand-held computer would make the headache of managing multiple application windows on a small screen a thing of the past. Even better, Exposé can be activated via a hot corner, say the upper right hand corner. So, the user has ten applications open, and needs to copy some text out of a document for inclusion in an email. Simply drag the stylus to the upper right hand corner to activate Exposé ́, select the email application, copy the text, back to the upper right hand corner, select the document, paste and your done. It may not be as fast as other methods, but it is far more natural.

Solutions

Software is only half the story. It is the ying to hardware’s yang. Farmdog’s hardware is designed to deal with the current shortcomings of computer design by using everyday technology in radically new ways.

Today, operating systems are distributed on a CD-ROM or DVD-ROM and loaded (copied) onto the user’s computer hard drive. This system is almost as old as computers themselves, with only the delivery method changing (floppy disks to CDs to DVDs). The problem is that during an OS upgrade, you run a very real risk of losing the data on the hard drive. And what if there is a problem with the new OS? Unless you have copied your data over to another hard drive somewhere else, there is no way to go back a revision. Farmdog is different.

Farmdog would have two drives. One removable flash based drive for the OS, and another standard hard drive for the user’s data. This system would have a very distinct advantage over CD-ROM based delivery methods. For one, with the entire OS on the flash drive, the user could switch between operating systems by simply shutting down the Farmdog and re- placing the flash drive. In this way, the user would not have to worry about lost or damaged data, the data would be on a completely separate drive. The hard drive would also be easily removable via a slot on the right hand side of the computer. I can imagine distributing a simple hard drive replicator to backup data even further.

Nothing New Under The Sun

I really enjoy computers. I’ve enjoyed them even more since switching to Mac. I enjoy them so much that I’ve made a career out of knowing as much as possible about how they work, and what makes them tick. Farmdog is about putting a little piece of soul back in the box, finding the ghost in the machine and setting it free. Computing without thinking, finding without doing. Farmdog wants to be your best friend.


iOS 6 Headphones

October 8, 2012

I have been pleasantly surprised by one small enhancement in iOS 6 that probably affected a very small number of people. I drive a 2006 Saturn Ion that has an auxiliary port in the car stereo for plugging in things like iPhones. I have about a half-hour drive to work in the mornings, and I listen to podcasts downloaded with Instacast. Since I want to control the audible volume with my car stereo knobs, and I want the best possible signal from my iPhone, I turn the volume up to maximum for the drive.

This works great, but I also have a ten to fifteen minute walk from the parking garage in Des Moines to the office where I work, so I unplug from my car and plug in Apple headphones to listen to the podcast a little more while I walk. However, I would frequently forget how the volume button works on the iPhone. When I unplugged the car stereo, iOS tells Instacast to stop playing. So, when I unplug from the car, and then turn down the volume, I’m not actually turning down the volume for Instacast, only the volume for the phone’s ringer. I can’t quite get my head around that, I’m not sure why we would not want one volume to rule all audio. After hitting the volume, putting in my headphones, putting them in my ear, and hitting play on Instacast again, I would get my ears nearly blown out by iOS turning the volume up to full blast again.

It makes sense, in a way, but it does not agree with my mental model of how I think it should work. I think that the phone should have only one volume button, for everything. So, after a few times running into this, and randomly forgetting about it for a few times, I trained myself to keep my headphones out of my ears until after I have started Instacast again and turned down the real volume. Until I updated to iOS 6, and I noticed something a little different.

It took a few times to notice the change, and when I did notice it I watched it to make sure it was doing what I thought it was doing. I would unplug from the car, and plug in my headphones, start Instacast, and hit the volume, I would not be able to hear anything, the volume was already down at around half. Then I noticed that when I got back to my car and swapped the other way, I did not have to turn the volume back up again. iOS 6 remembers the volume setting for different headphones.

It may only remember the setting between Apple iPhone headphones with the microphone and control on the right ear bud wire and a regular 3.5mm TSR to TSR plug, but it is still such a nice addition. Once I realized the change, it made me smile. Apple has removed one small point of friction from iOS, and it is the kind of change that shows their focus on everyday use, and not just technical excellence.


ArcDown - My First Open Source Project

September 19, 2012

Part of a Farmdog project I’m working on needs nice syntax highlighting for markdown. After searching around for a bit I found Ali Rantakari’s PEG Markdown Highlight project, and knew that it would be a perfect fit. Unfortunately, the code was not written for ARC, or Automatic Reference Counting, and my project was. Rantakari’s code worked fantastic outside of ARC, but inside it needed a few days worth of love and attention.

I tried to dig through and fix the errors in my project, but after a while it seemed like a better idea to rip it out, start a blank Xcode project and do all of the fixing there. Thus, ArcDown was born.

ArcDown is a reference project, intended to be an example of how to use the PEG Markdown Highlight in your project. It’s far from perfect, not even close to finished, and is not going to replace MacVim for me any time soon, but it is a fun project to work on at night. If you are interested, ArcDown is released under the MIT license, so, fork away.


The Computer User I Want To Be

August 11, 2012

Learning about computers can be a dangerous thing. Breaking though the veneer of graphical interfaces reveals inefficiencies and inaccurate metaphors. For example, rsync copies files faster and uses fewer resources than the Finder. Copying lots of files is what rsync does best, but being a command line power tool there are a few subtleties with using it that are not readily apparent. As your skill grows, so to does the tendency to eschew modern tools in favor of “power tools”. You begin to see the inefficiencies of graphical tools as problems, problems that you need to fix. I’ve been down that road.

Power tools are impressive, but they also lend themselves to fiddling, spending more time configuring the tool than actually using it to get work done. Or worse, you need to stop tracking on what you are doing at the time to think about how to accomplish the next task. I was reminded of this as I was writing the first two New Mac Essentials posts, and realized that I needed to reread my own Principle of Least Software.

Use only the software that you need. No more, no less.

I was recently in Chicago attending training for Hadoop. I used DEVONthink to take notes and sync them over to my phone. I also used it to collect PDFs and web archives of documentation. It worked fairly well, but after thinking the experience thorough I found very little that I did with DEVONthink that I could not do with TextEdit and the Finder. I’ve obsessed over the ability to sync data to my phone, but the truth is that there have been very few times that I actually used that data.

DEVONthink was on my list of Mac Essentials, but I’m removing it. I have spent far too long thinking about how files are stored and the most efficient way to get to them. I’ve spent too long thinking about how to “manage a project”, instead of moving on to the next task in the project. When I consider my own tendency to overcomplicate my computer use, I realize that being the fiddly geek who spends his time tweaking his .muttrc file is not the guy I want to be. I believe in learning your tools, and learning them well, but I also believe in using as little software as possible. Most of all, I believe in using your computer as the tool it was intended to be. A bicycle for the mind.

When I bring my computer to life in the morning I want as little friction as possible between me and the tasks I need to accomplish. I can’t afford to think about the most efficient way to store a task, or file a PDF, or title an email. I simply need to do the task, read the document, and write the email. So, I’ve spent a good portion of today reorganizing my files, removing unnecessary applications, and streamlining my process. Time spent sharpening the saw is time well invested.

The kind of computer user I want to be is the kind who uses the simplest tool available, and does so with speed, accuracy, and finesse.


Make it Matter

July 30, 2012

You do what you do because it matters. At HP we don’t just believe in the power of technology, we believe in the power of people, when technology works for you, to do the things that matter; to dream, to learn, to create, to work. If you are going to do something, make it matter.

Well done, HP. Certainly much better than the recent Apple commercials.


New Mac Essentials - MacVim

July 29, 2012

Investing time learning a text editor is a serious commitment. Over time, you find yourself reaching for the editor’s built-in shortcut keys everywhere you type. In my case, I do almost all of my writing in MacVim. Unfortunately, MacVim comes with a fairly steep learning curve that many are unwilling to tackle. Part of the complexity of Vim, from which MacVim is derived, is the configuration. Over the years I’ve come up with a setup that works for me.

Downloads

Included Plugins

Configuration

My configuration is kept in Dropbox in a folder named Vim. I create three symlinks in my home directory.

ln -s ~/Dropbox/Vim ~/.vim
ln -s ~/Dropbox/Vim/gvimrc ~/.gvimrc
ln -s ~/Dropbox/Vim/vimrc ~/.vimrc

MacVim is customized through plugins. The history and legacy code behind the plugins make them unwieldy to maintain, so the first plugin I install is one to manage other plugins. Pathogen by Tim Pope allows you to use Git to install other plugins, and keep them nicely organized in ~/.vim/bundle. The GitHub page for Pathogen includes simple installation instructions. From here, to install other plugins, clone that plugin’s Git repository into the bundle directory.

I have found that the list of plugins I use changes every so often according to what language I’m working in or what task I’m working on. Vim is much like Linux… infinitely tweakable, and, if you are not careful, it can turn into a vast time sink. However, if you can curb your nerd impulse to optimize endlessly, and find a configuration that works for you, Vim will be your constant companion, always there for you when needed.


Living In The Technical Past

July 27, 2012

Bruce Lawson has a few interesting things to say about computers.

You may think it a badge of honour that you can do “sudo dpkg -i –force-all cupswrapperHL2270DW-2.0.4-2a.i386.deb” from memory. I think you’re burying your turds with a trowel in a thunderstorm.

It’s a good article, well worth a read. I’m a systems administrator with a degree in Human Computer Interaction, so what Bruce has to say about command line and graphical applications is right up my alley. The field of systems administration has not changed much in the past thirty years since the release of Unix upon the world. While I have not been using a computer for that long, thanks to the Navy I have seen the advancement of technology first hand from vacuum tubes and punched paper tape to the MacBook Air I’m typing this on. I still need to open a terminal emulator and type arcane spells in strange fonts, whispering curses against whoever decided to design the config file that way.

Spending time in the terminal over several years gives me a feeling of familiarity with the environment. I know my spells well, and can generally accomplish what I’m setting out to do with little hassle. However, simply because I am familiar with a setup does not mean that it is the best, just the one I know how to use. I have railed against GUIs as a sysadmin, and with good cause. Most GUI’s designed for the back end systems are horrible. Configuring a switch through a gigantic Java application that, one, uses no native controls, and two, is slow, and three, is unstable and may crash is a terrible way to work. Little to no thought is put into usability of these systems, so of course I would rather use a command line. Building a well designed application is hard.

I’ve been reading the Steve Jobs biography, and what struck me in the first portion of the book was how hard the original Macintosh team worked to make a computer easy to use. Jobs was famously demanding to work for, and would become furious when complexity was exposed to the person using the computer. The sysadmin field needs a Steve Jobs and an Apple, because currently the best we can come up with is increasingly complicated scripts to automate the tasks we need to do on the servers. If a task can be scripted, it can also be designed, animated, and brought to life in the modern age of computing.

The command line fetish does not end in the data center though. Sparrow was recently the subject of a kerfuffle, and the state of desktop email clients was discussed on Build & Analyze. The discussion prompted this quick exchange between myself and Seth Brown:

Sparrow => Mutt
  — Seth Brown (@DrBunsen) Sat Jul 21 2012 5:41 PM CDT
@DrBunsen assuming the Mutt learning curve, I suppose.
  — Jonathan Buys (@ibuys) Sat Jul 21 2012 6:20 PM CDT
@ibuys I’ve used Mutt before, so I’m comfortable with it, but now I’m going to try and use it full time; hooks to Address Book will be key.
  — Seth Brown (@DrBunsen) Sat Jul 21 2012 6:41 PM CDT

Seth is a scientist, and no slouch when it comes to thinking through computer interaction. I value his opinion, so so it will be interesting to see what he comes up with. I too have used Mutt as my email client in the past, and became frustrated with basic things like dealing with attachments. My frustrations are not unique, and the reason graphical clients like Apple’s Mail exist; to make the task of reading and sending email easier.

Mutt is powerful, and when properly configured can ease the burden of dealing with large amounts of email, but at a cost. To me, using Mutt always felt like I was tossing out twenty years of HCI research and design. Unlike data center applications, consumer facing applications have advanced greatly in usability.

Being a “power user” does not mean you need to disregard graphical applications. It means you learn whatever application you decide on inside and out.


Mountain Lion Reviews

July 25, 2012

OS X 10.8 Mountain Lion was released today, accompanied by a handful of reviews by the best tech sites.

  1. OS X 10.8 Mountain Lion: the Ars Technica review, John Siracusa
  2. Mountain Lion, John Gruber
  3. Mountain Lion and the Simplification of OS X, Shawn Blanc
  4. Apple releases OS X Mountain Lion, Jim Dalrymple
  5. OS X Mountain Lion, MacWorld

Just for kicks, don’t miss Marco Arment’s Review of John Siracusa’s Review of OS X 10.8 Mountain Lion, as well as John Siracusa writing about writing his Mountain Lion reveiw.

As for me, this is the first major upgrade of OS X that I do not see a compelling reason to upgrade right away. I plan to give it a day or two first.


The end of do-it-yourself - TechHive Beta Blog

July 24, 2012

Link

It makes sense that you’d need a special tool or kit to replace a cracked screen, but why should I have to send away my laptop in order to upgrade the hard drive? Why should I have to be without my phone or tablet for a week while the battery is replaced because it will no longer hold a charge?

A computer becomes more useful the smaller and faster it is. No one, other than geeks, ever cared how computers were put together, or how they worked. They only cared how they could use them to design a building, or research brain injuries, or plan a trip to Africa to drill a well. Real work.

The argument for repairable hardware is similar to the argument for open source; it misses the point of computers.


Should All Software Be Free

July 22, 2012

Introduction

We live in the information age. Digital devices and Internet connected, hand held computers are the prevalent way we communicate. The price for computers and for access to the Internet has dropped, and availability of publicly accessible Internet connected computers has risen. Schools across the country are providing computers to their students, some as early as sixth grade, and public libraries have been equipped with computers and, in some cases, free wireless Internet access. With the prevalence of computers of all shapes and sizes across nearly all parts of our society, questions about their ethical use and the purpose and place of computers in our lives have risen. One such question that has been debated since the early 1980s is “Should all software be free?”

“Free” in the English language is a fairly relative term. The New Oxford American Dictionary contains eight definitions of the word “free”, as well as an additional two adverb uses of the word. In the context of the question above, “Should all software be free”, the obvious meaning of the word is the fifth definition, which reads “given or available without charge”. However, the more academically interesting, accurate, and perhaps even subversive meaning of the word free is the first definition, which reads “not under the control or in the power of another; able to act or be done as one wishes”. Most computers shipped today come pre-installed with software that does not fall under either of these definitions of free, but should they? From a purely ethical context, should the user of software be able to copy, modify, and redistribute software as he sees fit? What are the social implications of an enmasse migration to free software?

There are many answers to this question, depending on who you ask. On one end of the spectrum are large proprietary software companies like Microsoft, Adobe, and Apple. These companies view software the same as they would a physical product, like a toaster. They design, engineer, and test the software, then package it and sell it to consumers to run on their computers.

On the other end of the spectrum is the Free Software Foundation, founded by Richard Stallman, who evangelizes the philosophy that all software, independent of the original author, should be free of restrictions.

History

In 1983 Richard Stallman was working as a programmer in the artificial intelligence lab at the Massachusetts Institute of Technology (MIT). By this time in his career, he had already garnered a certain amount of recognition in the small but burgeoning hacker community as a talented developer, largely due to his creation of the EMACS text editor, and his academic papers on artificial intelligence. Stallman embraced the openness and sharing of the hacker community, and found an ethos that would shape his career in the years to come. Towards the end of his work at MIT, Stallman found an increasing amount of proprietary software in use where he worked. One example in particular was a new printer that was installed on the network, which he was unable to gain access to the source code to. In a previous printer, he was able to expand the functionality of it to send messages when a printing job completed. Stallman’s inability to enhance the functionality of the printer based on the companies unwillingness to share source code with him was instrumental in convincing Stallman that proprietary software was ethically wrong. Stallman recalled the beginnings of the GNU project at a talk he gave at Google:

“So I found myself in a situation where the only way you could get a modern computer and start to use it was to sign a non-disclosure agreement for some proprietary operating system. Because all the operating systems for modern computers in 1983 were proprietary, and there was no lawful way to get a copy of those operating systems without signing a non-disclosure agreement, which was unethical.” (Stallman, 2004)

Shortly thereafter, he started the GNU project.

GNU is a recursive acronym for “GNUs Not Unix” , a play on words to indicate the purpose of the project, to create a Unix-like operating system that is freely available to anyone. The project was announced in late 1983, and officially started in early 1984. Stallman created a debugger (gdb), and a C compiler (gcc), and ported his popular text editor EMACS to the project as GNU EMACS. Launching the GNU project officially started the Free Software Movement, and Richard Stallman created a non-profit corporation named the Free Software Foundation to support the objectives of the new movement. (Stallman, 2010)

The GNU project worked for the next several years to develop the operating system, but were unable to successfully develop a reliable kernel, the core of the system. In 1991, an unexpected answer to this problem came in the form of a Finnish college student named Linus Torvalds who developed a clone of an educational version of the Unix kernel and named it Linux. Linus licensed his new kernel under the GNU GPL, and combined his new kernel with the GNU userland tools to create a fully functional operating system, properly named GNU/Linux.

The Free Software Foundation defines four essential “freedoms” that all people using software should have the right to enjoy. Using a hacker mentality, the freedoms are numbered starting at zero, a common programming practice. The four software freedoms are:

  • Freedom 0: The freedom to run the program, for any purpose
  • Freedom 1: The freedom to study how the program works, and change it to make it do what you wish. Access to the source code is a precondition for this.
  • Freedom 2: The freedom to redistribute copies so you can help your neighbor
  • Freedom 3: The freedom to distribute copies of your modified versions to others. By doing this you can give the whole community a chance to benefit from your changes. Access to the source code is a precondition for this.

In order to meet the Free Software Foundation’s definition of free software, an application’s licensing must meet all of these requirements. The FSF maintains a list of licenses that they find meet the definition of free software on their web site. The four freedoms are devised to give the user of the software complete control over their computing environment. For example, in an office environment where there are several computers, free software would enable the users to modify the application to suit their needs, and install the application on as many computers as they wished, without having to worry about additional software licensing or the possibility of breaking a contract with the developers of the software.

While the GNU project was founded to recreate a Unix-like operating system from scratch, another project was created that derived it’s source code from the original Bell Labs Unix directly. During the late 1970’s the University of California, Berkeley worked closely with Bell Labs developing the Unix operating system, sharing source code and fixes between the two. Berkley’s version of Unix became known as the Berkeley Systems Distribution, or BSD, and was distributed to colleges along with a license. When Bell Labs was bought out by AT&T, the focus of Unix development shifted to a stable, proprietary model for marketing to clients. AT&T changed the terms of the source code license to charge a substantial fee for universities to gain access to the source code. Around this same time, Berkeley independently developed a networking stack for the TCP/IP protocol for Unix, combined it with their BSD version, and made the source code available for a substantially lower fee. Encouraged by other universities and people interested in BSD, Berkeley continued working on rewriting utilities developed by AT&T for inclusion in BSD.

Through several iterations, splits, and rewriting of source code utilities and kernel files, there eventually appeared three versions of BSD: NetBSD, FreeBSD, and OpenBSD. There was also a fourth version, BSDi which was a commercial venture based off of the earlier works of Berkeley and their own rewritten kernel. (Bretthauer, 2002) Although BSD can clearly trace the ancestry of its code back to the original Unix of Bell Labs, due to a legal complication, no version of BSD can officially be called “Unix”. The Unix name is a trademark owned by Novell, who was recently purchased by Attachmate.

The commercial version of Unix was adopted by several vendors, and is now actively being sold and supported by IBM as AIX, HP as HP-UX, and Oracle as Solaris. Before being acquired by Oracle, Sun Microsystems released the source code of Solaris as OpenSolaris. Since the acquisition, the OpenSolaris project has been rumored to be disbanded in the near future. In response to the rumors, OpenSolaris has spawned the Illumos project to continue development of the released code.

BSD, along with the Mach kernel, also provides the core of both Apple’s Mac OS X and iOS operating systems. 4.4BSD was incorporated into NEXTSTEP, which was developed by NeXT corporation. Apple acquired NeXT in 1996, and began work incorporating NEXTSTEP into the Mac OS. Mac OS X Developer Preview 1, based on NEXTSTEP, which was based on BSD, was released in May 1999. (Singh, 2003) In 2007, Apple released the iPhone, running a stripped down, minimalistic version of OS X which was later renamed iOS. iOS and Mac OS X share a common ancestry that maps back to BSD, and from there back to Bell Labs and the original Unix.

Software Licenses

Although Mac OS X shares it’s history with the BSD variants, only a limited subset of its source code is available outside of Apple. Apple has made significant changes to the core source files of BSD, and released their version of BSD in a limited fashion as Darwin. Apple released a downloadable installer for Darwin as an image file (ISO) that could be burned to a CD-R, up until 2007, which corresponded with the release of Mac OS 10.5, Leopard. After this point, Apple released only the source code required by the license for open source tools included in Mac OS X or iOS. Apple utilized the BSD operating system, wrote their own tools and layers on top of it, repackage it, and sell the new operating system as their own. They were able to do this because of the liberal BSD license, which states:

Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:

  • Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
  • Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. (“Open source initiative,” )

The BSD license does not place restrictions on how the source code or binary programs are used or distributed, allowing that the redistributed application attribute BSD. This license reflects the academic roots and philosophy of the developers of BSD, who wished to make the system as open as possible to contribution. This style of license is very different from the license adopted by the GNU project, who developed the license based not only on their own philosophy that software should be free of restrictions, but also their own moral code.

To enforce the four essential freedoms, the GNU project created the GNU General Public License, or GPL for short. The GPL is a copyright license, legally enforceable, that protects the rights of users to create, modify, and distribute free software. The GPL also restricts what developers can do with code that they use that is already covered under the GPL. The GPL explicitly prevents developers from adding to, or deriving from, GPL code to create a new product, without also covering that code under the GPL. This restriction gives the GPL a viral aspect, as it can be seen to infect all other code it touches. If Apple had based Mac OS X on a core of GNU/Linux instead of BSD, it is very likely that Apple could have become entangled in a costly legal battle over he right to distribute their code in binary form.

For applications to comply with Freedom 0 as defined by the Free Software Foundation, the applications must allow no possibility to restrict the application from running. (Stallman, 2004) This means that any type of license key enforcement or digital rights management software would be prohibited. In contrast, the iPhone contains software which can only be used for one purpose, as stated by the iPhone Software License Agreement, section 2a, which states:

Subject to the terms and conditions of this License, you are granted a limited non- exclusive license to use the iPhone Software on a single Apple-branded iPhone.

The definitions of freedom offered by the Free Software Foundation act on the assumption that computers are central to a persons well being, and that there should be a natural right possessed by the user of any computer to have full and complete access to the computer based on that natural right to well being. However, it is my position that computers, or any other form of technology, only serve to increase personal freedom of the user in proportion to the increase in overall quality of life of the user of the technology. If the user possesses no knowledge of programming languages, than access to the source code does him little good. However, the user can hire a programmer to modify the source code for him, or group together with other users to raise a larger amount of money for the programmer, depending on the difficulty of the change requested.

Intellectual Property

In chapter four, section 2, of the textbook Ethics for the Information Age, intellectual property is defined as “any unique product of the human intellect that has commercial value”.

This concept of property is derived from John Locke’s writing The Second Treatise of Government, where Locke states that people have a right to their own labor, and a right to things that they have removed from nature through their own labor. The text then brings up an example of how this right can be misconstrued with the concept of intellectual property using William Shakespeare writing Hamlet. If Shakespeare writes Hamlet in a pub one night after listening to the rumors of royal intrigue, than it is agreed that the play is the result of his labor, and therefor he should have the right of ownership to it. However, if Ben Jonson listens to the same rumors in a separate pub across town, and then simultaneously with, but independent of, Shakespeare, writes Hamlet, then the text claims that ownership of the intellectual work is in question. There were two authors, but only one work, which creates a paradox when viewed in light of Locke’s reasoning.

The text is using flawed reasoning in this example. According to the text, “even though Jonson and Shakespeare worked independently, there is only one Hamlet”, but that two creative people could work independently to create the exact same work is impossible. The actual outcome of the Shakespeare example would be that there were two plays, Hamlet, and a very similar, but different, play written by Jonson. The text appears to be begging the question, since it assumes that the possibility of identical creative works is realistic.

If we assume that software is a creative work, similar to writing, art, or music, than it is logically assumed that the original author of the software is entitled to some form of ownership.

Copyright law dates back to the original printing press and the first ability to create copies of creative works quickly and efficiently. The first copyright law was passed in 1735 by the English Parliament as the First English Copyright Act, recognizing an original authors right to his creative work. (Ballon, & Westermann, 2006)

The GNU project takes a contrary stand on the subject of ownership of software. Richard Stallman, in his essay entitled “Why Software Should Not Have Owners” claims that authors of software can claim no natural right to their work, citing the difference between physical products and software, and rejecting the concept of a tradition of copyright. Stallman uses an example of cooking a plate of spaghetti to explain the difference between software and physical products:

“When I cook spaghetti, I do object if someone else eats it, because then I cannot eat it. His action hurts me exactly as much as it benefits him; only one of us can eat the spaghetti, so the question is, which one? The smallest distinction between us is enough to tip the ethical balance. But whether you run or change a program I wrote affects you directly and me only indirectly. Whether you give a copy to your friend affects you and your friend much more than it affects me. I shouldn’t have the power to tell you not to do these things. No one should.” (Stallman, 2010)

However, Stallman does not address what gives the second person who receives the software the right to benefit from the authors work without giving something in return.

Ethical Frameworks

Before the industrial revolution, most people learned a skill and worked for themselves in small communities. A single village would have all of the skill sets necessary to sustain itself, and each member of the community would apprentice into a particular skill set to contribute and earn a living. The industrial revolution pushed skilled workers into factories and assembly lines, work that was both distasteful and disdainful to an artisan in the craft. However, corporations were able to reduce cost and increase profits, and the platform has persisted into current work environments.

In the information age, the assembly line mindset has created oceans of cubicles filled with programmers who use their skills in small parts of large software projects, sometimes to great success, but far too often to failure. The Internet and popularity of lower priced computers has created a market for high quality third party software, the kind that is created by someone with a passion for what they are doing. This passion comes from learning a craft, and using that skill to earn a living, just like the workers from before the industrial revolution. Instead of living physically in small villages, these new age artisans live online and create communities built around social networking. (van Meeteren, 2008)

In many ways, this is a return to a more natural way of life, and a simple form of commerce. One person can create an application and sell it, and another person can buy it from him. The person selling the software benefits from being able to purchase shelter, food, and clothing for his family, and the person who buys the software benefits from the use of the software. It is a very simple transaction, and a model that is not adequately explained in the GNU essays. If all proprietary software is wrong, then an independent developer who sells software as his only job is also wrong.

GNU supporters could argue that there is nothing stopping the programmer from selling his software, but he should give away the source code under a license that permits redistribution along with the software once it is sold. At this point, selling the original program no longer becomes a viable business model. A programmer can not continue to sell his software when the user can, and is encouraged to, download his software from somewhere else for free.

While it may be the ethically right thing to do to purchase the software if you intend to use it, ethics alone are often insignificant motivation to encourage people to spend their money. If the choice of supporting the development of the software or not is entirely up to the user of the software, then purchasing the software becomes a choice that the user can make on a whim, with no real implications on the conscience of the user with either decision. GNU and the GPL place this decision squarely on the user, and encourage the users to not feel in any way obligated to pay.

The ethics of open source come into question when the requirement of adhering to the free software philosophies result in an independent developer not being able to support a moderate, middle-class lifestyle by developing a relatively popular application. Kant’s first formulation asks what would happen if all developers gave away the source of their code for free. The developer being the agent, his maxim would be giving away the source code of the application he developed to earn money. In this imaginary world where all developers did this, the quality of software would go down to the lowest common denominator of acceptability. Each developers motivation would be to develop for himself, and since he would need to find a source of income elsewhere, only in the free time allotted to him. This would result in a wide variety of software availability, with very little integration, testing, or source control, mirroring the current state of GNU/Linux based desktop operating systems.

Current software companies would move to a business model arranged around providing support to customers of their software. Competition, and therefore innovation, based on pure software features would decrease, since the source code of any feature another group could develop would be easily copied and integrated into competitors products.

From a utilitarian point of view, the outcome of proprietary software has clearly been to produce more pleasure for more people than open source has up to this point. Open source software is often more complicated, difficult to learn and maintain, and harder for the average computer user to use. Apple produces proprietary software and hardware, and states their mission to “make the best stuff”. Using their position as a leading software company, and leveraging their control over their computing environment, including iPads, iPods, iPhones, and Mac computers, Apple has been able to successfully negotiate deals with entertainment companies. The deals Apple has made allow the consumer to download music, television shows, and movies off of the Internet and watch them on any Apple branded device, and output the media to their televisions or home stereo systems. Because of the limits of Digital Rights Management, open source or free systems have not been able to provide this level of entertainment.

Conclusion

To answer the question of whether all software should be free, this article has examined a brief history of open source software, the concept of intellectual property, and finally an ethical analysis of the concepts of open source. I have found that there is a strong connection between an original author and their work, but have found no evidence or compelling argument that all software, or all forms of any genre of creative work, should be free of restrictions. I have found that there is a benefit of open source and free software to the public, cited numerous times in the essays of Richard Stallman and the GNU project. Free software enables the user to learn the intricacies of how the software works, and modify the software to suit his needs. Free software also provides a legal and ethical alternative to expensive proprietary software in developing nations or areas where the cost of obtaining a license for legal use of the software is prohibitive. Public institutions, like schools and government offices, where the focus of the organization is the public good, have the option to use software that is in the public domain and is not controlled by any one company. Free software also gives the user the option to “help their friend” by giving them a copy of the software, without having to worry about payment or licensing restrictions.

I have also found compelling evidence that proprietary software is beneficial to the public, as well as respectful of the original authors rights regarding their creative work. Software is the result of a person’s labor; it does not matter how easy it is to copy that work, the author still retains a natural right of ownership, according to John Locke’s The Second Treatise of Government. Proprietary software enables products like the iPad, which is being used to enable elderly people, nearly blind with cataracts, to create creative works of their own. (Newell, 2010) The iPad is also being used by caretakers of severely disabled children to enable them to communicate and express themselves. (Hager, 2010) It is possible that the iPad would have been created if the software used to power it had been free, but that is unknown. What is known is that the net result of the device is to better peoples lives, which is the true purpose of technology. Any technology is merely an enabler to get more satisfaction and enjoyment out of life.

The free software movement exaggerates the importance of a specific type of freedom, without addressing the proper place of technology in our lives.

The existence of free and open source software alongside proprietary software creates a mutually beneficial loop, wherein consumers and developers are able to reap the rewards of constant innovation and competition. I have found that there is a place for both proprietary and free software, and that the authors natural right to their creative work gives them the freedom to choose how and why their software will be distributed.

References

Ballon, H, & Westermann, M. (2006, December 1). Copyright ownership in works of art and images. Retrieved from http://cnx.org/content/m13912/1.2/#footnote1

Bertot, J, McClure, C, & Jaeger, P. (2008). The impact of free public internet access of public library patrons and communities. Library Quarterly, 78, 285-302.

Bretthauer, D. (2002). Open source software: a history. Information Technology and Libraries, 21(1), 3-11.

Crawford, M. (2009). Shop class as soulcraft; an inquiry into the value of work. New York, NY: Penguin Group.

Hager, E. (2010, October 29). Ipad opens world to a disabled boy. Retrieved from http://www .nytimes.com/2010/10/31/nyregion/31owen.html

iPhone software license agreement. (n.d.). Retrieved from http://www .apple.com/legal/iphone/us/terms/sla.html

Kain, R, & Bruce, I. (2010, November 22). Novell agrees to be acquired by attachmate corporation. Retrieved from http://www.novell.com/news/press/novell-agrees-to-be-acquired- by-attachmate-corporation/

Newell, C. (2010, November 4). Life is a limerick for centenarian virginia campbell. Retrieved from http://www.portlandtribune.com/features/story.php?story_id=128882605915653000

Open source initiative osi - the bsd license. (n.d.). Retrieved from http://www .opensource.org/licenses/bsd-license.php

Quinn, M. (2009). Ethics for the information age. Boston, MA: Pearson Education Inc. Singh, A. (2003, December 1). A brief history of mac os x. Retrieved from http://osxbook.com/book/bonus/ancient/whatismacosx//history.html

Stallman, R. (1977). Forward reasoning and dependency-directed backtracking in a system for computer-aided circuit analysis. Artificial Intelligence, 9(2), 135-196.

Stallman, R. (2004, June 11). Gnu and the free software foundation engineering tech talk at google. Retrieved from http://www.gnu.org/philosophy/google-engineering-talk.html

Stallman, R. (2010, November 12). The gnu project. Retrieved from http://www .gnu.org/gnu/thegnuproject.html

Stallman, R. (2010, November 14). Why software should not have owners. Retrieved from http://www .gnu.org/philosophy/why-free.html

Teli, M. (2010). Collective ownership in free/libre and open source software: the opensolaris case. Conference Proceedings of JITP 2010: The Politics of Open Source, 138-159.

van Meeteren, M. (2008). Indie fever; the genesis, culture and economy of a community of independent software developers on the macintosh os x platform. Informally published manuscript, Human Geography, University of Amsterdam, Amsterdam, Holland. Retrieved from http://indie-research.blogspot.com/

Williams, Jonathan. (2009, December 6). Free computers given to students. Retrieved from http://articles.baltimoresun.com/2009-12-06/news/bal-ho.computers06dec06_1_bright-minds- foundation-computer-refurbishing-organization-freshmen-laptops