jb… a weblog by Jonathan Buys

NKOTB

April 9, 2009

When I was younger, 20 years ago or so I suppose, the New Kids On The Block (NKOTB) were synonymous with cheesy pop music that was taking over our airwaves. They weren’t just a pop boy band, they were The pop boy band. I remember the thing was that if you were a guy and you were a NKOTB fan, it instantly meant you were gay. Which, for an adolescent boy just before his teens, that meant there was absolutely no way that this band’s music would ever willingly land on his ears. Rumors circulated regularly about the NKOTB being caught, red-handed no less, in one disgusting sexual tryst or another. One in particular that I remember was that one of the members had to have his stomach pumped because he was sick, and they found that it was full of sperm. That was the stigma attached to this band.

So, I grew, and I found that I much rather enjoyed classic rock and grunge more than anything else. As I’ve grown I’ve learned to appreciate many different types of music, everything from classical to the Beastie Boys. My iPod will shuffle from the Grateful Dead to an old Prodigy mix. However, I’ve never been a fan of pop music. Boy bands went away for a while, and for the most part I forgot all about them.

I was kind of surprised when my wife told me this blast from the past was coming to Des Moines. When she later told me that she wasn’t going because couldn’t find anyone to go with her, I felt bad for her, and told her that I’d take her. What I found at the concert surprised me.

The NKOTB are five guys from Boston who work really hard to put on a good show. They are talented singers and dancers, and they know how to keep the crowd entertained. They really are a great band, and when I look back at what they had to go through, I think that they were probably unfairly ridiculed. We were lucky enough to get seats on the (nearly empty) stadium, and were able to get within six feet of the band as they performed a few of their songs in the center on a rising platform. Getting that close to a person, being able to look in their eyes and have them look back at you, it’s hard to be critical. Much harder than listening to something on the radio or seeing something on MTV. I had an important revelation as I watched my wife go nuts over these guys.

I can’t do what they do.

I can neither sing, nor dance. I’m fairly certain that if I ripped off my shirt in front of a crowd I’d be politely asked to put it back on again. How can I be so extremely prejudiced against a group of guys that can do things that I cannot. I really can’t. The only thing that I can say is that I’m not a fan of their pop-sugary style of music. I can’t say that any of them are any less of a man than I am, and to be honest, in some ways much more for having the guts to do what they do in the first place. The public eye can be vicious.

So, what did I do that night? I looked a legend of the past in the eye, and gave them my respect. I don’t know if they are going to have a lot of success with this comeback, but I hope that things work out for them.

I also take it as a sign that I have grown to overcome the adolescent prejudices of my past. I can appreciate the work and talent that goes into making the music and the shows, even if I don’t care for the style of music itself. I’m not about to replace my collection of bootlegged Dead music for pop, but I’m glad that I’ve grown to the point where I feel that I’ve come into my own. My own preferences for my own reasons.

Hey, life’s too short for hate. And, just as proof, here I am with the beautiful woman who brought me, who loved the show so much she started screaming like she was twelve years old again:

Happy Birthday, babe.


Spit and Polish

March 31, 2009

After spending a week with Linux as my sole computer, I find it very refreshing to come back home to my Mac. FedEx says my wife’s PC should be here tomorrow, so she can go back to Word 2007 and I can have Mactimus Prime back. It’s not that I didn’t enjoy working with Linux, I did, but I’ve found that once the geeky pleasure of discovering something new wears off, there are problems.

I gave up on using Lotus Notes as any type of information manager a few months ago, so anything that comes in as an email is treated either as a to-do or as reference material. To-dos are deleted when they are finished, and reference material is printed to PDF for indexing by the operating system. Or, they would be indexed if Linux had a decent indexing system. Tracker, the default installed with Ubuntu seems incapable of properly indexing the large PDF files that I accumulate. I’ve done quite a bit of research lately concerning high availability, especially when it deals with DB2, but a search for the term “HADR” in tracker finds nothing. Maybe there’s a setting that needs tweaking, maybe I need to read the man page a little deeper, but all I know is that it doesn’t work quite right yet.

Google desktop search finds everything, but then there’s the strange need that it has to look and act like Gnome-Do. That bothers me to the point that I wind up uninstalling Google Desktop, the only tool that actually works. I just can’t have two applications performing the same function. If Gnome-Do would actually index the contents of the files, it’d be perfect. Google Desktop just looks and acts out of place. I like tracker because it looks like it belongs in Gnome, but looking like it belongs and actually performing its intended tasks are two different things. Google Desktop finds what I’m looking for, but it opens Firefox and I feel like I’m browsing the web, not my personal files. The conceptual model just doesn’t work… not for me.

Contrast this to my Mac. Every file that is written to the hard drive is automatically indexed by spotlight, and is immediately available from search. Spotlight is written into the kernel, deeply ingrained in the OS X operating system. Linux (and Windows, I believe?) search and indexing systems are add on applications that search for new and updated files to add to their index on a schedule. There’s no integration, no unification of the application an the operating environment.

Desktop search is only one area that needs attention. Another is the location and organization of the menus. In Gnome, there are two task bars, top and bottom. The top menu bar contains the three main menus for finding and launching installed applications, bookmarks and file system manipulation actions, and settings for personal preferences and systems administrative tasks. In OS X, the nested menus have been replaced with applications. The Finder is an application for locating and launching applications and manipulating the filesystem. On the surface, it seems like Finder is on par with Nautilus, the Gnome file manager, but this is not the case. The Finder, for all its faults, is a one stop shop for finding and launching applications, documents, music, pictures, and anything else that you can think of. Gnome uses a menu to launch applications, and Nautilus for finding and launching documents. Gnome also uses a pair of menus to change system and personal preferences. OS X has the system preferences application, which is also integrated into spotlight with a clever highlighting feature that “spotlights” the preference you are looking for.

Finder and System Preferences are two applications that highlight the focus on simplicity that is inherent in the design of OSX. It comes down to limiting choices. If I want to change something, I need to change a preference, in OS X I launch System Preferences. In Gnome, I launch…. well, it depends on what I want to change. I need to spend some time reading through the menu options to find the appropriate application that will change the preference I’m thinking of.

Then there’s the rest of the top Gnome task bar, which absolutely drives me nuts. For one, Gnome allows you to add applications shortcuts to the task bar, and ships with Firefox, Evolution,and the Gnome help application there by default. So now, you have both drop down menus and applications to launch side by side. Gnome allows you to “Lock” a menu item in place, assumedly so that when other options are added, they must be added around the item that is locked. This has the unfortunate effect of causing some menu bar options to be completely in the wrong order when moving from the 13” laptop screen to the docked 22” monitor. The menu items that are locked in place are still in the same place relative to the smaller screen size, while the menu items that are not locked in place have expanded with the larger screen and are now on the opposite side. So, to remedy this, I have to go to each item and unlock it, and then drag them to where they are supposed to be.

Of course, this raises another interesting question… just where are they supposed to be?

I normally remove the item with my name on it, since it duplicates functionality available under the System menu. I leave the calendar where it is, and love the weather available there. The volume is next to the weather, as is the network manager item, which both seem perfectly at home. Then there ‘s the notification area, which I’m never sure what will show up in this area. Pidgin lives there, as does the Rhythmbox music player, and occasionally Gnome-Do pops up there long enough to let me know about some new tweets on Twitter. The new unified notification system reminds me of Growl, and I love that it is included in the operating system by default now. However, when I launch Pidgin now, I’m given yet another icon on the task bar, and I have no idea what it does. It looks like a small envelope, and seems to be there simply to annoy me.

Most of the things in the top right hand side of the task bar seem perfectly at home there, but the notification area doesn’t. I might move it down to the bottom right task bar, in the place now occupied by the trash and the desktop switcher, but I’m afraid it might clutter up that area too. In the end, the notification area is more clutter than anything else.

The bottom task bar includes the trash, desktop manager, and an icon to click if you want to minimize all of the open windows to see the desktop. That button must come from Windows. The rest of the bottom task bar is used for window management, with an icon for each open window much like the Windows task bar. This bothers me because I have a notification area where applications can live, and I have the window management task bar where applications can also live. So, when I want an application, at times I have to stop long enough to check both the top and the bottom of the screen. Or, Alt-Tab, or Super-Tab, or, if it’s not in there maybe it’s a daemon, or maybe it crashed.

Personally, I’d like to see a more interactive bottom task bar, one that incorporates both notification capabilities and window management. Eliminate choices so I don’t have to make choices and decisions about my computer when I should be thinking about what I’m doing. I have to stop using my tool at times to interact with the tool, instead of simply using the tool to accomplish my intended task.

Gnome has some things that I’ve really enjoyed. I love TomBoy. I love how Gnome has built in ability to mount nearly any connection and browse it as a filesystem. I love, love, love Gnome-Do, and I wish it wouldn’t have tried to be something its not with that ridiculous dock theme. Gnome-Do, you excel at being a keyboard application launcher for Linux, where I spend all my time in the terminal. Be what you are!

I’m looking forward to getting my Mac back in the next couple of days, and I’m really looking forward to firing up Xcode and getting to work on my app again. I may talk to my boss about using my Mac at work as well, that would solve so many problems for me. For now, I’ll keep using Linux. The major problems have been overcome; stability and capability. Now, it’s time to concentrate on spit, polish, and deep integration for a cleaner, less cluttered user experience.


Tough to Turn Down

March 22, 2009

I’ve been using Linux for nearly a decade now. I first had a guy I met in a Navy school install it on my old IBM desktop. Back then, it was very hard to find the right drivers, and just about impossible to get on the Internet, seeing how almost every modem that shipped with a PC was a WinModem. X11 configuration was error prone to say the least, drivers for the sound card were hard to come by, and out in the rural English countryside where we lived, broadband was almost unheard of. Installing software could be a nightmare. Say I wanted to install a music player to listen to my CDs. The music player would have dependencies, certain libraries that needed to be installed, so I’d go and download the dependencies and try to install them, only to find out that the dependencies had dependencies! So, I’d download further dependencies, and eventually I’d be able to listen to my music. And then I’d try to launch something else, only to find out that in fulfilling the dependencies of the music player, I’d broken the dependencies of the other application that used to be installed and working.

However, on the rare occasions when it did work, Linux was fascinating. It’s been amazing to watch Linux grow over the years, and even more amazing to see the leaps and bounds in the user interface and usefulness. To say that right now, I’m writing this on my laptop, wirelessly connected to the Internet on my WPA2 secured network, listening to my Grateful Dead collection shuffle, while keeping four other virtual desktops open with several other applications running is nothing short of amazing. I’ve been critical of Linux in the past, but within the past six months, Linux on the desktop has really brought it all together.

For example, I’ve got an Airport Express on my home wireless network, with an HP PSC 1500 attached to its usb port. Ubuntu automatically detected the printer, downloaded and installed the drivers, and allowed me to print a test page just now with no frustration at all. I can’t even do that with OS X, not unless I’ve installed several gigs of printer drivers I don’t need beforehand. Scanning works much the same way … plug it in and use it, that’s it.

Ubuntu has finally fixed the dual-monitors problem that bugged me for so long, and Mint (an Ubuntu derivitive) sleeps and wakes up almost as seamlessly as my Mac. Looking at my desktop now I’ve got a ton of free software, in a system that works like I want it to. Given that all of the software I’m using now (with the exception of the IBM apps I need for work) is free, going back to my Mac when Rhonda’s Dell comes back may be harder than initally thought. Time will tell, but the free Linux desktop is becoming harder and harder to turn down.


Adamo - Apple Pushes the Industry Forward

March 18, 2009

I almost feel sorry for the other hardware manufacturers. No matter what they do, no matter what they come out with, they seem to be forever following in Apple’s footsteps. Such is the case with Adamo from Dell, a clear shot at the MacBook Air.

Adamo uses the same machined aluminum manufacturing process introduced by Apple with the Air, which has since spread very successfully to the rest of the Mac laptop line. Adamo markets itself as very thin, and very light, and has an artistic feel to their advertising that seems out of place with “cookie cutter” Dell. In fact, the marketing is almost too artistic, almost like they are trying too hard to shed their old image.

The specifications between the two lines are very similar.

AdamoMacBook Air
CPU1.2Ghz Core 2 Duo1.6Ghz Core 2 Duo
RAM2GB2GB
Display13.4”13.3”
GPUIntel GMX 4500NVIDIA GeForce 9400M
Storage128GB SSD128GB SSD
Weight4 Pounds3 Pounds
Price$1999$2299

As you can tell, this is not truly an apples to apples (pardon the pun) comparison. At this price point, the major difference between the Air and the Adamo is the $500 optional SSD. Configured with the 120GB SATA drive, the Air comes in at $1799. The Air is faster and lighter than the Adamo, and includes a dedicated NVIDIA graphics card.

A more accurate comparison may be with the MacBook, also configured with a 128GB SSD.

AdamoMacBook
CPU1.2Ghz Core 2 Duo2.0Ghz Core 2 Duo
RAM2GB2GB
Display13.4”13.3”
GPUIntel GMX 4500NVIDIA GeForce 9400M
Storage128GB SSD128GB SSD
Weight4 Pounds4.5 Pounds
Price$1999$1749

The MacBook case is larger and heavier. With the MacBook there is a half a pound difference in weight, but there is a big difference in the 2.0Ghz speed boost in the CPU. Fortunately for Dell, the Adamo is not about hardware internal hardware specs. It’s about trying to catch up with Apple in industrial design.

The screen looks gorgeous, I love the edge to edge glass design. The Adamo screen has a slightly different resolution from the standard 1280x800, coming in at 1366x768. Dell has done a great job with the Adamo. Unfortunately, its still not a MacBook killer, simply because its still not a Mac. Great industrial design is only one part of the puzzle of what makes a Mac a Mac. The other vital piece is OS X. With OS X and the Mac Apple has created a machine that drifts into the background, gets out of your way, and lets you do what you set out to do. Adamo ships with Windows Vista Home Premium 64bit. No matter how great the hardware is, if the software is not intimately tied to it the way only a company that creates both the hardware and the software can do, it’s still just another PC.

People may initially buy their first Mac because of the design, or the halo effect of the iPod. They buy their second Mac because of the experience.


Communications

March 5, 2009

Ask any mechanic, machinist, or carpenter what the single most important thing that contributes to their success is, and what they are bound to tell you is “having the right tool for the job”. Humans excel in creating physical tools to accomplish a certain task. From hammers to drive a nail to the Jaws of Life to pry open a car, the right tool will save time, money, and frustration. It’s interesting to note the contrast that people have such a hard time with conceptual tools. Software designed to accomplish a specific task, or several tasks, is often forced into a role that it may not fit into, making the experience kludgy, like walking through knee deep mud.

I’ve found this problem to be especially prevalent in business environments, where the drive to “collaborate” brings many varied and sundry applications into the mix. While hurrying to the next great collaboration tool they forget to ask the absolute most important question: What do we need this software to do?

To communicate effectively in a business environment, it is imperative to use the right tool for the job. To determine what the right tool for the job is, you first have to ask yourself exactly what the job is. Do you need ask a quick, immediate question from a co-worker? Is there a more detailed project question that you need to ask, and maybe get the opinions of a few others too? Do you have something that you need to let a large group of people know, maybe even the entire company? Each of these tasks is best suited to a different tool. Unfortunately, I’ve most often seen each of these tasks shoe-horned into email.

Email is a personal communications medium, best suited for projects or questions that you do not need immediate responses to. There have been many times when I’ve gone through my inbox and found something that didn’t grab my attention when it was needed, and by the time I looked at it was past due. Email is asynchronous, ask a question, and wait for a response whenever the receiving party has time.

If you do need immediate response, the best tool would be instant messaging. IM pops up and demands the receiving person’s attention. When requesting something via IM, the person on the other end has to make a decision either to ignore you or to answer your question. Long explanations are not a good fit for IM, but short, two or or three sentence conversations are perfect or it.

When considering sending something out to the entire company, keep in mind that email is a personal communications medium. Company wide email blasts are impersonal, and for the most part require no action on the part of the receiver other than to eventually read it. A better tool for this job is an internal company blog, accompanied by an RSS feed. RSS is built into all major browsers now, and could be built into the base operating system build for PCs. Every single time I get something from our company green team or an announcement from the CEO, I can’t help but think that a blog would be the best place for such formal, impersonal communications. A blog could be archived for searching, new announcements broadcast through RSS, and best of all accessible when the intended receiver has the time and attention to best appreciate the content. To me, company wide email is the same thing as spam, and for the most part, is treated as such.

One other form of internal communications which is far, far too often maligned beyond recognition is shared documentation. Technical documentation especially suffers from document syndrome, which is having a separate (often) Word file for each piece of documentation, spread out through several different directories on Windows shares, or worse on the local hard drive of whoever wrote it. Such documentation should be converted to a wiki as soon as possible. If you are writing a book, I hear Word is a fairly good tool to use. If you are writing a business letter, again, great tool. If you are documenting the configuration of a server, a word processor should not be launched. Word (or OpenOffice) documents have a tendency to drift, and are difficult to access unless you are on the internal network. Trying to access a windows share from my Mac at home through the VPN is something that I’ve never even considered trying. A wiki, is perfectly suited for internal documentation. They are a single central place to organize all documents in a way that makes sense, is accessible from a web browser, easy to keep up to date, and most importantly, searchable. Need to know how to set up Postfix? Search the internal wiki. Need to know why this script creates a certain file? Search the internal wiki. Everything should be instantly searchable. Perhaps search is not the most important feature of a wiki, perhaps the most important is the ability to link to other topics according to context. I can get lost in Wikipedia at times, jumping from one link to another as I explore a topic. The same thing can happen to internal documentation. This script creates this document for this project, which is linked to an explanation of the project, which contains links to other scripts and further explanation. Creating the documentation can seem time consuming, until you are there at 2AM trying to figure out why a script stopped working. Then the hour spent writing the explanation at 2PM doesn’t seem so bad.

One of the first things I did when starting my job was to set up both a personal, internal blog and a shared wiki for documentation. I used Wordpress for the blog, and MediaWiki for the wiki. Both are excellent tools, and very well suited for their purpose. If I were a manager, I’d encourage my team to spend 15 minutes or so at the end of the day posting what they did for the day in their personal blog. Could you imagine the gold mine you’d have at the end of a year or so? Or the resources for justifying raises. Solid documented experience with products and procedures, what works and what doesn’t. The employee’s brain laid out.

Internal moderated forums is something that I haven’t tried yet, but I can see benefits to this as well. They’ve been used on the Internet for years, and I can imagine great possibilities there, especially for problem resolution. How about a forum dedicated to the standard OS build of PCs? Or one for discussing corporate policies?

Blogs, wikis, forums, and IM are tools who’s tasks are far too often wedged into email. If an entire organization begins to leverage using the right tool for the job, the benefits would soon become apparent. Then you’d wonder why you ever did it the other way at all.


Shell Script Style

February 25, 2009

My co-worker and I spent the better part of yesterday afternoon going through a former employee’s shell scripts to try to determine what they were and what he was trying to do. The script worked, for a while, but there were several mistakes. The mistakes were not in strict syntax, they were in style. Here are a few simple rules to follow to write great scripts:

  1. Always, always, always start off each and every script with a shbang line: #!/bin/sh. Starting off your script with this line tells your shell where to find the interpreter for the commands in the script. Without this line, the script is using your user’s existing shell, the one you are typing in at the moment. This is bad because you are sharing environmental variables, and maybe changing environmental variables outside of your script, and not keeping it self contained and portable.

  2. Keep your script self contained: If at all possible, try to avoid writing files in different directories. Or, even better, try to avoid writing files at all. Use variables when you can, write files when you have to.

  3. Avoid sourcing other scripts or files containing functions: I read about this method in Wicked Cool Shell Scripts, but I disagree that it is as useful as they say. Writing a custom function to send an email is a great idea. separating it out of the script you are working on at the time is not. Again, keep the script self contained. There are obvious exceptions to this rule. If your function is over 50 lines of code, and reused in multiple other scripts, then by all means, source it. If your function is 10 lines, create a vi shortcut for it and add it to the top of the script.

  4. Comment tasks: Each block of code in your shell script is meant for a specific task. Add a comment for this block. Make it easy to read, and simple to understand. Assume that you will not work there forever, and someone else will need to read your code and make sense out of it. Also assume that in a year, you will forget everything you did and why you did it and need a reminder.

  5. Keep it simple: Scripts should flow logically from top to bottom. If you are creating functions, make it obvious using a comment. Reading a script should be as easy as reading a book, if it’s not, then you are intentionally making things overly complicated and difficult to read.

  6. End each script with #EOF: This is purely a matter of taste, but I find it adds a nice closure to the script.

The easiest thing to do is to create another script who’s purpose in life is to create new scripts. Couple this script with a vi shortcut (mine is ,t) to create the skeleton of the script and you can quickly create powerful, well formatted, easy to read scripts. Here’s an example of mine:

#!/bin/sh
# 
# scripty.sh: This script creates other scripts
# Created: 25 Feb. 2009 -- inbound@jonathanbuys.com 
#
##############################################################

# A place for variables
VAR1="Set any variables at the top"
DATE=`date`
ANDTHEN="Whatever"

# A place for functions
some_func(){
    echo `date`
    echo "Whatever's Clever!"
}

# Get down to writing the script

echo $VAR1
echo $DATE
echo $ANDTHEN
some_func
# etc...

##############################################################
#EOF

This article doesn’t talk about syntax, only style. There’s plenty of help with syntax available on the Intertubes. Also, this is my style, as you progress as a sysadmin or scripter of some sort or another, you are bound to come up with your own style that suits you. My style is based on the documentation at grox.net. My style has evolved over time, as will yours, but this is a good place to start.


Systems Administrator

February 24, 2009

From time to time I’m asked by members of my family or friends of mine outside the tech industry what it is that I do for a living. When I respond that I’m a sysadmin, or systems administrator for Linux and UNIX servers, more times than not I get the “deer in the headlights” look that says I may as well be speaking Greek. So, for a while, I’ve taken to saying “I work in IT”, or “I work with computers, really big computers” or even “I’m a computer programmer”, which isn’t exactly accurate. Although I do write scripts, or even some moderate perl, I’m still not officially a programmer. I’m a systems administrator, so, let me try to explain, my dear friends and family, what it is I do in my little box all day.

First, some basics, let’s start at square one. Computers are comprised of two parts, hardware and software. Sort of like the body and soul of a person. Without hardware, software is useless, and vice-versa. The most basic parts of the hardware are the CPU, which is the brain, the RAM, which is the memory, the disk, which is a place to put things, and the network card, which lets you talk to other computers. For each of these pieces of hardware there needs to be some way to tell them how to do what they are intended to do. Software tells the hardware what to do. I forgot two important pieces of hardware: the screen and the keyboard/mouse. They let us interact with the computer, at least until I can just tell it what to do Star Trek style.

Getting all of these pieces of hardware doing the right thing at the right time is complicated, and requires a structured system, along with rules that govern how people can interact with the computer. This system is the Operating System (OS). There are many popular operating systems: Windows, OS X, and Linux are the big three right now. The OS tells the hardware what to do, and allows the user to add other applications (programs) to the computer.

Smaller computers, like your home desktop or laptop have network cards to get on the Internet. The network card will be either wired or wireless, that doesn’t really matter. When you get on the Internet, you can send and receive information to and from other computers. This information could be an email, a web page, music, or lots of other media. Most of the time, you are getting this information from a large computer, or large group of computers that give out information to lots of home computers just like yours. Since these computers “serve” information, they are referred to as Servers.

Large servers are much like your home computer. They have CPU, RAM, disk, etc… They just have more of it. The basics still apply though. Servers have their own operating system, normally either Windows, Linux or UNIX. Some web sites or web services (like email) can live on lots of different servers, each server having its own job to do to make sure that you can load a web page in your browser. To manage, or “administer” these servers is my job. I administer the system that ensures the servers are doing what they are supposed to do. I am a systems administrator. It is my responsibility to make sure that the servers are physically where they are supposed to be (a data center, in a rack), that they have power and networking, that the OS is installed and up to date, and that the OS is properly configured to do its job, whatever that job may be.

I am specifically a UNIX sysadmin, which means that I’ve spent time learning the UNIX interface, which is mostly text typed into a terminal, and it looks a lot like code. This differs from Windows sysadmins, who spend most of their time in an interface that looks similar to a Windows desktop computer. UNIX has evolved into Linux, which is more user friendly and flexible, and also where I spend most of my time.

Being a sysadmin is a good job in a tech driven economy. I’ve got my reservations about its future, but I may be wrong. Even if I’m not, the IT field changes so rapidly that I’m sure what I’m doing now is not what I’ll be doing 5-10 years from now. One of these days, maybe I’ll open a coffee shop or a restaurant, or I’ll finally write a book.


A Work in Progress

February 19, 2009

A few days ago I decided that I was not going to use anyone else’s theme on my site. It happened after I stumbled across another site using the exact same theme as mine. Unavoidable really, as long as you are using someone else’s theme. So, the decision was to either stop using Wordpress, or to design my own theme. I love Wordpress, so I decided to go with the latter.

Designing a web site is a strange mix of code and graphic design. In my case, I’ve had to go back to php, a language I left a long time ago, and start learning CSS. Since I’ve been fooling around in Cocoa for quite a while, going back to php is just painful. Objective-C is a beautiful programming language. Mixing php and html… well, that’s just plain ugly. However, that being said, it’s familiar territory, so I almost feel like I’m coming home. One concept that I’ve learned with Cocoa is the Modal-View-Controller method, basically separating out the presentation code from the application code (yes, I know there is a lot, lot, lot more to it than that… no I’m not going to get into it here), using CSS kind of reminds me of the MVC method, in your php/xhtml code you define what objects are going to be displayed, and in CSS you define where and how to display them. I like the separation… keeps it clean.

At any rate, I’ve been busy coming up with the overall look and feel of the site. One thing I believe about software is that simplicity always wins. At least where I’m concerned it does, that’s why I use a lot of the apps that I use, because they are simple to use. Think about the Google home page. Simple, and it wins.

I’d appreciate any comments on the design, and please keep in mind this is only a very early mockup. Also, I’m going to be using this as my avatar for everywhere that I’ve got an account online: A friend of mine, who actually is a designer, laughed when I told him about the tools I’ve been using to do the design so far. First, the initial concept was created in OmniGraffle. From OmniGraffle, I’d export it as a Photoshop file and open it in Pixelmator to add the leaves and other touch ups. Right now, that’s as far as I’ve got. I’ll finish the design in the next couple of days, and then move into chopping the file up and getting deep into some code. Hopefully, I’ll have this finished in two or three weeks.


JeOS

February 6, 2009

For better or worse, we are starting to put Ubuntu JeOS images into production in our network. Starting off, we will only put these systems in for our non-IBM services, no WebSphere or DB2, as IBM doesn’t officially support this configuration yet, but for everything else, JeOS looks like a perfect fit.

Most of our production systems currently use SuSE Linux Enterprise Server (SLES), which is an excellent OS, but requires you to purchase a license in order to download patches and updates. This is fine for the majority of our production systems, but as the number of our virtual systems grows, so does the complexity of management overhead that goes along with it. We can quickly put a new SLES box into production if needed, but, will it be licensed? Will it be able to download the latest patches? SLES 8 and 9 were fairly lenient about it, SLES 10 is not. With Ubuntu, we never have to worry about if we are going to be able to download the latest security patches, they are always freely available, which brings me to my next point:

There is very little installed with the default JeOS image, there truly is “just enough” to get up and running. Reminds me of the bare-bones OpenBSD installs I used to do 7 or 8 (9?) years ago. Follow up with “sudo apt-get install ufw clamav” and you’ve got a very secure, firewalled, anti-virus scanned system. Yes, clam and ufw both need more configuration, but even that is dead easy. Also, I really like the concept of not having a root user, and having to use sudo to accomplish anything.

The final thing I enjoy about Ubuntu is that it’s just easy to use. Today, I set up Request Tracker (better known as RT) from Best Practical. It was so easy, the 3.6 version is in the repositories, so installing it was a quick apt-get away. Comparatively, using SLES, I had to run their “make testdeps” and “make testdeps” scripts to try and download and compile all of the perl prerequisites, not to mention Apache, MySQL, and mod_perl.

RT is the first app that I’m moving over to an Ubuntu JeOS. It will be quickly followed by Nagios, Webmin, and then Postfix and an Apache web server hosting our company web site. The idea is to make each of these services a virtual appliance, so we could move them around easily, and work on one without affecting the others.


Rota

February 1, 2009

It was June of 1996 when I arrived in Rota. The Spanish sun was bright as I stepped off the creaky military aircraft, and I realized that this day would hold a lot of firsts for me. Today, I was going to meet my ship.

This was the day for which I had been preparing for the last year; U.S. Navy boot camp in Chicago, followed by engineering common core, (a school designed to teach us young recruits about the basics of engineering, like, how to turn a valve), then a class A school in San Diego where the curriculum taught how to create machinery parts out of metal stock using a lathe and mill, and where I learned how to stick a twenty in my sock in Tijuana to pay off the crooked cops. That whirlwind of confused order was designed to prepare me for this experience, this day, when I begin to earn the title that the United States government has given me. Today when I set foot on the ship, I will earn the name Sailor.

After twelve hours of being cramped up in the suffocating small cabin of the plane, suffering through the tiny, dry in-flight meals, and two movies that I could neither see clearly through the array of heads in my line of sight, or hear through the headphones that did not fit properly on my head, I was relieved to be out in the fresh air again. But my reprise was short lived, as the airport staff soon herded us into the airport to collect our baggage. Only one bag for me, the Navy seabag, packed tight with everything that I owned in the world, except what was boxed up at my parents house in Montana. I reached down and grabbed my seabag, and hefted it up to my shoulders. As I turned around, a stocky Hispanic man with thick glasses and tight black hair wearing Navy dungarees asked me, “You fireman Buys?” he asked. “Yes”, I replied, “Fireman Recruit Buys.” A broad grin stretched out across his face as he laughed and said, “Ok, Fireman Recruit Buys, I’m Petty Officer Garcia. Welcome aboard, and follow me.”

Why was he welcoming me aboard, I wondered, I’m not on the ship yet. I brushed my trivial concerns behind me and followed him to his plain white government issue van. We had only a short drive until we stopped in a gravel parking lot in front of a long concrete pier. I hauled out my seabag and followed Petty Officer Garcia up the pier. It smelled of salt water and fuel, the breeze light on my skin. It was then that I got my first good look at the ship I would be calling home for the next few years. Haze gray paint covered the hull of the oiler named USS Platte. The ship was smaller, and less impressive than I thought it would be. Still, it was a good sized ship, approximately two football fields long. The first two thirds of the ship were dominated by seven kingposts, metal towers bearing large black hoses. The last third was split; half was the house, at the top of which was the bridge. At the aft end of the ship was a small flight deck, large enough for one helicopter.

As Petty Officer Garcia led me up the metal brow, I pulled out my military ID. At the top of the brow I reached the quarter deck. Before stepping foot on the metal deck, I faced the American Flag, stood at attention for a moment to pay respect. Then I turned to the officer of the deck, stood at attention and presented my ID. “Permission to come aboard.” I requested. “Very well.” Came the reply. I then set foot on the deck of the ship, a sailor at last. Yes, it was very well indeed.