jb… a weblog by Jonathan Buys

The Master Craftsman

May 14, 2008

The Master Craftsman works methodically, not slowly, not hurriedly. He has mastered the basics, and knows the essence of his craft. He has moved to a point where he can define his own methods, and doesn’t need to explain them to anyone, unless someone is wise enough to ask. The Master Craftsman enjoys the hardest, most complicated problems, and enjoys unravelling them piece by piece. He enjoys the challenge to his skill, and proves his worth again and again as he overcomes each obstacle.

The Master Craftsman has an intimate relationship with his tools. He knows not only what they do, but much more importantly, how they do it. His choice of tools for a certain task is based on years of experience, research, and hands on use. He can debate intelligently on the merits and problems with his tools, as compared with other tools that he has tried and discarded over time. His choice of tools explains something about him.

The Master Craftsman enjoys his work, and is comfortable with his place in the world. He is an expert, and he knows it.

I’ve had an abstract idea for this post for a couple of months now. The Master Craftsman is the embodiment of my professional goals. I’m not one to speak of gurus or wizards, since I come from very down to Earth country in Montana, but a craftsman, or a woodworker is an image that I can grab on to. I’m not a Master yet, but I’m working towards it. My tools are not jigsaws and planers, but vi and zsh, but the basic principles still apply. I think the idea of a craftsman is less about impressing others and more about perfecting his skill.


Creative Uses for Wordpress

May 7, 2008

Where I spend my days ($WORK), we have multiple monitoring systems for just about every service on every server that we have. Many of these are Nagios, some are built in, and others are SiteScope. All of the systems generate email alerts that either go to our pagers, our email, or both. From time to time, management would ask a question like “How many pages do you get in a week on average”, which up till a couple of months ago, our answer was always “It just depends”.

Not a great answer, so I decided to start tracking the email alerts with a centralized database. Now, at this point, I could have whipped up my own home-brew frankenstein creation, but since everything I wanted was already built into Wordpress, I really didn’t need to. Wordpress has the option of posting blogs via email. So, all I needed to do was set up a special email account on our mail server and make sure the pop3 server was running. Then, add the server and login information into Wordpress, setup a cron job to trigger the mail check every five minutes, and there. Instant logging of all pages that are sent out in a searchable, easy to read, web format. Now, when management gets it in their mind to start asking questions, we can easily say “Let me reference my report.” They really like hearing things like that.

Building on the success of the alert log, I thought it might be good to also log all of our changes to the system. This idea is completely different from traditional “Change Management” systems which require you to log ahead of time what you want to accomplish in some ridiculous form or application. Instead, I find it much more useful and relevant to build in the change logging where we spend most of our time, the command line.

I’ve added an alias for “exit” in the shell like so: alias exit=”exec /scripts/ch_log” Here is the ch_log script:

#!/bin/sh
# ch_log - Prompt the user to log system changes on
# exit from the root shell.
#
# jonbuys@os-zen.com - Wed Apr  2 15:32:43 CDT 2008
#
############################################################

HOST=`hostname`
DATE=`date +%m-%d-%y`
echo $DATE
echo "Did you make any changes to the system? (y/n)"
read answer

if [ $answer == n ] ; then
   echo "OK, Thanks!"
   exit 0
else
   echo "Cool, please enter your name, and then describe the changes in the form."
   echo "Name:"
   read NAME

    cat /scripts/log_template | sed s/NNN/$NAME/g | sed s/DDD/$DATE/g | sed s/SSS/$HOST/g > /tmp/$$.answer
    vi /tmp/$$.answer
    mail change_log@mail.mydomain.com -s "Change Notification for $HOST"< /tmp/$$.answer
    echo "OK, thanks!"
fi
exit 0

############################################################
# EOF: ch_log

Basically, when we exit our shell we now have to make a choice… do we log what we did with this quick and easy script, or do we ignore it and risk the consequences. I’ve found that for the most part, I choose to log my work. The email that is sent off to the change_log@mail.mydomain.com address is picked up by a second Wordpress install, and posted to the blog. Now we have a historical record of what we’ve done incase something breaks, or (more importantly) when annual review time comes around and we are asked “what have you been up to”

There is one other change that I had to make to get this to work right. By default, Wordpress holds all posts it recieves via email for approval before posting it to the main page. This is good security, but not really needed on an internal LAN, and it breaks the system I’ve laid out above. So, to fix it, I’ve made a slight change to the wp-mail.php file: // Set $post_status based on $author_found and on author’s publish_posts capability if ($author_found) { $user = new WP_User($post_author); if ($user->has_cap(‘publish_posts’)) $post_status = ‘publish’; else $post_status = ‘publish’; } else { // Author not found in DB, set status to pending. Author already set to admin. $post_status = ‘publish’; }

Above, I’ve changed the “pending” post_status to “publish” for unidentified users, which is everything that it receives via email. This is a very bad idea to do outside of the LAN, but I don’t see any harm in it internally. Undoubtedly there are those who would disagree, but this works well for us.

This is how we are using Wordpress internally on our corporate LAN right now, I’d be interested to hear how some others are using Wordpress or other blogging software.


My Optimized Windows Workflow

May 6, 2008

I love Linux, I really do. Compared to the older UNIX systems like AIX, HP-UX, and Solaris (who is trying really hard to catch up) Linux is head and shoulders above the rest. The main reason for this is that a lot of really smart people also love Linux, and try their best to make it the best server on the planet. For the most part, I’d agree that we are succeeding on that front. On the other hand, to date, I simply can’t run Linux on my desktop. If there are servers down, or an application fault somewhere, I need to be able to rely on my tools to be there for me. That’s why I run XP on my laptop.

Now, just because I’m running XP doesn’t mean that it has to suck. I spend most of my time either in the command line or in Firefox. Oh… uhhh… and, most unfortunately, in Lotus Notes (holding back gag reflex). I’m not sure if there has ever been a worse email client created than Lotus Notes, but that’s a post for another day. So, to make the most of what I have, there are three tools that I’ve come to rely on:

  1. PuTTY - Outstanding SSH client. Always there for me, never craps out, and an awesome Alt-Tab to full screen command line goodness.
  2. Launchy - Now on my permanent list of apps I can’t live without.
  3. Emerge Desktop - I’ve got a small screen, and that damn “Start” menu always bothered me. With Emerge, I’ve replaced the Explorer shell with a very small, very minimal, no task bars or anything else desktop. If I need the Start menu, I just right click on the desktop and there it is.

With PuTTY, I’ve traded out my SSH keys with my management server, where I run everything else from. I set up a saved session in PuTTY for the management server, and make sure that I can log in without a password. Next, I create a “Launchy Indexed” folder in my home directory on my laptop, and create a shortcut with the following as the “Target:”

"C:Program FilesPuTTYputty.exe" -load my.management.server

Next, reload the launchy index, and we are in business. Now I’m just a couple of keystrokes away from my management server. Since my management server is secure and on the same network as most of the rest of my servers, I’ve set up some custom scripts there as well. My most used script is named “go”:

#!/bin/sh
# go - ssh into the specified server
#
# jonbuys@os-zen.com - Wed Jul 25 09:52:09 CDT 2007
#
############################################################
# Make sure the user actually entered something here
if [ -z "$1" ]; then
        echo "Usage: go [servername]"
  exit 1
else
        # Set the variables
        SERVERNAME=`echo $1`
        USERNAME=`whoami`
        # run the command
        ssh $USERNAME@$SERVERNAME
fi
############################################################
# EOF: go

Very simple, but still, fewer keystrokes than actually typing ssh myusername@whatever.server.com. I’ve also been known to abbreviate server names as well in /etc/hosts for folks who insist on ridiculous names that make sense only to them. Fewer keystrokes, quick access, and less to think about when I absolutely need to get to that server ASAP. Luckily, there are very few times when the need is that great. It is, however, very satisfying to see the looks on the faces of my old-school co-workers when they realize that “He hasn’t touched his mouse yet…”


Contextual Search

May 2, 2008

My personal browser of choice has almost always been Omniweb. Omniweb and I went through a tough time for a while when it (she?) was crashing frequently and generally having a tough time of it. The Omni Group has once again straightened things out, and she (yea, I’ve decided Omniweb is a she) is once again fast, sleek, and powerful. There is one small item about the browser that bothers me though, and that is the lack of a search function from the browsers contextual menu that pops up when you select a word and right click on it.

Exhibit A: Omniweb Contextual Menu Omniweb Contextual Menu

Omniweb has some interesting options available, including the somewhat dubious value of cascading the “Page” menu that is available when right clicking without selecting a word. I’ve tried both the “Start Speaking” and “Inspect Element” options, neither of which do me any good in my normal browsing flow.

When Omniweb and I were not getting along I tried out both Firefox and Safari for a while. While both browsers offer a search function from the contextual menu, Firefox has made the best choice from a usability standpoint.

Exhibit B: Firefox Contextual Menu Firefox Contextual Menu

When I select “Search Answers.com For ‘perfect’”, Firefox opens a tab in the background searching Google for the word that I had selected. To me, this is clearly the best option.

Safari’s search function is complemented by a “Search in Spotlight” option that I’ve never used. I suppose its nice that its here, but still, never used. The ability to search the built in dictionary is nice, and I think I may have used that function once.

Exhibit C: Safari Contextual Menu Safari Contextual Menu

The problem I have with Safari’s search Google option is that it replaces the tab that I’m reading instead of opening up another tab. Clearly this defeats the purpose, I’m interested in the word, product, or service that I’m reading about, but I don’t want to stop reading to search Google. I want Google to be there in the background waiting for me when I’m good and ready. Firefox has this right, and I think both Safari and Omniweb could stand to adopt this feature.


Starting Over

April 22, 2008

My wife wanted me to read something that she was writing the other day, so I sat down at her laptop on our table and read through it. While I was there, I happened to glance at her email, an old hotmail account, and noticed that she has emails going all the way back to ‘01. A quick glance at my gmail tells me that there is no way I can tell how far back my email goes, but I’m pretty sure that I’ve lost everything prior to ‘05 or so. I’ve been accused of having email ADD in the past, and I’m fairly certain there is a bit of truth to it. It seems to hold true for a lot of the technology in my life, I’m just never satisfied with it, and wind up tweaking, fiddling, and otherwise screwing around with my tools until they are either just right or completely screwed up and I throw the entire thing in the trash and start over.

This certainly holds true for my email, I have been through @aol.com, @hotmail.com, @yahoo.com, @mac.com, @inbox.com, @live.com, and finally, @gmail.com. It also holds true for my web sites. I’ve started 10 or 15 web sites throughout the years, starting with a geocities site back in… what, ‘99 or so? If I’d have stayed on top of it, I’d have managed to compile a decent amount of writing in one place for 9 years. But, I have some form of technology ADD, and can not seem to be happy with any single system. When I discovered Linux, my curiosity really got the best of me. I must have downloaded and tested 100-125 distros. I installed so many that I started recording them on my old (now defunct) blog, jonstechblog.com, which evolved into the also now defunct, osvids.com. This went on until I “switched” to Mac, and I’ve been fairly happy with my operating system since. At least I know that there is nothing else out there that’s any better than what I have now.

I’ve learned a lot about what I want out of my technology over the years, and I’ve found that when I find a good system, even if its not perfect, its best to stick with it until there is a significant reason to change. My curiosity has unfortunately led to my loosing data. Somewhere along the line I lost a lot of email, and a lot of writing, and there is no way to get that back. So, now, I’ve come to a point where I’m content in the systems that I have in place. My email works great, my OS works great, and I have an excellent blogging platform on a reliable host. I’ve started over far, far too many times, and it’s time to settle down and shoot down some roots. Its time to stop worrying about the method of creation, and focus on the creative process itself.


Linux is not for MacBooks

April 10, 2008

I recently gave Linux my second, and final, serious shot at running it full time as my primary operating system on my MacBook. This time, it lasted all of three days before I dug out my Leopard install disk and began the long migration back to OS X. To preempt any questions on the subject, no I didn’t dual boot, and no, I didn’t have a good time machine backup. I was going to force myself to learn to do things the Linux way on my laptop.

Right then I should have realized that this was going to be a problem. Using a computer should not be difficult, especially if its a Mac. Apple goes to great lengths to try to keep the operating system out of the users way, to make it nearly invisible so you can use the applications. With Linux, its more like coaxing the operating system along, trying to trick it into doing what you want. But, seeing as I’m absolutely obsessed how my computer works… I just had to try.

The graphical desktop capabilities of beryl/compiz/desktop effects are simply not there yet. On OS X, there are limited effects when compared with beryl, but the effects that are there work seamlessly, every time. There is no noticeable slowdown, and no window stuttering when moving windows around. I understand that if I had a more powerful graphics card that the graphics would look better in beryl, but they look fine in OS X with the card that I have. The other beef I have with beryl is that it doesn’t always work the way I want it to. I could never be sure if the hot corners were going to activate the expose rip off 1, or if it was going to trigger the virtual desktops2, or if it was going to do nothing at all. Wobbly windows and transparent desktop cubes are neat, but when the effects that I would like to have, the ones that actually increase the usability of the system are not reliably available, you might as well not have any of them. Most of the effects of beryl are a complete waste, and are simply there because the developers discovered that they could, not because they increase the usability of the system. Just because you can do something, doesn’t mean you should.

I realize that the developers do their best to support all the hardware that they possibly can, but when it comes down to it, I just want all of my devices to work, and work well. Hardware support is the biggest YMMV in Linux, because you never know what chipset any given device you are using is until you find out that its not supported. Wireless support is getting better, but its still a long ways from working as well as it does under either OS X or Windows. The Network Manager applet was a big step forward, but if your card is not supported out of the box it doesn’t do you any good. Ubuntu includes a “Restricted Drivers Manager” which is very nice, it allows you to download non-free drivers to enable your hardware to work, if it supports them. In the case of my 802.11n Atheros card, I had to spend quite a bit of time in the command line to coax it to work. Sine there were no Linux native drivers, I had to use the ndis-wrapper application to use a Windows driver for the card that I downloaded from Toshiba3. I never did get my built-in iSight web cam to work. Also, my battery drained faster, and my laptop ran hotter. There are additional tweaks that you can do to help with this, but I think its something that the installer should do once its aware that its on a laptop. I’m sure if Apple released the specs for their hardware that all of it would work seamlessly. But, they didn’t. So, it doesn’t. Life just isn’t fair.

Another thing that bugs me about quite a lot of the desktop software is that it always seems to be in a continual state of “beta”. Even the apps that are released as final releases always seem to have that beta feel to them. There are a few notable exceptions. I love Amarok, and I really like F-spot’s tag based organization feature. However, going back to my beryl comments, beryl is included and activated by default in Ubuntu and several other distributions, and according to the main site, they are only at version 0.2.1. Or is it 0.5.2, or 0.6.2? Compiz and beryl merged… so… right.

Also, the fonts look terrible. I’m not sure why this is exactly, but I agree with Scalzi on this one. To get any decent fonts you have to install the Microsoft fonts, but even then, they don’t seem to be rendered quite properly.

Linux desktops (and laptops) have a lot of potential, its too bad that they never seem to fully realize it. I’ve been waiting for the Linux release that will change my opinion for 8 years now, and I’ve come to the conclusion that Linux is best kept where it absolutely excels: on the server. As a Systems Administrator, I’ve run several operating systems ranging from Windows (NT, 2000, 2003) to AIX and everything in between, and Linux is by far my favorite OS to have on a server. Unfortunately, for now, its best kept on the server.


  1. It would be very hard to argue that the Linux version is not just a copy. Well, imitation and flattery I suppose.
  2. UNIX has had virtual desktops for years, it was about time they made their way over to the Mac.
  3. I don’t remember the exact model or link that I found that worked, I was just happy to have something.

Writing and Word Processing

April 7, 2008

A friend of mine is having a heck of a time with his new MacBook. He’s a recent convert to Macs, and as a philosophy student he spends a lot of time in Word. When he first bought his shiny new MacBook, he was surprised to find out there was no word processor in it. I pointed out TextEdit, which he quickly dismissed as not nearly powerful enough for what he needed to do. So, back to the store he went to pick up a copy of iWork ‘08, and started working with Pages.

Unfortunately, it seems that the import/export feature of MS Word documents was not as seamless as he felt it should be, and as everyone he deals with works with Word, back to the store he went to pick up a copy of Office 2008. Now, every time he starts typing “Guten Tag!” (he’s also a German student) he gets the error:

This command is not available in this version of Microsoft Word

I did a quick lookup on google and found found an article that pointed towards embedded visual basic macros that may be tied to whatever he is trying to type, and the macros are simply not there in the Mac version. This was driving him nuts. I can imagine why, for those intelligent people who simply are not tech savvy, having to stop their creative process to attempt to troubleshoot some ridiculous computer problem is an extreme annoyance. However, I can only sympathize with him for so long.

A few months ago I began writing in LaTeX, and I’ve never looked back. I tried to tell my friend with the serious writing needs about it, but he was uninterested in trying to learn a “programming” language. Also, before he started telling me about this Word error, I pointed him at this article, which explains in detail why Word does not scale well to very large writing projects. Plain text does, and since LaTeX is plain text, it too can scale to large projects.

That only leaves the problem of interoperability, which was the main reason my friend went down and bought Office anyway. Another plus for writing in LaTeX is that it’s plain text, and quickly converts to pdf format, which is the other document format that everyone can read (But not edit… I know). Unfortunately, it seems that the only solution that he has come up with so far is to uninstall and reinstall Office, which has not worked. I’m hoping that my friend gives some serious thought to how he is writing, and why he needs a “word processor”, when it seems that what he really needs is a tool to allow him to write.


Agility

March 7, 2008

To create the perfect datacenter, what would you recommend? For me, the perfect datacenter would be based on agility. We would be able to add new capacity when needed, and reallocate resources whenever needed, quickly and easily. We would be able to backup everything, securely and easily, off-site. We would use, whenever possible, open source software so we would not be constricted by licensing schemes. Would we have a SAN? Yes, most likely something very simple to administrate, like a NetApp. We would boot from the SAN, have no moving parts in the servers themselves, so we would have very few hardware failures. Whenever possible we would keep to one style of hardware, ie: all blades, or all 1U rack mounts, etc…

We would purchase the servers as needed. Purchasing the equipment instead of leasing it gives us more flexibility, and decreases overall cost. We still abide by the server life cycle, but instead of returning older servers to the vendor, we re-purpose them by migrating them over to test and development, or management servers. Then, when the servers have truly reached the end of serviceable life, drop them on ebay to recoup a bit of the cost. 1 We would purchase racks with built in cooling, fans at top and bottom. We would have an ambient temperature sensor hooked up to Nagios to keep an eye on the environment. Nagios, of course, would keep an eye on everything else as well.

We would run our cabling under the floor in Snake Tray, gigabit Ethernet to the servers, maybe 10 Gig fiber backbone between the switches and routers? It may be expensive to implement, but it would last, and provide more than enough bandwidth. I would build a pair of OpenBSD firewalls with automatic failover and load balancing, one pair for each of two Internet connections. I suppose there would have to be two sets of firewalls on each Internet uplink, to provide a DMZ, a good place for a Snort system.

We would deploy a single operating system, possibly Ubuntu. Something with commercial support if needed, but enough freedom to keep things moving the was we want them to… no restrictions. The Ubuntu server is not bad, and with Canonical providing support its reliable enough to build a business off of. Keep everything at the same patch and kernel level.

Yes, this is a pipe dream. In reality datacenters are heterogeneous, organically grown, and often stuck together with duct tape and bubble-gum. What would we build though, if we had hundreds of thousands of dollars and a blank slate to work off of? If the task were given to me, this is what I would build.


  1. We may not be able to recoup much from eBay, but its better to sell them there than junk them altogether. The servers may end up in a hobbyist’s garage, building the next version of Linux!

The Little Things

March 4, 2008

Today I was out in the data center and decided to boot into Linux to get some work done on my Dell laptop. I was busy populating our internal wiki with hardware and OS data from our servers (how many dimms, what size, kernel level, etc…), which is a lot of work, lots of copy and paste, formating, grepping, going back and forth between the terminal and firefox. Lots of moving around, but not a lot of cpu or memory intensive tasks, just basic office tasks. I’m using Ubuntu, with Gnome and the desktop effects turned on, and I got so frustrated that I booted back into Windows. I hate to say it, but I was able to get more done in Windows today than I could in Linux.

Some of that is my fault, and the choice of applications, but much of it is not. I’m constantly getting the feeling that I don’t have X configured just right, the screen just doesn’t seem sharp enough. Also, I’m quite certain that the gnome interface is slower than Windows explorer. I don’t have any hard data to support this, just my gut feeling and a collection of separate annoyances that I’ve noticed. For instance, at times when dragging a window across the screen the window seems to stutter across the screen instead of flowing smoothly the way it should. I understand that Ubuntu comes with “Bulletproof X”, which is a big improvement in a lot of ways, but there is still no way that it can compare with Quartz.

With Windows, I use the excellent Putty SSH client, coupled with Launchy and exchanged SSH keys results in superfast access to my servers. Also with Putty, all I need to do is select text in the shell with the mouse and it is copied into the clipboard 1. Then it’s a quick Alt-Tab to Firefox and CTRL-P to past into the wiki, and I’m back. Maybe its because I’m more familiar with the interface, maybe its because I’ve spent the time to set up my windows environment, maybe its just because the interface is snappier. I was reading through one of my old Linux Magazine 2 issues when I came across a letter to the editor that said

Linux’s benefits are in its license, not in its interface.

I couldn’t agree more.

I want to use Linux on my laptop, its just that I want it to work the way I want it to work. I want it to be fast, smooth, and more than anything, intuitive. I know I may be asking too much in that last one, but things have been moving steadily towards a more usable interface for a long time. Each year since 2003 I’ve been waiting for the Year of the Linux Desktop, and I think its high time it arrived. I’ve been a fan of Gnome for a long time, I like the interface more than KDE (which I’ve always thought was a bad Windows clone), and Gnome is more fully featured than XFCE (or any of the other minimalist window managers out there). However, since Gnome seems to be running so sluggish on my laptop, it looks as though I’ll either be switching to XFCE, or seriously tweaking my X configuration. Ahh, and I thought those days were behind us.

Yes, Linux, you are getting there, but you’ve still got a ways to go. It’s not the big things anymore (winmodems, anyone?), it’s the little things that get us now. I’m seriously thinking that what we really need is an entirely new window manager, perhaps something not based on the thirty year old code base of X. Maybe its time we threw out everything we think we know about user interfaces and start from scratch. I wonder what awesomeness we could come up with.


  1. I realize that there is probably a configuration file somewhere I could edit (or a checkbox to check?) that would change this behavior. I just don’t know where it is.
  2. I remember when Linux magazine was a little more edgy, and a lot more geeky, back when it’s tagline was “Chronicling the Revolution.”

License Restrictions

February 17, 2008

Software licensing is one of the biggest expenses of high-end server systems. The vendors charge you not only to use the software, but they charge you for how efficiently you want to use the software as well. IBM, for example, charges a different license fee for AIX determined by how many cpus are in the system. So, to scale in response to load, weather its up or out, you have to pay for additional hardware, and then you have to pay for the ability to use that hardware. We are not talking small numbers here either, we are talking in the upwards of six figures 1, in addition to the cost of the hardware. In addition to that, if you are using proprietary applications on top of the OS, you are going to have to pay additional licensing fees for those as well. WebSphere in particular charges on a per cpu basis.

This is where open source solutions really shine. Companies are already going to pay for the superior performance of the hardware. Now with Linux and other open source solutions available to run on the software, and vendors like RedHat supporting them, companies have a whole new world of capability available to them. In a rather strange turn of events, IBM is even supporting Linux on it’s power architecture. The ability to break the chains of restrictive and unnecessary software licensing is well worth the (supposed) trade off in features. Being able to leave the software licensing and keep the hardware is worth its weight in gold. You still need to pay for the software, in a more roundabout way, and, in the end, a much more profitable way.

When a company chooses not to pay software licensing and instead to choose open source software for its needs, what it is choosing is to invest in itself instead of investing in another company. So, instead of paying for support from IBM or HP, a company can pay its employees to train and increase their knowledge, therefore increasing the capability of the company. When you pay an external source to be the technical knowledge base, you are limited to what they are willing to give you. When you invest in yourself, you are limited by your imagination.

For many older companies, ones that have been in the tech sector for the past thirty years or so, the idea of being self-supported is incredibly frightening. They have been used to simply purchasing the software and hardware from a vendor and going back to the vendor when something doesn’t work. For these companies, migrating to a self-supported open source infrastructure would take more than training, it would take a change in corporate culture. This will come easier as the newer companies that are proving the capability of open source become successful. If the culture is unable to change, there is a very good chance that the old companies will be left behind by more agile systems that can respond to the ever increasing rate of change in IT.