jb… a weblog by Jonathan Buys

Personal Rules for Cocoa Happiness

June 29, 2009

  1. Understand every line of code written in my apps
  2. Use frameworks sparingly (see rule 1).
  3. Do not copy and paste code (see rule 1).
  4. Add lots of comments so I remember what I was doing at any given time.
  5. Manage memory as I go, don’t wait until I’m nearly finished and then go and “clean it up”.
  6. Draw out the application on paper before writing a single line of code.
  7. Search Apple’s documentation, CocoaDev mailing list archives, Google, and anything else I can think of before asking for help.
  8. When programming, quit Mail, Twitter, NetNewsWire, and IM. Turn on some non-distracting music in iTunes.
  9. Get a cup of coffee.
  10. Remember, “This is hard, you are not stupid.”

Ping from Cocoa

June 15, 2009

One of the features of Servers.app is that it pings the host and shows a little indicator for it’s status: green is good, red is dead. Getting that little feature in here was not quite as easy as I thought it would be. After searching the Apple documentation for what seemed like years, I stumbled across SimplePing. It was perfect, exactly what I was looking for. I dropped the two source code files into into my project, read through the documentation commented out in the header file and added the code to call SimplePing. It worked, and everything seemed fine. Except that it kept working, even when I knew for certain that the host did not exist.

So, I started digging through the source code for SimplePing.c, and found that instead of calling a standard error message, it called perror, which, according to the documentation, doesn’t return a value. This is fine if you want to log the ping results, but I wanted to ping the server and make a decision based on code. So, I changed a couple of lines of code from this:

if (error == 0) //was able to wait for reply successfully (whether we got a reply or not)
{
    if (gotResponse == 1)
    {
		{numberPacketsReceived = numberPacketsReceived + 1;}
	}
}

to this:

if (error == 0) //was able to wait for reply successfully (whether we got a reply or not)
{
    if (gotResponse == 1)
	{
		{numberPacketsReceived = numberPacketsReceived + 1;}
	} else {
		error = 1;
	}
}

Simply adding an extra else statement in there to return an error code gave my application something to work with. Now, from my application controller I can call SimplePing like this:

int systemUp;
systemUp = SimplePing(cString, 2, 2, 1);

After that, I can make decisions based on the value of the systemUp int. Now, it’s entirely possible that I’m doing this wrong, all I can say is that it works for me now, and it didn’t before.


Macs, Netbooks, and Education

May 19, 2009

What does it mean when an entire community springs up around hacking together a product that is not otherwise available? Apple has been adamant that it is not interested in the netbook market, but according to the many users who are breaking Apple’s EULA and installing OS X on Dell Minis, the market is there, and waiting.

Rumors have been circulating for some time now that Apple is working on a netbook. I wish I had some special insight or access to Apple’s inner-workings so I could confirm or deny those rumors, but all I can say is I hope so. To get everyone on the same page here, let me explain exactly what I mean when I say an Apple netbook. I’m talking about a small device with a ten inch (or smaller) screen, small physical keyboard, reduced hardware specs, running a legal, full copy of OS X. I’m not talking about an iPhone, or some kind of iPhone/MacBook hybrid. While its not clear if Apple would consider such a device or not, what is clear is that people would buy a $500 MacBook Nano (or NetMac?) in droves.

Part of the problem here is that this was thought up by someone other than Apple. Apple likes being the innovator in whatever it does, and it seems to me that they don’t like following where someone else has already gone. Fair enough, but by not offering what people are asking for, Apple is, in a round- about way, promoting the download of illegitimate hacked versions of OS X. They know that people want it, but they don’t know how to give it to them.

One of the arguments I’ve read against a Mac netbook is that offering such a device would cannibalize sales of their other notebook offerings. However, what that argument misses is that netbook sales are in a different sector than regular notebook sales. All of the traditional arguments against netbooks still apply, and are still just as valid. Cramped keyboard, small screen, underpowered hardware. The people who are buying netbooks are not trying to decide between a notebook or a netbook, they are not going to buy a notebook at all. The decision would not be whether to buy a Mac netbook or a MacBook, it would be whether to buy a Mac netbook or some other brand of netbook.

Traditionally, Apple has been strong in the education market. Netbooks are perfectly suited for use by students. The smaller keyboards fit their hands, and the small size and light weight of the netbooks make them easy to bring between classes, sit on a desk with room for writing, and tote back and forth from home. Without an Apple offering in this market, I would not be surprised at all to hear of increasing numbers of schools purchasing nine inch Acer netbooks.

The bottom line in the Mac netbook debate is that the market is ready for the product to appear, and willing to purchase it when it does arrive. What Apple needs to figure out is how to make a $500 computer that’s not junk. Dell, Sony, and HP seem to have figured it out. When, and if, Apple does release a netbook one thing is clear, they will make sure that it redefines the entire netbook market. Then, I’d imagine the little “NetMacs” selling by the truckload.


Jim Sanford

May 13, 2009

My Mama said if I’d be good, she’d send me to the store.
She said she’d bake some gingerbread if I would sweep the floor.
She said if I would make the beds and watch the telephone,
She would send me out to buy a chocolate ice cream cone.

And so I did, the things she said
And she made me some gingerbread.
Then I went out, just me alone
And I bought me a chocolate ice cream cone.

Now on the way home, I stubbed my toe upon a big ol’ stone.
Need I tell you that I dropped my chocolate ice cream cone.
A little doggy came along and took a great big lick.
So I hit that little doggy with a great big stick.
And he bit me, where I sit down.
And he chased me all over town.
And now I’m lost…can’t find my home.
All because of a chocolate ice cream cone!

Good night Papa, we love you.

Jan. 26, 1929 - May 9, 2009

DiggBar This

April 10, 2009

I’ve put a lot of work into this site. I’ve put thought into how I want the URLs to look, how the layout should feel, and since the site has my name on it, I wanted it to be good. I created a custom theme for the site which will purposefully only work here, and I’ve even done a bit of personal “branding” I suppose with the jb{ logo.

So, when I saw that Digg had decided to take a page out of 1999’s web play book and start framing sites inside of their DiggBar, I was more than a little annoyed. There may be only six people who read this site on a regular basis (Thanks guys!), but this site is entirely mine. The DiggBar removes the URL from the site you are reading and adds it’s own custom shortened URL, making it hard to bookmark, or really even remember where you are.

John Gruber, from Daring Fireball fame and who’s work I admire, evidently shares the same feelings I have about the DiggBar, and posted some php code to block it. I have taken his example code, and created a tiny Wordpress plugin that places his code in the correct spot in the site. The goal is to make it super easy for the thousands of wordpress installs to remove this particular menace.

Please let me know in the comments if you have any problems.

UPDATE 04/14/2011 - Since I no longer use PHP for this blog, I’ve dropped support for the DiggBar block. However, there are several great alternatives out there. Also, is the DiggBar even a thing anymore? I’m not sure.


NKOTB

April 9, 2009

When I was younger, 20 years ago or so I suppose, the New Kids On The Block (NKOTB) were synonymous with cheesy pop music that was taking over our airwaves. They weren’t just a pop boy band, they were The pop boy band. I remember the thing was that if you were a guy and you were a NKOTB fan, it instantly meant you were gay. Which, for an adolescent boy just before his teens, that meant there was absolutely no way that this band’s music would ever willingly land on his ears. Rumors circulated regularly about the NKOTB being caught, red-handed no less, in one disgusting sexual tryst or another. One in particular that I remember was that one of the members had to have his stomach pumped because he was sick, and they found that it was full of sperm. That was the stigma attached to this band.

So, I grew, and I found that I much rather enjoyed classic rock and grunge more than anything else. As I’ve grown I’ve learned to appreciate many different types of music, everything from classical to the Beastie Boys. My iPod will shuffle from the Grateful Dead to an old Prodigy mix. However, I’ve never been a fan of pop music. Boy bands went away for a while, and for the most part I forgot all about them.

I was kind of surprised when my wife told me this blast from the past was coming to Des Moines. When she later told me that she wasn’t going because couldn’t find anyone to go with her, I felt bad for her, and told her that I’d take her. What I found at the concert surprised me.

The NKOTB are five guys from Boston who work really hard to put on a good show. They are talented singers and dancers, and they know how to keep the crowd entertained. They really are a great band, and when I look back at what they had to go through, I think that they were probably unfairly ridiculed. We were lucky enough to get seats on the (nearly empty) stadium, and were able to get within six feet of the band as they performed a few of their songs in the center on a rising platform. Getting that close to a person, being able to look in their eyes and have them look back at you, it’s hard to be critical. Much harder than listening to something on the radio or seeing something on MTV. I had an important revelation as I watched my wife go nuts over these guys.

I can’t do what they do.

I can neither sing, nor dance. I’m fairly certain that if I ripped off my shirt in front of a crowd I’d be politely asked to put it back on again. How can I be so extremely prejudiced against a group of guys that can do things that I cannot. I really can’t. The only thing that I can say is that I’m not a fan of their pop-sugary style of music. I can’t say that any of them are any less of a man than I am, and to be honest, in some ways much more for having the guts to do what they do in the first place. The public eye can be vicious.

So, what did I do that night? I looked a legend of the past in the eye, and gave them my respect. I don’t know if they are going to have a lot of success with this comeback, but I hope that things work out for them.

I also take it as a sign that I have grown to overcome the adolescent prejudices of my past. I can appreciate the work and talent that goes into making the music and the shows, even if I don’t care for the style of music itself. I’m not about to replace my collection of bootlegged Dead music for pop, but I’m glad that I’ve grown to the point where I feel that I’ve come into my own. My own preferences for my own reasons.

Hey, life’s too short for hate. And, just as proof, here I am with the beautiful woman who brought me, who loved the show so much she started screaming like she was twelve years old again:

Happy Birthday, babe.


Spit and Polish

March 31, 2009

After spending a week with Linux as my sole computer, I find it very refreshing to come back home to my Mac. FedEx says my wife’s PC should be here tomorrow, so she can go back to Word 2007 and I can have Mactimus Prime back. It’s not that I didn’t enjoy working with Linux, I did, but I’ve found that once the geeky pleasure of discovering something new wears off, there are problems.

I gave up on using Lotus Notes as any type of information manager a few months ago, so anything that comes in as an email is treated either as a to-do or as reference material. To-dos are deleted when they are finished, and reference material is printed to PDF for indexing by the operating system. Or, they would be indexed if Linux had a decent indexing system. Tracker, the default installed with Ubuntu seems incapable of properly indexing the large PDF files that I accumulate. I’ve done quite a bit of research lately concerning high availability, especially when it deals with DB2, but a search for the term “HADR” in tracker finds nothing. Maybe there’s a setting that needs tweaking, maybe I need to read the man page a little deeper, but all I know is that it doesn’t work quite right yet.

Google desktop search finds everything, but then there’s the strange need that it has to look and act like Gnome-Do. That bothers me to the point that I wind up uninstalling Google Desktop, the only tool that actually works. I just can’t have two applications performing the same function. If Gnome-Do would actually index the contents of the files, it’d be perfect. Google Desktop just looks and acts out of place. I like tracker because it looks like it belongs in Gnome, but looking like it belongs and actually performing its intended tasks are two different things. Google Desktop finds what I’m looking for, but it opens Firefox and I feel like I’m browsing the web, not my personal files. The conceptual model just doesn’t work… not for me.

Contrast this to my Mac. Every file that is written to the hard drive is automatically indexed by spotlight, and is immediately available from search. Spotlight is written into the kernel, deeply ingrained in the OS X operating system. Linux (and Windows, I believe?) search and indexing systems are add on applications that search for new and updated files to add to their index on a schedule. There’s no integration, no unification of the application an the operating environment.

Desktop search is only one area that needs attention. Another is the location and organization of the menus. In Gnome, there are two task bars, top and bottom. The top menu bar contains the three main menus for finding and launching installed applications, bookmarks and file system manipulation actions, and settings for personal preferences and systems administrative tasks. In OS X, the nested menus have been replaced with applications. The Finder is an application for locating and launching applications and manipulating the filesystem. On the surface, it seems like Finder is on par with Nautilus, the Gnome file manager, but this is not the case. The Finder, for all its faults, is a one stop shop for finding and launching applications, documents, music, pictures, and anything else that you can think of. Gnome uses a menu to launch applications, and Nautilus for finding and launching documents. Gnome also uses a pair of menus to change system and personal preferences. OS X has the system preferences application, which is also integrated into spotlight with a clever highlighting feature that “spotlights” the preference you are looking for.

Finder and System Preferences are two applications that highlight the focus on simplicity that is inherent in the design of OSX. It comes down to limiting choices. If I want to change something, I need to change a preference, in OS X I launch System Preferences. In Gnome, I launch…. well, it depends on what I want to change. I need to spend some time reading through the menu options to find the appropriate application that will change the preference I’m thinking of.

Then there’s the rest of the top Gnome task bar, which absolutely drives me nuts. For one, Gnome allows you to add applications shortcuts to the task bar, and ships with Firefox, Evolution,and the Gnome help application there by default. So now, you have both drop down menus and applications to launch side by side. Gnome allows you to “Lock” a menu item in place, assumedly so that when other options are added, they must be added around the item that is locked. This has the unfortunate effect of causing some menu bar options to be completely in the wrong order when moving from the 13” laptop screen to the docked 22” monitor. The menu items that are locked in place are still in the same place relative to the smaller screen size, while the menu items that are not locked in place have expanded with the larger screen and are now on the opposite side. So, to remedy this, I have to go to each item and unlock it, and then drag them to where they are supposed to be.

Of course, this raises another interesting question… just where are they supposed to be?

I normally remove the item with my name on it, since it duplicates functionality available under the System menu. I leave the calendar where it is, and love the weather available there. The volume is next to the weather, as is the network manager item, which both seem perfectly at home. Then there ‘s the notification area, which I’m never sure what will show up in this area. Pidgin lives there, as does the Rhythmbox music player, and occasionally Gnome-Do pops up there long enough to let me know about some new tweets on Twitter. The new unified notification system reminds me of Growl, and I love that it is included in the operating system by default now. However, when I launch Pidgin now, I’m given yet another icon on the task bar, and I have no idea what it does. It looks like a small envelope, and seems to be there simply to annoy me.

Most of the things in the top right hand side of the task bar seem perfectly at home there, but the notification area doesn’t. I might move it down to the bottom right task bar, in the place now occupied by the trash and the desktop switcher, but I’m afraid it might clutter up that area too. In the end, the notification area is more clutter than anything else.

The bottom task bar includes the trash, desktop manager, and an icon to click if you want to minimize all of the open windows to see the desktop. That button must come from Windows. The rest of the bottom task bar is used for window management, with an icon for each open window much like the Windows task bar. This bothers me because I have a notification area where applications can live, and I have the window management task bar where applications can also live. So, when I want an application, at times I have to stop long enough to check both the top and the bottom of the screen. Or, Alt-Tab, or Super-Tab, or, if it’s not in there maybe it’s a daemon, or maybe it crashed.

Personally, I’d like to see a more interactive bottom task bar, one that incorporates both notification capabilities and window management. Eliminate choices so I don’t have to make choices and decisions about my computer when I should be thinking about what I’m doing. I have to stop using my tool at times to interact with the tool, instead of simply using the tool to accomplish my intended task.

Gnome has some things that I’ve really enjoyed. I love TomBoy. I love how Gnome has built in ability to mount nearly any connection and browse it as a filesystem. I love, love, love Gnome-Do, and I wish it wouldn’t have tried to be something its not with that ridiculous dock theme. Gnome-Do, you excel at being a keyboard application launcher for Linux, where I spend all my time in the terminal. Be what you are!

I’m looking forward to getting my Mac back in the next couple of days, and I’m really looking forward to firing up Xcode and getting to work on my app again. I may talk to my boss about using my Mac at work as well, that would solve so many problems for me. For now, I’ll keep using Linux. The major problems have been overcome; stability and capability. Now, it’s time to concentrate on spit, polish, and deep integration for a cleaner, less cluttered user experience.


Tough to Turn Down

March 22, 2009

I’ve been using Linux for nearly a decade now. I first had a guy I met in a Navy school install it on my old IBM desktop. Back then, it was very hard to find the right drivers, and just about impossible to get on the Internet, seeing how almost every modem that shipped with a PC was a WinModem. X11 configuration was error prone to say the least, drivers for the sound card were hard to come by, and out in the rural English countryside where we lived, broadband was almost unheard of. Installing software could be a nightmare. Say I wanted to install a music player to listen to my CDs. The music player would have dependencies, certain libraries that needed to be installed, so I’d go and download the dependencies and try to install them, only to find out that the dependencies had dependencies! So, I’d download further dependencies, and eventually I’d be able to listen to my music. And then I’d try to launch something else, only to find out that in fulfilling the dependencies of the music player, I’d broken the dependencies of the other application that used to be installed and working.

However, on the rare occasions when it did work, Linux was fascinating. It’s been amazing to watch Linux grow over the years, and even more amazing to see the leaps and bounds in the user interface and usefulness. To say that right now, I’m writing this on my laptop, wirelessly connected to the Internet on my WPA2 secured network, listening to my Grateful Dead collection shuffle, while keeping four other virtual desktops open with several other applications running is nothing short of amazing. I’ve been critical of Linux in the past, but within the past six months, Linux on the desktop has really brought it all together.

For example, I’ve got an Airport Express on my home wireless network, with an HP PSC 1500 attached to its usb port. Ubuntu automatically detected the printer, downloaded and installed the drivers, and allowed me to print a test page just now with no frustration at all. I can’t even do that with OS X, not unless I’ve installed several gigs of printer drivers I don’t need beforehand. Scanning works much the same way … plug it in and use it, that’s it.

Ubuntu has finally fixed the dual-monitors problem that bugged me for so long, and Mint (an Ubuntu derivitive) sleeps and wakes up almost as seamlessly as my Mac. Looking at my desktop now I’ve got a ton of free software, in a system that works like I want it to. Given that all of the software I’m using now (with the exception of the IBM apps I need for work) is free, going back to my Mac when Rhonda’s Dell comes back may be harder than initally thought. Time will tell, but the free Linux desktop is becoming harder and harder to turn down.


Adamo - Apple Pushes the Industry Forward

March 18, 2009

I almost feel sorry for the other hardware manufacturers. No matter what they do, no matter what they come out with, they seem to be forever following in Apple’s footsteps. Such is the case with Adamo from Dell, a clear shot at the MacBook Air.

Adamo uses the same machined aluminum manufacturing process introduced by Apple with the Air, which has since spread very successfully to the rest of the Mac laptop line. Adamo markets itself as very thin, and very light, and has an artistic feel to their advertising that seems out of place with “cookie cutter” Dell. In fact, the marketing is almost too artistic, almost like they are trying too hard to shed their old image.

The specifications between the two lines are very similar.

AdamoMacBook Air
CPU1.2Ghz Core 2 Duo1.6Ghz Core 2 Duo
RAM2GB2GB
Display13.4”13.3”
GPUIntel GMX 4500NVIDIA GeForce 9400M
Storage128GB SSD128GB SSD
Weight4 Pounds3 Pounds
Price$1999$2299

As you can tell, this is not truly an apples to apples (pardon the pun) comparison. At this price point, the major difference between the Air and the Adamo is the $500 optional SSD. Configured with the 120GB SATA drive, the Air comes in at $1799. The Air is faster and lighter than the Adamo, and includes a dedicated NVIDIA graphics card.

A more accurate comparison may be with the MacBook, also configured with a 128GB SSD.

AdamoMacBook
CPU1.2Ghz Core 2 Duo2.0Ghz Core 2 Duo
RAM2GB2GB
Display13.4”13.3”
GPUIntel GMX 4500NVIDIA GeForce 9400M
Storage128GB SSD128GB SSD
Weight4 Pounds4.5 Pounds
Price$1999$1749

The MacBook case is larger and heavier. With the MacBook there is a half a pound difference in weight, but there is a big difference in the 2.0Ghz speed boost in the CPU. Fortunately for Dell, the Adamo is not about hardware internal hardware specs. It’s about trying to catch up with Apple in industrial design.

The screen looks gorgeous, I love the edge to edge glass design. The Adamo screen has a slightly different resolution from the standard 1280x800, coming in at 1366x768. Dell has done a great job with the Adamo. Unfortunately, its still not a MacBook killer, simply because its still not a Mac. Great industrial design is only one part of the puzzle of what makes a Mac a Mac. The other vital piece is OS X. With OS X and the Mac Apple has created a machine that drifts into the background, gets out of your way, and lets you do what you set out to do. Adamo ships with Windows Vista Home Premium 64bit. No matter how great the hardware is, if the software is not intimately tied to it the way only a company that creates both the hardware and the software can do, it’s still just another PC.

People may initially buy their first Mac because of the design, or the halo effect of the iPod. They buy their second Mac because of the experience.


Communications

March 5, 2009

Ask any mechanic, machinist, or carpenter what the single most important thing that contributes to their success is, and what they are bound to tell you is “having the right tool for the job”. Humans excel in creating physical tools to accomplish a certain task. From hammers to drive a nail to the Jaws of Life to pry open a car, the right tool will save time, money, and frustration. It’s interesting to note the contrast that people have such a hard time with conceptual tools. Software designed to accomplish a specific task, or several tasks, is often forced into a role that it may not fit into, making the experience kludgy, like walking through knee deep mud.

I’ve found this problem to be especially prevalent in business environments, where the drive to “collaborate” brings many varied and sundry applications into the mix. While hurrying to the next great collaboration tool they forget to ask the absolute most important question: What do we need this software to do?

To communicate effectively in a business environment, it is imperative to use the right tool for the job. To determine what the right tool for the job is, you first have to ask yourself exactly what the job is. Do you need ask a quick, immediate question from a co-worker? Is there a more detailed project question that you need to ask, and maybe get the opinions of a few others too? Do you have something that you need to let a large group of people know, maybe even the entire company? Each of these tasks is best suited to a different tool. Unfortunately, I’ve most often seen each of these tasks shoe-horned into email.

Email is a personal communications medium, best suited for projects or questions that you do not need immediate responses to. There have been many times when I’ve gone through my inbox and found something that didn’t grab my attention when it was needed, and by the time I looked at it was past due. Email is asynchronous, ask a question, and wait for a response whenever the receiving party has time.

If you do need immediate response, the best tool would be instant messaging. IM pops up and demands the receiving person’s attention. When requesting something via IM, the person on the other end has to make a decision either to ignore you or to answer your question. Long explanations are not a good fit for IM, but short, two or or three sentence conversations are perfect or it.

When considering sending something out to the entire company, keep in mind that email is a personal communications medium. Company wide email blasts are impersonal, and for the most part require no action on the part of the receiver other than to eventually read it. A better tool for this job is an internal company blog, accompanied by an RSS feed. RSS is built into all major browsers now, and could be built into the base operating system build for PCs. Every single time I get something from our company green team or an announcement from the CEO, I can’t help but think that a blog would be the best place for such formal, impersonal communications. A blog could be archived for searching, new announcements broadcast through RSS, and best of all accessible when the intended receiver has the time and attention to best appreciate the content. To me, company wide email is the same thing as spam, and for the most part, is treated as such.

One other form of internal communications which is far, far too often maligned beyond recognition is shared documentation. Technical documentation especially suffers from document syndrome, which is having a separate (often) Word file for each piece of documentation, spread out through several different directories on Windows shares, or worse on the local hard drive of whoever wrote it. Such documentation should be converted to a wiki as soon as possible. If you are writing a book, I hear Word is a fairly good tool to use. If you are writing a business letter, again, great tool. If you are documenting the configuration of a server, a word processor should not be launched. Word (or OpenOffice) documents have a tendency to drift, and are difficult to access unless you are on the internal network. Trying to access a windows share from my Mac at home through the VPN is something that I’ve never even considered trying. A wiki, is perfectly suited for internal documentation. They are a single central place to organize all documents in a way that makes sense, is accessible from a web browser, easy to keep up to date, and most importantly, searchable. Need to know how to set up Postfix? Search the internal wiki. Need to know why this script creates a certain file? Search the internal wiki. Everything should be instantly searchable. Perhaps search is not the most important feature of a wiki, perhaps the most important is the ability to link to other topics according to context. I can get lost in Wikipedia at times, jumping from one link to another as I explore a topic. The same thing can happen to internal documentation. This script creates this document for this project, which is linked to an explanation of the project, which contains links to other scripts and further explanation. Creating the documentation can seem time consuming, until you are there at 2AM trying to figure out why a script stopped working. Then the hour spent writing the explanation at 2PM doesn’t seem so bad.

One of the first things I did when starting my job was to set up both a personal, internal blog and a shared wiki for documentation. I used Wordpress for the blog, and MediaWiki for the wiki. Both are excellent tools, and very well suited for their purpose. If I were a manager, I’d encourage my team to spend 15 minutes or so at the end of the day posting what they did for the day in their personal blog. Could you imagine the gold mine you’d have at the end of a year or so? Or the resources for justifying raises. Solid documented experience with products and procedures, what works and what doesn’t. The employee’s brain laid out.

Internal moderated forums is something that I haven’t tried yet, but I can see benefits to this as well. They’ve been used on the Internet for years, and I can imagine great possibilities there, especially for problem resolution. How about a forum dedicated to the standard OS build of PCs? Or one for discussing corporate policies?

Blogs, wikis, forums, and IM are tools who’s tasks are far too often wedged into email. If an entire organization begins to leverage using the right tool for the job, the benefits would soon become apparent. Then you’d wonder why you ever did it the other way at all.