Tuesday, March 20, 2007

Stay Hungry, Stay Foolish

The full video of Steve Jobs' now famous commencement address to students at Stanford.

This has been a truly inspirational speech for me, sort of a go-getter message that I refer to whenever the chips are down. Some ideas expressed by Steve in this speech are simply outstanding and definitely practical albeit in a raw form. The crux of this awe inspiring speech

  • Listen to your heart
  • Chase your dreams
  • Don't live someone else's life (because your time is running out)
  • Define your own success
  • Stay hungry for more
  • Stay foolish to fallacy and a heavy head

I wish I had the guts that Steve Jobs had, to just let go and follow my dreams. I hope I can chuck off run-of-the-mill jobs and go join a startup with a fresh perspective, work my @$$ off, get noticed and satisfy my karma (duty). I hope that day is not far off.

Myopic eyes (Microupdate 1)

Myopia, or nearsightedness also known as short sightedness, is a refractive defect of the eye in which collimated light produces image focus in front of the retina when accommodation is relaxed. Those with myopia typically can see nearby objects clearly but distant objects appear blurred.

Savvy ? I have had myopia since the age of 9 and have been wearing corrective lenses (eye glasses). As it stands, my degree of myopia is -6.5 (when you get to -16 or lower values, you are closer to stone blind without corrective lenses). Because I depend so much on my glasses to lead a normal life, I generally take good care of them. It so happened that out of sheer bad luck, they slipped off my nose and one of the lenses shattered into innumerable pieces. And there I was, gaping with no glasses and no spare set lying around. I could see things only half as clear as your average Joe.

I rushed to the nearest optician to get my glasses fixed and he promised to get back to me in 2 days since the day after was a national holiday. The immediate implication of this being no computers or TV or reading or travel for the next 2 days. It was hell on earth, believe me guys. I was able to put up with these 2 days largely due to my iPod (thanks to a friend who gifted it) and another patient friend who gave me company. I did attempt to watch TV with little success. I tried squinting my eyes and using my laptop, with slightly improved success but I couldn't hold myself too long.

I even had to take a day off from work and I was promising myself that I would have at least one spare set of glasses made so that such exigencies are appropriately handled. I finally got my glasses repaired today (oh ! what a relief) and decided to blog about it. I am considering corrective surgery sometime in the near future so that I don't have to worry about the EVIL effects of myopia for times to come.

P.S. I call this Microupdate 1 because this is the first of my posts which doesn't resemble a post (duh !), doesn't have a fully developed theme, more like a "quick write and publish" variety. I hope to be posting more such micro updates as and when something bloggable (?!) happens.

Monday, March 19, 2007

Dilbertization of IT !

Information Technology (IT) over the years has been a sought after industry where creative juices kept flowing. Every bright high school student wanted to get into IT because it paid well, allowed innovation and creativity and made workplace fun. IT began in garages, small units that targeted a specific area and allowed its employees to turn in new and innovative inventions and work on interesting projects.

Talk to someone who has worked in IT for decades and more often than not, they'll regale you with stories of the "good old days," when the workplace was lively and creative juices flowed.

Nowadays it's a different story, they usually say, and place their blame along any number of lines: outsourcing, off-shoring, cost-cutting, IT commoditization, reactivity where there was once pro-activity, not to mention the shoddy desks in their office in dusty room at the end of the hall with pre 1990 IBM desktops that stay there because the company has a contract with the supplier.

In the simplest terms: too many IT workplaces have become Dilbertized - micromanaged, bureaucratic and stifled creatively. It's become an environment where busy work is praised and morale is low.

IT isn't fun anymore, and while a lack of fun at work may not seem worth stopping the presses over, the long-term effects of depriving a field of appealing work may very likely look like this: Students are turning away from computer science at an alarming rate. There's a huge talent shortage across the entire field, and, in confidence, enterprise IT workers say they'd probably choose a different career path if they could go back and start over again.

Yet not as many people are speaking about how to fight this, to adapt to the market and bring value back to IT that simply can't be sent elsewhere when cost-saving impulses set in.

To blame outsourcing, off-shoring and the dot com bust is to miss the point. If you know the value chain is changing, and you continue to teach people to do things that are quickly becoming commoditized, you are doing them a disservice. It's better to figure out where the value is being added for employers, and focus on this.

When IT stops interacting with the rest of the company, stops exerting influence by no longer offering innovative technology solutions for business problems, it packages itself in an easily shipped box.

"Outsourcing is a symptom, not the problem. Outsourcing has become such an important factor because when you turn IT into a commodity, it becomes about where you can get it at the lowest cost. It's what we've done to IT that is the problem, which is taking away its chance to influence business."

Let IT be more than firefighters

IT professionals that have worked in the field for a long time often speak about a shift in their work where they have gone from tossing ideas back and forth to make for better technology solutions to fighting fires all day.

There's less emphasis on creativity, and more on maintenance. Tweak this, work on this… In being reactive not proactive, everything is a crisis. Something has to be done right now, putting out fire after fire, going a long way to making IT a less pleasant environment.

Beyond making for an unpleasant work environment for the techies already in-house, this firefighting serves as a warning to potential recruits: "You will not like this job."

The best minds are not making it into the field. To some CIOs, it is a concern; to others it is not. They're losing out on the bright young people coming into the pipeline because people have an impression that if they work in IT, they'll just be fixing passwords all day.

IT needs to get back to showing people how work can be made better through technology, and how technology can be more effective, IT professionals said.

"IT is behind the wall, and business is outside the wall, and trying to exchange ideas across the wall is nearly impossible. We've stopped asking what computers can do for us."

Better managers get better results

Although establishing guidelines and defining expectations is straight out of Management 101, many enterprise IT professionals don't know what is expected of them. Often, they only find out after they've missed a deadline or made a mistake.

People perform best when they have a clear understanding of what is expected of them and how their performance will be measured. Many high performance organizations provide their team members with written expectations and accountabilities. It takes time to write and discuss expectations and accountabilities, but it is time well spent.

People want to work some place they can be creative, have opinions, where they can make their ideas work and where the rules aren't so rigid and they won't be nailed to the wall every time there is a mistake. People will perform better if they're given some flexibility, and the opportunity to make an impact.

An essential part of making people feel better about their work is to give continuous feedback, not just waiting until a scheduled salary or performance review. You have to make people feel good about what they're doing. You don't feel good about going to the dark room at the end of the hall, chained to the help desk phone until the time you go home.

This is not about becoming more civilized; it's about how the IT career is shifting from a creative, motivating path to a bureaucratic one. Of course there is room for creativity, new ideas and concepts are popping up in a daily basis and they are making things better and easier for everyone.

AJAX, BitTorrent, none of that existed some years ago and is a part of life of millions today. The problem is that most companies nowadays see programmers as commodities, creativity as risk, planning and careful deployment of systems as expenses.

They have managers that don't know anything about technology, deadlines impossible to meet, no recognition for merit and talent. The consequence is that systems crash all the time, "workarounds" are the rule and the good professionals are overloaded with work to make up for all that people that work with them that don't have a clue.

With such perspective ahead, it will be no wonder if in a near future the best brains will go to finance, law or any other profession that may offer what IT used to do.

The problem is that IT has been taken over by
Business School products. They have no grasp of science, no feel for aesthetics, they only have feel for next quarter numbers and covering their @$$. This is what Business School teaches. One needn't know anything about an industry in order to manage it, Business Schools build this into their Product. They will never ever learn a new skill unless it is something useful for climbing the corporate ladder. The best thing IT can learn is to weed Business School product out. Dilbert's boss is hiding in every last one of them.

What can be done to make IT a better proposition

IT needs to be separated into distinct engineering and operations groups.

IT engineering is what the OP obviously favors. Designing new technologies, building better solutions to existing problems, and increasing productivity through these incredible meta-tools we call computers. IT Operations is about taking these technologies, cataloging their shortcomings, and doing what is necessary to implement them and keep them implemented. Engineering is about the introduction of new ideas; Operations is about the constant war to keep those ideas safe from entropy.

Often, these goals are in direct conflict. It is only natural for a solution developer to recognize the shortcomings in their product and want to fix it. It is in the best interests of operations that a stable server not be changed unless absolutely necessary, and then only when the changes have been thoroughly tested, put through miles of red tape and human business process, and signed off on by people whose jobs are on the line if the application goes down.

The idea that you can write a program and be the person who runs it most effectively is a false one in any mission critical application. When there's money on the line, red tape and paperwork is the only way to make sure that it keeps flowing.

So to be successful in IT, we on the one hand need developers who are free to try radical new ideas in an environment that rewards creative solutions to entrenched problems, and on the other hand we need a static environment ruled by business process and red tape, which stifles unproven concepts and chokes creativity. The only solution to this is to separate these groups completely, and have development treat operations as a very stodgy customer.

Too many companies don't realize that this split is necessary to maintain their financial longevity, and have the same people who develop their applications responsible for their day-to-day operations. This situation not only leads to frustrated development staff who feel creatively stifled, it is also in the long term project suicide. In-house developers should not only be relieved of the responsibility for running their code, they should in fact not even have logins to the servers on which their code is running.

Professional standards of code release need apply, too. It's not enough to release code to production via CVS checkout; you need to write an installer with an uninstaller and an upgrade path, just like you would for commercial software. It's not enough to run an ant build on your server via an NFS mount back to the depository, you need to compile a .war or .ear file just like you would for any other customer. As a developer, operations should be your only customer, and your relationship with them should be the same as the relationship you would maintain with your most valuable and critical customer.

But one person wearing the development and operations hat? That leads to nothing but frustration, burnout, entropy, and failure.

Cynical view of IT

However, there is a cynical view of how IT works. Some people are convinced that IT is beyond repair and is becoming commoditized like Manufacturing, where employees are treated as “resources”, head count for a project is the only criteria for setting deadlines and firefighting is the only recognizable form of good work. Bad applications that require firefighting often are created, the people responsible for the bad design that caused the fire fight it out (coz they screwed it in the first place) and end up being appreciated/rewarded because they were swift in fighting the fire.

Dilbertization is INEVITABLE in any hierarchical organization. There's nothing whatsoever you can do about it.

Its causes are ultimately all within human nature. Starting with the technologists themselves, they're all in competition with one another. Each wants to be recognized as the alpha geek. Furthermore, some are lazy and some are energetic. The lazy ones hate the energetic ones because they make them look bad. The energetic ones hate the lazy ones because they're not carrying their weight. Finally, the TALENTED technologists are repulsed by the thought of being promoted into management, but the inept ones love the idea, as do the closet fascists.

The professional managers, middle-managers, "project managers" (ha!) and other undead minions of all standard IT organizations are just as dysfunctional. Secretaries are sullen, convinced that everyone thinks they're stupid (in some cases, this is astute on their part). Project managers, like the fawning little lap-dogs they are, tell management whatever they want to hear, often totally screwing over their staff by agreeing to ridiculous deadlines that cannot be met.

"Middle managers often know nothing whatsoever about technology and run their areas according to whatever management theory is currently in vogue". Worse, they often rate employees by how well they schmooze, not how well they code. Nepotism is rampant. Other minions, like managers selected to represent users in design meetings, often are in way over their heads and only want to cover their asses and contribute enough to meetings to LOOK as though they've got things under control.

The whole system is completely, hopelessly, irreparably FUBAR. It's a clusterfuck of legendary proportions. The only way to survive within it is to make yourself invisible and get your work done as efficiently as you can, while not getting drawn into any politics, never suggesting anything, and never volunteering for anything.

Personally, the biggest gripe I have with IT is this "Dysfunctional middle management".

My biggest headache at work is the nontechnical people who are mid level managers in the IT department. Some of them come from Finance, others from other non-technical departments in the company.

So what do they do? Instead of running a team like most normal managers they have to meddle to prove their worth and validate their existence. So they do dumb stuff like randomly reassign staff, change priorities every two months, and other PHB-style behaviors. They have no technical competency so they can't help out in the work, so they overcompensate and do dumb stuff.

We work exclusively on CRT monitors at workplace (ugh ! I know it's from the previous millennia) and I wanted to convince a middle manager who takes IT decisions to provide flat screens for all developers. He starts off like this "Hmm why do you guys need LCD's, what is wrong with CRT's". I try and give a detailed "technical" explanation of the advantages that a LCD display has, only to hear the usual FUBAR about lack of "funds". The biggest irony is this guy works exclusively on a nice 19' LCD display.

I would have hoped that these types of people would have filtered out of the IT department by natural attrition (new companies, etc), but they haven't and it bothers me endlessly. Most companies have a fairly well equipped top management consisting of people who have been in the trenches, who have coded stuff somewhere in the past. The biggest problem is, when these guys become C*O's (CEO, CIO, COO etc), they tend to have a disconnect from the lower echelons of the organization, where most of the useful work gets done.

Till top management starts understanding the problems faced by developers who sweat it out day after day and put in a sincere effort to improve the situation, things are going to be "status quo ante". A well equipped top management that is techie makes workplace so much better, stellar examples being Larry and Sergey and the way they run Google. I am sure Larry is not going to question the need for flat displays or any other tools of the trade that help developers turn out better work.

P.S. Google employees generally have dual 21' displays and two workstations to play with, budget is the least limiting factor for procuring and making available any developer tool or toy and developers are pampered with a host of on-site facilities and sops that make them stick to their workplaces longer and turn out better work because they generally love what they are doing. Any Google employee can rise to the position of a VP purely on technical accomplishments without getting into the willy nillies of managing a team. No wonder Google is a sought after destination for every IT guy worth his salt.

Tuesday, March 13, 2007

A gentle intro to JSON and its concepts

Out of all the technologies that have emerged in the Web 2.0 world of Ajax powered apps, JSON is perhaps one of the most important. Its implications might not be apparent but JSON is a powerful technology that allows data interchange between traditional applications in an open and human readable format without the quirks of XML.

JSON (JavaScript Object Notation) is a lightweight computer data interchange format. It is a text-based, human-readable format for representing objects and other data structures and is mainly used to transmit such structured data over a network connection using serialization.

JSON finds its main application in Ajax web application programming, as a simple alternative to using XML for asynchronously transmitting structured information between client and server.

JSON is a subset of the object literal notation of JavaScript and is commonly used with that language. However the basic types and data structures of most other programming languages can also be represented in JSON, and the format can therefore be used to exchange structured data between programs written in different languages.

Code for parsing and generating JSON (the latter is also known as "stringifying") is available for a whole bunch of languages including (but not limited to) C, C++, C#, Java, JavaScript, Objective-C, The 3 P's - Perl | Python | PHP, Ruby, ColdFusion, Common Lisp, E, Erlang, Limbo, Lua, ML,Ruby, Smalltalk, Tcl and ActionScript.

JSON is built on two structures:

  • A collection of name/value pairs. In various languages, this is realized as an object, record, struct, dictionary, hash table, keyed list, or associative array.
  • An ordered list of values. In most languages, this is realized as an array, vector, list, or sequence.

These are universal data structures. Virtually all modern programming languages support them in one form or another. It makes sense that a data format that is interchangeable with programming languages also be based on these structures.

Before we head into how JSON is useful in the Web 2.0 scenario, we have to elaborate a little bit on how Ajax works and where JSON fits in.

Ajax is a web development technology that makes the server responses faster by enabling the client-side scripts to retrieve only the required data from the server without retrieving a complete web page on each request, which will minimize the data transferred from the server. These requests usually retrieve xml formatted response, the xml responses are then parsed in the JavaScript code to render the results (which complicate the JavaScript code)

Need for JSON

Ajax allows Web-enabled applications to perform out-of-band client-server calls, establishing a separate channel on which to send and receive information from remote Web services. In layman's terms, updates and navigation sequences in Ajax applications are done outside the classical client-server context, which entails a complete screen refresh, with the information being received in the background (a.k.a out-of-band).

These application updates that are typically obtained from RESTful Web services, once received in a user's browser, need to be incorporated in the overall HTML page layout, which is exactly where XML proves to be more than a handful. Though the capabilities of most mainstream browsers have increased over the years with the support of scripting languages and plug-in support, many programming tasks still remain difficult or unnatural to perform, one of them being manipulating and processing text, which is typically done using DOM.

The complexity in using DOM lies in its function-based roots, with a simple modification or access to a data tree requiring numerous method calls. In addition, DOM is known for differing implementation details among various browsers; this process takes us to a very elaborate programming scheme with ample possibilities for a breakdown in cross-browser compatibility. So the outstanding question now becomes: How can a markup language be easily integrated into an HTML layout page to accommodate Ajax requirements?

The answer comes in the form of leveraging a common component in all mainstream browsers: The JavaScript engine. Instead of delivering Ajax updates in a format such as XML, which would require the use of a mechanism like DOM to access and incorporate data into a layout, a more natural and intuitive approach would be using a format that fits natively to the aforementioned engine, namely JSON.

Now that we have addressed the place JSON has with respect to XML and Ajax applications, let's take a closer look at the technical details behind JSON.

The first thing you should realize about JSON is that it remains a simple text format—just like XML—which is relatively easy to read and inspect with the naked eye. At a syntax level, what starts to set JSON apart from other formats is the characters used to separate data, which are mainly constrained to apostrophes ', brackets ( ), [ ], { }, colons :, and commas.

This listing illustrates what a JSON payload looks like:

{"addressbook": {"name": "Mary Lebow",
"address": {
"street": "5 Main Street"
"city": "San Diego, CA",
"zip": 91912,
"phoneNumbers": [
"619 332-3452",
"664 223-4667"

Human readable? Yes. Machine can parse easily? Yes. Similar to XML? Largely.

How does JSON manage to work?

The answer is that JSON works because most people don’t really need all that overhead, and because it’s often possible to do really interesting things with really simple formats. The World Wide Web has been churning along for over a decade with a markup language that originally had no standardized specification; these days it has specs, but they’re almost never enforced and are, in fact, usually thrown down to the ground and trampled upon. And it still works.

So HTML is a fast and loose format and it doesn’t have any concept of data types that the average programmer would recognize (though it does, in its own special way, have data types), and what rules it has with regards to what you can stick where are routinely ignored. And yet it works. It works really, really well. It works because most people who are using it don’t really need to do complex things with it. Most people who need markup languages for use on the web just want to do simple things like display some text and pictures. You don’t need a 500-page language specification to do that.

JSON is stricter than HTML in some ways; it expects you to obey the rules, but in exchange it gives you fewer rules to follow. And JSON works really, really well. It works because most people who are using it don’t really need to do complex things with it; most people who need data formats for use on the web just want to do simple things like fetch some data from over there and drop it into this web page here. You don’t need the massive overhead of XML-the-protocol-stack to do that.

There are people who genuinely have more complex needs, and I’m not going to try to say whether one thing or another will suit what they’re doing. But for the majority of us who are lounging around in the big belly of the web, JSON is just fine.


Making use of JSON's data separators may not be too obvious at first glance, but there is a fundamental reason behind them: easier data access. As it turns out, the internal representation used by JavaScript engines for data structures like strings, arrays, and objects are precisely these same characters.

Where this leads us is to a more straightforward approach for accessing data than the alternate DOM technique, but let's take a look at a few JavaScript code snippets to illustrate this process. These snippets access the information in the previous JSON snippet:

  • Name access from JSON: addressbook.name
  • Street address access from JSON: addressbook.address.street
  • First phone number access from JSON: addressbook.address.phoneNumbers[0]

If you have done some type of DOM programming, you will probably notice the contrast immediately, but just in case you haven't, here is an external resource to the Document Object Model, which contains a small example for navigating across data.

An added benefit of JSON is its less verbose nature. In XML, the opening and closing of tags is a necessity just for markup compliance, but in JSON's case all that's required is a simple bracket for closure. In data exchanges comprising a hundred or more fields, this additional XML markup can add to transit times.

Additionally, JSON has garnered the attention of many developers specializing in different programming languages, giving way to libraries capable of producing this format from environments as diverse as Haskell and Lisp to more mainstream options like C# and PHP.


Like many benefits, JSON's less verbose format can cut both ways, and in this sense lacks a few of XML's properties. Namespaces, which allow the mixing of identical pieces of information in different contexts, is clearly missing in JSON. Another differing feature is that of attributes, since every JSON assignment is done with colons (:); when transforming XML to JSON it can be difficult to distinguish between what would be considered text between tags—XML CDATA—and the actual value of attributes.

You may also find the actual creation and validation of JSON a little more complex than an average XML fragment. In this sense, it may be that XML has had a head start in terms of developing tools for its processing. Nevertheless, and to ease any worries you may have in this area, we will be exploring some of the most mature JSON developments in our next section.

Is JSON for You?

Like all design choices in software, your requirements will provide the answer to this question. If your Web services consumers will be created in a classical, full-fledged programming environment like Java, .NET, PHP or Ruby, then you can probably make due without JSON. Given the open-ended capabilities of most programming language environments, providing you total configuration control not to mention access to custom libraries, parsers, or helper classes, the difference between consuming JSON, XML, or any other type of Web services payload should be negligible.

If on the other hand, your Web services consumers will be confined to the environment of a browser, JSON deserves serious consideration. Consuming Web services in a browser is not done for the sheer fun of it but rather as a real business requirement. If your manager comes to you and requires one of those "slick Web 2.0 interfaces" that appear to load data without a delay/refresh, then you would be entering the technical realm of Ajax and Web services consumption in browsers.

In these last circumstances, you would not only be limited to the processing environment of a machine located across a network, but one under tight control of a random user, limiting even the most resourceful developer to work with a lowest common denominator for processing text in browsers: the DOM, which as outlined earlier, is a difficult undertaking when compared to accessing a JSON tree.

When you get started with JSON the comparisons with XML are inevitable. Hard core XML fans tend to remark that JSON is redundant and everything it achieves is already implemented in XML. JSON is a better data exchange format. XML is a better document exchange format. Use the right tool for the right job.

Though the x in Ajax stands for XML, and Web services have come to the forefront by the steady use of this same format, it doesn't necessarily mean such an approach is cast in stone. As we have seen, XML has a few drawbacks when applied to Ajax-enabled applications, due to the text-processing capabilities in browsers. In this sense, JSON has emerged to provide a compelling alternative in this same context.

References ::

The first two references contain more material on JSON. The rest of them are comparisons of JSON with XML and suitability of JSON for RPC.


Things at workplace that make a geek happy

There are many reasons to let geeks work the way they want to work. Today they work in every industry. They are the knowledge base, blood and sweat equity of many businesses. They work harder than most. They work longer than most. Their job isn’t a separate “thing they do” while they look forward to going home and relaxing. They eat, sleep and breathe it. They are your systems administrators, your IT team, your programmers, your web developers, your designers.

Anyone who understands how to leverage today's technology to increase intelligence, productivity and efficiency; anyone who stays up nights working to get better at what they do; anyone whose job is their life is a geek.

These are the most important asset your company has. For this reason, it’s important to give geeks what they want. Best part is, if you do, they most likely will not leave your company easily, or will think twice nay thrice before leaving.

1. Let them work when they want

Flexi-timings are an absolute must for geeks, they are seldom 9-5 people. Geeks work almost every moment they are awake. They are online before they go to the office. They are home working after the office closes. They work weekends. They are even sometimes working in their dreams. Employers should understand this and more importantly appreciate it. Don’t force geeks to work 9-5 if there is no real need other than “company morale.” Meetings are one thing, so is socializing with coworkers, but a relaxed office schedule will do wonders for the contentment levels of your employed geeks.

2. Let them work where they want

Allow your geek to telecommute or work from home. This is better than any perk you and your creative human resources team can come up with. Geeks prefer to have a couch around to nap on if they are tired. Some like no windows; others want to stare out into a city or landscape. At home, geek’s offices are usually more lived in, more comfortable and enjoyable than anywhere else in the world. This is because they love what they do, and they do it so much of the time they need to be comfortable where they do it. Most geeks are fine working from offices but giving them the option to work from the comfortable confines of their couch always helps.

3. Let them control their lighting

There is nothing more annoying than working in bright crappy fluorescent lighting if you prefer to work in the dark, or vice versa. Geeks usually have sensitive eyes from staring at LCD/CRT monitors for too long. The last thing you want is your geeks to have headaches. Most geeks aren’t very pleasant to work with when they have headaches.

4. Let them wear headphones

Geeks are experts in the arts of “focus.” Focusing takes removing all unnecessary distractions from your environment and creating a state where nothing else is going on but what they are working on. The harder the problem they are trying to solve or the more creative they have to be, the more they need to focus. Headphones or simply a lack of ringing phones and talking people around allow geeks to focus much easier.

5. Do not expect them to wear a suit

Geeks find arbitrary activities that lack real and meaningful purpose, a waste of time and energy. This includes attire. Most companies today are aware of this and even practice casual dress so as to make everyone more comfortable, but geeks are a special case. Geeks wear stuff they are comfortable in, including but not limited to their old Slashdot tees and faded jeans. Unless you have a compelling reason to dictate otherwise, let them wear whatever they want.

6. Do not make them participate in company events (unless you are sure it is geek-friendly)

Most geeks will not be jumping up and down with joy to attend a company party to celebrate the local football team, unless of course there is beer, and they can hang around and talk to each other about geeky things. Keep this in mind when planning company events. Geeks like to have fun, just not the same kind of fun as your typical non-geek.

7. Do not hold a lot of arbitrary meetings that could have otherwise been handled through email or IM

This one is important. Like I said, geeks need to focus to be happy and able to focus. Nothing is more of an interruption than someone walking into their space unexpectedly and saying “hey do you have a minute?” The answer is usually going to be a disgruntled “Sure.” The truth is geeks are fine with attending planned meetings (and will happily be there if the meeting is really a necessary one for them to attend in person), but are usually most happy communicating through email and IM. These forms of communication are most appealing to geeks because they do not interrupt you, and polite geeks will even respond with a quick “hold on a sec, I’m in the middle of something.” Email and IM are recorded, searchable records of conversations. They are efficient and to the point. This also makes geeks happy. Geeks can discuss anything through email and IM and will usually be more willing and thorough with their response. Face to face meetings are important, geeks know that, but I would guess that 90% of conversations and meetings held face to face, would be more efficient and end with happier people, if they were held in a recordable, written, virtual space.

8. Do not make them do anything other than work

This one isn’t completely accurate all the time. Geeks are team players, but they are also easily insulted by being given a task below their level of expertise or outside of the scope of their position. They’ll do it, but they won’t be totally happy. This includes: answering phones, taking out trash, going shopping for company supplies, “filling in” for a sales person or document a project they didn't work on.

I hope this summary helps employers further understand the world of geeks and how to keep them happy. I also hope this helps other geeks out there approach their employers with a list of what they need to work happy.

Disclaimer:: This article was written almost entirely by Nomadishere and got digged. I made a few minor modifications and posted it here.