Friday, June 13, 2014

On-Board Engineering - Part 1






I have worked at a few different companies in my time, and one spectrum of how to characterize companies is is how well the company strives to enable their employees to become productive.

sink-or-swim ⟷ welcome-on-board
In the sink-or-swim organizations, after you are hired, you are generally thrown into the deep end and expected to figure everything out on your own. Sure there are generally people who will help you if you ask, but their patience runs out quickly if you ask for too much help, and woe to you if you ask for help on the same subject more than once. They expect everyone to take fastidious notes while they ramble on with verbal diarrhea.

Also what tends to be common, is there are few, if any, written documents on process, methodology, policies, procedures, etc. You often hear excuses like "rules stifle creativity" or "to stay competitive, things change too quickly to document." That last quote was from a company that actually did not really have any competitors.

When these organizations recruit new employees they generally consider "trained workers" as a disposable commodity and emphasize a volume short-term skills in their job descriptions and job interviews, rather than a depth of experience and potential.

In this style of management, workers are a resource to be exploited.

In the welcome-on-board organizations, after you are hired you are typically assigned a guide or buddy, or pointed to some key documentation that explains everything in a logical manner. When you ask people questions, rather than just answer you verbally, they take the time to show you, or even better they send you an e-mail message with lots of links to the kind of knowledge you are looking for. It's not that they spend more time hand-holding you, rather they spend less time, and just give you higher quality answers, in more productive ways. There is no need to take notes, because every step along the knowledge path has the notes built in.

Not surprisingly these organizations tend to have well defined and documented process, methodology, policies, procedures, etc. While in older days it was common to produce much of this knowledge in-house, newer welcome-on-board organizations take an open source attitude towards knowledge, and shamelessly use/refer to existing industry knowledge in the form of web links, YouTube videos, etc.

When these organizations recruit new employees they generally consider "professional colleagues" as an investment in long-term success and emphasize long term education, resourcefulness, and adaptability in their job descriptions and job interviews.

In this style of management, employees are colleagues to be valued.

Beyond Organizations

These characteristics are not applicable to just organizations, but entire communities. In the computer and communications world you may work for one company, but you are likely a member of many different organizations, communities and cultures. Without any hard scientific data to cite, it is my perception that often times the most successful communities are those that practice a welcome-on-board spirit of collaboration. To cite some anecdotal evidence:

Git & GitHub

I cannot speak directly to what it is like working inside GitHub, but from the various articles and YouTube videos I have seen from it's employees it seems like a really great, productive, and exciting place to work. However, there is the much wider body of the Git and GitHub community and culture, and that really embodies a welcome-on-board spirit.

Maven

Maven is a fiercely powerful build tool, but not only a tool, rather an entire methodology and attitude to how to manage and build the artifacts of software projects. At the same time, Maven can be a source of enormous stress and anxiety in a person's life if you do not understand how to use it properly. In a nutshell, I would say that Maven is about 10 to 100 more complex to master than Git and GitHub. It is not that the technology is bad, so much as that the technology does so many more things around running a successful software project.

Maven has a really great community of people eager to help other welcome others on-board, and while there is a great deal of documentation out there, it is very challenging documenting something so complex well. More recently, one of the great Maven experts, Russel Gold, created a set of videos that is by far the best introduction to Maven I have ever seen. My point is, that while there was an abundance of documentation for Maven already, the biggest advance in getting people on-board was a great set of videos created by a clear-thinking expert on Maven.


On-Board Engineering

If you have ever worked as a contractor you will appreciate how important the welcome-on-board process is. In particular, if you have ever contracted with a sink-or-swim organization you will know how frustrating it can be to figure out what is going on, become productive, and produce quality results you can be proud of.

If you are an organization who hires a lot of contractors for short term projects, hopefully you will realize your contractors are of less value to you if it takes them too much time to get on-board with everything they need in the project. If you are an organization that experiences a lot of staff turnover, as is common in many developing nations, then it is pretty much the same as hiring contractors.

If you are a growing community or culture, and you want to keep growing or grow faster, then it is pretty much the same as hiring new employees constantly, which is pretty much the same as hiring contractors. If your community has a sink-or-swim smell to it, it is not likely going to attract too many people, and those you do attract are going to be looking for greener fields soon.

A welcome-on-board culture is not just an attitude that people have around being friendly and helpful, it actually takes conscious effort to intentionally architect a welcome-on-board infrastructure.

Scala

While Java was enormously successful, in large part to its on-board engineering, it was also backed by an extremely successful tech company, Sun Microsystems, at the time. A decade later, Martin Orderky brought Scala to the table in the Java ecosystem. Scala is an ambitious endeavor, backed by far fewer resources than Sun had at the time, but continues to become more successful, not only because of its great science and technology, but also because of excellent on-board engineering.

Java was a relatively easy to learn language and platform. Scala, on the other hand, is a multi-paradigm language, incorporating functional programming, something quite alien to many software developers. In a nutshell, getting on-board with Scala is often more of a challenge than something like Java or .NET.

While the Scala initiative followed many of the Java on-board engineering practices: java-doc/scala-doc, books, tutorials, conferences, etc. One of most novel and successful tactics has been the Coursea course: Functional Programming Principles in Scala. Some of the other feature of Scala that make it easier to get on-board is the emphasis on the REPL (Read Eval Print Loop) and IDE Workspaces.

Part 2

Will go into more details and examples of on-board-engineering, some tangible examples of how to easily engineer continuous on-boarding of people to technology, methodology and best practices.

Monday, June 24, 2013

My First QCON

Recently I attended QCON 2013 in New York City and was quite impressed with the conference. I highly recommend it to software architects, developers, managers and even CTOs.


Billed as an International Software Development Conference I can certainly say it was focused on software development and had an international flavor to it given I met so many people from Europe and elsewhere outside of North America. While I have attended many other technical conferences, and JavaOne a half dozen times, what I really liked about this conference was the breadth of topics and the intellectual intimacy available - that is, it was really easy to talk one-on-one with people, experts, geniuses on a wide range of issues. I especially enjoyed the 'unconference' or open sessions where topics just evolved on the spot.

I opted for 5 full days, two days of tutorials and the three main conference days. I will report on specific topics later, but some of the interesting themes I noted included:
  • Privacy & Security. Given the recent controversies in the news regarding the NSA collecting information, and revelations by Edward Snowden (and others), there was a lot of chat on that subject, and some people did ad hoc revisions to their keynote addresses (i.e. Bruce Schneier). In a world where 'data science' is progressing rapidly, and databases are getting bigger and bigger, a lot of people were asking questions about the role of scientists with respect to ethics and morals.
  • Functional Programming. I have been working hard to become more proficient with Scala and Functional Programming for about 5 years now, and recently completed Functional Programming Principles in Scala. I am really glad I had that background before attending because in addition to an entire track called 'Post Functional' the theme of Functional Programming seemed to be all around, and I was able to absorb an appreciate the presentations much better.
  • Polyglot Architectures. There was an entire track devoted to this, and it is a really interesting and controversial subject. What I see is a spectrum from full polyglot solutions integrating dozens of programming languages and technologies, to efforts to minimize this down to a single language and platform for all devices. For example, Paul Snively dreams of a small team of developers building a universal application in OCaml that runs on iOS, OS X, Android, Chrome, Linux, Windows, etc.
This was also my first time in New York City, so I stay for an extra week of vacation. What an awesome place, if you have never been I strongly recommended visiting there. I even attended a block party hosted by the Bridgerunners of Brooklyn where there must have be 10 or 20 other motorcycle clubs attending. The music and other entertainment was great, if not unique.



Friday, March 15, 2013

Sanity Architect

The Inmates Are Running the Asylum


Alan Cooper takes a stab at trying to explain why so much of our technology seems 'insane' by offering anecdotes of technology failures we all instantly empathize with as frustrating.
Why are VCRs impossible to program? Why do car alarms make us all crazy at all the wrong times? Why do our computers reprimand us when they screw up? All of these computerized devices are wildly sophisticated and powerful, and they have proliferated our desks, our cars, our homes and our offices. So why are they still so dauntingly complicated to use?
Ironically, building computerized products isn't difficult, they only seem so because our process for making them is out of date. To compound the problem, the costs of badly designed software are incalculable, robbing us of time, customer loyalty, competitive advantage and opportunity.  
"He believes that in part, the problem lies in the fact that business executives in the high-tech industry have relinquished their control to the engineers and techies. In the rush to accept the many benefits of the silicon chip, responsibility has been abandoned, and "the inmates have been allowed to run the asylum." The solution, Cooper says, is to harness those talents to create products that will both thrill their users and grow the bottom line"

While I believe Cooper has provided us with some really valuable insight, I have to disagree with him on this last point somewhat. It is mythful thinking to believe that somehow "business executives in the high-tech industry have relinquished their control to the engineers and techies" and while this may be correct in some cases, sometimes exactly the opposite is the problem, sometimes engineers and techies have relinquished their control to team leads, managers, and business executives.

It is not hard to realize that often CEOs and techies have different perspectives, and consequently often have trouble communicating or making good decisions, either alone or in cooperation.

Software Sanity Architect

CEO's and Techies need a mediator; and while generally that mediator would be the Software Architect, we have had Software Architects for decades and still there is still an incredible amount of insanity in the software development enterprise, usually resulting in driving the end users of software technology insane.

Polymath in the Job Description

To some degree all Architects need to be polymaths, but the Sanity Architect more so, with special emphasis on
  • Psychology
  • Political Science
  • Anthropology
Basically the Sanity Architect fulfills two important roles, among other things
  1. Analyst - in the sense of psychology, political science and anthropology, in addition to all the other technical stuff.
  2. Therapist - in the sense of psychology, political science and anthropology, but also in the sense of software development methodology, processes and best practices.

Pathological Examples


Muda

In the automobile industry, the Japanese realized long ago that if your factory floor is covered in litter, grease and oil; that if tools are not put away when not used any more; that if your inventory is full of stuff you are never going to use or sell; then productivity and safety are going to be compromised. What some companies do is they have a "muda day" a few times a year where production stops and everyone just cleans up the mess, so production can start again on a clean safe foundation.

In the software industry there is no visible factory floor, most of the production goes on in the minds of the software developers and bits flowing through computer systems and networks. When the CEO walks in to a techie's office, the techie may have a clean or a messy office, but the CEO has no real way to appreciate at a glance how much muda there is under the surface of it all, and in almost all cases the CEO has never considered that muda applies to software development too.

One duty of the Sanity Architect is to realize muda is a problem, sit down with the CEO, the technies, and all people who are stakeholders in the software enterprise and explain the problem in terms they can all understand - that muda is undermining productivity and safety, is decreasing Return on Investment and increasing Total Cost of Ownership.

Refactoring

Code refactoring can be a controversial subject. While some team leads, managers and executives support the idea, there are startling numbers who do not. Often if a lowly code monkey wants to spend some time refactoring code they cannot get permission, or if they do so without permission they get reprimanded.



Sadly the result of insufficient code refactoring is more muda. The productivity of the code monkeys drops, and management responds by complaining about the software development team, demanding they work harder, or by off-shoring software development to a cheaper labor force.

Another duty of the Sanity Architect is find specific examples of muda, such as messy unmaintainable code, how much that is costing the business, and how everyone can profit by applying the correct amount of code factoring. In addition showing everyone that there are best practices around this that can be supported with methodology and process improvement the Sanity Architect also has to be a:
  • Psychologist who can diagnose the organizational and human dynamics that led to the insanity, and then prescribe and administer the necessary therapy.
  • Political Scientist who can analyze if there is too much politics that led to the insanity, and then act as an ambassador to prescribe and administer the necessary diplomacy.
  • Anthropologist who can research the deeper issues and identify latent causes to the insanity, and then act as philosopher to prescribe and administer the needed insight and opportunities for innovation.

Influence



In his presentation, Dan Pink makes the case that only 10% of the population are recognized as being in sales, but that the other 90% are are also involved in sales every day, every time we have to sell our ideas or positions to another person. He offers three key qualities in being successful in selling:
  1. Attunement - being attuned to the perspectives of others
  2. Buoyancy - staying afloat in a sea of rejection
  3. Clarity - articulating your message as clearly as possible attuned to the perspective of who you are selling to; defining problems your listener never knew they had.
Dan cites numerous studies that show a strong correlation to how much power a person has, or believes they have, and how attune they are to other people's perspectives. Largely, the more power someone has, or believes they have, the less attune they are to other people's perspectives - they simply don't care. The less power someone has, or believes they have, the more attune they are to other people's perspective. Ultimately those without power are most attune of all - because they have to be.

When I worked at Motorola they spent a large amount of time and money giving us 'Influence Training' - they cited that autocratic decision making does not lead to quality innovation. Basically influence relied on consensus decisions where everyone was equal in power and contributed by selling their knowledge and ideas to the decision making process. In particular, I was a member of the Process Improvement Process Committee :-)

More than anyone, the Sanity Architect has a duty to define problems the enterprise never knew they had. If they are successful then everyone will realize they all have the same duty, and the enterprise can run a successful process improvement process. Not having a successful process improvement process is a guarantee of insanity.

Finding Software Sanity Architects

OK, I just invented the term 'Software Sanity Architect' so it is not a recognized discipline yet. But as in Field of Dreams "if you build it, they will come" so in the software culture 'if you define it, they will come.'

Monday, February 4, 2013

Productivity

Productivity is being able to do things that you were never able to do before.
Franz Kafka Read more at Brainy Quote 

Even though worker capacity and motivation are destroyed when leaders choose power over productivity, it appears that bosses would rather be in control than have the organization work well.
Margaret J. Wheatley Read more at Brainy Quote


The economy has become seriously unbalanced. Its growth has not been driven by investment or by overcoming Britain's long-standing weaknesses in investment and productivity, particularly skills. Instead, there has been a binge of debt-financed consumer spending.
Vince Cable Read more at Brainy Quote

Productivity is never an accident. It is always the result of a commitment to excellence, intelligent planning, and focused effort.
Paul J. Meyer Read more at Brainy Quote

A wonderful emotion to get things moving when one is stuck is anger. It was anger more than anything else that had set me off, roused me into productivity and creativity.
Mary Garden Read more at Brainy Quote

America's growth historically has been fueled mostly by investment, education, productivity, innovation and immigration. The one thing that doesn't seem to have anything to do with America's growth rate is a brutal work schedule.
Fareed Zakaria Read more at Brainy Quote 

1970 Programming Productivity

When I first started to learn to program computers in 1970 I was 12 years old, taking the Vancouver School Board's first Mathematics Class with computer programming added to the curriculum, my Productivity was very very very low. This had nothing to do with the fact I was 12, or the fact that I was learning to program computers for the first time - it had everything to do with the methodology, technology, and science I was using.
 

Methodology


In those days the methodology was that before writing any code, you had to first create a Flow Chart of the problem you were going to solve. This was fun at first because it meant drawing boxes on paper with lines and arrows - especially fun when you are 12 years old and like to draw. After I few months I had my first real argument or debate with my favorite mathematics teacher. I stopped creating Flow Charts and simply started writing the code. I argued that Flow Charts were unnecessary and he argued they were necessary. The argument was settled when I realized that already I was a better programmer than my teacher, and that the Flow Chart was for his sake not mine, and without the Flow Chart he could not understand my code well enough to grade me. It was like writing in the answer on a math test, without showing how you got the answer. The original debate would not have been so protracted if he had simply stated the fact up front that I was already a better programmer than he was. One other thing, our school was very good at forcing students to come up with their own life realizations - and I thank them for it.
 
What I want people to appreciate is that writing Flow Charts in no way increased my personal productivity, in fact it lessened my productivity by taking my time away from just writing code. A key point here is that if you have to keep stopping to explain things to your teacher or boss, you become less productive. Imagine if your boss has 10 people reporting to them, and they each have to spend 10% of their times explaining what they are doing for the sake of their stupid boss. You now have one whole less person of productivity. Can you appreciate now why Scott Adams' Dilbert cartoons are so enormously popular. The solution to this problem is
  1. Get a smarter boss who knows what you are doing.
  2. Get a more trusting boss who trusts you know what you are doing.
  3. Get a more effective boss who knows when you do not know what you are doing, and gets you the training you need so you are both confident you know what you are doing.
The bottom line is that methodology is a key pillar of productivity, and that everyone needs to be on the same page and trust each other to be on the same page.

Please don't take away from this that Flow Charts are not important, because with very large complex projects they do become important, and methodologies like Universal Modeling Language are used to articulate that complexity.

Technology 


In those days the technology was that you would need a really good pencil, and would use it to write you program on Optical Mark Cards - each card was one line in the program. This was very tedious, mistake prone, and time consuming. When you assembled a deck of cards with your program you would wrap it with a piece of paper (that had your name on it) and a rubber band. You would put that in a box in the corner of the classroom. At the end of the day, the School Board delivery truck would pick them up and take them to one of the three schools that had a computer. They would run the deck of cards through the card reader, and them re-wrap them with the output from the printer and your name, and send it back to my school. If I was lucky I would get the results back the next day, but more often it was two days later, sometimes more. Given that 12 year old children want instant gratification, I was intensely patient to put up with this for years.

Once a week I took the bus in the evening to spend time at one of the high schools that physically had a computer.  Given I could now put the cards in the card reader myself and get the printout instantly, and assuming it still took me 10 minutes to write a program on cards, I was then 288 times as productive as before - given on average it took the school board 2880 minutes to turn around my programming experiment with the delivery truck.

In reality, I would never send just one programming experiment via the delivery truck, but I want you to appreciate how the right tools and technology affects productivity.

On quiet nights, sometimes I was the only person and I could use the teletype connected to the computer, instead of using optical mark cards. Unless you have used optical mark cards, you would not believe how much more productive it is using a keyboard and getting instant feedback from the computer when you make a mistake. I would have to say, in this mode, my productivity was 10 to 20 times more than marking cards and putting them through the card reader. It was also way less wasteful of paper.

The key concept here is how productivity is related to the technology workers have available to them. If organizations want increased productivity, one way to do that is by investing in better tools and technology.

The bottom line is that technology is the other key pillar of productivity.

Science

10 PRINT "Enter a number to see the factorial"
20 INPUT N
30 LET F = 1
40 FOR L = 1 to N
50 LET F = F * L
60 NEXT L
70 PRINT N; "! = "; F
80 PRINT "Would you like to see another factorial? 1=YES ANY(other)=NO"
90 INPUT RESPONSE
100 IF RESPONSE = 1 THEN GOTO 10
110 PRINT "Thanks! Goodbye."
120 END

In those days my first programming language was BASIC or Beginners, All-purpose, Symbolic, Instruction, Code. Pretty much all of the Computer Scientists and Professionals I respect these days ubiquitously agree that BASIC is one of the most horrible programming languages ever devised. In the early days of computing, there were a great many mistakes and failed experiments made, and BASIC is just a failed experiment.
Some people might say that BASIC is a technology and not a science, but my point is that really good computer languages are designed by Computer Scientists, and the best languages are designed by the best scientists.
Now if all you are doing is writing a 10 or even 100 line computer program, then BASIC is not all that horrible. For example, if you are only building a one floor building, then mud and straw is not all that horrible either, and is really inexpensive and expedient. The questions is, would you feel safe on the 100th floor of a building made of mud and straw?
There is an old joke that if Engineers built buildings the way Programmers built programs, then the first woodpecker to come along would destroy an entire city. Quite literally, most computer software these days is very much like a city built of mud and straw. The failure of software developers is not always that they do not know how to write software, it is that they have not learned to educate their bosses yet that software developers need more than mud and straw to build with.
To be sure, when my card decks started getting over 100 cards, understanding and changing the program became exponentially more difficult. I am not feeding you hyperbole either; managing 100 cards is not 10 times harder than managing 10 cards, it is more like 100 times harder.
 
Since then, Science and Engineering have given us programming languages far better than BASIC. What is really sad is that Bill Gates, who was not a scientist, promoted BASIC and ultimately created Visual Basic; a curse upon the world the way Typhoid Mary spread her curse upon the world. Mary was scared, selfish and ignorant, and did not care to listen to anyone else about how dangerous her affliction was.

The key concept here is that the science of program organization and management allows us still be productive when managing a million or ten million lines of code. If you are not using the best science, then you cannot be as productive when trying to comprehend and manage those really large sophisticated programming projects. You cannot be as productive as possible if you do not understand or use the latest science and engineering.
 
The bottom line is that science is the third key pillar of productivity, and without three legs to stand on, productivity will fall over. In particular, science and research is critical to process improvement, and productivity is inherently about process.

2013 Programmer Productivity

Methodology

When I was a snively-nosed teenager there was not a lot of methodology, when I started there was (1) draw the Flow Chart, (2) write the code, (3) test the code. Even when I was in University there was not a lot methodology that was taught, primarily because I studied in Computing Science schools.

Finally people started applying a lot of Engineering Methodology to software development and we got Software Engineering. Over time, Software Development Methodologies evolved into the abilities for huge teams of people to create software systems of stunning scale and complexity with increasing productivity, reliability and safety.

It is still true that if Detroit built cars the way software companies built software, there would be 100 times as many car related deaths, and 1000 times as many car related injuries as there are today. It is very telling that almost every software product on the market includes a disclaimer along the lines of: We make no claims that this software does anything correctly or useful in any way whatsoever; please enjoy it.

To be very clear, there does not yet exist any legal concept of "software consumer protection." On the other hand, as methodology improves, so does the ability to make safer and more reliable software. What is abundantly clear is that evolving and improving methodologies are increasing productivity.

Technology

Technology is the main reason that one factory worker can produce a million times more widgets in the same amount of time as a single artisan hand crafting said widget. These days I use technology like Integrated Development Environments on high performance workstations with 30 inch computer displays; Virtual Machines that let me simulate dozens of different operating systems in dozens of different configurations; automated tools to manage the methodology and processes of developing software.

These days I work on systems that are the equivalent of a deck of optical mark computer cards that have a million cards in the deck. Technology is quite literally the ability to do things you were never able to do before.

Science

Over 40 years I have generally been able to quickly pick up new programming languages. Some of them were horrible and hard to learn, like COBOL, and resulted in poor programming productivity. Others, like Pascal, were simple, elegant and easy to learn, and productivity for a student learning to program was quite high. Students can also learn to program quickly in BASIC, but they learn so many bad habits, afterwards their overall productivity can suffer for years or decades to come.

By 1995 Java came on the scene. It was influenced by languages such as C and C++, and platforms like UCSD Pascal; it applied better science and design, and made programmers dramatically more productive than C and C++. One way to demonstrate how effective Java was at increasing productivity is that Microsoft were forced to shift away from C++, ATL and COM (the three pillars of programming hell) and create C# and .NET, which is pretty much an exact design copy of Java.

After 13 years of using Java my overall productivity increases are slowing because the Java language is not keeping up with Computing Science. These days I am struggling to learn Scala, but already I can see the path to being more productive. I say struggle because Scala is based on so much new science I did not have in university, or have not kept up with in my professional career, I struggle to catch up with the science now. What is important to realize is that in almost all aspects of productivity improvements there is always some kind of investment to make, and there are many dimensions to the kinds of investments.

Astronaut's View of Productivity

We have come a very long way in over 40 years. Everyone is keenly aware of the advances in electronics that took us from vacuum tubes, to transistors, to integrated circuits. Fewer people are aware that we have pragmatically reached the limits of how fast electronic circuits can operate. Computers are not getting any faster. Computers are still getting more powerful, though, because they can do more work in parallel with more circuits. Unfortunately the science of writing parallel software has not kept up with the pace of building parallel hardware, but it does continue to progress and get better. In essence, while computers will not get any faster, they will seem to get faster by making computers multitask more.

As an aside, there is a class of problems that cannot be reduced to parallel computations, and consequently computers will never be able to solve these problems faster. For example, most people will realize that 9 women cannot make a baby in 1 month, but they can make 9 babies in 9 months.

Often I have seen on Job Postings "Good multitasking skills required." This is a sure sign that management expect their employees to function like computers or robots. To be sure, people can multitask on some things quite well, like walking and chewing gum, but we are horrible on many other things, like making a single baby, and our overall productivity drops significantly when we try to multitask on many things

In particular, when multitasking, productivity should not be measured in how many widgets per hour can be produced, but in how many widgets with zero defects per hour. Or if you are only producing one cool widget that people can just download to their phone or tablet, how much technical support staff do you need to satisfy your customers before they get pissed off and go try some other company's widget? Isn't it just more productive to build a widget that is so simple and effective that it does not require a huge dedicated support team?

The bottom line, when discussing productivity, is how are you measuring productivity? When increasing productivity are you measuring it before and after you increasing it, and weighing that against your investment in increasing productivity?

All I am saying is that the world of productivity looks a hell of a lot different from orbit, than it does from the parking lot behind some building.


Sunday, February 19, 2012

Code Monkey


  

   code monkey:n
1. A person only capable of grinding out code, but unable to perform the higher-primate tasks of software architecture, analysis, and design. Mildly insulting. Often applied to the most junior people on a programming team.

2. Anyone who writes code for a living; a programmer.

3. A self-deprecating way of denying responsibility for a management decision, or of complaining about having to live with such decisions. As in “Don't ask me why we need to write a compiler in COBOL, I'm just a code monkey.

Some claim the etymology of the phrase derives from the famous Infinite Monkey Theorem, where a monkey hitting keys at random on a typewriter keyboard for an infinite amount of time will almost surely type a given text, such as the complete works of William Shakespeare. To be sure some of the WTF code I have had to try to fix or maintain brings that imagery to mind.

There are quite a few references to the term Code Monkey: c2,  etc.

I started out computer life as a code monkey, 12 years old, learning to automate basic mathematical curiosities such as Fibonacci series and prime numbers in BASIC (circa 1970) as part of an experimental computer mathematics class in grade 8. When I graduated from University my first job was as a "Programmer/Analyst" which is effectively a Code Monkey. These days I pretend to be a Software Architect (whatever that means) but there are still days when I feel like a Code Monkey and other days when I want to go out and kill me some Code Monkeys.

Writing code can be a daunting task - as you are not just writing code your are building and/or fixing a machine. People often forget the programs, software, applications are machines and not prose that some artificial intelligence reads and deals with in any reasonable way - except the reason of Garbage In, Garbage Out.

When I am in full Code Monkey mode I am usually struggling with concepts or mechanisms that are just beyond my reach of groking, and in frustration I behave very much like a monkey trying random things and praying to the Robot God that the next run will be the one that compiles and executes correctly. However, at this point people's behavior diverges:
  • Myself I believe that if I struggled with something hard then I need to make sure the code is clean and readable and that there are sufficient comments describing what is going on so that the next person to come this way does not hurt their brain - because it is usually me to forgets the context when I have to come back and look at my own code again
  • Other people seem to have this cherished attitude "Well if it was hard for me to figure out it should be hard for the next person to figure out too." In reality sometimes people are just too burned out solving the problem they need to distance themselves from the pain as soon as possible and move on to trying to meet their deadline.
One of the most common Code Monkey patterns I see is the copy-paste pattern where
  1. Code Monkey needs to implement featurebut does not know where to start.
  2. Code Monkey searches code base for similar feature for example.
  3. Code Monkey copies and pastes other code and repurposes it to make new Code Monkey feature.
  4. Code Monkey makes boss proud by completing feature in half the time estimated.
The basic flaw with this pattern is that it tends to replicate really back stinky Code Monkey feces-style code everywhere in code base. By contrast the Code Master pattern might be
  1. Code Master needs to implement feature but does not know where to start.
  2. Code Master searches code base for similar feature for example.
  3. Code Master copies and pastes other code and repurposes it to make new Code Master feature.
  4. Code Master refactors new feature code to conform to more sane and up-to-date coding practices and conventions, so as to leave behind better examples for other Code Monkeys and Code Masters to follow.
  5. Code Master does not seek acknowledgement or praise from boss as boss can never understand ROI Code Master provides. This last part is humor (mostly).

Wednesday, December 28, 2011

Space Junk - The 80% Rule

As we continue to make advances in space exploration, we also continue to leave behind more space junk that makes continued advances more perilous.

Computer technology continues to advance at a rapid pace, and in particular the productivity advances in creating and maintaining new software continue to advance as well. However, computer technology and software development also incurs its own forms of 'space junk' that act as impediments to advances and productivity as well.

The 80% Rule.

In any software project the first 80% of progress is relatively easy, while the last 20% becomes exponentially harder to 'finish properly' - in short, no software project ever gets finished properly. But what does 'finished properly' mean? In this context it means that there is nothing extra for the end user to learn or master when using the software application because there is nothing left for the software developers to do to improve the software any further.

The basic principle at work here is a trade-off between time the software developers have to invest in making the product better or more finished, and the time the end users have to spend making up for the time the software developers did not spend.

For example: let's say a product is 70% done, but that the development team have not created any user documentation. For the most part the user interface is well designed and most users in most circumstances have no trouble with the product. However, for those users who have to do something a little bit different there is no guidance on how they can do it, let alone if they can do it. Consequently they have to resort to online searches with Google, typically leading to customer support forums where they can ask questions.

Let's try to put this in more mathematical terms. Say a developer saves 40 hours by not documenting how to do some task. Also, 20% of users cannot figure out how to do the task that was not documented, and have to spend 1 hour of their own time researching how to do it. If there are 200 users of the product then this is a fair trade off as the developer saved 40 hours, and collectively the user base spent 40 hours. However if the user base is 2000, then they would collectively spend 400 hours. If you chart this relationship it looks something like

20040
2,000400
20,0004,000
200,00040,000
2,000,000400,000
20,000,0004,000,000
200,000,0004,0000,000

Now try and let the simple economics of this sink in. If there are 200 million users, then they will collectively spend 4 million hours of time in order to make up for the 40 hours the developer saved. Now lets say that the developer gets paid $100/hour and the end users get paid $10/hour, this means that while the developer saves four thousand dollars, collectively the end users will have to spend fourty million dollars extra that they would not have had to spend had the developer invested that four thousand dollars better.


Wednesday, August 3, 2011

Why Microsoft Can't Write Good Error Messages

One of my pet peeves is that there is so much software out there with bad error messages, and much of it seems to come from Microsoft. To be sure, there are many other companies just as guilty of writing poor diagnostics, but Microsoft is such a big part of all our lives, and it is fun to pick on them. Here's a 20 year old joke I still love tell...

There is this helicopter flying towards Seattle Airport, but it is very foggy. Eventually he sees a tall building projecting above the clouds and flies over for a look. He spots someone on the roof and hovers near, opens the window and shouts "WHERE AM I?" The person on the roof shouts back "YOU ARE IN A HELICOPTER!" The pilot immediately takes off west and in a few minutes lands safely at the airport. The passenger looks at the pilot and says "how did you know where to go?" The pilot says "well his answer was 100% correct, and 100% useless, so I figured he must work for Microsoft. From there I knew which way Seattle was."

I really hate doing software development in Microsoft-land, Visual Studio, .NET, COM, Microsoft C++, and all that crap. I find I am far more productive using Eclipse, Java, and open source artifacts. What I really hate is when something is not working, the diagnostic messages are incredibly poor or even nonexistent. One day I was working on a hard problem and could make no headway, so I asked a teammate with more Visual Studio experience for some help. He said just step through your program with the debugger. I took his advice, and eventually I found the problem because I reached a point in the debugger where an error result appeared that I have never seen emitted before. The point was, the only way to solve the problem was with the debugger, the error result was not logged or emitted in any place outside of the debugger.

My teammate told me that when developing Microsoft applications you have to spend a lot of time in the debugger, everyone does, it's just what you have to do.

This was quite alien to me. I have used debuggers before, but I only used them as a last resort. I prefer to rely on logging messages, because when you are troubleshooting you do not always have a debugger - for example at a customer site.

It finally occurred to me there are two camps of thought on this: one camp, the one I am in, only uses debuggers as a last resort; while the other camp always uses the debugger as a first resort. Here is what happens
  • When you avoid using the debugger you tend to write a lot of logging messages for diagnostic purposes. When troubleshooting you write even more messages to zero in on the problem, until it becomes clear what the code is actually doing. What you have done is to codify your diagnostic process into the software itself. The more you do this, the more experienced you get at writing better and better messages. When you are really experienced, your messages not only tell you clearly what the problem is, but often how to fix the problem as well. For example, a message that says "can't find configuration file foo" is like that guy standing on the roof of the Microsoft building. On the other hand, a message that says "MyApp.Configurator cannot find the file C:\Program Files\My Application\web\data\foo.xml" is a lot more meaningful. When it comes to writing user facing error messages I also find that the people in this camp are much better at producing these types of messages too, because the more diagnostic messages you write, the more logs your read, the more crappy messages you find, the better you get at writing clear and meaningful messages.
  • When the debugger is your first resort at solving a problem, you step through the code, you think about the problem, you reason stuff out, and eventually you find the solution and move on. All that reasoning and problem solving wisdom from that moment does not get written down anywhere for anyone else to see or learn from. Even if you have to revisit the same problem months later you have likely paged-out how you figured out the problem in the first place, and have to reinvent the reasoning from scratch. Also, because you are never writing any diagnostic messages, you don't get any good at writing diagnostic messages. When you are forced to write some user facing diagnostic messages because there requirements mandate it - well, you are still a neophyte moron when it comes to writing diagnostic messages - your are just that guy standing on the roof of the Microsoft building.
To be fair, I reiterate Microsoft are not the only one's guilty of this practice, I have seen this time and time again over the years in Unix and Mac OS, and open source software, etc. Also to be fair, when I am working in Java culture, I do notice the diagnostic messages generally are better than I see in other cultures.