Sunday, June 15, 2008

Who's to blame?

What happened in software development in the last 50 years? Can we say our business has improved or worsened over time? 

I would say that we are at stalemate. 

In years gone by, computers were used heavily for scientific and mathematical problem solving. As time moved on, business domain users became more the focal point and with it a large shift in one of the most important attributes of a developer - to be able to communicate with average, non-computer-literate humans effectively. We could look back over the years and it would seem that more is achieved now through more advanced human user interaction models but I don't really buy this. Great ideas in HCI have been around for decades - yet core computer science problems are just the same as they ever were.

Many put far too much emphasis on very specific language or framework skills or understanding a particular API. This subject brings me back to my earlier post referencing 'prefer design skills'. General skills in the key areas such as design, business analysis, working with customers, understanding of what it takes to build quality into a product and a good sense of architecture - these are the only things we should be focusing on. Specific skills come naturally and can be picked up by the right type of people. 

We have fragmented roles for BA's, QA, Architects etc, and I for one find it difficult to reconcile development, the creative production of code, with some of these other roles. I don't feel that great results can be achieved when these are viewed from a separated, isolationist standpoint. Most in these roles have never had to deliver software and don't understand what it takes to do so. Developers need to intimately understand these various perspectives though in order to deliver a great product. The best scenarios are generally found when experienced developers have sufficient experience and knowledge and can integrate all these perspectives in their everyday activities.

It would not be acceptable for law, medical or business leaders to enter their chosen fields without passing an applicable exam - typically at undergrad or postgrad level. Why is it commonplace that graduates of other disciplines can become computer scientists? Don't misunderstand, I am definitely not bigoted here, I know one or two very good people who don't have any formal computer science background, but they are incredibly motivated individuals - exceptions rather than the rule. However, I have met many more from a physics, math or other degree discipline who just don't get it, and haven't been motivated to get it. So is this a problem with education, or a misunderstanding in general that it really doesn't matter how well someone understands computer science in order to do the job?

Sometimes, education itself can contribute to the problem, but it largely depends on the curriculum and teaching staff. Generally speaking I don't feel they are guilty of any bad intent - their main purpose should always be to encourage open mindedness and my experience here was a good one with my college.

The big players in the business of selling software, hardware and services influence us more than we would admit, but I can't say I blame them, after all, its just business. More fool on us for paying through the nose for a product that isn't a good fit - more often than not, its because the wrong people are involved in the decision making process; marketing rather than development. Smart marketing is at the heart of big business's approach - playing to the fears of senior managers is not hard when development track records are highlighted. Many desperately want to believe in a silver bullet, a giant slayer that solve all our problems, but there simply isn't one - will there ever be? I doubt it. 

This is pure speculation, but my feeling is that software development is an incredibly complex mix of social, creative and technical abilities and it is very hard to find great people with this combination. It's just a case of massive widespread misunderstanding and underestimation of the complexity involved in creating a software product. Great products are built by people who have this understanding and generally such people spend most of their time trying to make it simpler and easier to produce something - often by changing the rules of the game by trying to simplify the inputs.

There was a stigma associated with the discipline, the real techie geek type being locked in a big server room with thick plastic glasses taped together - really through to the 1990's. Then it became more socially acceptable and HTML hit the mainstream, and all of a sudden even little Johnny could put together a web site in his bedroom in 5 minutes so it must be easy this IT stuff - right?

Software is undoubtedly viewed by most as pretty much a blue collar, pass it along the production line type of work activity. Many managers still look for ways to reduce costs and replace more expensive, valuable staff with cheaper one who have a painting by numbers mindset. The best people in the business are not a commodity, they think creatively and are not generally constrained by the ideas of the masses. One of these people is often worth 10 cheaper staff, yet is often only paid 20% more. 

Enough of all this waffle. Is there a hard and fast answer to the question - no. How can things be changed - I have no idea. All I know is that good people are out there who can identify with many of the things I have mentioned in this post and they know the path to tread through the minefield to achieve a good degree of success.


Paul said...

Hi Andy,

I'm tempted to print this one out and share it with my current management/development teams.

The problem is people and organisations.You place more than 5 people in a room and you've got politics. Software development is complex with many interdepenent parts. The seperation of these parts into the roles which you speak of ensures that no single individual has the "big picture" in mind, and the cost of communication (and skills) to ensure that everyone is intimately aware of a "big picture" which is continuously changing is prohibitive.

So what is the solution? Get rid of the organisation and all the excess baggage. Sack everyone who desn't write code and start again.

Sounds drastic - which is probably why it doesn't happen. The type of organisations that need this change the most are the ones that are least likely to do it. The remedy was pointed out by Fred Brooks, but big organisations don't like the medicine. Empowering programmers will never do for them :)

My personal view is that this is a good thing. There are lots of opportunities in the Software industry precisely because failure is endemic. If you can deliver then the world is your oyster.

This is happening right now! Web2.0 belongs to small startups. Companies that are no bigger then 7+/- 2 people, were everyone (and I mean everyone) codes. Venture capitalists are throwing money at them, and bit by bit Microsoft is becomming an irrelevance as the internet takes centre stage.

The future is bright. You just need to make sure that you are on the right side of change :)

Take a look at the essays of Paul Graham:

Paul an his cohorts represent the future. What you describe here is our immature past :)


Brad Wiederholt said...


I've had this bad feeling for the last 18-24 months about 1) Chip manufacturers coming to the end of single processor Moore's law cycle, and 2) organizations such as the Association for Computing Machinery bemoaning the drop in enrollment in technical fields, especially computing.

With the new advances in multicore processors, there is going to be *tremendous* pressure on the software industry to shift from sequential to parallel programming models. We've gone 30 years with remarkable advances in speed that have masked some of the screw-ups and inefficiencies in the software industry. Those days are numbered.

We've got a shrinking base of folks who know the fundamentals of what they are doing, we still have these people and communication issues, and we are about to undergo a radical and surprising shift in the fundamentals of software development. I can count on less than two hands the number of developers I know who know what Erlang is. A lot has been taken for granted these last 20 years, and we are entering the days of the perfect storm.

That's all good news! I've got to agree with Paul here that for some folks, the ones who can master the fundamental shifts going on and deliver, the world *is* about to be your oyster.

If you thought sequential programming was tough to manage or outsource, just wait until you try to find someone with experience in parallel programming models...