What happened in software development in the last 50 years? Can we say our business has improved or worsened over time?
I would say that we are at stalemate.
In years gone by, computers were used heavily for scientific and mathematical problem solving. As time moved on, business domain users became more the focal point and with it a large shift in one of the most important attributes of a developer - to be able to communicate with average, non-computer-literate humans effectively. We could look back over the years and it would seem that more is achieved now through more advanced human user interaction models but I don't really buy this. Great ideas in HCI have been around for decades - yet core computer science problems are just the same as they ever were.
Many put far too much emphasis on very specific language or framework skills or understanding a particular API. This subject brings me back to my earlier post referencing 'prefer design skills'. General skills in the key areas such as design, business analysis, working with customers, understanding of what it takes to build quality into a product and a good sense of architecture - these are the only things we should be focusing on. Specific skills come naturally and can be picked up by the right type of people.
We have fragmented roles for BA's, QA, Architects etc, and I for one find it difficult to reconcile development, the creative production of code, with some of these other roles. I don't feel that great results can be achieved when these are viewed from a separated, isolationist standpoint. Most in these roles have never had to deliver software and don't understand what it takes to do so. Developers need to intimately understand these various perspectives though in order to deliver a great product. The best scenarios are generally found when experienced developers have sufficient experience and knowledge and can integrate all these perspectives in their everyday activities.
It would not be acceptable for law, medical or business leaders to enter their chosen fields without passing an applicable exam - typically at undergrad or postgrad level. Why is it commonplace that graduates of other disciplines can become computer scientists? Don't misunderstand, I am definitely not bigoted here, I know one or two very good people who don't have any formal computer science background, but they are incredibly motivated individuals - exceptions rather than the rule. However, I have met many more from a physics, math or other degree discipline who just don't get it, and haven't been motivated to get it. So is this a problem with education, or a misunderstanding in general that it really doesn't matter how well someone understands computer science in order to do the job?
Sometimes, education itself can contribute to the problem, but it largely depends on the curriculum and teaching staff. Generally speaking I don't feel they are guilty of any bad intent - their main purpose should always be to encourage open mindedness and my experience here was a good one with my college.
The big players in the business of selling software, hardware and services influence us more than we would admit, but I can't say I blame them, after all, its just business. More fool on us for paying through the nose for a product that isn't a good fit - more often than not, its because the wrong people are involved in the decision making process; marketing rather than development. Smart marketing is at the heart of big business's approach - playing to the fears of senior managers is not hard when development track records are highlighted. Many desperately want to believe in a silver bullet, a giant slayer that solve all our problems, but there simply isn't one - will there ever be? I doubt it.
This is pure speculation, but my feeling is that software development is an incredibly complex mix of social, creative and technical abilities and it is very hard to find great people with this combination. It's just a case of massive widespread misunderstanding and underestimation of the complexity involved in creating a software product. Great products are built by people who have this understanding and generally such people spend most of their time trying to make it simpler and easier to produce something - often by changing the rules of the game by trying to simplify the inputs.
There was a stigma associated with the discipline, the real techie geek type being locked in a big server room with thick plastic glasses taped together - really through to the 1990's. Then it became more socially acceptable and HTML hit the mainstream, and all of a sudden even little Johnny could put together a web site in his bedroom in 5 minutes so it must be easy this IT stuff - right?
Software is undoubtedly viewed by most as pretty much a blue collar, pass it along the production line type of work activity. Many managers still look for ways to reduce costs and replace more expensive, valuable staff with cheaper one who have a painting by numbers mindset. The best people in the business are not a commodity, they think creatively and are not generally constrained by the ideas of the masses. One of these people is often worth 10 cheaper staff, yet is often only paid 20% more.
Enough of all this waffle. Is there a hard and fast answer to the question - no. How can things be changed - I have no idea. All I know is that good people are out there who can identify with many of the things I have mentioned in this post and they know the path to tread through the minefield to achieve a good degree of success.