Monday, June 23, 2008

Silo Driven Development (SDD - an antipattern)

In Johanna Rothman's latest post, handoffs don't work, requirements hand off to development teams is discussed and I think she makes a great point. I believe that this behavior is part of a larger issue of silos in development organizations. Unless organized along the lines of product development, silos are anti-patterns whether BA's working on requirements, DBA's working as part of a database team, a tester working for QA team or an architect working in an architecture/strategy group.

So, lets take a minute to look at the requirements example because this one is probably the most common problem. Now, I tend to think that business analysis is part of a developer's everyday job, so the role itself is important; but why do we use a business analyst?

- Do they have better domain knowledge? Often not, they are usually skilled in business analysis, so like a developer, their knowledge of the domain is secondary. Additionally, developers tend to ask many more questions than their BA counterparts.

- Are they better communicators? I don't think there is any difference (in my experience)

- Do they save developers time? Well undoubtedly if the developer starts work without talking to the customer at all - but is this what we want. This, I think is a serious problem and leads to the developer missing a key opportunity to pick up on the domain, the more they can talk directly to the customer the better, having a proxy in between is just a recipe for miscommunication and misunderstanding.

- Don't they get it right first time? No, unfortunately not. Johanna's post describes this much better than me. Often subconsciously, customers use empirical feedback to steer their requirements along the journey. Expecting them to sign their name in blood on a requirements document and it will never change has long since been dismissed as naive at best.

So, why do I seem to be committed to the development perspective and not feel for the poor BA? Quite simply its a matter of responsibility. A business analyst has to deliver a paper document, proving how good or otherwise that is proves to be quite subjective. A developer on the other hand takes responsibility for delivering working code. Did you ever hear of a BA getting in trouble for not delivering a document on time? Do documents fail testing or cause null pointer exceptions during use? Many customers read requirements documents and think to themselves 'I have no idea what this all means - but I am confident everything will come out exactly as I expect it to.'. 

Even if you work with a BA, cutting out the silo and having the BA work with the development team - and the team all working with the customer, (all other things being equal) should result in a better product.

Saturday, June 21, 2008

Distributed Development Disappointment

In this post from last year, Mark Levison discusses some of the issues that arise from having a distributed development team. This really hit home for me, since I completed a project where we had an unusual case of working out of two distributed locations.

I knew that it was going to be a challenge and I did not like what I was getting myself into, but there was some history behind it, and before going further I should try to provide a little background to explain why a distributed strategy was chosen. Sometimes it sounds like common sense, unless you have a real deep understanding of software development, so I can see how this situation arose.

One team at location 'A' were strong in specific skills and had built up a product over a number of years. Our strategy was to change the direction of the product, introducing a new technology, which was familiar to developers at office 'B'.

The project was extremely tough but did manage to deliver a working product, but the teams at the different locations never really gelled. Its not surprising really -

From team 'A' perspective -
1. Felt threatened
2. Left out of important decision making
3. Couldn't hear many of the meetings on the phone call - effort led from team 'B' location (many more staff and office space available)
4. Didn't understand why some decisions were made when they had much better background knowledge


But, there were also some other reasons from the other team's perspectives.

From team 'B' perspective -
1. Team left to figure out integration issues on their own
2. Little help was forthcoming
3. Only one who wanted the project to succeed
4. Couldn't understand the resistance

Now, you're saying, so why not ship the folk at office 'A' to office 'B' - well family life and other commitments. Why didn't we do something about the communications problem - it was raised, a new phone ordered and then promptly turned down based on cost.

In my opinion, there is no simple one size fits all remedy for this kind of thing. It simply doesn't work. Yes, we managed to succeed, against the odds, but it was like pulling teeth and I wouldn't wish it on anyone. I do not blame or take sides at all in this, it was just circumstances.

Of course there are consultancies out there that claim they can make it work for you, and maybe they truly can make the experience less painful, but I'm guessing the real reason is because they want to bill you for their time.

Retrospectively, I would have ensured that all the information was in the hands of one location or the other, but I was not involved in the earlier stages of the project and there are a few uncomfortable hurdles to overcome before that suggestion could be a reality. As usual, everything comes down to people problems and how they communicate. Excessive use of email, IM and even the phone as poor second cousin to face to face communication has done more damage to human relations than anything else - especially in the corporate world. Having said that, the way my daughter miscommunicates over the web, maybe its really lousy in the social networking world as well. Guess I'm just from a different generation.

Tuesday, June 17, 2008

What No Getters?

I was intrigued reading an article by Michael Feathers recently. Although the main topic of conversation was flawed thinking in the TDD world, I was struck by the point about writing OO code with no getters. This sounded like an interesting idea to me, as I have long since thought that setters and getters are very much an OO anti-pattern, exposing the details of an object unnecessarily much of the time. 

Many blogs are covering the subject right now, and I am still reading through them and trying to remain open minded. Here are a few examples -

http://peripateticaxiom.blogspot.com/2008/06/tdd-mocks-and-design.html
http://moffdub.wordpress.com/2008/06/16/the-getter-setter-debate/

Martin Fowler has a slightly different perspective -

http://martinfowler.com/bliki/GetterEradicator.html

Sunday, June 15, 2008

Who's to blame?

What happened in software development in the last 50 years? Can we say our business has improved or worsened over time? 

I would say that we are at stalemate. 

PERCEPTION
In years gone by, computers were used heavily for scientific and mathematical problem solving. As time moved on, business domain users became more the focal point and with it a large shift in one of the most important attributes of a developer - to be able to communicate with average, non-computer-literate humans effectively. We could look back over the years and it would seem that more is achieved now through more advanced human user interaction models but I don't really buy this. Great ideas in HCI have been around for decades - yet core computer science problems are just the same as they ever were.

SKILL
Many put far too much emphasis on very specific language or framework skills or understanding a particular API. This subject brings me back to my earlier post referencing 'prefer design skills'. General skills in the key areas such as design, business analysis, working with customers, understanding of what it takes to build quality into a product and a good sense of architecture - these are the only things we should be focusing on. Specific skills come naturally and can be picked up by the right type of people. 

ROLES
We have fragmented roles for BA's, QA, Architects etc, and I for one find it difficult to reconcile development, the creative production of code, with some of these other roles. I don't feel that great results can be achieved when these are viewed from a separated, isolationist standpoint. Most in these roles have never had to deliver software and don't understand what it takes to do so. Developers need to intimately understand these various perspectives though in order to deliver a great product. The best scenarios are generally found when experienced developers have sufficient experience and knowledge and can integrate all these perspectives in their everyday activities.

EDUCATION 
It would not be acceptable for law, medical or business leaders to enter their chosen fields without passing an applicable exam - typically at undergrad or postgrad level. Why is it commonplace that graduates of other disciplines can become computer scientists? Don't misunderstand, I am definitely not bigoted here, I know one or two very good people who don't have any formal computer science background, but they are incredibly motivated individuals - exceptions rather than the rule. However, I have met many more from a physics, math or other degree discipline who just don't get it, and haven't been motivated to get it. So is this a problem with education, or a misunderstanding in general that it really doesn't matter how well someone understands computer science in order to do the job?

Sometimes, education itself can contribute to the problem, but it largely depends on the curriculum and teaching staff. Generally speaking I don't feel they are guilty of any bad intent - their main purpose should always be to encourage open mindedness and my experience here was a good one with my college.

BUSINESS
The big players in the business of selling software, hardware and services influence us more than we would admit, but I can't say I blame them, after all, its just business. More fool on us for paying through the nose for a product that isn't a good fit - more often than not, its because the wrong people are involved in the decision making process; marketing rather than development. Smart marketing is at the heart of big business's approach - playing to the fears of senior managers is not hard when development track records are highlighted. Many desperately want to believe in a silver bullet, a giant slayer that solve all our problems, but there simply isn't one - will there ever be? I doubt it. 

ROOT CAUSE
This is pure speculation, but my feeling is that software development is an incredibly complex mix of social, creative and technical abilities and it is very hard to find great people with this combination. It's just a case of massive widespread misunderstanding and underestimation of the complexity involved in creating a software product. Great products are built by people who have this understanding and generally such people spend most of their time trying to make it simpler and easier to produce something - often by changing the rules of the game by trying to simplify the inputs.

There was a stigma associated with the discipline, the real techie geek type being locked in a big server room with thick plastic glasses taped together - really through to the 1990's. Then it became more socially acceptable and HTML hit the mainstream, and all of a sudden even little Johnny could put together a web site in his bedroom in 5 minutes so it must be easy this IT stuff - right?

Software is undoubtedly viewed by most as pretty much a blue collar, pass it along the production line type of work activity. Many managers still look for ways to reduce costs and replace more expensive, valuable staff with cheaper one who have a painting by numbers mindset. The best people in the business are not a commodity, they think creatively and are not generally constrained by the ideas of the masses. One of these people is often worth 10 cheaper staff, yet is often only paid 20% more. 

IN CONCLUSION
Enough of all this waffle. Is there a hard and fast answer to the question - no. How can things be changed - I have no idea. All I know is that good people are out there who can identify with many of the things I have mentioned in this post and they know the path to tread through the minefield to achieve a good degree of success.

Tuesday, June 10, 2008

Estimation

I had an interesting discussion some time back about the relative merits of attempting to give a SWAG (silly wild ass guess) on level of effort to build some features for a product. It was the very early stages of product definition and a rough level of effort was required from some participants. 

My first reaction was the suggest that we put more meaning around the two or three word features that were listed out because I did not even know what the words meant, let alone how long it would take a team to build them. If I do not know what these things actually mean, how can I provide any form of estimate - there was not enough information. Yet, despite my stance, there was still a general insistence that the information be provided.

This got me thinking more deeply about estimates and estimation in general. There are many estimation techniques out there such as cocomo, wideband delphi, function point analysis, some of which I have tried, some I have not. But I ask myself now, is there any value in pursuing any of these? Are they any more accurate than a 'gut feeling' (I guess that is synonymous with a SWAG)?

So, can estimation techniques provide any kind of reasonable output? I would say the answer to that is a guarded - it depends. There are many factors that govern predictability.

- PEOPLE - team size, mix, skills, talent, effectiveness bonding, business domain knowledge
- PROCESSES - how the team works, rigidity, willingness to change, working environment
- TECHNOLOGY - equipment and tools, choice of libraries, languages, frameworks

Sure, there are many more than I have listed here, point is - the more that is known, or understood, the more likely that those involved will be able to provide meaningful estimates. 

So, for example, if a team is asked to provide estimates to build something in the business domain that they understand very well, with technology stack they have prior knowledge of, with a good team mix and the right input, environment, tools etc. they can provide something meaningful. However, I would still consider their output with a healthy dose of skepticism, because users/customers/product owners are prone to changing their minds. Even developers change their minds during the course of a project.

Where to now? The problem with the above estimation ideas is that they are based on the assumption that things remain static during a project. But projects are a creative activity, not a production line. Its like asking an artist to provide an estimate on how long it will take him to paint a picture - he may get half way, and clear his canvas and start again from scratch if he doesn't like the way its taking shape. He has that prerogative - its an artistic process. 

Now, can teams at least commit to a specific capacity? To a degree I think they can - a team who have worked together before may know that all things being equal, they can deliver 20 story points per iteration. Does this mean anything in terms of estimates - I would say a qualified yes - if everything remains unchanged. If circumstances change (and they will) then the team should be able to use that information to advise customers/users/product owners that project parameters have changed and that scope or time to market adjustments should be made. The more work a team delivers on a project, the more they learn about themselves, the tools and the domain and the more accurate their estimates become over time. In my opinion, this places an even higher emphasis on open/honest communication channels with executives and stakeholders to allow them to make sensible funding decisions.

Monday, June 9, 2008

Generic Agile

In this post on Generic Agile, Rachel Davis talks about the idea of mixing up different types of agile methods to arrive at something useful that fits the organization's style and timeframe. Rachel presents some great points and for the most part I agree and it is in fact what I am trying to do myself. 

The interviewer asked what advice and recommendations Rachel would give teams that are looking to change their processes. In response, Rachel said that they should read around many different flavors and not to get too hung up following practices exactly. While I agree that reading around various flavors is a good thing, I can't help but think that during early adoption it would be better to call in a coach who has worked successfully with lightweight processes to help the seed to grow. If a team experiments with different pieces of various processes without first having a working knowledge of them, it might risk pulling apart harmonious practices that are not so effective individually.

In my judgement I am possibly over harsh on the ability of others to pick up on this - but I doubt I would have so easily grasped it myself had I not been fortunate enough to work with a great coach.

Sunday, June 8, 2008

The Essence of Agility

I am trying hard to avoid the 'A' word these days and I am not alone in this endeavor. There are so many examples of misinterpretation its really quite sad. Of course there is nothing different here to almost any other major phenomenon in the tech world, as soon as any label/buzzword reaches a critical mass, everyone wants to get on the bandwagon. 

Yet agility is one of the more interesting examples - it isn't a technology that can simply be learned and I don't think it is a case of looking at a set of steps in a book on your favorite flavor - XP, Scrum, DSDM, Crystal etc. It is extremely hard to understand the essence without first witnessing it by working in a team.  

I will say this only once - the essence is really down to mindset and attitude.

One of the easier ways to understand it is to have experienced what development should not be, then move to a team with agile values to instantly see the difference - it can be a real 'road to Damascus' experience. Unfortunately there are many in the business who have not been in the position of direct responsibility for delivering working software so it is much harder for these people to understand the mind shift required to adopt an agile way of working/thinking. 

A simple mind adjustment then is all that is required, a good reading list and practice is all that remains to complete the transition. Note the difficult part, its incredibly hard to change attitudes, especially those entrenched with ideas taught, read and practiced over years. Couple this with a little ego and discomfort with change and the odds are really stacked against.

There were many out there that were agile before the 'A' word and there are many out there now who practice it now without giving it a label.  Smart people who want to deliver maximum business value have always been around, but the addition of some good writing under the 'A' label adds some great principles, values and practices to the smart team's arsenal. For me, I just like to remove the label to avoid any preconceived baggage.