Wednesday, March 17, 2010

Number Lust

I am an amateur photographer while I am not not hacking project teams and building custom software at ThoughtWorks. Photographers are known to have bouts of lens lust* time and again, especially at the beginning. I realize that some managers seem to have similar urges when it comes to numbers, metrics. They suffer from acute number lust.

In ThoughtWorks, we believe in and encourage self organizing teams. BAs gather requirements, developers write code, QAs test and automate and the customer signs stories off in a flow. The Project Manager role is therefore reduced to making sure that nothing obstructs this flow.

The way this would be done is to gather and analyze the right data and take actions based on this analysis. Examples being:
  • Finding bottlenecks from the wall or a CFD and taking actions to fix the problems.
  • Looking critically at retrospective action items and seeing that all those issues are fixed (especially if they are issues outside the teams control, like infrastructure).
  • Making sure that the right capabilities exist in the team at the right time. If not, get people from outside, arrange training, etc.
Some PMs are content in gathering just enough data to spot problems. But then there are some who can't get enough data ever.

They want to organise the whole world into neat boxes and label them and track anything and everything possible. The walls soon fill up with useless charts with graphs and numbers that the team can't use in any way. The worst part is, there is no analysis done around the data even by the people who gather it. Having people in a 15 member team draw emotional seismographs every iteration and sticking them on the wall is of no use if you are not going to make any decisions based on them.

Now I agree that gathering all possible information can enlighten us to some extent. But its a matter of marginal utility. If I have to do 5 extra things to do my job, just because we want to gather data, I am not doing it.

So guys, here's a humble request. Figure out with your client what kind of data he would like to see. Figure out as a team what you would like to improve and what data needs to be gathered to help you get there. Don't burden your team with things that obstruct OR slow down the flow. Keep it simple and keep it lean.

Our job is to provide quality software not elaborate reports.

* lens lust - the dangerous urge to keep buying new, expensive lenses not realizing that its practice that will improve your photography, not lenses!!

Onsite Business Analyst

This has come up in various discussions recently and I want to put down my thoughts about the role and responsibilities.

For a ThoughtWorks team in India (or China) most of the work is offshore agile development. Clients are usually in the UK or the USA. The team is structured as follows:
  1. Offshore PM
  2. Offshore Devs
  3. Offshore QA(s)
  4. Offshore BA(s)
  5. Onsite BA(s)
This is how the communication works.




Of course there are other exchanges that take place but the Business Q & A and the Technical Q & A are the most important pieces of concrete information exchanged.

Other companies, even traditional development outfits, have a role called the Onsite co-ordinator. I believe this role facilitates similar discussions (although I hear that they are more technical than business related)

The onsite business analyst's main role is to facilitate business and technical discussions between the team sitting offshore and the client. She is not restricted to be just an onsite co-ordinator. She can take up a new stream to analyze by herself. Even feed it to an onsite development team, if there is one. But all this keeping in mind that the main purpose is bridging the communication gap between the client and the development team.

At the same time the onsite business analyst shouldn't go overboard with this and become a single point of failure. The direct communication lines between the offshore team and client should always be open.

Tuesday, March 2, 2010

Metrics

I have recently come out of a consulting assignment which has given me loads of time to read and think about processes, improvements and effectiveness. (People who follow me on google reader must have noticed). It also had me thinking about introduction of agile into a traditional IT outfit and what would make it more effective.

Top -> Down
The Top -> Down approach is where someone in the top management realizes (or is convinced) that agile is the solution to all their problems and goes on to "mandate" agile. This is not necessarily bad. If the organization hires the right consultants to bring agility into the project teams at the ground level.

Bottom -> Up
The Bottom -> Up is not really an approach. Its just a team realizing that they can do their stuff better and try to bring in improvements in their team without affecting the organization. This is later, agility may spread into other teams virally and this might trigger an organizational transformation with eventual buy-in from the management.

Here's what I think works best. Like any win-win solution, its a mix of the two extreme approaches described above.

Top -> Down buy-in, Bottom-Up implementation
The management decides, that they require to be agile to meet the market conditions but instead of pushing "agile" down the throats of people below, they let the people on the ground level realize what agile means and how it changes the way the team works. Agile Coaches are useful in this transition.

Sounds simple enough? Well here's the catch. How does the management know that the new process is effective?

The sad part is in a traditional organization's hierarchy, the only interface between the management and the teams is "Reports". Reports on various parameters (metrics) that the management believe should be monitored to see how stuff is going on. Whether the teams embrace agile themselves OR whether management asks teams to be agile, the teams will need to report the right metrics to the management to gain their confidence. If the teams don't do this, the management might start tracking the wrong metrics and hence encourage wrong practices. What you measure is what you get.

I opened this topic to a bunch of people from the client organization and we came up with a bunch of metrics based on the principle that we should report consistency rather than hard numbers. Consistency gives a measure of effectiveness and also encourages the right activities.

Here's the list that we came up with (There can be many more. The basic principle is to use proportions / percentages instead of hard numbers):
  1. Velocity Barometer without numbers (we mark completed stories in red on a barometer, but we take out the numbers. Consistent color means good, ups and downs means trouble)
  2. Cumulative Flow Diagram (consistent thickness of bands v/s actual numbers)
  3. Ratio of open V/s closed bugs (Percentage. Instead of bugs found and bugs fixed)
  4. Test Coverage (Percentage)
  5. Automated V/s Manual tests (Percentage)
Really speaking it all boils down to using just Control Charts for project reporting and management. What do you think?