Sunday, October 31, 2010

When in a inception...

Everyone is an analyst
Don't get bound by role boundaries. Everyone needs to understand the system. Talk to the clients, ask questions, draw diagrams, make suggestions, understand problems and solve them.

Make sure everyone facilitates sessions at least once
Especially the BAs since they have to interact the most with the clients. Clients need to feel confident about the BAs on the team. Other members should also be actively involved. Don't let a single person be the scribe all the time. The client might ask "why am I paying for this guy" at some point.

Have an intent for each session
and strive to achieve it. Have back up plans ready in case you find that your planned tools / techniques are not working for the client. Fail fast, change your approach, take a break and regroup, but make sure you get the right information out of a session.

Educate the client about the way you work
They should know what it would be like to work with you day-to-day. Clarify time commitments and kind of inputs required from your clients (suggestions during analysis of stories, showcase feedback, testing). Have special sessions that cover methodology. Offline the subject if it comes up elsewhere. It can derail your session plans easily.

Be honest and upfront
About any shortcuts you have taken, any information that you don't have, any help that you need.

Understand each stakeholder
You should know the pressures that they are dealing with, individually. Know how you can help them.

Create. Shared. Vision.

Monday, July 19, 2010

Livescribe Pulse Smartpen

Ooooh! Ooooooh! I love it!!

This is the most awesomest thing I have bought this year.

There are enough reviews online for me not to write one of my own. But I'll mention the most useful features here:

  • Captures everything I write / draw. This is the basic promise. A very well kept one.

  • Captures good audio. It's not crystal clear or anything, but is it works well in a meeting setting. I recorded three interviews today, alongwith notes. After I get the headset that someone took by mistake, I'll experiment with the more clear 3D recording capabilities.

  • It links the audio and notes together so you can replay what you were saying when you wrote a particular word.

  • It searches. Even cursive handwriting.

  • And then there's other fun stuff like drawing a paper piano and playing it. Or using a calculator by tapping the paper calculator on the inside cover of the starter notebook. Or drawing a cross for navigating the pen's main menus. Or bookmarking audio at specific points.

  • It uses dot paper for all the magic. But you are not restricted to the notebooks supplied by the company. You can print your own dot paper for free using any half decent printer!

All in all this is going to be very useful. The 2GB storage is more than enough. My 1:02:14 audio session used 12MB. So if you download the data even once a week, you'll be alright.

I'm loving it!

Sunday, May 23, 2010


I facilitated immersion for the March 2010 batch alongwith Arun. On the first day we went through the ThoughtWorks values, culture and history. Then had a session on feedback. We value feedback a lot in ThoughtWorks because the way we deliver is based heavily on Constant Feedback and Continuous Improvement (thanks Sarah). One the second day we went through a brief description of the ThoughtWorks way of running projects (based on agile / lean principles). On the third day we were talking about iterations and about the problems with velocity becoming a target rather than remaining as a basis for planning. I was saying that when we sign up for stories, we don't commit to the points signed up for. It is just a guess based on some information (yesterday's weather) and some gut feel. At this point someone asked the question.
When do we commit? What do you commit to? The client can't just trust you... there needs to be some concrete commitment.
I didn't realize how important this question was when it was asked. But thinking about it I see that it is the basis of everything we* do isn't it? The basic agile principle of
Customer Collaboration over Contract Negotiation
The success of a software project does not depend on its completion on time or within budget. It depends on what benefit you get out of it.

Wednesday, March 17, 2010

Number Lust

I am an amateur photographer while I am not not hacking project teams and building custom software at ThoughtWorks. Photographers are known to have bouts of lens lust* time and again, especially at the beginning. I realize that some managers seem to have similar urges when it comes to numbers, metrics. They suffer from acute number lust.

In ThoughtWorks, we believe in and encourage self organizing teams. BAs gather requirements, developers write code, QAs test and automate and the customer signs stories off in a flow. The Project Manager role is therefore reduced to making sure that nothing obstructs this flow.

The way this would be done is to gather and analyze the right data and take actions based on this analysis. Examples being:
  • Finding bottlenecks from the wall or a CFD and taking actions to fix the problems.
  • Looking critically at retrospective action items and seeing that all those issues are fixed (especially if they are issues outside the teams control, like infrastructure).
  • Making sure that the right capabilities exist in the team at the right time. If not, get people from outside, arrange training, etc.
Some PMs are content in gathering just enough data to spot problems. But then there are some who can't get enough data ever.

They want to organise the whole world into neat boxes and label them and track anything and everything possible. The walls soon fill up with useless charts with graphs and numbers that the team can't use in any way. The worst part is, there is no analysis done around the data even by the people who gather it. Having people in a 15 member team draw emotional seismographs every iteration and sticking them on the wall is of no use if you are not going to make any decisions based on them.

Now I agree that gathering all possible information can enlighten us to some extent. But its a matter of marginal utility. If I have to do 5 extra things to do my job, just because we want to gather data, I am not doing it.

So guys, here's a humble request. Figure out with your client what kind of data he would like to see. Figure out as a team what you would like to improve and what data needs to be gathered to help you get there. Don't burden your team with things that obstruct OR slow down the flow. Keep it simple and keep it lean.

Our job is to provide quality software not elaborate reports.

* lens lust - the dangerous urge to keep buying new, expensive lenses not realizing that its practice that will improve your photography, not lenses!!

Onsite Business Analyst

This has come up in various discussions recently and I want to put down my thoughts about the role and responsibilities.

For a ThoughtWorks team in India (or China) most of the work is offshore agile development. Clients are usually in the UK or the USA. The team is structured as follows:
  1. Offshore PM
  2. Offshore Devs
  3. Offshore QA(s)
  4. Offshore BA(s)
  5. Onsite BA(s)
This is how the communication works.

Of course there are other exchanges that take place but the Business Q & A and the Technical Q & A are the most important pieces of concrete information exchanged.

Other companies, even traditional development outfits, have a role called the Onsite co-ordinator. I believe this role facilitates similar discussions (although I hear that they are more technical than business related)

The onsite business analyst's main role is to facilitate business and technical discussions between the team sitting offshore and the client. She is not restricted to be just an onsite co-ordinator. She can take up a new stream to analyze by herself. Even feed it to an onsite development team, if there is one. But all this keeping in mind that the main purpose is bridging the communication gap between the client and the development team.

At the same time the onsite business analyst shouldn't go overboard with this and become a single point of failure. The direct communication lines between the offshore team and client should always be open.

Tuesday, March 2, 2010


I have recently come out of a consulting assignment which has given me loads of time to read and think about processes, improvements and effectiveness. (People who follow me on google reader must have noticed). It also had me thinking about introduction of agile into a traditional IT outfit and what would make it more effective.

Top -> Down
The Top -> Down approach is where someone in the top management realizes (or is convinced) that agile is the solution to all their problems and goes on to "mandate" agile. This is not necessarily bad. If the organization hires the right consultants to bring agility into the project teams at the ground level.

Bottom -> Up
The Bottom -> Up is not really an approach. Its just a team realizing that they can do their stuff better and try to bring in improvements in their team without affecting the organization. This is later, agility may spread into other teams virally and this might trigger an organizational transformation with eventual buy-in from the management.

Here's what I think works best. Like any win-win solution, its a mix of the two extreme approaches described above.

Top -> Down buy-in, Bottom-Up implementation
The management decides, that they require to be agile to meet the market conditions but instead of pushing "agile" down the throats of people below, they let the people on the ground level realize what agile means and how it changes the way the team works. Agile Coaches are useful in this transition.

Sounds simple enough? Well here's the catch. How does the management know that the new process is effective?

The sad part is in a traditional organization's hierarchy, the only interface between the management and the teams is "Reports". Reports on various parameters (metrics) that the management believe should be monitored to see how stuff is going on. Whether the teams embrace agile themselves OR whether management asks teams to be agile, the teams will need to report the right metrics to the management to gain their confidence. If the teams don't do this, the management might start tracking the wrong metrics and hence encourage wrong practices. What you measure is what you get.

I opened this topic to a bunch of people from the client organization and we came up with a bunch of metrics based on the principle that we should report consistency rather than hard numbers. Consistency gives a measure of effectiveness and also encourages the right activities.

Here's the list that we came up with (There can be many more. The basic principle is to use proportions / percentages instead of hard numbers):
  1. Velocity Barometer without numbers (we mark completed stories in red on a barometer, but we take out the numbers. Consistent color means good, ups and downs means trouble)
  2. Cumulative Flow Diagram (consistent thickness of bands v/s actual numbers)
  3. Ratio of open V/s closed bugs (Percentage. Instead of bugs found and bugs fixed)
  4. Test Coverage (Percentage)
  5. Automated V/s Manual tests (Percentage)
Really speaking it all boils down to using just Control Charts for project reporting and management. What do you think?

Thursday, February 18, 2010

Testing considered wasteful??

@silvercatalyst posted on twitter a few days back that one of the trainees in his session counted testing as waste. I retweeted with a #funny but @silvercatalyst said he actually agreed with it. So we twiscussed it for a while. (By the way twitter is just the wrong tool for discussing interesting things). Back to the story.

Here's what we ended with after a few emails had been exchanged:
  • Testing is not wasteful. But testing as an activity after development (especially after a time gap) is wasteful
  • Some types of testing can be done upfront but other types still have to be done after the story is complete
  • There are ways to prevent bugs (rather than catch them) by Dev + QA and BA + QA pairing
Mixing this conversation with Feature Injection technique and Mike Cohn's post on removing Finish to Start activities, I think that BA(or Customer), Dev and QA pairing on a story will provide tremendous boost to cycle time and significant reduction in bugs. But to achieve this, you need certain pre-conditions to be true.
  1. Co-location with customer. Else a great BA who is an excellent customer proxy
  2. Poly-skilled team members (not just smart)
  3. Team members (including the Customer) open to work towards a moving target (with negotiable stories)
My next project will hopefully see this implemented, at least in small steps. Something a step ahead of the Ménage à trois that's already been tried out successfully.

Thanks @silvercatalyst for an interesting discussion and helping me put my thoughts together on a bunch of stuff I had read recently.

Friday, February 5, 2010

Back to the Basics - 1 - The problem

Reading Martin's ConversationalStories renewed my confidence in this draft post from about an year ago. I just couldn't put it in the right words and gave up on it. I keep talking about deterministic universes over randomness and stuff like that but the gist of the matter is simple.

Writing stories is not the JOB of a Business Analyst in agile development. Writing stories is a collaborative effort in which Customer, BA, Dev, QA should all take part. This is the N in the INVEST principle.

And here's the post


I have been talking about tracking and trends and smells and quantum physics for sometime but here's a post that takes us back to the basics. This is about the very problems (with waterfall) that we are trying to solve using alternative approaches (agile, lean, hybrid, etc).

As everything in life should start, we start by defining the problem. Waterfall is an approach that borrows heavily from Einstein's idea of a deterministic universe. The idea is that
If you know the position and velocity of each atom in the universe, you can accurately determine the state of the universe at a given point in future or the past.
The point is that this is just a theory. I am not contradicting the theory but I am just pointing to the simple fact that gathering knowledge about the position and velocity of each atom takes too much time to be of any use.

Waterfall tries to do the same thing at a very small scale; at a project level. And the argument remains the same. It'll take you too much time to understand every aspect of a problem for you to be able to successfully solve it while it's worth solving.

Agile is about improvisation. You accept the inherent randomness of the universe. This might be genuine randomness OR it might just feel random because we are not able to understand it, but at any rate the universe is random to Human Beings.

So we say, given that things are going to change in ways that cannot be determined, let's do our best to adapt to those changes as quickly as possible. To adapt is to understand what has changed, how it affects us and what can we do to "maximize our happiness" in the given situation.

This is where you require the key component of anything worth calling a success. Collaboration.

In the deterministic world of waterfall, the Business Analyst is supposed to be this wizard, who understands the whole problem, whose affected and how and formulates a solution to it all by herself. Few have succeeded at meeting this unrealistic expectation.

Agile says let us all work together towards solving this problem and I think that's a more realistic way of getting an optimum solution. Now what this means is that nobody's word is final on any solution unless everyone is happy. Neither the client, nor the Business Analyst and nor the Developers. That is what it means when we say that a user story is Negotiable (i-N-v-e-s-t).

* Increasing happiness as in the sole purpose of life.