The Agile Practitioner: Getting the Data

To understand what is meant by “long-range planning,” we need to go back to the traditional approach to product planning. Good product managers start by gathering evidence. They may be receiving usage and user feedback directly from the product in the case of software development. They should also be directly interacting with users and associated product stakeholders.

Data is king for product managers and those who operate without it or with limited input from it, cannot consider themselves strong practitioners. In the software industry, where I work, there are two excellent sources for understanding the role of the product manager and the ways in which data can be collected: Marty Cagan is a leading consultant and the Nielsen Norman Group offers a fountain of information around user experience (UX) design. I cite them here because it would be hard to separate out the things I've learned from them from the things I've learned through my own experience. This is because these thought leaders influence the practices of other people in my own organization and throughout the industry so much that it would be a mistake to assume that because I learned something from someone else, it didn't come through them.

Getting unbiased data is tricky. At ITHAKA, we have a User Insights team. Their sole purpose is to help other teams gather actionable feedback that informs the product's direction. Because they are experts in this area, they keep us off the jagged rocks of bias. It turns out that we inject our own bias into our inquiries without realizing it, but the respondents get the message.

Some examples are obvious:

What do you like about this feature?

This question assumes that they like the feature or something about it. It doesn't seem biased, but it is a typical loaded question. One might temper this by rephrasing the question:

Are there good qualities about this feature? If so, please describe.

We have now removed the loading, but the bias remains around the word “good.” To remove all bias, we need to stay away from these judgement words. So, our question becomes something more like:

What is your reaction to this feature?

There are no loaded words here. We are asking for a reaction, which is free to be good or bad. Having people with a strong background in the psychology around bias can be invaluable. Without it, product managers often end up confirming their own biases without realizing it. They continue on thinking that they are gathering good data when, in fact, it has been tainted.

Of course, questioning users has the risk of bias, whereas collecting data directly from user behavior is a safer, if more technically challenging approach. Both are important to have a complete picture. Tools for data collection abound. Two popular options come from industry stalwarts Adobe and Google. Adobe has retired their Catalyst product and it has been replaced with the Adobe Experience Platform. Probably the most popular is Google Analytics, which has become part of the Google Marketing Platform.

Both of these systems can collect some data in their basic form, but by embedding tracking codes into the user interface software, much more detailed information about user behavior can be collected and analyzed. Firms that are serious about understanding the behavior of their users will track and log most, if not all, user actions. This data is sometimes referred to as clickstream (or click path) data. I first heard this team from Ralph Kimball, who literally wrote the book on data warehousing. There are many other techniques for gathering user data from the software itself, including a/b testing and heat mapping.

Using the Data

Assuming that you've gathered lots of useful data, the real art is in figuring out what to do with it. Some data will speak loudly and clearly about the need for action. Unvisited pages for example, call for further research. Do users not understand the labeling? Do they not see a need for the information contained within? Is the navigation confusing? If the answers to these questions aren't evident in the data, then the next action is another experiment. However, if the answers are available, then remediation is in order.

Armed with a mountain of data, some clearly pointing to specific product changes and others suggesting further research, how does the product manager prioritize work? The agile manifesto provides!

Principle #1:

Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.

There are two organizing principles here: first, satisfying customers, and second “early” and continuous delivery. These two principles do not always play nice with each other. Product managers must prioritize between highly valuable new features that will require extensive work against less valuable features that can be delivered much more quickly.

A technique that I have used successfully to help with this is a weighted point system. If you value speed of delivery higher than value to the user, then you can give more weight to this. I suspect most product managers would prefer to weigh user value more heavily. Even so, a feature with a very high speed to market score could ultimately outweigh a more valuable feature with a low speed to market score.

fig1

As we see in this illustration, the feature with the lowest user value here may rank higher because agile practices prize speed of iteration. Quick wins allow us to learn more quickly. While this may not always translate to solving the big user challenges, showing users continuous improvement provides a sense that progress is being made.

Sometimes features can be broken down into smaller components that deliver partial value to users. Agile practitioners are always on the lookout for these types of opportunities, but big changes are often hard to deliver in this manner.

The Roadmap Conundrum

Armed with a ranked list of improvements with corresponding work efforts, it is only natural that a product manager (PM) would want to present this list in the form of a product roadmap. In order to create the list, we had to assess the development effort at least at a high level, so the PM should be able to put a rough time scale around their list of initiatives. This is, by definition, a product roadmap.

To be clear, I am not advocating against product roadmaps. They can be a useful tool. The challenge here is to avoid the traditional use of the roadmap. Traditionally, PMs would build a roadmap for a set period of time, typically one to five years depending on the scale of the effort involved.

Even when software developers were still using a waterfall technique for design and build, roadmaps rarely survived their initial plan. Some level of flexibility is always required. In an environment in which agility is in practice, roadmaps have even less survivability. Therefore, they must become dynamic documents that are continually changing. This means the way they are used must also be different.

Portfolio Roadmap
Portfolio Roadmap

This graphic attempts to illustrate the nature of roadmaps. The farther into the future we look, the more opaque and less reliable our vision. This does not mean we shouldn't identify opportunities that are lower priority. It simply means that we should not see the roadmap as a work plan, but rather as a dynamic, living breathing document that changes as we go along.

There are a number of key reasons why we must do this:

  • User preferences change
  • Market or other environmental conditions change
  • Approaches to implementing higher priority features obviate the need for lower priority features
  • Organizational priorities shift
  • New technologies emerge that present new opportunities

These reasons can (and usually do) act in combination to destabilize longer term plans. However, by having a plan, we can continually reevaluate and adjust — throwing out ideas that are no longer valid and adding new features as they are identified.

Leaders that create a product roadmap and then force the organization to adhere to it do their organizations a great disservice. Teams that work on product development everyday see the impact of the various reasons why plans change up close and personal. Some of the items on the list above are more visible to development teams. Leaders may think the plans they validated in the past are still valid. If they force their view on teams, they will become disillusioned with doing work that they don't perceive as important and/or valuable.

The risk to the product in this scenario is twofold. Not only does product development not follow an optimal path, but team members will start to lose enthusiasm for the work. In a world in which users have many options, switching costs are dropping, and software development professionals are scarce and in high demand; having a product roadmap that is much more than a suggestion of a possible future is dangerous.

Cautious Optimism

Some agile purists will denigrate the product roadmap as an artifact of the waterfall past. Agile practice should not become rife with dogma. The whole point behind agile principles is that very few things are sacred: the customer comes first, iterate quickly to learn, be free to make changes based on lessons learned, and most importantly – value communications (with teammates, stakeholders, leadership, and users).

A product roadmap can be a good guide to supporting all of these practices if it is used in a flexible manner. Just remember, the roadmap is not the thing, the customer is the thing. The roadmap is simply a tool to efficiently share ideas in support of improving the customer's experience.

PDF Version

Tom Bellinson

Tom Bellinson

Mr. Bellinson has been working in information technology positions for over 30 years. His diverse background has allowed him to gain intimate working knowledge in technical, marketing, sales and executive roles. Most recently, Mr. Bellinson finds himself serving as a Scrum Master for ITHAKA, a global online research service. From 2008 to 2011 Bellinson worked with at risk businesses in Michigan through a State funded program which was administered by the University of Michigan. Prior to working for the University of Michigan, Mr. Bellinson served as Vice President of an ERP software company, an independent business and IT consultant, as chief information officer of an automotive engineering services company and as founder and President of a systems integration firm that was a pioneer in Internet services marketplace. Bellinson holds a degree in Communications with a Minor in Management from Oakland University in Rochester, MI and has a variety of technical certifications including APICS CPIM and CSCP.
Share

Speak Your Mind

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share
Share