When thinking about analysis, people often think about a set of techniques, such as elicitation and modeling. While those techniques are important, the thought process you use when performing analysis is critical for determining the right thing to deliver.
To give you an idea of what a good thought process looks like, here are my 10 tips for analysis with an agile mindset. In Beyond Requirements: Analysis with an Agile Mindset, I described the agile mindset as a focus on seven guiding principles:
- Deliver value
- Consider context
- Decide wisely
- Reflect and adapt
1: Maximize outcome with minimal output.
Outcomes are changes in the world that happen because of your work. For IT projects, outcomes show up as changes in the organization or in your stakeholders' behavior. You deliver outputs in order to achieve some outcome. Outputs in IT projects can include code, tests, requirements, and documentation.
Because outputs are typically easier to measure than outcomes, a lot of teams and organizations measure progress and define scope by the number of outputs produced. But it's possible to produce a large number of outputs, spending a lot of money in the process, and still not reach the desired outcome. It's better to focus on the outcome that you want and as a team determine the minimum outputs necessary to deliver that outcome. This approach shortens the time required to deliver that outcome, reduces the cost of producing the outcome, and decreases long-term costs, as fewer outputs (code, tests, documentation) need to be maintained.
The secret is to build a shared understanding of the problem your stakeholders are trying to solve (outcome) and then determine the most appropriate solution (output) for addressing that outcome. Don't take all stakeholder requests verbatim; instead, dig a little deeper to understand what's really behind each request, and then decide whether the stakeholder request is relevant to your given solution. If it is relevant, determine the underlying need that the stakeholder is trying to satisfy, and tackle that need. If the request is not relevant, explain to the stakeholder why it's not appropriate to address in this particular effort.
2: Start with the outcome and then consider outputs.
Once you understand the outcome you're trying to deliver, as well as the assumptions underlying that outcome, use that information to guide what you do next. Select the outputs (often expressed as features) that allow you to make progress toward meeting the targeted outcome, or that help validate assumptions. Which aspect to focus on first depends on how far along you are in the initiative. At the start, you'll spend more effort validating assumptions (you can also think of this as reducing uncertainty), followed by delivering features that you know provide the value you seek.
The key point here is to identify value first, and then iteratively identify the features that you need to deliver that value. Don't brainstorm a big list of possible changes and try to figure out what each feature could contribute to business value.
Measuring value at the granularity of a user story is very difficult and can generate wasted effort. Too often, a team spends time overanalyzing the value points associated with a story, when they could easily have made a priority decision in another way, which is really how value points are intended to work. By working from outcome (value) to output (features) instead of the other direction, you'll have fewer items to manage at any point, and you'll avoid the tricky business of trying to assign value to any specific change.
3: Do only what is absolutely necessary to deliver value.
In addition to delivering only required outputs, deliver those outputs using only necessary activities. Put into practice, this means that the approaches your team uses should be barely sufficient (with adjustments along the way as needed, based on experience). You also shouldn't get too hung up on the arcane semantics of modeling techniques. Remember that you're generally using modeling to aid with communication and build a shared understanding. Imperfect is okay as long as everyone involved in the discussion understands what the model conveys. You can always chat to clear up any confusion.
Complicated processes or frameworks are rarely good ways to address complexity. In fact, the more complicated a process is, the less likely people are to follow it effectively—and, perversely, the more likely they are to hide behind the complicated process, to the detriment of the entire team. Keep your processes simple, and adjust them as you learn.
4: Delivering value is a team sport.
In a good collaboration, team members commit to meeting a joint goal, and they're not afraid to step outside their area of specialization to help others on the team. All the team members have a specialty (such as development, testing, or analysis) on which they spend a considerable part of their time, but when the need arises, they should be able to jump in and work on something else to help the team meet its overall goals.
For a business analyst, this might involve facilitating whole-team collaboration, using team member and stakeholder insights to aid in analysis, and helping out with testing and documentation when other team members get stuck. Analysts shouldn't hoard all of the analysis work for themselves, or restrict their contribution to analysis.
5: Before delivering a solution, build a shared understanding of the need.
Before starting an IT project, you should understand why you're doing it—in other words, the problem you're trying to solve. If you understand the problem you're trying to solve or the opportunity you're trying to exploit (the need), you have a better chance of choosing the most effective solution, and you'll avoid putting needless time and effort into creating a solution that isn't needed.
Understanding the need first and being able to describe it gives you the opportunity to build a shared understanding with your team about why you're considering starting (or continuing) a particular project. It also gives you a basis for asking the question, "Is this need worth satisfying?"
6: Be intentional about your decision-making.
Success in many types of organizations (for-profit, not-for-profit, governmental) depends on making well-informed, timely decisions. I've worked on successful projects in successful organizations, and one characteristic that always seemed to be present was clear decision-making. Conversely, in cases where I had the opportunity to learn from less-than-desirable situations, one factor that always seemed to be present was poor (or nonexistent) decision-making.
An important aspect of decision-making is who actually makes the decisions. That person should be as informed as possible and also be in a position to make the decisions stick. In many organizations, the people who are expected to make most decisions—senior leadership—are not the best informed; decisions require in-depth, detailed knowledge that leaders may not have.
One very effective way of resolving these issues is by spreading decision-making throughout the organization. This helps to ensure that the people with the relevant information are the ones who make certain decisions. A prime example is teams deciding the best way to approach a project, given that they have the proper understanding of the desired outcome for the project, and they understand the constraints under which they must work.
7: Shorten the feedback cycle.
Unlike operational work, IT projects and other types of knowledge work rarely use directly repeatable processes. When you're engaged in operational work, such as assembling a vehicle or processing a claim, many of the steps can be copied directly from one unit to the next. Identifying improvements becomes easier because often there is very little time between cycles of a particular set of work tasks. Operational work is repetitive and fairly predictable; you can always learn how to do it better.
Knowledge work projects are like snowflakes: No two are alike. Even if you get the opportunity to experience multiple projects, the lessons you learn from one project probably aren't applicable to a different project. A focus on continuous learning, with iterations being a key component, reminds your team to stop every so often and figure out what they can revise. This approach also helps to identify meaningful milestones, with progress shown as actual working output rather than intermediate artifacts.
8: Validate assumptions early and often.
It's important to validate assumptions early in your project, so that you can determine whether you have identified the right solution to the right problem. Asking your stakeholders for feedback is helpful, but due to the influence of cognitive biases, they could give you misleading information. The Build-Measure-Learn loop provides a way to validate assumptions in conjunction with talking to your stakeholders. It also encapsulates the overall approach to building and getting feedback, which is a key aspect of the guiding principle to reflect and adapt.
Quick cycles through the Build-Measure-Learn loop can help your team reduce the uncertainty that often comes along with IT projects. You can reduce the uncertainty bit by bit with rapid trips through the Build-Measure-Learn loop to validate assumptions. Start by tackling the biggest or riskiest assumptions. Eric Ries calls them the "leap-of-faith assumptions," but it may be easier to think of them as the assumptions that, if proven wrong, can really reduce the chances of the project being successful.
9: Learn from the past to improve your future.
Your team should continuously learn from its experiences if you want to improve your approach and the outcome of the project. Projects often last longer than a couple of months. During that time, business conditions, team member understanding of the purpose of the project, and the environment surrounding the project will all grow and change. Your team should seek to use that change to its advantage, to ensure that the project's outcome meets the needs of your stakeholders when the result is delivered—not just the perceived needs of the stakeholders when the project started.
Project teams have long done postmortems or lessons-learned sessions, where team members gather at the end of the project to talk about what happened—usually the negative aspects—in hopes that they can do better next time. If that end-of-project analysis is considered a good practice, wouldn't it make sense to do the same thing during the project, when the team still has time to make changes that affect the outcome? This is the idea behind retrospectives, which provide teams with a mechanism to discuss what has transpired on the project to date—things the team did well, along with opportunities for improvement—and decide what course corrections should be made.
10: Always remember, it all depends.
The term best practice frequently describes techniques or processes that were successful for one project or organization and are being copied by others. Unfortunately, what works for one project may not work as well in other situations. Many environmental factors can play a role in the effectiveness of a practice for a given project. For this reason, I prefer to use the term appropriate practices or good practices, emphasizing the fact that there really are no best practices across all projects.
Teams need to consider context when choosing which processes, practices, and techniques they use, so they can be sure they're doing whatever will make them successful—and skipping anything that's not necessary. Perhaps considering context is the only real best practice.
With the proper mindset and a great deal of self-discipline, a team can be successful with minimal process. Without the proper mindset, teams must continuously add process to aid the collaboration that comes naturally to those who have the right mindset. I encourage you to think about how you can adopt an agile mindset, to help you achieve the right outcome for your stakeholders more effectively. If you're looking for assistance with how to do that, Beyond Requirements: Analysis with an Agile Mindset might be helpful. (Of course, I may be a little biased.)