Home > Articles > Software Development & Management > Agile

  • Print
  • + Share This
This chapter is from the book

The Kickstart Process

The kickstart process covered here is particularly effective when you’re faced with limited time or resources—in other words, when teams or organizations need to better manage workflow, but don’t have sufficient resources to develop those better ways. Naturally, this is not the only way to introduce Scrumban to teams.2 Always use your understanding of Scrumban principles to modify the process to fit your context.

Incidentally, this process is modeled after a Kanban kickstart process developed by Christophe Achouiantz, a Lean/Agile coach, and Johan Nordin, development support manager at Sandvik IT. Sandvik needed a way to empower its company’s teams to begin a process of continuous improvement that wouldn’t involve a significant initial investment of time or effort. Achouiantz and Nordin have documented this process and made it available to the public as a reference guide. While the particulars may not apply directly to your needs, it’s an outstanding summary of the key points and objectives relevant to any process.

Preparation

As systems thinkers, it’s critical to first understand the context of the environment before attempting to kickstart anything. At a minimum, preliminary preparation should include the following steps:

  • Identifying the current conditions and organizational priorities (e.g., increased quality, productivity, predictability)
  • Developing working relationships with key managers and team leads
  • Ascertaining areas of desired change
  • Introducing Kanban/Scrumban as a framework for evolutionary change
  • Starting to educate staff on Scrumban’s core principles and practices
  • Identifying any risks and potential impediments to introducing Scrumban to targeted teams

Regarding that last item, it’s important to recognize that some teams or contexts may not benefit from introducing Scrumban until certain conditions or impediments are addressed. These can include teams in an active state of reorganization, teams with serious internal conflicts, and so forth. Your context will determine how you address these situations. For example, Sandvik’s needs dictated that such teams be bypassed until circumstances changed.

Though it’s possible for teams to initiate Scrumban adoption without organizational buy-in, as previous chapters indicate, broader engagement is necessary to achieve the full breadth of desired outcomes and to sustain them over time. As such, Scrumban is not substantially different from a pure Scrum context.

In his book Succeeding with Agile: Software Development Using Scrum (Boston: Addison-Wesley, 2009), Mike Cohn addresses a variety of considerations that apply when you’re selecting the project context in which to roll out a new process (these considerations are made with an eye toward building off demonstrated success). Though less critical in a context where the Scrumban or Kanban framework is employed, they nonetheless represent worthy considerations (see Figure 5.1 for a graphical illustration of the convergence of these considerations):

Figure 5.1

Figure 5.1 The convergence of key considerations in selecting a pilot project for introducing Scrum. These same considerations apply to Scrumban.

Source: Michael Cohn. Succeeding with Agile: Software Development Using Scrum (Boston: Addison-Wesley, 2009).

  • Project size: For Scrum pilots, Mike suggests selecting a project that will not grow to more than five teams, and limiting initial efforts to just one or two of those teams. Coordinating work among more Scrum teams represents more work than you can effectively tackle.

    For Scrumban rollouts, there’s substantially more leeway. With fewer concepts to learn and master (again, we start out respecting current roles and processes), working with a slightly larger number of teams is feasible.

  • Project duration: Select a pilot project of average duration for your organization, ideally around 3–4 months. This is plenty of time for teams to begin mastering the framework and seeing benefits. It’s usually also sufficient for demonstrating that your new approach will lead to similar success in other contexts.
  • Project importance: Mike suggests that with Scrum pilots, it’s critical to select high-profile projects. Unimportant projects will not get the organizational attention necessary to make Scrum successful, and team members may be disinclined to do all the hard things Scrum requires.

    There’s slightly more leeway with Scrumban. Building momentum and trust through success is still an important objective, but the framework’s service-oriented agenda and active management of psychological barriers to change represent additional capabilities that can be leveraged to ultimately satisfy these needs.

  • Business owner’s level of engagement: As previously mentioned, Scrum and Scrumban ultimately require changes in the way both the business and the technology sides of the house approach things. Having an engaged player on the business side can be invaluable for overcoming challenges and evangelizing success across the organization.

    Scrumban modifies the weight of this factor substantially. To function properly, Scrum requires active engagement on the business side—its prioritization and feedback functions depend on it. Scrumban won’t function well without business engagement, but it does afford teams the ability to create positive changes by allowing knowledge discovery to “stand in” for disengaged business players.

    In a similar vein, Scrumban is ultimately designed to respond to business demands. If the business isn’t demanding change, then Scrumban’s pragmatic “response” is to reflect the absence of any need to change (although a culture of continuous improvement will continue to spawn incremental changes to make what you’re already doing well even better).

Initial Considerations

Many topics can be addressed during a kickstart. What constitutes “information overload” for a particular group will vary from context to context, and must always be judged against the level of ongoing support available following the kickstart (for example, you will likely cover more material if teams will have coaching support following the kickstart than if they will have none at all).

With these considerations in mind, incorporating themes that reinforce the service-oriented philosophy and the organization’s core values will go a long way toward positioning teams to evolve more effectively. There are many ways to do this, and it’s one area where an experienced practitioner or coach will really add value to the process.3

One special consideration for existing Scrum teams may be to introduce the concept of Sprint 0 (if this tactic is employed within the involved team or organization). There’s much debate about the “appropriateness” of Sprint 0 in Scrum circles. These special sprints are often described as necessary vehicles for managing what needs to be done before a Scrum project can start. For example, the team may need to be assembled, hardware acquired and set up, and initial product backlogs developed. Purists’ feathers may be ruffled by the notion of differentiating sprints by type and, more importantly, by the idea that any sprint would be structured in a way that fails to deliver value.

The Scrumban framework obviates the need for debate on these issues. The tasks associated with ramping up a new project can be easily accommodated and managed within the Scrumban framework as a special work type. If it’s important for your environment to ensure Scrum ceremonies hold true to their Agile objectives, then Scrumban provides a ready solution.

It also makes sense to assess existing Scrum practices with an eye toward understanding their impact on performance. Larry Maccherone and his former colleagues at Rally Software have analyzed data from thousands of Scrum teams in an effort to assess how differences in the way Scrum is practiced affect various elements of performance. This data mining exposed some interesting realities about how our choices involving Scrum practices can influence various facets of performance. Incidentally, Maccherone’s analysis is consistent with data mined from hundreds of teams using my own Scrum project management platform (ScrumDo.com).

Maccherone elected to adopt a “Software Development Performance Index” as a mode of measuring the impact of specific practices. The index is composed of measurements for the following aspects:

  • Productivity: Average throughput—the number of stories completed in a given period of time
  • Predictability: The stability of throughput over time—how much throughput values varied from the average over time for a given team
  • Responsiveness: The average amount of time work on each user story is “in process”
  • Quality: Measured as defect density

How You Choose to Organize Makes a Difference

As noted, Larry Maccherone’s analysis reflects a definite correlation between certain choices and software development performance. Although correlation does not necessarily mean causation, we can use these relationships as a guide for tailoring a kickstart process to address specific factors relevant to a particular implementation. Teams can be guided to position their visualizations and practices to better understand and discover which choices regarding Scrum practices are likely to work best for their desired outcomes.4

Determining Team Size

Humans are the slowest-moving parts in any complex organization. Teams can help us counteract this reality by making us smarter and faster. However, this outcome is possible only if we get teams right. Team dynamics is Scrum’s strong suit—and an arena Kanban addresses only tangentially, if at all. In fact, providing a framework that helps us improve team dynamics is a great example of how Scrum boosts Kanban’s capabilities.

To this end, size stands out as the most significant predictor of team success. There’s a right size for every team, and like so many other aspects of managing systems, that size should be dictated by overall context. Even so, having guidelines doesn’t hurt.

The military is a great starting point for gaining perspective on ideal team size. The basic unit in the U.S. Army’s Special Forces is the 12-person team. The army arrived at this number after recognizing there are certain dynamics that arise only in small teams. For example, when a team is made up of 12 or fewer people, its members are more likely to care about one another. They’re more likely to share information, and far more likely to come to each other’s assistance. If the mission is important enough, they’ll even sacrifice themselves for the good of the team. This happens in business, too.

The founders of Scrum understood these dynamics. Ask anyone in the Scrum community about ideal team size, and you’ll hear 7 members plus or minus 2.

The reasons for maintaining small team sizes are valid, but not every 12-person military unit is right for a particular assignment. Likewise, not every 9-person Scrum team is right for a given environment. Your system ultimately will dictate your needs, and it may very well require expanding your teams to incorporate a larger number of specialists or address some other need.

Scrumban supports this flexibility through its enhanced information sharing and visualization capabilities. Even the cadence of daily stand-ups is different: We’ve seen daily stand-ups in a Scrumban environment involving almost 100 individuals that are both effective in addressing organizational needs and capable of being completed within 15 minutes. This would be impossible in a Scrum context.

Iteration Length

Many organizations seek to establish uniform iteration lengths because they mistakenly believe this is a good approach for aligning different systems to the same cadence. Yet for many years, the general consensus among Scrum practitioners has been to recommend two-week iterations. What does the data tell us?

Rally Software’s data shows that almost 60% of the software development teams using its tool adopt two-week sprints, and I’ve seen similar patterns among users of ScrumDo. The data also suggests a team’s overall performance tends to be greatest at this cadence (Figure 5.2).

Figure 5.2

Figure 5.2 A view into the relationship between iteration length and team performance.

Source: © 2014 Rally Software Development Corp. All rights reserved. Used with permission.

Dig more deeply, however, and differences begin to show up among the various components of the index. Throughput, for example, tends to be greatest among teams adopting one-week iterations (Figure 5.3).

Figure 5.3

Figure 5.3 A closer look at productivity.

Source: © 2014 Rally Software Development Corp. All rights reserved. Used with permission.

Quality, in contrast, trends highest among teams adopting four-week cadences (Figure 5.4).

Figure 5.4

Figure 5.4 A closer look at quality.

Source: © 2014 Rally Software Development Corp. All rights reserved. Used with permission.

Every organization possesses characteristics that drive different outcomes, which is where Scrumban’s visualization and added metrics can help teams discover what’s best for their unique circumstances. The higher throughput typically associated with two-week cadences won’t magically materialize if the historical lead time for most of your completed work items is three weeks. Similarly, delivering completed work every two weeks won’t magically produce better outcomes if the customers can’t accept it at that rate.

Also, be wary of inferences drawn from data associated with teams that have adopted longer iterations. For example, Frank Vega recently made an interesting observation on this topic, recounting instances where teams chose longer iteration periods as a means of opposing (consciously or subconsciously) Agile transformation efforts because agreeing to a shorter iteration cycle would counteract their position that nothing of any real value could be delivered in such a short time period.

Estimating Story Points

Scrum prescribes the Sprint Planning ceremony as an event that determines which specific stories can be delivered in the upcoming sprint and how that work will be achieved.

Although Scrum itself doesn’t prescribe a particular approach or method that teams should use to assist with this decision making (other than expecting it to be based on experience), most Scrum teams use story points and team estimation techniques as components of this process.

Another process that teams use to aid their estimation efforts is Planning Poker—a consensus-based technique developed by Mike Cohn. With this approach, which is used primarily to forecast effort or relative size, team members offer estimates by placing numbered cards face-down on the table (rather than speaking them aloud). Once everyone has “played,” the cards are revealed and estimates discussed. This technique helps a group avoid the cognitive bias associated with anchoring, where the first suggestion sets a precedent for subsequent estimates.

Scrumban enables teams to begin measuring the correlation between story point estimates and the actual time spent completing development, measured from the time work on a story begins until the story is completed. Some teams reflect a strong correlation between their estimates and actual work time. Others are more sporadic, especially as the estimated “size” of stories grows. Teams using exponentially based story size schemes tend to be more precise with their estimates. Comparing the two charts in Figure 5.5 and Figure 5.6, it’s fairly evident that exponential estimation reflects a tighter band of variability in lead time—especially among smaller stories. The spectrum of variability is much wider across the Fibonacci series.

Figure 5.5

Figure 5.5 There tends to be only a loose correlation (if any at all) between estimated story points and actual lead time.

Figure 5.6

Figure 5.6 The correlation is slightly better in an exponential points scheme.

As with the data on sprint length, the way teams estimate the size and effort of work tends to correlate with overall performance (Figure 5.7).

Figure 5.7

Figure 5.7 The relationship between team performance and estimation schemes.

Source: © 2014 Rally Software Development Corp. All rights reserved. Used with permission.

It is not uncommon to find a lack of correlation between a team’s velocity, the average number of story points completed during a sprint, and actual throughput. Nonetheless, there are many instances when the estimating process works well for individual teams. As mentioned previously, it’s more challenging to work with this metric on a portfolio level involving multiple teams. Calculating relative correlation is one way that Scrumban helps teams gain a better sense of whether the ceremonies they engage in, such as estimation events, are providing sufficient value to justify the investment of time and effort.

Contextual Responsibilities

As teams begin to employ Scrumban, members will assume a range of new “responsibilities.”

First and foremost, team members must be groomed to challenge and question their team’s policies, facilitating team interest in and ownership of how they work. Fortunately, this basic concept should not be foreign to people already used to a Scrum context. A great way to promote and support this mindset is through the establishment of “working agreements.”5

Not surprisingly, Scrum’s existing ceremonies and artifacts implicitly support the notion of working agreements. Scrumban’s framework goes one step further, calling for the team to make such agreements explicit and to visualize them on their boards as appropriate. In many respects, they are akin to establishing an internal service level agreement between team members.

Ultimately, we want to develop leaders who help all team members acquire the ability to see and understand concepts like waste, blockers, and bottlenecks. Good leaders also ensure teams don’t get stuck in a comfort zone, by prodding team members to always seek out ways to improve.

As previously mentioned, although teams will experience work improvements simply from employing Scrumban independently, an active management layer is essential to realizing benefits at scale. Systemic improvements are less likely to occur without a manager who maintains contact, helps resolve issues outside the team’s domain, and oversees the extent to which teams devote time and effort on pursuing discovered opportunities for improvement. Setting expectations for this need at the kickstart helps ensure a more sustainable implementation.

  • + Share This
  • 🔖 Save To Your Account

InformIT Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.

Overview


Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

Collection and Use of Information


To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

Questions and Inquiries

For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

Online Store

For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.

Surveys

Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

Contests and Drawings

Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.

Newsletters

If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

Service Announcements

On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

Customer Service

We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

Other Collection and Use of Information


Application and System Logs

Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

Web Analytics

Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

Cookies and Related Technologies

This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

Do Not Track

This site currently does not respond to Do Not Track signals.

Security


Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.

Children


This site is not directed to children under the age of 13.

Marketing


Pearson may send or direct marketing communications to users, provided that

  • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
  • Such marketing is consistent with applicable law and Pearson's legal obligations.
  • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
  • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

Correcting/Updating Personal Information


If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.

Choice/Opt-out


Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

Sale of Personal Information


Pearson does not rent or sell personal information in exchange for any payment of money.

While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

Supplemental Privacy Statement for California Residents


California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

Sharing and Disclosure


Pearson may disclose personal information, as follows:

  • As required by law.
  • With the consent of the individual (or their parent, if the individual is a minor)
  • In response to a subpoena, court order or legal process, to the extent permitted or required by law
  • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
  • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
  • To investigate or address actual or suspected fraud or other illegal activities
  • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
  • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
  • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.

Links


This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

Requests and Contact


Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

Changes to this Privacy Notice


We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

Last Update: November 17, 2020