Home > Articles

This chapter is from the book

12.4 MVP Planning

When a product is a new-market innovation, you can’t prioritize features reliably upfront because customers themselves often won’t know what they want until they see it. The lean startup approach,2 introduced earlier in this book, addresses this problem by running experiments on customers—short-circuiting “the ramp by killing things that don’t make sense fast and doubling down on the ones that do.”3

12.4.1 What Is an MVP?

A minimum viable product (MVP) is a low-cost, experimental version of the product or feature used to test hypotheses and determine if it’s worth fully investing in it. According to Eric Ries, the inventor of lean startup, an MVP is “that version of the product that enables a full turn of the Build-Measure-Learn loop with a minimum of effort and the least amount of development.”4 MVP is not (as often thought) the first version of the product released to the market. It’s a version meant for learning—a means to test hypotheses and to determine the minimum set of features to include in a market-ready product. The minimal releasable version of the product is referred to as the minimum marketable product (MMP).

12.4.2 MVP Case Study: Trint

You only really understand why MVPs are so crucial to the success of innovative product development when you see a real example of the process. That was the case as I followed the story of Trint, a company founded by Emmy-winning reporter, foreign and war correspondent (and good friend) Jeffrey Kofman. Like many late-stage entrepreneurs, Kofman set out to solve a problem he understood intimately because it had bothered him throughout much of his previous professional life: every time Kofman had to transcribe an interview by hitting PLAY, STOP, TRANSCRIBE, and REWIND, he couldn’t understand why he was still using a process that had remained virtually unchanged since the 1960s and 1970s. Why wasn’t artificial intelligence (AI) being used to automate the speech-to-text transcription? He knew the reason: journalists can’t risk inaccuracies. Since AI makes mistakes, journalists wouldn’t use an AI-based product unless there was a way to verify the content. The real problem, then, was how to leverage automated speech-to-text in order to get to 100 percent accuracy.

Kofman knew that if he could solve that problem, he would have a winning product. Furthermore, he knew that if his team could solve it for journalists—whom he knew to be unforgiving—they could solve it for anybody. He concluded, therefore, that the most important leap of faith hypothesis for the product was that the company could find a way for users to correct errors in place in order to deliver transcripts that could be verified and trusted. As Kofman saw it, his team needed to create a layer on top of AI (the automated speech-to-text component) so that the AI part would do the heavy lifting of transcription, allowing the user to focus on quicker tasks: search, verify, and correct. He believed that by using this approach, he could reduce the time to perform a task that would normally take hours to complete down to minutes or even seconds. From earlier chapters of this book, you’ll recognize Kofman’s steps as the beginning of the MVP process: the articulation of the problem, vision, and leap of faith hypotheses for the product.

To create the MVP, Kofman gathered a team of developers with experience in audio-to-text alignment using manually entered text. He challenged them to hack together an MVP version that would automatically transcribe speech to text and allow a user to edit it.

The company’s first MVP was built in just three months. Kofman decided to use some of his limited seed funding to invest in user lab testing. He brought in a group of journalists for the testing day. Interestingly (as is often the case), the first MVP was “wrong.” While the journalists liked the concept, they struggled to use the product, finding it annoying to switch back and forth between editing and playback modes. (The original design used the space bar as a toggle between modes and as the text space character during editing, confusing users.) As Kofman told me, “Good innovative products should solve workflow problems; this was creating new ones.” And so, using feedback from the MVP, he asked the developers to build a new user experience with a better workflow.

MVP isn’t just about one test; it’s a process. Fifteen months into the project, in early 2016, the company developed a more refined version of the MVP. Kofman was ready to prove his hypothesis that there was a strong market for the product. At this point, the product provided much of the core functionality needed by users, such as the ability to search for text to locate key portions of an interview. However, it still lacked key components required to make it fully ready for the market. For example, there were no mechanisms for payments or pricing.

Through his extensive network of journalistic colleagues, Kofman let it be known that they would be opening up the product for free usage during one week of beta testing. When the testing began, things proceeded normally until an influential journalist at National Public Radio sent out a highly enthusiastic tweet, causing usage to soar. At ten thousand users, the system crashed. It took the company two days to get back online, but the test proved beyond a doubt that there was a market for the product.

Today, Kofman views that one day of MVP lab testing as perhaps the most important action taken by the company in its early days because it caused developers to change direction before spending a lot of time and money on a failed solution. The lesson, as Kofman tells it, is this: “You have to test your ideas out on real people”—the people who will actually use your product.

In previous chapters, we examined how to identify the leap of faith hypotheses that must be tested and validated for the product to be viable. Now, we focus on the next step: planning the MVPs that will test those hypotheses.

12.4.3 Venues for MVP Experiments

Since an MVP is only a test version, one of the first things to consider is where to run the test and who the MVP’s testers will be. Let’s explore some options. Testing in a Lab

A user testing lab may be internal or independently operated by a third party. Testing labs provide the safest venue for testing, making them appropriate for testing in highly regulated mainstream business sectors, such as banking or insurance, where there is minimal tolerance for errors. Because the lab setting provides an opportunity to gain deep insight into users’ experience of the product, it’s also an ideal venue for MVP testing at the beginning of innovative product development when it’s critical to understand customer motivations and the ways they use the product.

The testers should be real users. However, in cases where the requirements are stable, proxies may be used (e.g., product managers with a strong familiarity with the market). Include testers familiar with regulations governing the product, such as legal and compliance professionals, to identify potential regulatory issues. Testing MVPs Directly in the Market

The most reliable feedback comes from MVP-testing in the marketplace to a targeted group of real customers. Consider this option for new-market disruptions, where first adopters are often willing to overlook missing features for novelty. This option is also advised for low-end disruptions, where customers are willing to accept reduced quality in return for a lower price or greater convenience. Dark Launch

Another way to limit negative impacts during MVP feature testing is to dark-launch it—to stealthily make it available to a small group of selected users before broadening its release. If the feature is not well received initially, it can be pulled back before it impacts the product’s reputation; if customers like it, it is developed fully, incorporated in the product, and supported. Beta Testing

A beta version is an “almost-ready-for-prime-time” version—one that is mostly complete but may still be missing features planned for the market-ready version. Beta testing is real-world testing of a beta version by a wide range of customers performing real tasks. Its purpose is to uncover bugs and issues, such as usability, scalability, and performance issues, before wide release.

Feedback and analytics from beta testing are used as inputs to fix remaining glitches and address user complaints before releasing the product or change to the market. Split testing may also be performed at this time—whereby one cohort of users is exposed to the beta version while a control group is not.

Beta testing is not just for MVPs; it should be a final testing step after internal alpha testing for all new features and major changes before they are widely released.

12.4.4 MVP Types

When planning an MVP, the objective is to hack together a version of the product or feature that delivers the desired learning goals as quickly and inexpensively as possible. The following are strategies for achieving that. One MVP might incorporate any number of these strategies.

  • Differentiator MVP

  • Smoke-and-Mirrors MVP

  • Walking Skeleton

  • Value Stream Skeleton

  • Concierge MVP

  • Operational MVP

  • Preorders MVP

These MVPs are described in the following sections. Differentiator MVP

At the start of new product development, the most common strategy is to develop a low-cost version that focuses on the product’s differentiators. This was the approach we saw taken earlier by Trint. Using existing components, the company was able to piece together an MVP demonstrating the differentiating features of its product (speech-to-text auto-transcription plus editing) and validating its value in just three months.

Another example is Google Docs, which began as Writely. Writely was an experiment by Sam Schillace to see what kind of editor could be created by combining AJAX’s (JavaScript in the browser) content-editable functionality with word-processing technology.5 Early versions focused on the product’s key differentiators—its speed, convenience, and collaborative capabilities—while leaving out many other word-processing features, such as rich formatting and pagination. The hypothesis was that users would be excited enough about the differentiators to ignore the lack of richness in other areas. Interestingly, real-time collaboration on documents—which became a differentiating feature—was not seen as a primary one at the time; it was included because it seemed like the most natural way to solve the problem of documents worked on by multiple people.

The first version of the original product was pulled together quickly, using the browser for most of the editing capabilities and JavaScript to merge the local user’s changes with those of other users. The client-side JavaScript amounted only to about ten pages of code.6 Over time, the company added more word-processing features when it became apparent that they were essential to users and in order to open up new markets. Just one year after Writely was introduced, it was acquired by Google. Within the first month of its adoption, about 90 percent of Google was using it. Smoke-and-Mirrors MVP (or Swivel Chair)

A Smoke-and-Mirrors MVP approach provides the user with an experience that is a close facsimile of the real thing but is, in fact, an illusion—like the one created by the magician pulling strings behind the curtain in the movie The Wizard of Oz.

One of my clients, a cable company, used this approach to provide an MVP frontend for customers to configure their own plans. The site operated in a sandbox, disconnected from operational systems. Behind the scenes, an internal support agent viewed the inputs and swivel-chaired to an existing internal system to process the request. The customer was unaware of the subterfuge. The MVP allowed the company to test the hypothesis that customers would want to customize their own plans before investing in developing the capability. Walking Skeleton

A Walking Skeleton, or spanning application, validates technical (architectural) hypotheses by implementing a low-cost end-to-end scenario—a thin vertical slice that cuts through the architectural layers of the proposed solution. If the Walking Skeleton is successful, the business will invest in building the real product according to the proposed solution. If it is unsuccessful, the technical team goes back to the drawing board and pivots to a new technical hypothesis.

For example, in the Customer Engagement One (CEO) case study, the organization plans an end-to-end scenario for ingesting text messages from a social-network application, saving the messages using the proposed database solution, retrieving them, and viewing them as a list. Another example is Trint, whose first MVP incorporated the end-to-end scenario from speech to text to editing in order to validate the architectural design for the product. Value Stream Skeleton

A Value Stream Skeleton implements a thin scenario that spans an operational value stream—an end-to-end workflow that ends with value delivery. It’s similar to a technical Walking Skeleton except that it validates market instead of technical hypotheses. It covers an end-to-end business flow but does not necessarily use the proposed architectural solution.

The intuitive sequence for delivering features is according to the order in which they’re used. For example, you might begin by delivering a feature to add new products to the product line for an online store and follow with features to receive inventory, place an order and fulfill an order. Not only does this sequence minimize dependency issues, but it also enables users to perform valuable work while waiting for the rest of the system to be delivered. I usually took this approach in my early programming days. The problem with it, though, is that it results in a long lag until an end customer receives value (e.g., a fulfilled order). In a business environment where there is a strong advantage in being fast to market, that kind of lag is unacceptable. Another problem is that it can delay the time until a company can begin receiving revenue from customers.

A Value-Stream Skeleton avoids these problems by delivering quick wins that implement thin versions of the end-to-end value stream, often with reduced functionality.

The first version of a Value-Stream Skeleton focuses on the value stream’s endpoints—the entry point where the customer makes a request and the endpoint where the customer receives value. Workarounds are often used for the missing steps. For example, the first MVP for an online store allows a customer to purchase a few select products. The product descriptions and prices are hardcoded into the interface instead of being pulled from a database. This lowers development costs. The products are offered only in a single geographic region—simplifying the business rules and delivery mechanisms that the MVP implements. Despite the thinness of the MVP, it provides learning value to the business and real value to an end customer, who can already order and receive the products with this early version. As the business grows, the MVP evolves to handle more products and a broader geographical region. Concierge MVP

The Concierge MVP7 is based on the idea that it’s better to build for the few than the many. Early versions are aimed at a small submarket that is very enthusiastic about the product, and the learning gained from the experience is used to scale the product. One example of a Concierge MVP is Food on the Table,8 an Austin, Texas, company that began with a customer base of one parent. The company met with the parent once a week in a café to learn the parent’s needs and take orders. The orders were filled manually. The process was repeated for a few other customers until the company learned enough to build the product.

As the example illustrates, you begin the Concierge MVP approach by selecting a single, real customer. The first customer can be found through market research, using analytics to determine the desired customer profile and inviting a customer who fits the profile to act as an MVP tester. Alternatively, you can select the first customer from among individuals who have previously indicated an interest in the product. This customer is given the “concierge treatment”—served by a high-ranking executive (e.g., vice president of product development) who works very closely with the customer, adding and adjusting features as more is learned.

At this stage, internal processes are often mostly manual. A company might spend a few weeks working with the first customer in this way, learning what that person does and does not want, and then select the next customer. The process is repeated until the necessary learning has been obtained and manual operations are no longer viable—at which point the product is built and deployed. Operational MVP

An MVP isn’t always created to validate software hypotheses and features; it can also be used to test operational hypotheses and changes. In a real-life example (which I’ll keep anonymous to protect the company), a company created an MVP to test the impact of a price hike on sales. The MVP displayed the higher price to a select group of customers, but behind the scenes, the customers were still being charged the regular, lower price. Once the learning objective was achieved, customers received an email notifying them that they had been part of a test group and that no extra charges were actually applied. Preorders MVP

The most reliable and cost-effective way to test a value hypothesis that customers will pay for an innovative product is to offer a means to order it before it’s actually ready. The MVP can be something as simple as a promotional video or demonstration prototype. It may employ a stripped-down ordering process, such as order by email attachment, order by phone, or an online ordering site with hardcoded options. An MVP of this type might not require any stories—or it might need a few small stories (e.g., to set up a simple frontend for placing orders).

My own company, Noble Inc., used this approach when we were considering developing a product to provide a 360-degree evaluation of the business analysis practice in an organization. For the MVP, we developed a facsimile of the product and demonstrated it to our clients in an attempt to generate presales. What we learned was that there wasn’t enough interest to justify building the real thing. Despite the failure of the test, I consider it money well spent. Imagine if we had learned it only after a large investment!

Dropbox’s version of this MVP strategy played out much better. Dropbox posted a video of its product,9 illustrating its main features. The video received enthusiastic and voluminous feedback from potential customers—making the case for the product and generating important suggestions about features and potential issues that were incorporated into the first marketed version.

12.4.5 MVP’s Iterative Process

You don’t just create an MVP and test it once. The MVP process is iterative. Its steps are as follows:

  1. Establish an MVP to test hypotheses.

    Specify an MVP to test one or more leap of faith hypotheses (e.g., using any of the MVP types discussed in the prior section).

  2. Tune the engine.

    Make incremental adjustments to fine-tune the product on the basis of feedback from customers as they use the product.

  3. Decision point: persevere or pivot.

    After tuning for a while, decide whether to persevere with the business model or pivot to a different hypothesis.

12.4.6 The Pivot

A pivot is a switch to a different hypothesis based on a failure of the original premise. A company may decide to pivot near the start of a product’s development due to the MVP process described previously. Alternatively, the pivot may occur at any time in a product’s life if it becomes apparent there is no market for the product, and the product should be reoriented toward a new market or usage.10 An example of a pivot to an established product is Ryanair, once Europe’s largest airline (based on passenger numbers).11 Back in 1987, when the company realized it was failing financially, it pivoted to a low-end, disruptive revenue model based on the hypothesis that customers would be willing to pay for meals and other perks in return for cheap fares. The hypothesis was borne out when customers flocked to the airline.12 More recently, in response to Brexit, the company has again pivoted—this time away from the United Kingdom to a business model based on growth outside of it.13 Constructive Failures

A pivot represents a failed premise, but, as the Ryanair example shows, the failure can often be constructive. In fact, many of today’s successful companies are a result of such failures. For example, Flickr resulted from the failure of a previous offering—Game Neverending.14 When the original product failed, the company pivoted by turning it into a successful photo-sharing app, leveraging the lessons it had learned about the value of community and the social features it had developed for the game (such as tagging and sharing). Groupon is another example. Conceived initially as an idealistic platform for social change, it then pivoted to become a platform for those seeking a bargain.

12.4.7 Incrementally Scaling the MVP

An effective way to develop a product is to start with a manual MVP and automate and scale it incrementally as the product grows. This approach was used by Zappos, an online shoe store.

Here’s how the process played out, as described by the company’s founder: “My Dad told me … I think the one you should focus on is the shoe thing. … So, I said okay, . . . went to a couple of stores, took some pictures of the shoes, made a website, put them up and told the shoe store, if I sell anything, I’ll come here and pay full price. They said okay, knock yourself out. So, I did that, made a couple of sales.”15 In 1999, the company signed on a dozen brands—all men’s brown comfort shoes. As they added more respected brands, such as Doc Martens, the company and market grew and, in tandem, Zappos automated and scaled its business systems and processes.

12.4.8 Using MVPs to Establish the MMP

Using the MVP process, a company can quickly and inexpensively validate through experimentation which features will make the most difference. These features are referred to as the minimal marketable features (MMFs). An MMF is the smallest version of a feature (the least functionality) that would be viewed as valuable by customers if released to the market. MMFs may deliver value in various ways, such as through competitive differentiation, revenue generation, or cost savings. Collectively, the MMFs define the minimum marketable product (MMP)—the “product with the smallest feature set that still addresses the user needs and creates the right user experience.”16

  • + Share This
  • 🔖 Save To Your Account

InformIT Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.


Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

Collection and Use of Information

To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

Questions and Inquiries

For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

Online Store

For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.


Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

Contests and Drawings

Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.


If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

Service Announcements

On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

Customer Service

We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

Other Collection and Use of Information

Application and System Logs

Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

Web Analytics

Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

Cookies and Related Technologies

This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

Do Not Track

This site currently does not respond to Do Not Track signals.


Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.


This site is not directed to children under the age of 13.


Pearson may send or direct marketing communications to users, provided that

  • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
  • Such marketing is consistent with applicable law and Pearson's legal obligations.
  • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
  • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

Correcting/Updating Personal Information

If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.


Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

Sale of Personal Information

Pearson does not rent or sell personal information in exchange for any payment of money.

While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

Supplemental Privacy Statement for California Residents

California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

Sharing and Disclosure

Pearson may disclose personal information, as follows:

  • As required by law.
  • With the consent of the individual (or their parent, if the individual is a minor)
  • In response to a subpoena, court order or legal process, to the extent permitted or required by law
  • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
  • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
  • To investigate or address actual or suspected fraud or other illegal activities
  • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
  • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
  • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.


This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

Requests and Contact

Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

Changes to this Privacy Notice

We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

Last Update: November 17, 2020