Home > Articles

mySAP: Determining a Suitable Stress-Test Business Process

This chapter will help you determine the best SAP stress-test for your business, taking into consideration the nature and functioning of your particular business, and what you need SAP to do.
This chapter is from the book

Finally, we are in a position to give attention to one of the core matters surrounding testing and tuning: determining a suitable stress-test business process or other "input data" test mix. After all, it is the mix of activities, transactions, and processes executed under your guidance that ultimately simulates your financials' month-end close or helps you understand the load borne by your SAP customer-facing systems during the holidays or other seasonal peaks. And it's the test mix that brings together master data, transactional data, customer-specific data, and other input necessary to fuel a business process from beginning to end. You've certainly heard the phrase "garbage in, garbage out." It applies without question here, because poor test data will never allow you to achieve your testing and follow-on tuning goals.

Beyond SAP application-level data, though, input data can also consist of the scripts, batch files, configuration files, and so on required of lower level testing tools, like those associated with testing the performance of your disk subsystem, network infrastructure, and so on. As you know by now, sound testing of your SAP Technology Stack encompasses much more than strictly business process testing.

The goal of this chapter is therefore to help walk you through the challenges surrounding data: how to select appropriate data, what to look for, and what to avoid. In this way, you'll be that much closer to conducting stress-test runs that not only "work"but also truly simulate the load planned for your production environment.

9.1 Overview—What's a Mix?

What exactly is a test mix? In true consulting fashion, the right answer is "it depends." The next couple of pages provide a high-level overview of a test mix, followed by more detailed discussions on this subject. For beginners, note that all test mixes must include a way to control timing, or the time it takes for a test run (or subtasks within a run) to actually execute. Sometimes this factor is controllable via the test mix itself or a test-tool configuration file, whereas other times it's left up to test-tool controller software or the team executing and monitoring the test runs.

Timing is not everything, though. Test mixes also typically allow the number of OS or other technology stack–based threads or processes to be controlled, another key input factor relevant to multiuser stress testing. In this way, the very load placed on a system may be controlled, making it possible to vary workload rates or individual users or processes without actually changing the absolute number of users or processes.

Unfortunately, we must also keep in mind that the assorted tools and approaches we have examined thus far differ greatly in terms of their fundamental execution and goals, and therefore their specific data-input-mix requirements. On the flip side, test mixes tend to adhere to a set of general rules of thumb, too. But rather than trying to lay down a dictionary-style definition, let's instead look at test mixes by way of example, working our way up the technology stack, as follows:

  • For network infrastructure testing, a test mix consists of factors like the number of data transfers or other network operations performed, the size of those transfers/operations, the use of configuration files that define the two end points necessary for testing (i.e., server-to-server testing), and even the protocols that may be tested (although for SAP testing, there's usually little reason to test anything other than TCP/IP-based RFC, CPIC, and ALE network activity).

  • For disk subsystem testing, a test mix defines the number of reads and writes to be executed (either as a ratio or an absolute number), the types of operations to be executed (sequential versus direct/random, or inserts versus appends), data block sizes, and even the number of iterations (rather than leveraging timing criteria, a test mix might instead specify the number of I/O operations each thread, process, or test run will execute, such as 1,000). And the number of data files or partitions against which a test is executed is also often configurable, as is the size of each data file. For instance, I often execute tests against six different data files spread across six different disk partitions, each 5GB in size, thus simulating a small but realistic 30GB "database."

  • For server testing, a test mix might include the number of operations that specifically stress a particular subsystem or component of the server (e.g., the processor complex, RAM subsystem, system bus). Server testing often is intertwined with network infrastructure and disk subsystem testing, too, and therefore may leverage the same types of data input as previously discussed.

  • At a database level, a test mix must reflect operations understood and executable by the explicit DBMS being tested. Thus, SQL Server test mixes may differ from Oracle in terms of syntax, execution, and so forth. But, in all cases, a database-specific test will reflect a certain type of operation (read, write, join) executed against a certain database, which itself supports a host of database-specific tuning and configuration parameters.

  • A single R/3 or mySAP component's test mix will often reflect discrete transactions (or multiple transactions sequentially executed to form a business process) that are executed against only the system being tested. I call these "simple" or "single-component" tests, because they do not require the need for external systems, and therefore CPIC-based communications, external program or event-driven factors, and interface issues stay conveniently out of scope. (Note that CPIC, or SAP's Common Programming Interface Communication protocol, allows for program-to-program communication.) The input necessary to drive these simple tests is transaction-specific, therefore, and relatively easy to identify: all input data that must be keyed into the SAPGUI (anything from data ranges to quantities to unique transaction-specific values like PO numbers, and more) and any master, transactional, or other client-specific data must be known, available, and plentiful. User accounts configured with the appropriate authorizations and other security considerations also act as input. The right mix essentially becomes a matter of the proper quantity of high-quality input data versus the amount of time a test run needs to execute to prove viable for performance tuning.

  • Complex R/3 or mySAP component test mixes, on the other hand, necessitate multiple components or third-party applications to support complex business process testing that spans more than a single system. Even so, the bulk of the input data may still be quite straightforward, possibly originating in a single core system. But like any complete business process, the output from one transaction typically becomes the input to a subsequent transaction. Cross-component business processes therefore require more detailed attention to input data than their simpler counterparts, because the originating system (along with other supporting data) must be identified as well.

Many of these examples are discussed in more detail later in this chapter. Suffice it to say, though, that a single TU or stress-test run can quickly grow complex from an input data perspective if several technology stack layers or systems are involved, especially in light of the many stack-specific test tools that might be used.

9.1.1 Change-Driven Testing

It's important to remember one of the fundamental reasons behind testing—to quantify the delta in performance that a particular change or configuration creates. I refer to this generically as change-driven testing, and believe that most if not all performance-oriented testing falls into this big bucket. But it's impossible to quantify and characterize performance without a load behind the change. In other words, things don't tend to "break" until they're exercised. The strength and robustness of a solution remains unproven, just like the maximum load a weight-trainer can lift, until it (or he or she) successfully supports or lifts it. The "it" is the workload, in essence the manipulation or processing of input data. Thus, it only follows that a good performance test depends on a good mix of data, data that has been identified and characterized as to their appropriateness in helping you meet your simulation and load-testing goals. Not surprisingly, stress tests executed without the benefit of adequate test mix analysis, or workload characterization, often fall short in achieving their goals.

9.1.2 Characterizing Workloads

Given that the workload a test run is to process is central to the success of stress testing, characterizing or describing that workload is paramount as well. Workloads vary as much as the real world varies, unfortunately, so there's no easy answer. Key workload considerations as they pertain to the SAP application layer include the following:

  • The mix or ratio of online users to batch jobs, reports, and other similar processes, reflecting a particular business scenario.

  • The mix of functional areas, such that the resulting mix reflects a sought-after condition (e.g., month-end closing for the entire business) or state (e.g., an average daily load).

  • The pure number of online users, batch processes, report generators, and so on leveraged to create a representative load on an SAP system.

  • The predictability of a workload, so that apples-to-apples comparisons may be made against subsequent test runs—in other words, avoiding true randomization when it comes to input data is important. True randomization needs to be replaced instead with pseudorandom algorithms that are at once "random" and repeatable.

  • The quantity of data. Low quantities result in higher cache hit rates as more data may be stuffed into hardware- and software-based caches, therefore exercising the disk subsystem less than the workload truly would otherwise.

  • The quality of data. Only unique combinations of customers, materials, plants, and storage locations "work" to create a sales order, for example. Other combinations result in error messages in the best of cases, and more often in failed or simply incompletely executed business processes.

The configuration of the system itself in relation to performance also represents "input" to some extent as well, though I prefer to keep this separate, under the guise of configuration or current-state documentation as discussed in the previous chapter.

But how do you determine what a representative workload or data mix looks like? In the past, I've spent most of my time speaking with various SAP team members to answer this question. Functional leads are your best bet, because they know the business as well as anyone else on the SAP technical team. But SAP power users representing each of the core functional areas are also valuable sources of workload information, as are management representatives of the various functional organizations found in most companies.

Outside of people resources/team members, you can also determine the "Top 40" online and batch processes through CCMS's ST03 or ST03N, both of which display detailed transaction data over defined periods of time. That is, you can drill down into the days preceding or following month-end to identify precisely the transactions that are weighing down the system most in terms of CPU, database, network, and other loads, along with relative response-time metrics. Particularly heavy hours, like 9 a.m. to 11 a.m., or 1 p.m. to 4 p.m., can be scrutinized closely as well. These "top transactions" can be sorted in any number of ways, too: by the total number executed, the peak hour executed, and even by greatest to least impact (highlighting the transactions that beat up the database or CPU or network the worst, for example). And, because CCMS since Basis release 4.6C allows you to globally view ST03 data across an entire SAP system, the once time-consuming job of reviewing individual application servers for your production BW system, for example, is no longer necessary.

Once you understand the particular transactions and business processes representative of a particular workload or regular event/condition, you must then consider how you will represent this workload in a stress test. To make the process of workload characterization as flexible and manageable as possible, I like to create "packages" of work. Within a package, the work is typically similar in nature—online transactions that focus on a particular business process or functional area, batch processes that execute against a particular set of data or for a similar period of time, and so on might make sense for you. Besides maintaining a certain functional consistency, I also chop up the packages into manageable user loads. For example, if I need to test 600 CRM users executing a standard suite of business activities, I'll not only create a set of business scripts to reflect those activities, but I'll also create perhaps six identical packages of 100 virtual users each, so as to easily control and manage the execution of a stress-test run. If the workload needs to be more granular and represent five core business activities, I might divide the packages instead into these five areas (where each package then reflects unique business process scripts), and work with the business or drill down into the CCMS to determine how many users need to be "behind" or associated with each package. Even then, if one of these granular and functionally focused packages represents many users, I would still be inclined to chop it up further as just mentioned and as shown in Figure 9-1. More details regarding the creation and divvying up of test packages are discussed later in this chapter.

09fig01.gifFigure 9-1 Once your workload is characterized and sorted by tasks, functional areas, or business processes, it makes sense to further divide the workload into smaller and more manageable packages, each reflecting a representative user mix.

9.1.3 Don't Forget to Baseline!

Beyond baselining and documenting the configuration to be stress tested, each test mix also needs to be characterized by way of a baseline. The baseline serves as documentation relevant to a particular state—the configuration of the system in relation to the workload that it can support—so that future changes to either the technology stack or workload are not only easily identified but also specifically quantified in terms of performance. In my experience, these initial performance baselines tend to multiply rather quickly. For instance, I'll often begin initial load testing by working through a number of specific configuration alternatives, testing the same test mix against each different configuration in the name of pretuning or to help start off a stress-test project on the best foot possible. Each alternative rates a documented baseline, but only the most promising end-results act as a launch pad for further testing and pretuning iterations.

Once a particular configuration seems to work best, I then move into the next phase of baselining, called workload baselining. This phase is very much test mix–centric, as opposed to the first phase's configuration-centric approach. The idea during these workload characterization and baselining processes is to maintain a static system configuration (i.e., refrain from tuning SAP profiles, disk subsystems, etc.), so as to focus instead on monitoring the performance deltas achieved through executing different workloads. All of this is performed by way of single-unit testing, of course, to save time and energy. The goal is quite simple, too: to prove that a particular workload indeed seems to represent the load described by the business, technical teams, or CCMS data. The work of scripting a true multiuser load followed by real-world performance tuning comes later, then, after you've solidified your workloads and executed various stress-test scenarios as depicted in the next few chapters.

Baseline testing is useful from a number of different perspectives. For instance, it's useful even when it comes to testing a non-Production platform, like your Development or Test/QA systems deployed prior to building, configuring, and initially optimizing the Production platform. That is, you can still gain a fair bit of performance-related knowledge testing against a non-Production system, both in terms of single-unit and multiuser stress testing. And this knowledge can pay off simply by helping you create a faster development experience for your expensive ABAP and Java developers or by creating a functional testing environment for a company's unique end-user base. This is especially true prior to Go-Live, when the production environment is still being built out, but the need for initial single-unit or "contained" load testing is growing ever more critical. The same goes for any change-driven testing as well, however, such as that associated with pending functional upgrades, technical refreshes, and so on.

InformIT Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.

Overview


Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

Collection and Use of Information


To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

Questions and Inquiries

For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

Online Store

For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.

Surveys

Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

Contests and Drawings

Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.

Newsletters

If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

Service Announcements

On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

Customer Service

We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

Other Collection and Use of Information


Application and System Logs

Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

Web Analytics

Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

Cookies and Related Technologies

This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

Do Not Track

This site currently does not respond to Do Not Track signals.

Security


Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.

Children


This site is not directed to children under the age of 13.

Marketing


Pearson may send or direct marketing communications to users, provided that

  • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
  • Such marketing is consistent with applicable law and Pearson's legal obligations.
  • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
  • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

Correcting/Updating Personal Information


If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.

Choice/Opt-out


Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

Sale of Personal Information


Pearson does not rent or sell personal information in exchange for any payment of money.

While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

Supplemental Privacy Statement for California Residents


California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

Sharing and Disclosure


Pearson may disclose personal information, as follows:

  • As required by law.
  • With the consent of the individual (or their parent, if the individual is a minor)
  • In response to a subpoena, court order or legal process, to the extent permitted or required by law
  • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
  • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
  • To investigate or address actual or suspected fraud or other illegal activities
  • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
  • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
  • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.

Links


This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

Requests and Contact


Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

Changes to this Privacy Notice


We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

Last Update: November 17, 2020