The Future of Technology, Part 1
In my opinion, among understanding technology's past, trying to manage it in the present, and understanding its future, understanding the future is by far the easiest task. The future of a technology can generally be mapped by observing and following logical progression, using known precedence. when observing new and pervasive technologies, almost all technologies fall into the simple S-curve pattern: Substitution increases slowly at first, faster as acceptance grows, and more slowly again as saturation is reached.
But plotting technology is the easy part. Deciding which technologies to chart is more difficult. Yet that's what we're all asked to do, in one way or another, almost every day.
Of course, charting and predicting technology in the 21st century is drastically different than in the 20th. The 19th and 20th centuries were marked with technological expansions in particular and concise areas. Vacuum tubes and transistors were the products of electronics, propellers and jet engines of aeronautics. These and many others are tangible, insular examples of technological progress over the course of time. But the Digital Age is going to be judged quite differently.
The pervasive nature of digital technology and the role it plays in conjunction with other technologies makes it more difficult to insulate areas of progress. Almost every branch of science, from biogenetics to geographical tectonics to quantum physics, depends on the power of digital computers to process, record, and display collated data. Manufacturing and engineering have also become increasingly dependent on digital automation. The social and political sciences now rely heavily on digital statistical analysis and demographic data. Even the entertainment industry uses computer-generated characters. Knowing this, it becomes evident that even a small incremental improvement in digital technology can lead to a wide variety of advancements on numerous economic fronts.
Viewing the classic corporation as a whole, we can see that the modern manager is faced with a twofold problem. On one hand, corporate leaders must first understand what their internal dependence on digital computing systems means to their core businesses. On the other, they must understand how digital computing affects their entire interdependent value-chain network. As if it wasn't hard enough to keep track of the in-house expenditures on PCs alone, now you have to keep track of everyone you do business with. This is more than a little daunting. Inevitably, even the smallest business must be able to respond to or at least understand technology. Larger corporations with their greater resources must do more. Large industries and the corporations that comprise them can no longer afford to wait for external innovations. Even the largest businesses must be more proactive in determining what type of technology they need to create, utilize, or disseminate.
Insurance companies, banks, and retail chains all invest in technology research and development (R&D) to a certain extent. Many nontechnology companies assume that they don't need to do R&D in technology. Technology may not have a formal category in the general ledger or a full-time staff, but every company, small or large, has some investment in computers, networks, and software; therefore, there is commitment to get these technologies to work. Every time a company upgrades Microsoft Word, for example, it's actually reaching into the R&D budget, whether that budget is real or imagined. Usually, this investment pays off in improved time and effort spent by individuals to perform basic tasks, or ensured workability.
But that's just the beginning of the investment. There will be more phone calls to the help desk when the company upgradesâ€”perhaps significantly more. Not only is the corporate help desk the first line of defense in addressing new and unknown issues, but the help desk staff is expected to find answers. This is a formidable task, considering that few people with a Ph.D. in technology are available to answer calls and solve problems. My point is that once a company or institution decides to adopt technology, in whatever form, it's also agreeing either to make the technology conform to its needs or to have its needs conform to the technology. New technology results in one resolution or the other, because no technologyâ€”unless created for some express purposeâ€”is going to meet 100% of an organization's needs right out of the box! There always has to be some degree of customization.
When technology is developed in-house, whether for internal or external use, the R&D costs are straightforward. The ambiguity in R&D expense within a firm begins when purchasing technology. The cost of integration for in-house technology is usually easy to assess, since it can be calculated quickly from the original budget and project plan. The costs are usually fixed in labor hours (time and materials) for analysis, design, coding, testing, and integration, with fixed costs for hardware, installation, and training. This cost breakdown is pretty similar in the case of "purchased" technology as well. Very rarely, even in technology organizations, are the specific costs of design and coding taken into consideration, other than in man-hours. Most firms assume that since they don't have to design or code the application, these costs don't factor into total costs. This may be true, but designing and coding costs are simply replaced with the internal costs of integration. It's widely assumed that as soon as a new technology is put into place, it starts paying dividends right away. However, hard experience has taught almost all of us that this is a fallacy.
Change always means costs in one form or another. Perhaps this is one of the reasons that most people try to avoid change. In an economic sense, cost is the same as price, and price is an extension of value. Value is the meeting point between the supplier and the consumer. Without these two parties agreeing on value, there is no interaction. If there is no interaction, there is no market. We must take into consideration that although regarded as nearly synonymous, price and cost are still different. Price is a definite, agreed-upon value expressed in terms of money, livestock, or some other kind of goods or services that have some merit to human beings.
Cost can be the same as price, but it can also include other values that are not associated with material goods. Time has a definite value, but can rarely have a single defined price for everybody. The value of time is usually measured by the value of a service. But how do we value that service? How do we value our own time? Economics terms the price of spending time, effort, and energy on one exercise (as opposed to another) as opportunity costs. For instance, after graduating from high school, you have a choice to go to work for $25,000 per year or go to college and probably pay $25,000 per year. Initially, it appears that college puts you $25,000 in the hole. But without a college education, your "opportunities" for increasing your income over time are dramatically reduced. Going to college, on the other hand, increases opportunities (at least in theory). You may earn $25,000 a year straight out of high school, and let's say that 20 years later you are earning $50,000. However, the average starting salary for college graduates with software-related degrees is over $40,000. (Source: Remarks by Harris N. Miller, President, Information Technology Association of America, April 21, 1998, before the House Subcommittee on Immigration, http://www.house.gov/judiciary/6095.htm.) Theoretically, the average college graduate should take far less than 20 years to increase his or her salary by that $10,000 difference. (Ideally, two individuals with equal talents and skill should eventually achieve the same level of income, but this is an ideal.) Opportunity costs simply measure the tradeoff between what you have and what you could have had.
However, the concept of measuring opportunity costs in a business rarely arises, except at the highest levels of management or under the direction of an actuary. Apparently, since the interpretation of these costs has always been somewhat subjective, either they are given scant notice, thereby becoming absorbed by the final "price tag," or they are simply ignored altogether. This can lead to enormous problems when whole businesses try to reengineer or outsource.
Projected budgets only take into account the price of actual goods and services on a timely basis. Future costs of doing business in a new way, let's call it System B, can always look more attractive than doing business the same old way, System A. What's difficult to measure is the true cost of migration from System A to System B. Sooner or later, the reengineered company must retrain its staff to operate in this new way. Similarly, companies that outsource can't leave all the technology development or deployment in the hands of strangers. Technology by itself can rarely solve large problems; however, technology in the hands of people who are knowledgeable and trained can solve almost any problem. This goes for technology applied within an organization as well as technology created for consumer use.
In light of the history of technology, we can theorize that options analysis (similar to the kind used for pricing financial futures) is the most reliable way of predicting the possible impact of new technologies. Options analysis is not a technique based on an opportunity or situation that may or may not present itself; rather, options analysis allows positioning for any number of opportunities that may or may not arise. The law of averages dictates that even if most of these opportunities don't pan out, eventually one or several must. Technology is just as conducive to hedging as a commodity or stock price.
But the real question about technology is more than a matter of economic potential. Technology is powerful because it actively, genuinely improves our livesâ€”at least in theory, and occasionally in practice. To really take advantage of these improvements, we simply have to be able to predict "potentialities." Thomas Edison saw the need for motion pictures but never saw the need for motion picture projectors. This is what makes technology so interesting both as an investment and as a mental exerciseâ€”this ability to branch into new and different technologies.