Home > Articles > Operating Systems, Server > Solaris

  • Print
  • + Share This
Like this article? We recommend

Architecting the Migration

The first phase in the SunTone migration methodology involves architecting the solution. At this stage, you assess the existing environment and design a first-cut architecture.

Assessing the Current Environment

The next step in the migration is the assessment of the existing application and the associated environment. This will allow you to create a risk list that can be used to identify any areas of the project that might require a proof of concept to ensure that the project can be completed. The outcome of the assessment is a risk list (where appropriate) and a work breakdown structure that details the amount of effort required to migrate the application and the associated environment. This work breakdown structure is then used to create a plan and schedule various activities, overlapping independent subtasks, where appropriate.

For custom-written applications, provide the migration team with a snapshot of the application source and associated infrastructure to serve as a baseline for the migration activity. When possible, you should also acquire a build log for the application. This log will provide the following information:

  • Tools used

  • Options provided to these tools

  • Source that is compiled

  • Libraries that are linked

  • Order in which symbols are resolved

Although development documentation is welcome, a simple build log can serve as a guide to the "facts on the ground." It will show how the application is actually built.

In the following sections, we explore the assessment process.

Assessing the Application Infrastructure

Scripts provide an easy way for IT staff or administrators to create tools to administer an application, analyze or modify data, and provide functional support for an application. Scripts can leverage utilities that exist elsewhere in the operating environment to perform various administrative tasks. In addition, scripts will identify which utilities are used, as well as the options that are specified.

When the application is migrated, the associated scripts must be migrated to the new environment as well. Although the script tool (for example, Ksh, bash, sh, csh, PERL, or Python) might support the same syntax in the new environment, the location of the programs or files used by the script might be different in the new environment. Additionally, the options of the programs called by the scripts might also require modification.

Ensure that a version of the script tool is available in the new environment.

Analyze Scripts

The Perl utility is becoming popular as a scripting tool because of its power and flexibility. However, the venerable shell is still the script tool of choice for most developers, primarily because of its availability across a variety of platforms and environments.

When assessing shell scripts, check each command for the following conditions:

  • Command is unavailable on the Solaris OS.

  • Command is in a different location and the location is not in the user's path.

  • Command uses a flag that does not exist on the Solaris OS.

  • Command uses a flag that has different functionality on the Solaris OS.

  • Output of a command is different and is redirected.

This check can be done manually or through the use of the scriptran tool.

The following sample presents the analysis of the issues associated with the shell scripting used with the Tru64 example.

alias      |   27
ar         |  365
cc         |  86
colrm      |  1
df         |  14
du         |  1
e          |  2
ed         |  1
egrep      |  68
expr       |  11
fold       |  1 
get        |  1 
iostat     |  1 
ipcs       |  45 
ld         |  13 
lex        |  4 
ln         |  177 
lpr        |  2 
make       |  1 
mcopy      |  1 
more       |  12 
mt         |  3 
netstat    |  5 
printenv   |  5 
sleep      |  94 
stty       |  1 
style      |  219 
tail       |  61 
tset       |  1 
vmstat     |  26 
w          |  5 
wait       |  14 
whoami     |  2 
xconsole   |  3
xhost      |  4 
xlsclients |  1 
xset       |  14 
xsetroot   |  4 
xterm      |  1 
yacc       |  4 
Total: 40  |  1301

Analyze Build Tools

When working with a custom application, you also have to migrate the tools used to build the application executable. These usually include a compiler, a source code management system, and the build environment used to create the executable. Additionally, any third-party products that were used to build the application must be migrated.

Obtaining a build log created when the application was last built is the best way to ensure that the build process and the tools involved in that process are identified. Be certain that you understand the semantics of the options that were specified when the application was built. Although tools in the new environment will most likely support the required functionality, different options might have to be specified to invoke the desired behavior. For example, static linking, position-independent code, extended symbol table information, and the like might require the use of new and different options.

In this example, the assessment reveals that a number of development tools are currently available on another Sun platform within the enterprise. Although this development environment has not been used to create the Tru64 version of the application you want to port, you can leverage some of the existing tools that are available (for example, compilers and debuggers). Assume that you have determined that this platform can be used for the migration exercise.

Determine Third-Party Products Usage

While all applications depend on support from the operating environment and associated utilities, many applications are also designed to work with the functionality provided by third-party products that are integrated into the execution architecture. When the application is ported, this supporting software must be ported as well, as part of the application infrastructure. In the example, the most significant piece of third-party software is the Sybase database that is implemented on the Tru64 environment. However, additional third-party software is used to generate reports and administrate the database.

When assessing third-party products, you must ensure that these or similar products are available for both the new OS and the new database.

This migration case study involves the conversion of a Sybase database implemented on the Tru64 platform to an Oracle database running on the Solaris platform. FIGURE 1 on page 7 provides an overview of the Sybase implementation.

When attempting to assess the database component of the application, be sure to assess the deployment of database technology, not just the database itself. Databases have evolved to become much more than simple repositories for data. Complex logic can be programmed into the database. Database vendors encourage developers and database administrators (DBAs) to store database-related (or data-intensive) logic inside the database. The program units that are locally stored in databases are often called stored procedures and triggers.

The practice of storing program logic in the database aids in the assessment because the majority of the database-related logic is centralized in a single location, although some interaction with the database will be specified in the programs themselves. For DBAs who are concerned about database performance, storing program logic in the database is encouraged as well, because logic that is locally stored in the database has many positive performance implications. These stored program units are written in a language that is commonly known as the Structured Query Language (SQL).

Regrettably, although there is an SQL standard, the degree of compliance with this standard varies greatly from one database vendor to another. Different database vendors might develop their own extensions to the SQL language to make it more powerful and easier to use and, in some cases, to address specific database performance issues through optimization.

The assessment of the database technology must address the stored procedures as well as database object behavior. Among the different databases, database objects (box 16 in FIGURE 1 on page 7) that have the same name behave differently. For example, database objects such as stored procedures, triggers, and temporary tables are supported in both Sybase and Oracle. However, there are no standards for the behavior of these objects. Consequently, procedures, triggers, and temporary tables stored in Sybase behave differently than those stored in Oracle. These differences must be well understood before you can accurately assess the amount of change and effort that will be required in a migration.

Take extra care when migrating application logic from one version of SQL to another. In this example, translating a full-blown Sybase T-SQL application to Oracle's PL/SQL could result in an extensive modification or a total rewrite. You must carefully identify the use of language features that might require the reimplementation of logic on the new deployment because the SQL extensions and their underlying functionality might not be available. For this reason, the conversion of the Sybase implementation will be considered a reengineering or rearchitecture effort.

When assessing the database technology integration with the application, be aware that each database vendor has its own version of SQL and that these versions can vary considerably. Understanding the differences in SQL implementations will help you understand the nature and amount of work that is needed for a project of this nature.

In addition to the Sybase database technology, our example makes use of third-party reporting tools (box 11), and DBA tools (box 13). If the tool vendor supports both source and target databases and platforms, these can most likely be replaced. If a tool cannot be replaced for any reason, then all the components that use it will most likely need to be rewritten. To keep the example simple, assume that you can replace all the third-party tools and libraries.

In the example, all the components that use SQL will be affected in the same manner. These components are:

  • Stored procedures and triggers (box 15). These are pure native SQL and are discussed above.

  • C programs that use embedded SQL (box 2). Embedded SQL allows developers to directly use SQL statements inside a programming language they are familiar with. In our example, the SQL statements are embedded inside C programs. These embedded SQL programs are then passed to a precompiler (box 3). The precompiler converts the embedded SQL to statements that directly call the native database API (box 4). The output is a generated C program that is then passed to the C compiler and linker.

  • Report programs that use third-party reporting tools (box 11). Note that for this scenario, it is not enough to replace the reporting tool. Report programs that use third-party tools usually issue SQL or SQL-like syntax (possibly allowing database vendor SQL language extensions), so they will have to be modified or rewritten.

  • DBA maintenance scripts (box 13). The database engine (box 17) stores data in objects called tables (box 16). The type of data that is going to be stored is defined at the table level by data types that are native to the database engine being used. When changing database engines, one of the first tasks is to determine whether all the data types used by the source database can be successfully mapped to data types in the target database.

Problems arise when the data types that are used in the source database cannot be mapped to the target database. If a data type cannot be mapped, you must find a way to mimic its functionality in the target database. This simple data type issue could potentially trigger a chain reaction of changes that need to be made to all components that reference the table. The extent of modifications will depend on the nature of the data type in question and how extensively it is used by all the components that are using the database.

In our example, all data types map from the Sybase implementation to the Oracle implementation without difficulty.

Assess the Application

As detailed previously, you must acquire the code for the application. That code will help you estimate how much effort will be required for the migration. There are two issues to consider when assessing an application:

  • Understanding the composition of the code used by the application. Many legacy applications have significant size (for example, millions of lines of code). Simply trying to understand the layout of the source tree and the types of files can be a complex task.

  • Understanding which files within the source distribution are actually used to build the application. As an application evolves, business functionality might no longer be required and new functionality can be added. Although this can be reflected when the application is built, developers seldom remove the old, unused code from the source code directory. Avoid transforming code that isn't being used.

The following appsurvey output represents the composition of the files under the source code repository of the inventory application.

Module | FileType    | # Lines | # of Files | # API issues

invtry | .4          | 127     | 1          | 0 
invtry | .C          | 429661  | 605        | 44 
invtry | .H          | 24570   | 216        | 9 
invtry | .Make_files | 20174   | 126        | 0 
invtry | .Msg        | 6572    | 24         | 0 
invtry | .acf        | 1916    | 86         | 0
invtry | .bak        | 430     | 8          | 0 
invtry | .bld        | 1914    | 6          | 0 
invtry | .c          | 656575  | 415        | 14 
invtry | .cat        | 25      | 1          | 0 
invtry | .cfg        | 131     | 11         | 0 
invtry | .cl         | 5070    | 34         | 0 
invtry | .cpp        | 6017    | 2          | 0 
invtry | .ctl        | 27908   | 54         | 0 
invtry | .dat        | 20684   | 11         | 0 
invtry | .def        | 81      | 1          | 0 
invtry | .h          | 116618  | 356        | 1 
invtry | .sh         | 2790    | 6          | 0
invtry | .sql        | 301904  | 699        | 0 
invtry | .test       | 133     | 1          | 0 
invtry | .tidl       | 4780    | 51         | 0 
invtry | .tmp        | 453     | 1          | 0 
invtry | .tpl        | 8169    | 36         | 0 
invtry | .wpm        | 162     | 1          | 0 
invtry | .zip        | 146     | 2          | 0 
TOTAL  |             | 2672376 | 4066       |68 

Remember that it is possible that not all of these files will be used to create the application. An analysis of the build log will reveal which files are used when the application is created.

In this example, you are considering a custom application written in the C programming language. When implementing this sort of migration, focus on the differences between the APIs provided by the Tru64 environment and those provided by the Solaris OS. The following sample breaks down the APIs differences.

Total Files: 3717    LinesOfCode: 1185289   Statements: 388262
accept     4    Weight: 5
acosd      2    Weight: 5
asind      4    Weight: 5
atand      5    Weight: 5
bind      33   Weight: 5
bind_to_cpu   1    Weight: 25
connect     5    Weight: 5
cosd      15   Weight: 5
endhostent   2    Weight: 5
exp       1    Weight: 5
fork      28   Weight: 3
freopen     6    Weight: 5
fseek      28   Weight: 5
gethostbyaddr  4    Weight: 5
gethostent   1    Weight: 25
getsockname   2    Weight: 5
getsockopt   15   Weight: 25
getsysinfo   6    Weight: 200
gettimeofday  90   Weight: 5
getuid     1    Weight: 3
htonl      44   Weight: 5
htons      63   Weight: 5
inet_addr    18   Weight: 3
inet_lnaof   1    Weight: 5
inet_netof   1    Weight: 5
inet_network  1    Weight: 3
inet_ntoa    14   Weight: 3
ioctl      104   Weight: 25
kill      26   Weight: 5
listen     4    Weight: 5
log       2    Weight: 5
min       9    Weight: 25
mq_setattr   1    Weight: 5
msgctl     2    Weight: 5
msgrcv     31   Weight: 5
munmap     3    Weight: 5
nint      9    Weight: 5
nintf      4    Weight: 5
ntohl      14   Weight: 25
ntohs          15   Weight: 25
open           6    Weight: 25
opendir         13   Weight: 5
pfopen          1    Weight: 200
pow           22   Weight: 5
pthread_cleanup_pop   3    Weight: 5
pthread_cleanup_push   3    Weight: 5
pthread_delay_np     21   Weight: 25
pthread_get_expiration_np 13   Weight: 25
pthread_lock_global_np  125   Weight: 25
pthread_unlock_global_np 128   Weight: 25
recv           14   Weight: 5
recvfrom         12   Weight: 5
remainder        1    Weight: 5
sched_getscheduler    1    Weight: 5
semctl          4    Weight: 5
semget          1    Weight: 3
semop          8    Weight: 3
send           11   Weight: 5
sendto          8    Weight: 5
sethostent        1    Weight: 5
setsid          3    Weight: 3
setsockopt        20   Weight: 25
setsysinfo        1    Weight: 200
settimeofday       4    Weight: 5
shmat          7    Weight: 3
shmctl          10   Weight: 3
shmdt          5    Weight: 3
shmget          7    Weight: 3
sigaction        5    Weight: 25
sigwait         5    Weight: 25
sind           13   Weight: 5
socket          55   Weight: 5
sqrt           118   Weight: 5
statvfs         2    Weight: 3
strftime         49   Weight: 5
system          2    Weight: 5
table          3    Weight: 200
tand           3    Weight: 5
template         1    Weight: 3
times          2    Weight: 3
ulimit          8    Weight: 25
uswitch         2    Weight: 200
wait           40   Weight: 3
waitpid         2    Weight: 3
write          2    Weight: 5

Assess the Compute and Storage Platform

In the example, the capacity of the existing hardware platform is determined. Based on this information, a replacement platform is chosen from the Sun product line that will provide the required performance, reliability, scalability, and manageability. The details of hardware sizing are outside the scope of this document.

Assess the Network Infrastructure

Next, examine the networking facilities inside the enterprise's data center to determine if they can support the required future capacity and load generated by the migrated environment. Where appropriate, additional capacity might have to be acquired (10BASE-T to 100BASE-T). All aspects of the network must be considered, from the transport technology (FDDI, Token Ring, Ethernet, and the like) to the number of ports that are available on the switch or hub that will be used to cable the Network Interface Card (NIC).

Once you determine the networking technology, you can order the correct NIC for the hardware described above.

In the example, the 100-megabyte network has sufficient capacity, and a port is available on the switch serving the data center. A 100BASE-T NIC is required for the platform, as well as a 10-meter cable to make the connection.

Assess Facilities

During the next part of the assessment, you assess the facilities and any changes that will be required to support the migrated solution. During this assessment, consider power, space, network connections, door frame size, and similar requirements.

In the example, the new platform is roughly the same in size as the older platform. As a result, it can fit through all the doorways. However, it will have to be installed in a previously unused corner of the data center because the old machine will not be retired for some time.

The newer Sun hardware in this example requires more power but produces less heat than the older platform. However, a new electrical receptacle will be required for compatibility with the new hardware. In this case, the client decides to re-route a cable run for cabling efficiencies with existing machines and to bring power to the new location.

Assess Management Tools

Next, you assess the existing management tools and determine how they can be moved to the target platform. In this case, the client uses BMC Patrol to monitor the old Tru64 environment. This product is also available for the Solaris environment and has already been deployed on other Sun platforms within the data center. Additional ad hoc system monitoring is performed using the cron utility, to schedule scripts that use conventional UNIX utilities such as iostat, vmstat, df, and the like.

Assess People and Process

The skills of the organization must be assessed to determine whether any gaps exist. A curriculum is then developed to address any shortfalls. In our example, the IT staff already supports a number of Sun/Solaris/Oracle environments, which means that no additional training should be required.

Understanding Threading Models

Applications use threads to implement fine-grained parallelism. Thread libraries have been created for most modern operating environments. The most common threading implementations are POSIX threads and Solaris threads, which have similar semantics. The Solaris OS supports both threading models.

DEC's implementation of threads differs slightly from these implementations. In the example, you would use a compatibility library to replace threading APIs that are found in the Tru64 environment, but not found in the Solaris environment.

  • + Share This
  • 🔖 Save To Your Account

InformIT Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.


Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

Collection and Use of Information

To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

Questions and Inquiries

For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

Online Store

For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.


Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

Contests and Drawings

Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.


If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

Service Announcements

On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

Customer Service

We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

Other Collection and Use of Information

Application and System Logs

Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

Web Analytics

Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

Cookies and Related Technologies

This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

Do Not Track

This site currently does not respond to Do Not Track signals.


Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.


This site is not directed to children under the age of 13.


Pearson may send or direct marketing communications to users, provided that

  • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
  • Such marketing is consistent with applicable law and Pearson's legal obligations.
  • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
  • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

Correcting/Updating Personal Information

If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.


Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

Sale of Personal Information

Pearson does not rent or sell personal information in exchange for any payment of money.

While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

Supplemental Privacy Statement for California Residents

California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

Sharing and Disclosure

Pearson may disclose personal information, as follows:

  • As required by law.
  • With the consent of the individual (or their parent, if the individual is a minor)
  • In response to a subpoena, court order or legal process, to the extent permitted or required by law
  • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
  • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
  • To investigate or address actual or suspected fraud or other illegal activities
  • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
  • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
  • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.


This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

Requests and Contact

Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

Changes to this Privacy Notice

We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

Last Update: November 17, 2020