Emerging technologies and business pressure: The driving forces for an information management reengineering strategy

Gomeni, Roberto

The pharmaceutical industry is characterized by mergers and restructuring; extensive use of outsourcing and global development strategies; new regulatory requirements for electronic records; new methodologies for conducting large clinical trials; and the need to effectively exchange information between the components of a corporate information system. Middleware connectivity and the workflow management systems ensure comprehensive integration of a corporate information system. Middleware enables pharmaceutical companies to exchange information between multiple systems or components, and to use and distribute the information on a server using an event-driven mechanism. Workflow technology can streamline and secure the information flow and approval processes, supplying a tool to electronically view, manage, revise, share, and distribute virtually any information or document across the enterprise without paper. With these tools, information management can move from the traditional request/replay model, in which users and applications must ask for status updates on needed information, to an eventdriven approach, in which they receive this information automatically when it becomes available. A strategy for an adaptive evolution of the information management process in response to continuous improvements in technology and the evolution of business strategies is presented.

Key Words: Middleware; Workflow; Information management; Data management; Information technology



Among the most information intensive industries in the world, faces information management challenges not seen in most other areas of business (1). Pharmaceutical companies are under pressure to improve operational efficiency and to continually simplify and automate processes. While progress has been made in automating individual business areas, there is often room for improvement between those areas. Strategic business objectives and regulatory requirements force pharmaceutical companies to generate more comprehensive and better quality data in shorter periods of time. Pharmaceutical companies need a global strategy (2,3) to organize, manage, and dispose of information, and reduce the time to decision making and market. This pushes companies toward implementing a continuous adaptive process to integrate organizational changes and benefit from evolutions in information systems and technology.

Information technology (IT) concerns acquisition, processing, storage, and dissemination of information through computers, telecommunication, networks, and electronic devices. As the need for employees to work together increases, real-time interactive information sharing among the stakeholders in the research, development, and marketing processes becomes a strategic need. These stakeholders include patients, investigators, sponsors, study monitors, project managers, medical doctors, quality assurance, safety managers, data managers, laboratories, statisticians, and regulatory authorities.

Information systems identify computers and/or telecommunications-related equipment or interconnected system or subsystems that are used in the acquisition, storage, manipulation, management, movement, control, display, switching, interchange, transmission, or reception of voice and/or data. They include software, firmware, and hardware.

As Kubick points out, the pharmaceutical industry often seems reluctant to employ new technologies (4). This has frequently been attributed to the absence of competitive pressure, concern with regulatory acceptance, lack of vision, and lack of knowledge on effective technology implementation. Like any manufacturer, however, pharmaceutical companies are sensitive to improvements that affect efficiency and productivity-and can ensure competitive advantages. This paper presents a strategy for an adaptive evolution of information management in response to continuous improvements in technology and business pressure.


The success of data management is the satisfaction of the client: the regulatory authorities. Thus, pharmaceutical companies primarily viewed data management as a standalone process with its own rules and methods, and regulatory requirements and guidances that required specific information systems and technologies. These systems were developed independently of the corporate information system, using an integrated and centralized approach with software and database support for a clinical data management process. They have grown and changed through multiple generations of hardware and software. Batch, transactional, and client/server systems coexist. Most software systems have been modified to meet shortterm, tactical goals; this often restricts the ability to react in the future.

Clinical research management must connect applications (clinical trials management, clinical data management, safety management, clinical project planning, electronic document management, medical information management, etc.) to optimize operations. Custom code and operational procedures, and inter-dependence of the applications, make it difficult and costly to accommodate business change. This slows an enterprise's ability to quickly react to change and leads to a progressive paralysis of communication channels. As organizations require more integrated applications to stay competitive, appropriate technologies must be used to avoid rigidity.


Business pressure requires technology that provides complete and integrated clinical research solutions and applications to ensure effective information management and dissemination. Modern drug development is characterized by mergers and restructuring; extensive outsourcing and global development strategies; new regulatory requirements for electronic records; new methodologies for conducting large-scale clinical trials; and the need to exchange information between the components of a corporate information system, as displayed in Figure 1.


Pharmaceutical companies require systems that can collect, import, exchange, and distribute information across corporate software systems and databases. When working with preferred contract research organizations or development partners, complex data transformation and synchronization tools may be required for secure and reliable data exchange and decision making. These include: electronic mail with attachments, scanned case report forms (CRFs) and regulatory document images, CRFs and clarification data sheets, analysis data sets, and desktop application files (eg, documents, spreadsheets, and slide presentations). An appropriate information technology infrastructure can help accelerate clinical development (5).

Large-Scale Clinical Trials

New developments in clinical trial methodology push pharmaceutical companies to organize large multinational trials (2) and megatrials (6,7). These large, simple randomized trials assess the effects of widely practicable treatments on mortality and morbidity in common diseases. They typically randomize tens of thousands of patients and provide clear and reliable information about the effects of several treatments. Unlike conventional clinical trials where the study stops when the last patient enrolled completes a predefined treatment duration (or discontinues), mega-trials terminate when the requisite number of clinical endpoints has been reached. Effective information management is required to identify, as early as possible, the end of the trial. Also, it has recently been shown (8) that the long duration (several years), multinational nature, and large volume of data generated in such trials requires extensive, flexible, and adaptive IT/information systems to ensure consistent, timely, and effective data management.

Global Development

The ever-increasing time and cost to bring new drugs through the clinical process is a powerful incentive to pursue simultaneous multinational development. Global clinical development is becoming common, and data from various countries are used in multiple submissions. Global development requires an integrated database, so that even if clinical studies are performed in one country, integrated safety and efficacy analyses can be centrally performed (9). Multinational companies still submit duplicate and sometimes different paper drug applications to international agencies with different requirements and computer systems. With global development and registration, data collected in one country can support product registration in another (10). Successful global development requires state-of-the-art IT and centrally coordinated and effective teamwork (11).

Regulatory Issues

Regulatory agencies encouraged a paperless regulatory process by adopting electronic submissions. Guidelines were released for electronic submissions without accompanying paper copies. Electronic signatures on electronic records that comply with the regulations are considered equivalent to handwritten signatures on paper records (12). Electronic standards are the rules by which data, text, and images are decomposed, stored, retrieved, and transmitted to computer systems. To make global submission feasible, pharmaceutical companies must develop the necessary ITT infrastructure (13), such as network bandwidth, digital communication lines, and scalable hardware and software.

Corporate Information Systems

Recently, there has been a global movement toward widespread dissemination of Enterprise Resource Planning (ERP) software applications (14) such as SAP R/3, BaaN, and PeopleSoft. ERP uses one database, application, and user interface to assist employees enterprise-wide in monitoring and controlling the business. Such systems are expected to automate finance, accounting, human resources, and manufacturing in order to accurately schedule production, fully use capacity, reduce inventory, and meet shipping dates. ERP integrates and synchronizes isolated functions into streamlined business processes. Many ERP systems work through networked business objects and componentbased architecture. These collaborative, flexible business-process elements automate and improve the manufacturing and supply chain. To complement ERP systems, corporate standard software (internally developed or acquired from third-party suppliers) has been adopted for decision making, simulation, and data analysis in areas such as: safety, clinical trials, clinical data, project, and electronic document management; medical information; and data mining (software and services that explore data to discover relationships and patterns that lead to proactive decision making). Information sharing among these systems is a challenge.


Pharmaceutical and biotechnology companies have undergone major changes in the last several years; more are likely. Financial analysts estimate that as the market growth rate decreases, the rate of restructuring, new business ventures, and mergers will increase (15,16). Optimizing decision making and ensuring smooth integration is critically important to an effective reorganization/merging process. A flexible and open ITT system is a competitive advantage in establishing quick and effective links between heterogeneous systems with different application components.

The Internet

Internet technology offers new tools for successfully reengineering pharmaceutical research and development. The Internet, as a public infrastructure, permits the speedy and inexpensive creation of sites to promote company image; disseminate product information; dialogue with clients; and support secure networks to connect trial sites, contract research organizations, sponsor, clinical monitors, doctors, patients, and other trial participants. This more rapid acquisition and distribution of clinical trial information enables the company to do more in less time, creating an economic advantage for the clinical trial industry (17).

An infrastructure that ensures appropriate security for large scale Internet use can be developed. Servers can be protected by a firewall (a set of related programs, located at a network gateway server, that safeguards a private network from other networks). Additional intrusion detection software with encryption systems can be used for clinical and sensitive data. Access to the system can be controlled by password/user identifiers or by more secure authentication systems such as SecureID cards, fingerprints, voiceprints, facial symmetry, or iris scanning (18).

The Internet and the World Wide Web have been successfully used in managing large-scale clinical trials for remote randomization and data entry, distributing information on trial progress, and managing an electronic investigators' forum (19). The Internet approach emphasizes collaboration, interdependency, and interactive sharing of information among stakeholders in the research and development processes. The real advantages of the Internet, however, can only be achieved with an appropriate IT/information systems infrastructure.

New Data Capture Technologies

Most clinical trials require the collection of a massive amount of data at the investigator site. Traditionally, these data were collected on paper CRFs that are manually retrieved and verified by the study monitor or delivered by a courier or mail. Then, the central site manually codes and captures the information using double keystroke entry. Finally, computerized consistency and missing data checks generate data clarification forms that are manually delivered to the investigator for verification and corrections. Emerging technologies and new regulations that allow electronic submissions without paper copies encourage sponsors to use data capture systems. These technologies save sponsors time and money and improve data quality. A recent review of the most promising technologies (eg, interactive voice response system, remote data entry, direct access to the electronic medical record, document scanning with optical character recognition, FAX to collect data, and hand-held portable equipment for data acquisition) (20), emphasized that they can improve data capture speed, decrease cost, and increase quality.

By effectively managing IT/information systems integration in the context of new business, regulatory, technological, research, and development issues, companies can reduce their costs and improve their products and customer service. New information technology architecture based on componentbased information systems, component connectivity, and workflow management offers valuable tools to reengineer the information management process (21).


Component-based systems technology (22), a recent development, builds large software systems by integrating existing components. By enhancing the flexibility and maintainability of systems, this approach can reduce development costs, assemble systems rapidly, and reduce the spiraling maintenance costs of supporting and upgrading large systems. This approach assumes that parts of large software systems reappear often enough that common parts should be written only once, and that common systems should be assembled through reuse.

Large-scale software development is increasingly achieved through component selection, evaluation, and assembly processes with components from sources external to the system building organization. As pointed out by Brown (24), many factors drive this approach:

The World Wide Web (WWW) and the Internet have increased understanding and awareness of distributed computing. The WWW encourages users to consider systems to be loosely coordinated services that reside `somewhere in hyperspace.' In accessing information it is not important to know where the information resides, what underlying engines are used to query and analyze the data, and so on,

The use of object-oriented software design techniques and languages, and the move from mainframe-based systems toward client/server computing, lead developers to consider application systems as separable, interacting components. In some applications, for example, computer-intensive storage and search engines are separated from visualization and display services, and

The rapid technology evolution creates both advantages and problems. Organizations are struggling to build systems that can incrementally take advantage of technology improvements over the system's lifetime.

Flexibility in accepting technology upgrades is a key to gaining competitive advantage. Figure 2 shows a typical componentbased clinical data management system. The components are organized by function (electronic CRF design, remote data entry, electronic monitoring and reporting, clinical data management, clinical trial management, medical data review, safety management, data analysis) around a data warehouse which is a central repository for clinical data storage. This architecture allows each component to be optimized for the task, and components to be updated without reconsidering the entire system's structure.


Middleware connectivity software assures and controls the distribution and exchange of information across software components (databases and application software). It consists of a set of enabling services allowing multiple processes, running on different computers, to interact across a network. Middleware systems were developed to migrate mainframe applications to client/server applications and to communicate across heterogeneous platforms. They enable multiple systems or components to exchange and use information. Commercially available middleware initiatives include the Open Software Foundation's Distributed Computing Environment, Object Management Group's Common Object Request Broker Architecture, and Microsoft's COM/DCOM (25).

Within middleware technology, the objective request broker (ORB) (26,27) manages communication and data exchange between objects. ORBs promote interoperability of distributed object systems and enable users to build systems by piecing together objects-from different vendors-that communicate with each other. ORB technology defines interfaces, locates and possibly activates remote objects, and communicates between clients and objects. An ORB acts like a telephone exchange; it provides a directory of services and helps establish connections between clients and services, as illustrated in Figure 3. Encouraged by the rapid adoption of distributed object and web technology, computer and software companies are developing and/or using component-based technology to change the way software-intensive systems are conceived, managed, developed, and deployed.


The workflow management system is a new software application to design and manage the flow and the decisional process related to distributing and exchanging information (28). Workflow technology allows one to electronically view, manage, revise, share, and distribute virtually any information or document across the enterprise in a paperless environment, and to streamline and secure the approval and decision making process. Workflow is the flow of information and control. Workflow management is the management procedures of such a flow in a business process. Typical examples of processes are: order processing, purchasing, holiday requests, time sheets, expense reports, data flow and validation processes, research and production procedures, and customer support.

A typical example in clinical research is the data flow and validation process. Investigators or authorized site personnel complete CRFs; the monitor performs on-site validation, sends CRFs to data management, generates a monitoring report, and updates the CRFs and data clarification form tracking system; data management receives the CRFs, checks the monitoring report and CRFs, and in case of problems informs and sends data query sheets to the study monitor supervisor. The monitor forwards the queries to the investigator for resolution. The investigator answers, signs, and returns completed queries to the monitor who forwards them to data management.

Implementing a workflow system requires a conceptual model using four key components of a business process:

1. The people who participate and the roles they play,

2. The information that is routed around and/ or required by users,

3. The route the information takes, and

4. The decisions made in the process and their impact on the flow.

Once these elements have been identified, workflow technology can be implemented using process definition tools such as the workflow process definition module, servers, client applications, and monitoring and management tools, as shown in Figure 4.

Workflow definition tools allow the user to define and map out the business process in the corporate information system. These tools are often graphical and look like a flowchart. Definition consists of creating nodes representing messages sent to the people taking part in the workflow, and adding past events and decisions to be made in the workflow. Workflow definition tools are address/ message-driven, decision-driven, or eventdriven. If they are address/message-driven, the main elements in the flowcharts are the messages sent to the participants. If they are decision-driven, the main elements are the decisions sent to the participants. If they are event-driven, the main elements are the events in the business process.

Workflow servers are programs that read workflow definition, interpret the instructions, and execute and track them. Workflow client applications are programs that interact with the workflow. Some use a web browser, making installation and administration easier as users do not need to learn new software.

Workflow management tools define multiple user levels of security. Security rules can be defined at the workflow step so that they can be modified. Authorized users can change workflow jobs in process without altering the job definition. Monitoring tools track significant events such as when a user does not approve or reject a document within a specified time frame, and generate appropriate actions. The workflow engine supports the need to structure the information flow process and enables authorized users to generate ad hoc events within this process.


Clinical research processes need to move from the traditional request/replay model, in which users ask for status updates on needed information across the distributed applications within the corporate information system, to an event-driven approach, in which they receive this information automatically and immediately. Information delivery is initiated by the information server. The end user automatically receives information as soon as it becomes available in any component of the corporate information system through a middleware connectivity environment on his/ her desktop computer. Application components inform the server of end user changes or updates by generating an event. The workflow management system distributes information when a new event is detected or a decision is made by an authorized user. The result is an event-driven clinical research process: a proactive, instantly responsive entity, in which better informed decisions are made faster, and cycle times are dramatically shortened.

For example, imagine that when an investigator calls into an interactive voice response system (IVRS) to provide basic demographic data on when a patient is available for a study, the NRS assigns the patient to a blinded, randomized protocol-specific treatment group. In the meantime, the workflow management system, through the middleware connectivity network, will:

Update the drug supplies subsystem within the SAP application, check the status of available stocks, and if necessary, inform the person responsible via e-mail,

Generate instructions for preparing the requested drug with appropriate labeling,

Notify the drug supply delivery service to send a package to the investigator,

Send an alert e-mail to the study monitor, Update the clinical trial tracking system, and

Update the enrollment curve status report and distribute it to clinical research managers.

Figure 5 displays a typical event-driven global information system.


Remaining competitive in today's increasingly global and interconnected marketplace means developing new ways of doing business. The underlying business transformations are driven by information and communication technologies, which are fast becoming key factors in enhancing efficiency and productivity. Business processes are dynamic and must change as the market, the organization, and regulations change. Workflow systems are essential for responding to this environment.

Clinical research and development requires information sharing beyond traditional corporate bounds to an extended team that may include investigators, project managers, central labs, contract research organizations, and others. Streamlining and securing the approval processes among team members, preventing important details and deadlines from being missed, and tracking the progress of time-critical action items are critical issues. The middleware connectivity system and workflow technology enable users to electronically view, manage, revise, share, and distribute virtually any document or information across the enterprise. In a global information system users no longer need to think about where information is stored or what application is used. Workflow technology is becoming essential to the information infrastructure of any global enterprise. The middleware connectivity system is ideal for use in emerging information management technologies without reconsidering the entire corporate information system.