Brent Reed, director of information technology, writes about elements required for managing data for a successful clinical trial.
Managing a successful clinical trial requires meticulous, multidisciplinary collaboration within an organization. With groups varying from clinical research associates (CRAs) to IT, coordinating with all the subject matter experts to properly design the essential elements of a trial can be a daunting task.
Considering all of the steps involved in managing a trial, there’s a single element that establishes commonality across all groups, traverses the entirety of a trial and is ultimately the final product — data.
Data in clinical trials encompasses a wide variety of detail, including site and subject information, visit schedules, tests, results, financials and even logistics. It’s not surprising that dedicated data management teams have been created to help manage the voluminous quantity of clinical trial fields and associated data delivery tasks.
Mature data management procedures are paramount to ensure protocols are accurately captured and configured in a study database. However, procedures alone don’t eliminate many of the challenges attributed to operational data demands.
International clinical trials often include hundreds of sites enrolling patients. In many cases, to support global data collection, multiple regional databases are established during the trial startup process. This can lead to a multitude of issues. To mitigate the risks and challenges associated with managing disparate databases, PPD® Laboratories’ central lab leverages a sophisticated single global database infrastructure design, the Preclarus® central lab database. This enables all global lab locations to import and export data from a single database instance, dramatically improving the quality, data integrity and efficiency surrounding data access.
This article compares some of the data requirements in a trial, and describes the benefits the PPD Laboratories’ central lab has encountered by leveraging its single global database.
Study setup and database creation
As is the case for most contract research organization (CRO) central labs, after a primary study database has been created from a variety of databases, the data needs to be copied to the disparate geographically separated database servers. Depending on IT staff resources and how many times the database needs to be replicated, this can introduce potential study startup delays.
The Preclarus central lab database eliminates all the steps involved in configuring multiple databases, as it only needs to be set up once.
Study database verification
In a multiple database infrastructure design, each copy and configuration of the regional database presents an opportunity for errors and data integrity issues, depending on the unique constraints of each database server location. Additional data management procedures typically are introduced to ensure sufficient quality control and verification steps are in place to gauge the accuracy of each database instance. This can result in longer study startups based on how many regional databases are required.
In contrast, PPD Laboratories only has to perform quality control and verification on a single instance of the global database. This has resulted in higher quality and faster study startups compared to the rest of the industry.
Study data entry
Following a multiple database design, the regional data collected from various sources, including labs, sites and other trial stakeholders, will flow into the designated database instance. This results in multiple sources of data that will later need to be consolidated to be exported or analyzed.
The Preclarus central lab database receives study data from all sources worldwide through web-based portals and apps. This data is automatically consolidated as it enters via the various data portals.
Supporting that aggregation is PPD’s new project management (PM) dashboard, which transfers information from the Preclarus central lab database to provide a visual representation of a clinical trial’s progress. In addition to offering context for decisions, the dashboard helps keep studies on track operationally and supports the project team’s efforts to manage the study budget. (For more information on the PM dashboard, see the BioPharma Dive article, “PM dashboard provides visual overview of clinical trials.”)
Operational study data delivery
With a multiple database infrastructure design, datasets must be normalized during trial execution, then consolidated or merged. Due to the complexity and risk associated with this activity, the provisioning of data to sponsors and trial stakeholders traditionally occurs at certain milestones.
During the execution of a trial with a single global database model, data and results can be immediately provided on demand through sponsor and site portals. For example, the Preclarus investigator site portal can provide reports and visibility into the results of samples tested regardless of which lab location performed the testing. The data doesn’t need to be compiled from multiple sources, so there aren’t inherent delays imposed to access information. (For more information on the Preclarus investigator site portal, see the BioPharma Dive article, “Investigator site portal reduces errors and improves communication for more efficient clinical trial management.”)
Following the merger or migration of multiple databases, additional data integrity steps are required to maintain certain levels of quality assurance. The slightest misstep during a migration process could be disastrous for a clinical trial because of the potential for introducing inaccuracies in subject test data. This could raise additional questions surrounding the quality of broader data collection during the trial execution process.
Further, during a study it’s possible for changes to occur in one of the decentralized databases that didn’t occur in the other disparate databases. When the multiple sources of data converge, not only is there a potential for conflict, but there’s a significant risk that data will be lost or overwritten with incorrect data.
Leveraging a single global database architecture, like the one used by PPD Laboratories’ central lab, eliminates many of these potential concerns. The need for exhaustive discrepancy management is reduced as there are no mergers or migrations of data. Every field collected during the trial flows directly into a centralized repository that can be exported and locked following the successful close of a trial.
Clinical trial data management success
In summary, multiple disparate databases are common throughout the clinical trials industry. Using a variety of databases to store clinical trial data can lead to additional operational challenges and introduce significant risk when trying to perform data management activities.
The single global database architecture used by PPD Laboratories’ central lab demonstrates that real-time, on-demand access to data throughout the clinical trial lifecycle is possible. Using a centralized data model for all lab locations, study setup and verification complexities are reduced. This method increases confidence levels surrounding data quality and integrity, producing a higher likelihood of success for the clinical trial. To build upon its achievements, PPD Laboratories is developing an app that will enable the company to communicate more efficiently with its clients, investigators and clinical trial managers, ultimately providing even greater value to its customers. (For more information on the mobile app version of the Preclarus investigator site portal, see the BioPharma Dive article, “There’s an app for that: Busy investigators can now manage clinical trials on the go.”)
Brent Reed is director of information technology for PPD’s global central, bioanalytical and vaccines labs.