Thoughts on the EMA Guideline on computerised systems and electronic data in clinical trials

Introduction

The European Medicines Agency (EMA) GCP Inspectors Working Group finalized their Guideline on computerised systems and electronic data in clinical trials on 7 March 2023, expanding on their position outlined in their 2010 'Reflection Paper on expectations for electronic source data and data transcribed to electronic data collection tools in clinical trials'.  The Guideline has an effective date of 7 September 2023, and it is expected that Sponsors will have implemented appropriate practices and procedures to comply with this document from this date.

In this paper, 2Richards have selected salient points from the Guideline for further consideration by the reader.

Please note that this is our assessment of the potential impact to study conduct and data management. It is the responsibility of the reader to review the Guideline in its entirety to determine appropriate actions to be taken, if warranted.

Comments on selected aspects of the Guideline

Scope

This section clarifies that this Guideline applies to any electronic system used for data capture or manipulation or for the analysis and reporting of data. This includes electronic medical records, EDC, instruments, devices, software as a service, laboratory systems, portals, clinical trial management systems, eTMFs, safety systems, etc. Essentially, if the electronic system contains any study data or it is used to manage the study, it’s in scope! This may be a change in perspective for many organizations that have historically excluded systems such as their CTMS or Investigator Portal from full validation on the grounds that “it doesn’t contain data directly submitted to a Regulator.” Based on the stated scope of this Guideline, this position will not be deemed acceptable by EMA inspectors.

There is focus on the definition and control of source data, including a reminder to the user that this may include emails, spreadsheets, AV files, images, and database tables. Our recommendation is that all source data be defined and considered in an overall data flow map.

The Guideline emphasizes that robust data governance principles must be in place to address data ownership, control, training, and archiving. It is clearly stated that lack of integrity is deemed to be equivalent to data loss/destruction. In accordance with the principles of ICH E6 and the EU-CTR, a risk-based approach to management of computerized systems should be used.

Responsible Party

The term Responsible Party is introduced in Section 4.2 of the Guideline. While specialist activities may be delegated, the parties responsible for computerized systems are the Sponsors and Investigators. This is emphasized in Annex 1, which specifies that executed Agreements, documenting assignment of tasks and responsibilities must be in place prior to provision of services or systems. Perhaps the most significant point of note here is that there is greater responsibility placed on Investigators to ensure that systems they are using are compliant with EMA expectations. Given that Investigators are unlikely to have computerized system validation expertise, or indeed access to such expertise, it is less clear how compliance with this expectation will be secured and/or evidenced. This is an area where we would recommend increased Sponsor scrutiny, and perhaps support from the Sponsors to Investigators.

System Validation

One area specifically called out is that if a vendor’s process is being used to validate a system, the Responsible Party should have assessed the vendor’s process to deem it suitable and the vendor must allow inspectors access to this assessment. The assessment may include an audit which, in turn, may be subject to review by EMA inspectors. Allied to this is an expectation that the agreement with the vendor requires that validation documentation be available even if the system is no longer available or the system vendor is no longer in business. The Guideline is also clear that if a vendor will not allow access to appropriate documentation, then the vendor should not be used. We have often experienced issues with vendors being unwilling to share in-house documentation for core system validation and we recommend this be addressed by Sponsors in their contracts with vendors.

Regarding specifics for testing, the Guideline unsurprisingly notes that validation should be traceable to user requirements and that systems be routinely reviewed to ensure that a validated state is maintained throughout the system lifecycle. It also iterates that testing documentation should include the results of testing in addition to a simple “pass/fail” indication. In our experience, this is something that is not universally adopted by software vendors, many of whom only record script fail results rather than actual results when test items function as expected.

Metadata

Section 4.3 of the Guideline opens with three short but critical sentences relating to electronic systems: “Electronic data consist of individual data points. Data become information when viewed in context. Metadata provide context for the data point.”  These points are reemphasized throughout the Guideline, for example: “Metadata form an integral part of the original record.” “Without the context provided by metadata, the data have no meaning.” “Loss of metadata may result in a lack of integrity and may render the data unusable.”.  In our opinion, these statements form the crux of validation concepts but, in our experience, metadata mapping and audit trail management, have been significant challenges for many organizations to defend during EMA inspections, especially when system upgrades and data or system migrations have occurred.

In accordance with ALCOA++ principles, the EMA underlines the importance of data traceability and the timeliness of data collection. There is an expectation that source data are minimally processed, and that data are collected as close to real-time as practicable. The Guideline makes it clear that the Investigator should review and sign-off on their data in a timely manner, reminiscent of vote 0500203 of the EFG-05 on 17‑Feb-2020 which stated (our translation): “The lack of implementation of electronic signatures for regular data verification in the eCRF is therefore a serious deficiency for which the sponsor is responsible (ICH-GCP 5.1.1 and 5.1.3)”.  In our experience, timeliness of data collection and the transfer of data from devices to the clinical data repository has been subject to criticism in many EMA GCP inspections.

Linked to the timeliness of data collection is an expectation that system timestamps not only be unmodifiable, but also that they be independently generated through an external timestamp. This can be challenging, especially with wearables or when study participants are using their own devices and is something that needs to be considered when evaluating system suitability. In an attempt to address differences in time zones, we have seen both manual and programmatic manipulation of timestamps by Sponsors with inevitable confusion and concern arising during inspection.

The Guideline places emphasis on the importance of a system’s integrated audit trail being complete and subject to regular review for data anomalies, inconsistencies, and unusual activity. While key data collection systems, such as EDC, IRT and safety databases usually include appropriate audit trails, this can be a weak point in other systems such as CTMS, communication portals, datasets from smaller vendors (often .xls) and in-house trackers. It is noteworthy that the requirement for audit trail access and review also extends to the Investigator, for example reviewing eDiary audit trails as well as eDiary data. Sponsors are encouraged to take a more holistic view of audit trail functionality, access and review as part of system design and selection.

System Access & Use

The Guideline outlines in sections A2.6 and A2.7 the expectation that both the final system validation report prior to release to production and the formal release prior to initial use are approved by the Responsible Party. This implies that the Sponsor has access to individual(s) with appropriate expertise to perform these reviews and confirm acceptability.  Historically we have seen some organizations (often smaller Sponsors) inappropriately delegate these tasks to the application vendors. Such an approach may likely cause future inspection issues.

EMA often focuses on the need for duties and assignment of responsibilities to be documented so it is not surprising that they expect agreements between parties to be in place before a system is used. However, there is an additional expectation that such agreements include confirmation that vendor personnel understand GCP, data protection and quality systems requirements. This is an area that is often overlooked by Sponsors, especially for software vendors providing bespoke, niche systems.

In keeping with a holistic approach to data management, the Guideline stresses the importance of user training (including study participants, when using Subject-facing systems) and control of system access. There should be regular reviews of system and data access, with particular focus on who can change data or see unblinding (or for open-label studies, aggregate) data. Although we typically assume that unique usernames and passwords will be required for system access, we recommend considering how this may impact the required functionality of shared systems (e.g., those used for patient profile or protocol deviation review). We recommend controls are put in place to ensure individual users can be identified and their activity attributed to them within such shared systems.

Of note is an emphasis on the physical security of equipment in addition to the logical security of computerized systems. This was something routinely considered when systems were maintained on in-house servers, but the growth of cloud-based services and virtual server farms has made this evaluation potentially more difficult. Sponsors will need to determine how to adequately verify and demonstrate that critical data centers meet physical as well as logical security requirements.

An area where we often see inspection challenges is the access to systems once studies have been completed. It is not uncommon for systems to be decommissioned after a study is reported and this may occur many months in advance of a regulatory submission and potential GCP inspection. EMA inspectors will request direct access to systems and audit trails, and it is made clear that systems should be available when needed, with direct access to the investigators, Sponsors, and regulators. This is an area that needs to be considered by Sponsors and addressed in vendor contracts, for example whether systems will remain live with read-only access after study completion, or archived with an expectation that they can be reactivated prior to inspection.

One final point of interest, as if to contextualize their opening comment that the use of computerized systems is not a requirement, EMA states that use of Subject-facing computerized systems should be justified, lest they exclude certain participants and patient groups.

Our Assessment

This Guideline contains a wealth of good practices and commonsense approaches to better ensure data integrity while using computerized systems. Based on a detailed review of this Guideline, we suggest that Sponsors develop processes to assess and document the risk of each computerized system to the integrity of their critical data; this might include:

  • A detailed data flow map. This should clearly indicate how data are first recorded, by whom and in what system, where these systems are located, how these data are transferred from one system to another and how they are analyzed;

  • A list of the risk-based approaches used to ensure each critical system is validated, released to production and maintained in a validated state, with references to all relevant supporting documentation;

  • Processes to periodically review user training, system access and evidence of audit trail review.

One outcome of this Guideline becoming effective may be the realization that Sponsors need more validation expertise so that, as the Responsible Party, they are able to assess the suitability of computerized systems to support their clinical development. Another outcome may be that we see greater consistency of computerized systems used during a clinical program. Often, we see different EDCs, IRTs, ePROs, used even between similar Phase III studies. Sponsors might consider committing to specific software applications for the duration of a program since it is easier to conduct incremental, study-specific build validation rather than having to validate new platforms. In addition, Sponsors now have a stronger argument for requiring even the largest software vendors be more transparent with their validation documentation.

 Conclusion

The ability to demonstrate data integrity through robust computerized system validation and control processes is essential. With this Guideline becoming effective, we anticipate seeing more observations relating to the use of computerized systems and data being issued during EMA GCP inspections. Such observations, in turn, may impact EMA’s opinion on the acceptability of data submitted by applicants in support of marketing authorization.

Note: This article is also available as a downloadable, printable PDF by clicking here.

Previous
Previous

Why Quality Matters

Next
Next

Reflections on FDA’s Informed Consent Guidance