Digital transformation in healthcare and clinical research has been slower to progress than in other economic sectors, such as consumer goods and communications. Healthcare is heavily regulated and typically slow to adopt innovations. Against this landscape, how do we ensure that digital health technology (DHT) platforms meet regulatory requirements while staying agile and up to date with the newest, state of the art digital and data science techniques?
What are the fundamental requirements for a fit-for-purpose DHT platform in a regulated environment? It needs to be capable of handling large volumes of raw sensor data, provide an auditable trace of each processing step, ensure reproducibility, and provide version control. When dealing with data collected in an uncontrolled real-world environment, the platform must have a consistent and robust way to handle potential gaps in the data streams, varying sampling rates, duplicates, and other issues that inevitably appear in passive and continuous sensor data. This level of rigor and complexity is typically not met in platforms designed for academic research or consumer applications.
ActiGraph's CentrePoint platform is optimized for this specific type of environment. Every day, CentrePoint collects, stores, processes, and provides access to gigabytes of continuous sensor data captured passively by our wearable devices in regulated clinical trials. These data include raw accelerometer and gyroscope data, temperature, and much more. The technology team at ActiGraph has historically focused on the fundamental requirements for data use in a regulated environment and implemented a small set of validated and commonly used algorithms.
The focus on robustness, however, comes with the tradeoff of agility. With new measures and algorithms published every day, this lack of agility becomes a limitation of the platform. In addition, the need for fit-for-purpose algorithms means that multiple algorithm variations might be needed for different clinical populations.
Recognizing these difficulties, ActiGraph embarked on a revamp of its data processing back-end system. This ambitious project modularized our infrastructure, allowing us to decouple the execution of our algorithms for extracting endpoints from the rest of our pipeline (data ingestion, filtering, resampling, etc).
With these changes in place, it is no longer necessary to implement each algorithm in a specific language or using any specific technology. Algorithms can now be packaged in containers, which are lightweight, standalone, executable packages of software that encapsulate each implementation and all of its dependencies. They can run independently on almost any machine. These containers can be inserted inside the data processing pipeline. They receive the raw data as files on a disk and write their output to files that are then parsed and stored automatically by ActiGraph’s processes.
Furthermore, these containers are version based, meaning that new versions of an algorithm can be introduced without impacting ongoing studies. The input and output of each algorithm, as well as the algorithm itself, is described in metadata. This means the platform can provide a human readable description of the transformation of collected data to a digital endpoint.
ActiGraph’s Science team are internal clients of this project, which allows them to quickly integrate new capabilities into our system. It is now possible for all our clients to package their desired algorithms inside containers and have them run on their data in a secure, compliant manner. This is the beginning of an exciting era for ActiGraph as we can now offer our customers all our expertise and operational excellence collecting and shepherding the data, without constraining them to specific endpoints or algorithms.
Contact us now to learn more about our fit-for-purpose DHT platform.