As an emerging field, there is still a great deal of scientific ground to cover in translational medicine, a natural extension of interventional epidemiology aimed at reducing the amount of time needed to get treatment protocols from “bench to bedside.” Although a number of statistically significant findings have been made in recent months, the organizational technology to support such findings is often as essential in solution developments as the research itself.
Take the recent translational medicine work described in an August, 2012 Bioscience Technology1 article, for example. A team of Stanford University School of Medicine researchers has discovered a possible link between the peptide beta-amyloid (also known as A-beta) and multiple sclerosis (MS). A-beta is a “despised molecule” in the world of medicine, considered one of the key culprits in Alzheimer’s disease – according to Lennart Mucke, MD, a veteran Alzheimer’s researcher at the Gladstone Institute of Neurological Disease in San Francisco, although A-beta is made constantly in the human body and its toxicity in the brain is well-established, “its normal function remains to be identified.”
When the Stanford study’s senior author, Lawrence Steinman, MD decided to test the role of A-beta in MS using a mouse model that mimicked an important feature of the disease – the autoimmune attack on myelinated sections of the brain – he expected injections of the peptide to have at best no effect, and at worst cause significant damage. Instead, symptoms improved. The researchers chose to inject the mice with A-beta into their bellies rather than their brains, relying on the link between immune events outside the brain and neurological events within, and discovered that in mice whose immune systems had been conditioned to attack myelin, the addition of A-beta either delayed or prevented the onset of characteristic MS paralysis altogether. Steinman believes “there probably is a multiple-sclerosis drug in all this somewhere down the line.”
It is this getting “down the line”, however, that can slow or even halt translational medicine efforts like the one at Stanford or blood stem cell transplant work being done at the University of Washington2; to ensure that large volumes of data are not only stored but also catalogued and handled properly, a number of scientific equipment firms such as Thermo Fisher Scientific3 and Labtronics4 have developed what are known as laboratory information management systems or LIMS.
The goal of a LIMS system is to provide a link between Biobanks, Universities, pharmaceutical companies and CROs, acting as centralized repository for data. These systems offer the ability to query demographic and sample type information, remain in compliance with HIPAA regulations, and also allow researchers to prioritize work or access laboratory data from a remote terminal. Because of the cross-disciplinary nature of translational medicine, it is crucial that any lab system is able to standardize its reporting and collection practices – a LIMS allows easier communication between partner agencies and government regulatory bodies such as the FDA. And as evidenced in the recent Stanford studies, it is not only the number of biospecimens collected that now aid drug development and treatment efforts but also the quality of these samples; the result is a large volume of high-value samples that must be catalogued correctly and must have well-documented chains of custody.
Translational medicine– whether as part of a University research endeavor, private study or corporate scientific project– can no longer rely on in-house technology to ensure sample security is maintained and audit trails are always accurately compiled; just as “big data” has made its way into the world of technology, so too have huge numbers of samples become standard in the medical industry, necessitating electronic oversight to help eliminate human error.
LIMS systems can be viewed as the second, perhaps silent half of research and in many respects the conduit through which bench data travels to bedside treatment; although front-line study acts as the first phase in any effort of translational medicine, it requires robust underpinning in the form of laboratory management to help ensure samples and data are appropriately managed along the road to practical application.