The Smart Magazine About Medical Technology Innovations
The Lab Goes 4.0
Clinical laboratories are inherently technological wonders, filled with some of the world’s most advanced innovations. In this issue of MedicalExpo e-Magazine, you’ll learn how they’ll move on to the next stage.
In addition to being automated, next-gen labs will be smart, computerized and mobile. And therefore ready for coming challenges, such as personalized genomics and other elements of precision medicine.
Smartphones have revolutionized our communications, as tweets, text messages, and even occasional phone calls come flying at us. And these devices, actually handheld computers, are starting to change how healthcare is delivered. Smartphones now can replace blood pressure cuffs, thermometers and stethoscopes.
Today we inhabit an increasingly connected world populated by smart homes, self-driving cars and numerous other devices and systems that enhance productivity and quality of life. According to the McKinsey consulting firm, we will be surrounded by 30 billion connected devices by the year 2020.
Yet in laboratories, already filled with highly sophisticated digital equipment, the trend toward interconnectedness has lagged behind the curve. Despite the benefits of connectivity, lab equipment and operations have failed to acquire “smart” status as quickly as many other commercial sectors.
“Everything is still predominantly manual,” says Puneet Suri, vice president of Smart Lab and Digital Science at Massachusetts-based biotechnology company Thermo Fisher Scientific. “We still see handwritten lab notes, plates and consumables tagged with markers and inefficient sample tracking, all of which can lead to experimental error.”
A Connected Imperative
Laboratory equipment is made by a wide range of manufacturers, frequently giving rise to compatibility issues. Laboratory workers often must collect data from one device after another before compiling it manually.
“This is a huge waste of time.”
“This is a huge waste of time,” says Markus Gershater, chief scientific officer at Synthace. The London-based startup is developing Antha, a high-level language and operating system for laboratories and other scientific environments. “It is a waste of the capabilities of scientists, who should be focusing all their efforts on doing science.”
Generating and collecting experimental data undoubtedly leads to operational inefficiencies and loss of key information. According to the scientific journal Nature, the annual cost of irreproducibility in biological research stands at a staggering $28 billion in the US alone.
BioBright equipment utilization Courtesy of BioBright
“A great deal of data is simply not being captured,” says Charles Fracchia, CEO of BioBright, an MIT-Harvard startup working to connect lab instruments with sensors and software. “This information could help scientists understand better what does and doesn’t work in an experiment, and help other scientists reproduce work.”
Synthace Antha screenshot Courtesy of Synthace
Synthace’s Antha software lets scientists specify what they’re trying to study, suggests experiments, compiles reports and controls lab equipment. To overcome the problems associated with heterogeneous equipment, Synthace has developed “drivers” for a diverse range of devices from different manufacturers.
“A particularly powerful function of Antha is running liquid-handling robots,” says Synthace’s Gershater. “These robots can perform the central functions of most experiments, but in practice aren’t often used because of the difficulty of programming them.”
Predicting Failures And Cloud-Based Solutions
Predictive maintenance is another area where connected laboratories offer significant benefits.
“Smart laboratory equipment is designed to be self-healing,” says Thermo Fisher’s Suri. “Just like an iPhone, these instruments can update themselves with no overhead or need for a field service engineer. Using analytics, they can even predict failures in advance, preventing equipment downtime.”
A number of companies are also developing cloud-based solutions for connecting laboratories.
“Purchasing a cloud-based system can help a laboratory achieve significant cost savings,” says Michael Davis, a web architect with Atlanta-based, cloud solution developer MediaLab. “It means less on-site hardware to maintain and fewer personnel required to maintain that hardware. The use of on-site solutions is also increasingly risky, especially with the growing threat of cyberattack.”
Other benefits of cloud-based solutions include faster turnaround times for the approval of new policies or procedures, and more efficient and effective training systems for employees.
Towards a Smarter Future
It will be many years before scientific research can be performed entirely by artificial intelligence and robots. Nevertheless, automation can play an increasing role in laboratory workflows requiring defined repetitive tasks and standard operating procedures.
“Enhanced connectivity can drive a transformational shift in the way laboratories operate and add value in many areas,” says Frank Kumli, an executive director in Ernst & Young’s Global Life Sciences division specializing in healthcare delivery innovation. “The fragmented and standalone nature of laboratory devices is both a problem and opportunity that needs to addressed now.”
The San Diego, California start-up, Portable Genomics, bridges two potentially transformative forces in health care. It spans an industry keen on accelerating precision medicine and patient advocacy groups pushing for advances in care.
What sets Portable apart is its innovative platform that gives patients rapid,...
The widespread expansion of electronic medical records, mobile technology and big data has led to a sea change in the practice of medicine. Once centralized at hospitals and health systems, diagnostic testing is increasingly done at the point of care, with diagnosis and treatment more rapid than ever thought possible
Point-of-care testing (POCT) is rapidly spreading throughout health systems in the United States, according to Dr. David McClintock, director of Point of Care Testing at the University of Chicago Hospitals. McClintock spoke at the EuroMedLab conference in June, offering attendees a snapshot of the nascent discipline of pathology informatics and its relationship to point-of-care testing.
Breaking down this specialty, McClintock defined informatics as the “application of science of information as data plus meaning to biomedical issues.” It’s a very specialized discipline that requires designing and implementing “novel quantitative and computational methods that solve challenging problems across the spectrum of biology and medicine,” he added.
Turnaround Time Savings
Pathology informatics uses the processed data to provide actionable information. “The main gain for doing lab testing at the bedside or near the patient is the turnaround time savings,” said McClintock. For example, most lab tests sent to central labs STAT take a minimum of 30-90 minutes. This excludes the time it takes from the physician order (electronic, written or oral) to specimen collection to transport to the lab.
“Some tests performed in less than ten minutes.”
“By doing POCT, that time can be drastically shortened, with some tests performed in less than ten minutes,” he explained. Doing a cardiac troponin in an emergency room, or creatinine/eGFR measurement to assess the risk of contrast-induced nephropathy are POCT that improve patient outcomes and save patient time.
Doctor discussing with patient over digital tablet at the hospital
Large health systems are streamlining procedures via POCT. For example, McClintock pointed to University of Chicago Medicine’s adoption of a rapid “decision to admit” process that is based on point-of-care-testing to expedite lab and radiology work. This permits emergency room physicians to assess and triage their patients more quickly.
Clinical informatics only became a board-certified medical specialty in 2011. Numerous hurdles need to be overcome before best practices are known. Wireless mobile technologies, electronic health records and mobile devices vary in their capabilities and capacity to interact. In addition, laboratory information systems were designed before the advent of wireless mobile communications. Multiple changes are expected in the years ahead, as pathology informaticists work to maximize functionality, interoperability and actionability at the point of care.
Expanding Array of POC Tests
Many tests are POC-compatible. McClintock mentioned glucose, pregnancy, dipstick urinalysis and PT/INR warfarin monitoring as the most common and important tests likely to provide invaluable immediate feedback. In the infectious disease domain, PCR tests now make it possible to get extremely sensitive and specific results in the office, leading to quicker treatment for patients and less overprescribing of antibiotics when other preliminary test results are negative but the physician suspects lab testing will yield a positive.
Point-of-care testing also proves valuable in providing rapid diagnostics and data in the fight against epidemics, such as Ebola, Zika, chikungunya and dengue. In parts of the world where healthcare infrastructure is especially wanting, wireless mobile testing units are extremely helpful in quarantine and disease control.
Nîmes University Hospital in southern France launched its own medical device assessment center (IDIL) in 2016. The center recently organized a congress to discuss medical devices and how open innovation can further their development. We interviewed Thierry Chevalier, the doctor in charge of IDIL.
MedicalExpo e-Magazine: Why create a medical device evaluation unit? What does it offer you?
Dr Thierry Chevalier: The Medical Device Evaluation Institute (IDIL) is a new entity created in 2016 at Nimes University Hospital. It’s a research unit where expert clinicians can evaluate medical devices.
IDIL grew out of a simple observation: proper evaluation of medical devices must be developed from scratch. There are several reasons for this. The range of devices is enormous, from sterile compresses to artificial hearts. In addition, the surgeon’s expertise plays an important role. And we must carry out long-term evaluations—5, 10 or 20 years.
ME e-Magazine: What’s the biggest need for proper evaluation of medical devices?
Dr Thierry Chevalier: The right methods. There are few guidelines for medical devices. Structures like ours will contribute to creating an evaluation process.
ME e-Magazine: You stress long-term evaluation. What’s changed in this area, especially with the emergence of connected objects?
A continuous flow of data will greatly improve these studies
Dr Thierry Chevalier: In the context of medical device epidemiology, we follow defined cohorts over the long term. A continuous flow of data will greatly improve these studies. The ability to receive information from a connected object at any time will be essential. For example, there are teams working on RFID chips implanted in hip prostheses.
ME e-Magazine: During the congress, the open innovation concept was highlighted. Can you explain what that means and how it applies to medical devices?
Dr Thierry Chevalier: Innovation in medical devices means having the right idea at the right time, one that meets a unfulfilled therapeutic need. That requires using multiple technologies, materials, electronics, etc. It’s at the intersection of three worlds: patients, doctors and engineers.
Open innovation is the idea that anyone can be an inventor. Anyone can come up with ideas that “overly structured” people who spend their days in an R&D lab won’t have.
ME e-Magazine: Is there a standout example?
Dr Thierry Chevalier: Connected objects. Most of the inventors come from the IT engineering world and offer truly new solutions. Of course, that raises evaluation questions: How to ensure device reliability? How to guarantee security? We have to invent the evaluation methods.
Doctors who suspect that a patient has had a myocardial infarction or is suffering from another heart disorder request a blood test to measure troponin, a protein released into the bloodstream by damaged heart muscle.
According to Lars Halvor Langmoen, CEO of SpinChip, an Oslo point-of-care (POC) start-up, this presents a dilemma. Doctors can order a POC troponin test taking 10-20 minutes, but offering unreliable results. Or they can wait an hour or two to get reliable results from a lab. And for cardiologists, time is muscle. The longer the wait, the greater the risk of further heart damage.
“Doctors in hospitals here in Norway are generally not comfortable using point-of-care instruments because in so many cases they can’t detect heart attacks. They basically don’t trust the level of detection they have today,” said Langmoen.
Like Coffee Cartridges
Prototype instrument Courtesy of SpinChip
Langmoen was part of management at solar manufacturer NorSun, Norway’s fastest growing technology company. He likens the SpinChip business model to the one that led to coffee makers using cartridges for different coffee flavors.
The secret is in the proprietary system developed by SpinChip founder and POC guru Stig Morten Borch. It provides “high-sensitivity” tests that can measure low concentrations of troponin and other targets.
SpinChip uses dual-axis centrifugation of the assay cartridges. The closed assay cartridge is fed into the instrument, automatically positioned and locked off-center in the rotor disc. The orientation of the cartridge relative to the centrifugal force is altered bi-directionally while spinning.
The SpinChip instrument contains two complementary optical readout systems. The spectral system plays a key role in real-time control of cartridge orientation and content. It also uses surface reflectance to read color intensity and transmittance to determine the optical density of liquids. The genuine fluorescent readout system allows for high sensitivity measurements.
“The combination of two complementary readouts makes the platform cover a wide sensitivity range and [offers] broad flexibility in the use of assaying readout principles,” said Langmoen.
“Our system forces reactions to happen more quickly, with better quality and sensitivity, It makes it possible to perform a broader range of analyses using one platform, and to transfer analyses from laboratories to point of care without loss in quality. All analyses are performed within a few minutes using a small droplet of blood and at a unit cost significantly lower than competing point-of-care platforms.”
Next-gen rapid testing
SpinChip has its eye on next-generation, rapid point-of-care tests that deliver results comparable to labs.
We try to do the most challenging first.
“Troponin is the Holy Grail,” said Langmoen. “There is a lot of money in troponin analysis. If you can do troponin at POC at the same level as labs, you can do anything. It’s very, very attractive. We try to do the most challenging first because then we know we can do the rest.”
The company previously worked on a quick two-minute test for C-reactive protein (CRP), a test to measure inflammation.
Since hospitals are looking to buy a portfolio of heart tests, SpinChip expects to start development in 2018 of a test for NT-proBNP, a peptide used to help detect, diagnose, and evaluate the severity of heart failure.
SpinChip aims for market introduction in late 2019. Other tests are expected to follow.