22 Sep 2020

Edge Computing: How New Technology Energizes This Old Concept

By Christophe Dore and Robert Cohen
lines

Edge computing may seem like a new technology trend but, in reality, the concept has been with us since the 1980s. Simply explained, edge computing brings data processing as close as possible to the end user to reduce network latency and bandwidth demands. It works like this: instead of data being sent from a laptop or other networked input device to faraway centralized servers, processed, and then delivered back to the user (a process that can be both time-consuming and expensive), most of the data access and analysis occur locally at the user’s location, or a nearby network location, so they can more quickly act on it.

It’s a computing model that supports efficiency and timely action. Perhaps most impressive, edge computing – although this term did not exist at that time – was used to take U.S. astronauts to the moon and return them safely home. U.S. astronauts on the Apollo missions did not have the luxury of waiting until data they entered in the spacecraft and lunar module Apollo Guidance Computers (AGCs) was processed in the mainframe terminal at Mission Control in Houston and sent back. The few seconds round trip of such data was still too long when piloting a spacecraft. They often needed immediate information on location for navigation and the status of various devices in the aircraft, while still exchanging status and getting orders from NASA on the ground. That meant reliance on the AGCs in the same space vessel as the astronauts to support the decisions that would take them to the moon and back to Earth in one piece.

For those astronauts, the primary obstacle was the vast distance between Earth and moon, and the time data took to travel between them. Fast-forward 51 years, and this same computing model is being used to tackle the challenge of scale. The mainframe computer based in Houston that helped guide Apollo 11 astronauts to the moon was capable of performing several hundred thousand operations per second, which pales in comparison to the five trillion (with a T) operations per second in the iPhone 11’s chip. While the “newness” of edge computing may be inaccurate, the capabilities available today – thanks to faster processors, greater memory and advanced software – are worthy of attention across many industries, including healthcare.

The Medical Device Quandary

Healthcare is the ideal industry to adopt more edge computing into its workflows due to the growing number of monitoring and other devices in use at the point of care. The large and aging Baby Boomer population is leading to an expected shortage of hospital beds. Hospitals need to shorten length-of-stay as much as possible to maximize capacity, and that requires ensuring patient deterioration is detected and an intervention is launched as soon as possible. Medical devices – especially connected ones – are helping hospitals identify and even predict health deterioration. As such, the medical device industry is expected to grow by more than 25% in the next four years, according to Wolters Kluwer.

Connected medical devices are crucial surveillance for efficiently tracking health trajectory based on multiple vital signs collected over time. However, when medical devices are not connected, hospitals lose out on valuable data for decision making and identifying trends. Even if connected, if the data needs to be analyzed in a server and then delivered to a computer workstation, the insight can be delayed, or overlooked if not easily accessible at the point of care and at the moment of care.

Edge Computing: Insight Where and When Needed

When the clinician is with the patient, logging into the electronic health record or separate application to view health trends can be distracting and an inconvenience to both parties, especially when the workstation has to stay in the corridor. Clinicians want that information at the bedside. That’s where edge computing comes in.

Thanks to the greater speed, power and capabilities of today’s devices, the insights available to clinicians at the point of care are abundant, but do not always require a server-client architecture to deliver that information; nor is it always feasible. When a patient is transported, for example, it can be a high-risk situation depending on patient status. Clinicians still want real-time, intelligent alerting and information about that patient, but they may be between facilities, in an elevator, or in an ambulance where a server connection is inconsistent or impossible. Edge computing allows the clinician to be fully informed about that patient status while caching the collected data during transport for later storage once a server connection is regained.

Enterprise-Wide Data Resiliency

Capsule Technologies’ Medical Device Information Platform (MDIP) offers these edge computing capabilities. Through Capsule’s Neuron 3 hub within the MDIP, up to nine devices can be connected to a single Neuron device to capture and analyze data regardless of network connection. With a long battery life and extended caching capability, medical devices can be connected for several hours, depending on the configuration, without data loss during transport, power outages, or network failures, all the while offering clinicians real-time data and intelligence they need to care for the patient.

While perhaps not as impressive as flying to the moon, edge computing can be a small step forward for hospital clinical workflow efficiency, but a giant leap for patient safety and care quality. Connect with a specialist and learn more about how the edge computing capabilities of Capsule MDIP can transform the delivery of care at your facility.


Christophe Dore is the Cybersecurity Manager at Capsule Technologies.

Robert Cohen is the Senior Product Manager for Edge Computing at Capsule Technologies.