In an ideal world with information technology interconnectivity, device interoperability and universally implemented service and supply data standards, a patient’s journey through the healthcare facility from admission to the inpatient room to the surgical suite to recovery, then discharge, check-ups and payment would be as fluid and flexible as possible with the secure electronic transmission data.
We’re not there yet. But Gary Palgon, Vice President, Healthcare and Life Sciences, Liaison Technologies, Alpharetta, GA, shared his insights with Healthcare Purchasing News Senior Editor Rick Dana Barlow about effective and efficient data management throughout the healthcare organization, particularly the operating room.
HPN: What are some effective strategies and tactics for collecting and sharing data from the point of use of a product or service up and down the channel — from the patient in the bed or on the surgical table in the OR to the EHR/EMR, billing and insurance company on one side and to Supply Chain, distributor, manufacturer and raw materials supplier on the other side?
PALGON: Historically, organizations took an ‘application-centric’ approach to solving business problems. This means they purchased applications — or devices — in response to a specific problem, but then later this left them with a myriad of applications and devices to support. And the biggest problem is that each application has effectively bounded the data within it, making it difficult to share data across them to make better decisions.
As more devices — and other physical objects for that matter — have been IP-enabled with data generated in unseen volumes, velocity and veracity, organizations are moving to a “data-centric” model whereby data is liberated and can be made available to any application or device in a timely manner, allowing insights and decisions to be made in a timely manner.
One model for this is Data Platform as a Service (dPaaS), where data are integrated and aggregated from a myriad of sources and can be transformed for analysis quickly by a wide variety of analytical or purpose-driven applications. Being “in the cloud” means that it can bridge boundaries between a patient in a hospital bed and a life sciences organization searching for patients to participate in a clinical trial — something that is a very manual and costly process today, for example. Creating a “data layer” that allows easy access to all of the data allows new applications to be developed quickly in order to answer new questions. In this new model, applications are the analytical layers without bounding the data in ways that prevent the sharing of information as models and research changes.
What are the impediments that slow progress toward this scenario and why?
Initiation of the data-centric model is quick and easy, especially where the cloud is being used to facilitate the integration and aggregation of information. This limits the investment, time and resources to develop the platform since it already exists and must simply be configured to meet the specific business needs.
The challenges then become two-fold: one technological and the other business. From a technology standpoint, legacy applications must enable the external sharing of the information they house. Application vendors must take the time to engineer methods to liberate their data, breaking out of the “bounded schemas” and batch processes that define them. Standards have and are evolving to help with the formats for exchanging data, but a standard is really only a guide — meaning that multiple applications using the same standard for document exchanges, for instance, are not 100 percent equal.
From the business perspective, health IT vendors are often charging a tax to export information, which often sets a high expense, especially for smaller institutions like physician practices — those that maintain the most patient data. This has been termed “data blocking.” Originally Meaningful Use, and subsequently MACRA regulatory requirements, require the external sharing of patient data to comply with industry guidelines, and more importantly, to meet the needs for full Medicare reimbursement. You would think that the EMR/EHR would include this functionality as part of their maintenance fees, but they usually do not. Therefore, to comply, physicians have relegated to sending a single direct message to check the box of external data sharing compliance.
How might healthcare organizations overcome these challenges?
First and foremost, all providers, patients and individuals on the caregiver side of the equation must let vendors know their frustration with data blocking and push them to include the sharing data as part of their solution maintenance fees and/or at a very reasonable cost. If we are all truly concerned about improving patient outcomes and the cost of healthcare, better access to patient data — financial, clinical and operational — must be at the foundation of all healthcare solutions.
Healthcare organizations must also be able to be open to taking advantage of newer technologies and those that have been successful in other industries.
How might a palm-implanted chip augment, supplement or even replace the need for a bar-coded or RFID-chipped patient wristband?
The thought of this has technical, regulatory and emotional implications, all which must be addressed. However, taking a simplistic viewpoint, if we can make access to healthcare easier, and potentially at a lower cost, while improving health outcomes, that is directionally the way to go. Bar codes have served their purpose for many years and have in recent years been complemented by QR Codes and RFID chip-enabled tags and devices. As newer and smaller technologies are invented and become broadly available, the healthcare and pharmaceutical industries should look to take advantage of them.
Data is the key to enabling advancements around decision-making, analytics, etc., to improve health, and it is also the key to linking the myriad of applications and devices together, which generate the data. If data can securely be stored or generated by implanted chips, then it could help to improve health outcomes. Again, this does not consider the issues related to security, regulatory or emotional issues that come with the thought of human-embedded chips.
Given that employees of a Swedish company reportedly are using a chip implanted in their hand to open doors and pay for lunch, their employer is tracking data like work hours, how might a healthcare organization accomplish something similar that shares appropriate data with payers and suppliers?
The chips act as the “glue” or “connector” that links together all occurrences of patient information across physical boundaries of caregivers as well as across applications and devices, but they should do it in a way that limits errors in the process. For instance, if the chip acts as a secure token, then bridging the IP-enabled devices and applications should be more secure without the problems that go with manually addressing this today. Patients being left responsible for writing down consistent insurance numbers, social security numbers, etc., surely cause many inconsistencies with linking patient information together today — especially for patients that are not in a condition that allows them to provide this important information, whether having a medical issue, being involved in an accident or simply aging and not being able to access or remember this information.
Rick Dana Barlow | Senior Editor
Rick Dana Barlow is Senior Editor for Healthcare Purchasing News, an Endeavor Business Media publication. He can be reached at [email protected].