Monday, June 25, 2007

Knowledge, Standards and the Healthcare Crisis: Part 8

I started this topic with the statement that our healthcare system needs radical transformation since:
  • All patients "…are at risk for receiving poor health care, no matter where they live; why, where and from whom they seek care; or what their race, gender, or financial status is"[1]
  • Healthcare is increasingly more expensive and less accessible[2], with more than 46 million uninsured in the U.S. from every age group and at every income level, 8 out of 10 being in working families[3]
  • There is a "knowledge gap"-the healthcare community is drowning in oceans of information, yet doesn't know the best ways to prevent health problems and treat them cost-effectively.[4]
I then went on to explain how these daunting problems can be solved through creation of a knowledge-based healthcare system that drives continuous improvements in care safety, quality and affordability by enabling everyone to:
  • Know the best ways to prevent illness, avoid complications of chronic diseases, and treat health problems (i.e., in the most effective and efficient manner)
  • Use this knowledge to promote wellness, self-management, and recovery
  • Participate in evolving this knowledge to make it ever-more useful and effective.
And then I discussed how data and technology standards are essential for obtaining, sharing and using knowledge effectively, and how such standards are a double-edge sword (i.e., there are serious problems with many standards in use today).

I concluded with a review of how a secure, economical, node-to-node architecture-with universal translation, composite reporting, and application integration-are essential components to the successful implementation of an intelligent and efficient quality improvement system.

I will now tie this all together as I present what might be considered the "holy grail" health-knowledge system.

Imagine patients and other healthcare consumers, along with clinicians and researchers, who collaborate to build an evergreen (i.e., continually growing and evolving) knowledge base of comprehensive information. This information comes from data obtained, which protect patients' privacy, via controlled clinical studies and real world outcomes research. These data are received around the globe every day and are analyzed on a regular basis in order to find associations between biological and psychological signs & symptoms, lab studies, diagnoses, genetic data, demographics, wellness interventions, sick care treatments, patient preferences, care costs, and clinical outcomes. The knowledge emerging from all this information is then used to create and validate evidence-based guidelines promoting high-value (i.e., safe & cost-effective) care alternatives that are matched to the particular needs of different consumers and providers. Understanding the relationships between health problems, care interventions and results enables both professionals and consumers to make wiser decisions that improve outcomes and control costs.

Underlying this information, knowledge and understanding are the data obtained from patients/consumers' PHAs (personal health applications) and providers' EHRs (electronic health records).

Clinicians and hospitals who collaborate in a patient's care, then share and view the patient data via next-generation CCRs (continuity of care records) tailored to each practitioner's particular needs. These CCRs go well beyond the ones being developed today by adding:
  • Sophisticated clinical decision support capabilities that present warnings and alerts, as well as clinical guidelines (and pathways)
  • Patient self-care information (i.e., "information therapy")
  • Tools that track compliance to the guidelines, reasons for non-compliance (i.e., "variance"), and clinical & financial outcomes.
In addition, researchers (in universities, public health facilities, etc.) receive and study the de-identified health data on an ongoing basis-including details about patient health, medications, procedures and other interventions done, and the results of such care-which are used to develop evolving guidelines.

The data and guidelines are transmitted through networks of networks[5] using a simple, secure, low-cost node-based architecture and e-mail that require no build out of existing IT infrastructures. The nodes' "universal translation" function accommodates all data standards, as well as any non-standardized data sets and terminologies. Furthermore, since the nodes communicate asynchronously via publisher-subscriber process, and since they can present interactive reports through "desktop/standalone" applications (i.e., they are not limited to Internet browsers), critical information can therefore be accessed offline using rich, powerful tools. This means:
  • There is no loss of data when a network connection drops out (i.e., unexpected disconnection), and there is no single point of failure to disrupt and entire network when a central server develops problems
  • All the information can be accessed anywhere/anytime, even if there is no Internet or other network connections
  • A great deal of data can be exchanged even when bandwidth is low and connectivity is intermittent (e.g., using dial-up)
  • Each person controls their own data since they are stored locally (in their own computers) in encrypted files
  • Total cost of ownership is minimized since there is no need to rely on expensive central servers and server administrators
  • Performance is greatly increased when performing complex, intensive computations since all data processing is done quickly and easily using local computer resources, rather than waiting for a strained central server, or being restricted by the limitations of a browser
  • You can integrate multiple desktop applications, which cannot be done securely using a browser. [6] [7]
These innovative technological solutions, however, are only part of the story. Other important issues include determining:
  • What data to collect and exchange
  • How to analyze, interpret and validate the data to generate useful information
  • How to organize, access, share and discuss the information to emerge useful knowledge
  • How to use the knowledge to improve care quality and control costs.
Dealing with these issues, I contend, requires valid & reliable data, information and knowledge (dik) that are comprehensive, complex and comprehendible.
  • " Comprehensive dik is required for understanding the "big picture" clearly. This big picture reflects a person's physical and psychological risks, strengths, problems and preferences, as well as the evidence-based well care and sick care intervention options best suited to that individual.
  • Complex dik provides crucial insights into care that are not possible using today's "minimum data set" standards. This higher level understanding comes from analyzing data that reveal such complexities as:
    • Medication-related interactions (e.g., drug-drug, drug-supplement, drug-metabolism and drug-lab results interactions, as well as allergic reactions)
    • Mind-body and mind-body-environment interactions[8] (e.g., the adverse affect of psychological stress and emotions on one's immune system, the affect of one's belief systems on one's health, etc.)
    • Medication side-effects and biomedical conditions that present as psychological symptoms
    • Trends, including changes in lab test results, functionality and signs & symptoms over a person's lifetime
    • The correlation of treatments and outcomes for different patient populations and providers
    • The reasons for not following particular recommended treatment processes and the results of such variance.
  • Comprehendible dik:
    • Are readily understandable (e.g., unambiguous, valid, reliable, relevant and useful)
    • Maintain important nuances of meaning (e.g., uses the correct terminology standards)
    • Do not overload people with irrelevancies or redundancies.
Two new questions now arise, which I will address in my next post; they are:
  1. How can we know if the data being collected are complete, appropriately complex, comprehendible, relevant and useful?
  2. What has to happen for good data to become useful knowledge that leads to ever-better and more affordable care?
References:

[1] The First National Report Card on Quality of Health Care in America by RAND Corp (2006)

[2] Health Care Coverage in America: Understanding the Issues and Proposed Solutions by The Alliance for Health Reform (March 2007)

[3] The Current Situation - WellnessWiki

[4] The Knowledge Gap - WellnessWiki

[5] Linking Providers Via Health Information Networks. Alliance for Health Reform. (Dec 2006).

[6] Is the Browser Singularly Capable of Everything? Software Development Times (June 15, 2007)

[7] New Google Tool Gets Offline Access in Gear. Eweek (June 11, 2007)


[8] Biopsychosocial Healthcare - WellnessWiki

Saturday, June 16, 2007

Knowledge, Standards and the Healthcare Crisis: Part 7

In my last post, I began discussing I.T. solutions for dealing with the healthcare standards problem. I started by describing the benefits of a "publisher-subscriber node-to-node architecture with universal translation." I now continue with a discussion of two more important parts of the solution: "compositing reporting" and "application integration."

Composite Reporting


A node-to-node health information exchange architecture I've described has the added benefit of generating composite reports. These reports are comprised of information sent from multiple publisher nodes to a single subscriber node. The subscriber node takes all that information and combines it into a single integrated patient health profile report.

For example, let's say this report is a "continuity of care record" (CCR) [1] that a primary care physician (PCP) wants to use to help keep track of what's going on with the treatment a patient is receiving from several provider specialists. The PCP's node, which serves as the subscriber, would send a request for CCR data from all the patient's specialists. Upon receipt, the specialists' nodes, which serve as the publishers, retrieve the requested data from their different EHR (electronic health record) databases and send the data automatically to the PCP's node. The PCP's node then incorporates the data into a composite report tailored to the PCP's needs and preferences, and then presents it on screen for the PCP to view. The PCP's subscriber node could also be instructed to request data from the publisher node connected to the patient's PHR (personal health record) database and, upon receipt, include specificPHR data into the same CCR report as authorized by the patient.

Now, if the nodes also utilized universal translation (see the previous post), the data being sent by the subscriber nodes to the PCP's publisher node would be transformed appropriately, so the data always arrives in the right format (structure) and with the right terminologies (semantics). The result would be a useful, cohesive, and understandable CCR report containing information from disparate databases and data standards.

Application Integration


Still another way to make information useful, while avoiding data standards problems, is through application integration. This refers to an ideal way to present information residing in different software applications to support healthcare decisions.

Software vendors have been inventing different ways to display patient data that reside in disparate databases and are presented trough disparate applications.
  • First, there was single sign-on (SSO) with which the user name and password are entered once. Multiple applications then display a patient's data through different windows on the computer screen. In its most basic form, SSO simply bypasses the login and user-validation screens of all but the first application accessed. The person still has to navigate through each application individually, however.
  • Then came context management, which uses SSO, but goes a step further by synchronizing multiple applications. This enables a clinician to view a list all his/her patients, select one, and have multiple windows, corresponding to multiple applications, refresh automatically. This means the information presented through the windows of all the applications is related to the selected patient.
  • Unified view takes context management one step further by presenting the data from different applications in a single window.
  • Data integration involves synthesizing data across multiple applications and presents the data in a fully integrated manner. For example, when context management with a unified view is used to examine how a patient's medications affect his/her lab data, it simply presents one window with the lab results and another with the patient's active medications. That is, the data in each window are accessible only to the application that presents it; there is no interaction between the applications, so the data is static.

    A data integration system, however, recognizes the relationship between the data in the lab results window and the data in the meds window. This means the system can provide decision support by, for example, displaying a warning that lab result A may be affected by the patient's use of medication B. This goes beyond context management with a unified view since it applies analytic intelligence to the data being displayed by different applications. These data associations, however, are transient, i.e., the logic processes used for analytics and decision support must be run each time the same data are presented since there is no way to store the associations.
  • Application integration goes beyond data integration since it retains the associations between data from multiple sources by creating and storing new forms of data that can be accessed repeatedly. Using the example above, it means that an application integration system would generate and store a new piece of data indicating that the problematic lab result A may be due to the patient's use of medication B. This warning information can be retrieved at any time in the future without having to startup the medication and lab results applications, and without having to analyze their data all over again. And the warning information can even be shared with other authorized persons, such as in a CCR report.[2]
In my next post, will conclude the topic of knowledge, standards and the healthcare crisis. I will draw upon the information I've presented to answer the questions posed in the first post of the series: What can we do to foster the widespread creation, use and evolution of healthcare knowledge without breaking the bank and without being hampered by the constraints imposed by data and technology standards?


References:
[1] American Academy of Family Physicians Center for Health Information Technology: ASTM Continuity of Care Record (2007)

[2] Holland, M. (Feb 2007). Improving Clinical Workflow with Unified Data Access and Management. IDC.

Saturday, June 09, 2007

Knowledge, Standards and the Healthcare Crisis: Part 6

In the previous posts [click here for first in series], I described many of the problems facing the healthcare industry as it attempts to deal with data and technology standards. These daunting problems include high cost, complexity, difficulty accommodating changes, loss of meaning and nuance, trouble defining quality, inadequate measures, political influence, etc. I now discuss how an innovative approach to the use of health information technology would help solve these problems.

Solving the Problems with Healthcare Standards

From a technological perspective, what's needed to solve the problem with standards is a simple, low-cost, reliable, secure, hassle-free way to exchange and view structured & unstructured health information, anywhere and anytime, in a way that:
  • Maintains the full meaning and nuance of the information being exchanged in order to maximize understanding and the information's usefulness, regardless of the data standards being used.

  • Supports fluid connectivity between all IT systems, regardless of their technology standards.

  • Gives all authorized consumers/patients, providers, suppliers, payers (insurers), and purchasers (employers and self-insured) the information they need, in the way they need it, to support decisions and guide actions.

  • Supplies researchers with the information they need to evaluate clinical outcomes and care processes, so they can create, continually evolve, and widely disseminate evidence-based guidelines.
I contend that the best way to do this is through a secure node-to-node network architecture using template-driven software applications having "publisher-subscriber," data translation, and personalized reporting capabilities. Let me explain.

Why a Node-to-Node Architecture

In a node-to-node architecture, each node is a software application in a computer that sends and receives information from other nodes. This architecture supports "peer-to-peer" (P2P) networks in which each node stores its data files locally and shares them with other nodes without being controlled by a centralized server.[1] The telephone system and e-mail are good examples of node-to-node. Every phone and every computer are nodes. By picking up the phone and dial a number, or by typing in an e-mail address, you can communicate with whomever you want, and do it anytime and anywhere. Your call or e-mail is routed automatically to where you want it to go through a series of simple switches. This open network is quite different than a centralized system in which you must first sign on to a central server that determines who you are authorized to contact before sending them your message, i.e., all information must pass through a central authority that controls all communications. In addition, such centralized systems typically require the costly development and ongoing maintenance of a centralized patient record locator to know where to find patient data.

The following make the case for a node-to-node/peer-to-peer architecture for exchanging healthcare data:

  • "The United States' National Health Information Network, or NHIN, will differ from the UK's project in a number of ways. Rather than having a single, closed network with a central database overseen by one government agency, the U.S. system will be decentralized, operating more like a peer-to-peer network, with records distributed across the system. Think Napster on steroids. …the NHIN will allow a doctor to quickly call up a patient's digital records from whatever databases they may reside in-at a hospital, at the family doctor's or dentist's office, at a clinical lab, wherever."[2]

  • "After initial testing using a centralized patient index, [Massachusetts' MA-Share HIE determined that] the maintenance for that looked like it would be more than users would want to pay. So the exchange uses distributed peer-to-peer networking. The MA-Share exchange provides an appliance to let members push financial transactions, e-prescriptions, and clinical summaries-so a doctor can send a file to another doctor or provide prescription data to a pharmacy." [3]

  • "To make significant gains in patient safety through the adoption of health IT, providers will need to adopt IT systems that can 'speak the same language' to each other. In computer terms, they should be 'interoperable.' But interoperability isn't enough. To communicate, different health IT systems must also be linked in some way. This is 'connectivity.' One model of connectivity, in a national health IT context, would be a non-proprietary 'network of networks.' …Several issues must be addressed if different health information systems are to communicate. …Some suggest that there should be one uniform national system with one central repository. This approach presents challenges: the sheer volume of data that would need to be handled, significant concerns about privacy and security threats, and likely disputes about governing and paying for a centralized system. Another option is a series of regional networks, as advocated by ONC [the Office of the National Coordinator of Health Information Technology, formally ONCHIT]. ONC's strategic frame- work suggests that a national network should be structured around regional health information organizations (RHIOs). RHIOs would store, organize and exchange patient health information within a defined geographic region, under local rather than national governance. These regional organizations would form a "network of networks" across the nation." [4]

Publisher-Scriber Communications Model

The nodes in these P2P networks employ a publisher-subscriber communications model in which a publisher node uses its communications software application to publish (send) information to one or more authorized subscriber (receiver) nodes. Once transmitted, the subscriber nodes use their subscriber applications to retrieve that information and present it as reports. In other words, the publisher-subscriber model uses an "application to application" transfer process in which each participating node uses a particular software application for exchanging (sending and receiving) information.

The publisher and subscriber applications support a particular operating system OS) and Internet connection using broadband or dial-up service. A node that uses an e-mail client (such as Microsoft Outlook on Windows OS) is one such example.

At one end of the connection, the publisher node must authorize the information transfer by authenticating that the subscriber node is allowed to receive the information. At the other end of the connection, each subscriber node must allow the publisher to deposit the information into a directory (i.e., a folder in computer's drive) as a file with a specific format (such as an MS Word, Excel, or "comma separated value" file).

Universal Translation

A node-to node architecture incorporating "universal translation" provides a means for modifying (transforming, translating) information as it passes between nodes, so that each subscriber node receives from a publisher node the right information, in the right format (structure), and with the right terminologies (semantics).

This is where data and technology standards are handled. That is, the universal translator makes the necessary transformations to the information sent by a publisher node, so different subscriber nodes can use that information to generate their particular reports and, if desired, to store the information received in the subscriber nodes' databases. It can accommodate any data standards and operate with systems using any technology standards.

Advantages and Benefits of the Node-to-Node Architecture

The advantages and benefits of this asynchronous, publisher-subscriber, node-to-node architecture are many, including the following:
  • Is exceptionally flexible:
    • Accommodates any data and technology standards, so everyone gets the information they need in the way they need it
    • Allows anyone to communicate with anyone else in any way
    • Can use multiple connectivity options, i.e., radio transmission, satellite transmission, wire transmission, wireless transmission.

  • Has maximum reliability since it leverages the most reliable network in the world, i.e., the switched network (like the telephone system).

  • Is inexpensive to deploy and operate because it doesn't require changes to existing I.T. infrastructures and keeps implementation costs low by eschewing additional equipment and system purchases.
  • Is robust and resilient since there is no single point of failure; so, unlike centralized networks that are disrupted if a central server goes down, the node-to-node network is survivable in a disaster since it keeps going even if individual nodes are disabled.

  • Makes scalability a non-issue, which means there's no need to purchase new equipment or redesign software as the network grows; this is unlike a centralized system in which there tends to be significant costs in time and money to meet the needs of a growing network.
  • Is highly secure since there are no external database queries; firewalls are not crossed.
In my next post, I discuss other parts of the solution: Composite Reporting and Application Integration.

References:

[1] Wikipedia - Peer to Peer and WellnessWiki - Network Architectures (see Node Mesh Network)

[2] Charett, R.N. (2006). Dying for Data: A comprehensive system of electronic medical records promises to save lives and cut health care costs-but how do you build one? IEEE Spectrum Online (Oct 2006)

[3] Kolbasuk McGee, M. (May 28, 2007). Urgent Care. Informationweek.com

[4] Linking Providers Via Health Information Networks. Alliance for Health Reform. (Dec 2006).



Friday, June 01, 2007

Knowledge, Standards, and the Healthcare Crisis: Part 5

In the previous four posts [click here for first in series], I described the data and technology standards commonly used to enable the exchange of health information between disparate data sources. I also discussed why such information exchange is vital to the creation and use of knowledge leading to increased healthcare value. In addition, I mentioned several challenges to using standards effectively.

In this post, I delve into the problems faced by the healthcare industry when dealing with standards.

The Problems with Healthcare Standards

We confront one set of problems with data (terminology, care measurement and care process) standards, and another with technology (messaging) standards.

Problems with Terminology Data Standards

Problems associated with terminology standards are significant:
  • According to William Hammond, professor emeritus of community and family medicine at Duke University, there's "been ongoing discussion about implementing health data standards harmonization and cooperation for 20 years, yet no one has defined all the standards needed to support a national health information network, and no one has identified what's missing." Just agreeing on medical terminology is a big issue. And, according to Michael Rozen, vice chairman of the IEEE-USA Medical Technology Policy Committee, "When you say 'gross profit,' everyone in finance knows what that means [but] in medicine, there are 126 ways to say 'high blood pressure.' "[1]

  • While setting an arbitrary standard for health-related terms is a way to foster widespread communications between people from different regions, organizations and healthcare cultures/communities, there's also a downside to such standards, i.e., they lose information due to reduced "semantic precision and nuance." In other words, there's a good reason to have multiple ways of saying high blood pressure. For example, malignant hypertension refers to very high blood pressure with swelling of the optic nerve behind the eye, which is usually accompanied by other organ damage like heart failure, kidney failure, and hypertensive encephalopathy. Pregnancy-induced hypertension, on the other hand, is a pregnancy-induced form of high blood pressure (also called toxemia or preeclampsia). Referring to a patient's condition using the standard term "hypertension," while clearly conveying that the person has high blood pressure, looses these important details, which could very well affect treatment decisions and outcomes.

  • Diagnostic code standards-including all versions of the ICD and DSM-have several serious limitations. These problems include the fact that (a) these standards are not detailed enough to describe the nuances of all diseases and conditions and (b) some diagnoses are not useful in making treatment decisions.[2] Since treatment selection is based (or should be based) on a patient's diagnosis, we need a diagnostic standards that have greater precision. This requirement is amplified with personalized care is, in which each patient's unique makeup (including genetics) and the mind-body connection are taken into account (not to mentions ones abilities and preferences).

Problems with Care Measurement and Process Data Standards

As I discussed in a previous post, care measurement and process standards relate to evaluating care quality and provider performance, and to establishing practice guidelines. Some of the problems associated with these standards, include the following:
  • Achieving wide-ranging and meaningful quality standards requires many more years of dedicated effort by many people and substantial financial resources.[3]

  • Standards should evolve continuously, changing as necessary to accommodate new knowledge. Unfortunately, it typically takes 17 years before clinical evidence is implemented in practice guidelines.[4] [5]

  • Simply maintaining nation-wide data standards is a slow and costly process.

  • And, as I discussed in an earlier post, there are many problems with practice guideline and quality measurement standards:

    • It's difficult to determine when there is enough evidence supporting a practice guideline and there is no longer any need to spend time or money on its continuous evaluation.

    • It's difficult to determine when a definition of quality is too narrow, which can happen, for example, when measuring quality based on cost or symptom reduction, without giving adequate consideration to prevention or the continuity of care.

    • It's difficult to determine how best to measure quality when resources are scarce and optimal care for the community may require less than "the best" care for its individual members.

    • It's difficult to determine how best to measure quality if outcomes are more strongly affected by patient compliance than by physician orders.

    • It's difficult to determine if care quality is of poor when a provider follows the recommended practice guideline, but the patient is atypical and responds poorly.

    • Using claims (administrative) data to measure care quality, as in often done today, is grossly inadequate.

    • Assessing care quality using process data may not be valid since they do not necessarily reflect care outcomes.

    • It's difficult to determine how to avoid political and ideological biases when determining what evidence to use as the basis for establishing the guidelines.

    • Many areas of healthcare lack care process standards and useful quality measures. Different healthcare disciplines and specialties require different types of data to evaluate quality.

Problems with Technology Messaging Standards

The problems with standards aren't limited to data standards; they also plague technology messaging standards:
  • When multiple information systems use the messaging standard to communicate, changing the standard cost huge sums as all the systems using them must be overhauled. A good real world example is the Year 2000 problem, where computer systems were built using a messaging standard that required only the last two digits of the year to be used when transmitting data containing dates. So, when 2000 rolled around, this data standard made it impossible to differentiate between years beginning with 19 and those beginning with 20 (i.e., 4/5/05 could be Apr 5, 1905 or 2005). This problem easily cost hundreds of billions of dollars to fix.

  • The Healthcare Information Technology Standards Panel, which is setting technical standards for a nationwide record system, identified an initial set of 90 medical and technology standards, out of an original list of about 600. These standards specify such things as how lab reports are to be exchanged electronically and entered into a patient's electronic record, as well as how past lab results are to be requested. More than 190 organizations-representing consumers, providers, government agencies, and standards development organizations-participating in the panel. It's no wonder, therefore, that a consensus on medical standards is so difficult and fraught with politics as standards-setting involve intense negotiations and delicate compromises. And once such IT standards are set, software systems and databases must be designed to conform with those standards.[6]

Summary

While data and technology standards offer a way to handle information exchange challenges, they come with issues posing serious problems in terms of cost, effort, time, hassle, complexity, inefficiency, usability, reliability, information loss, political influence, etc.

In my next post, I will discuss ways to solve the daunting problems plaguing the use of healthcare standards.

References:
[1] Dying for Data: A comprehensive system of electronic medical records promises to save lives and cut health care costs—but how do you build one? IEEE Spectrum Online (Oct 2006)
[2] Current Diagnostic Codes are Inadequate – WellnessWiki
[3] U.S. Health Care Sector Moves Rapidly To Provide Consumer Information on Value. HHS (May 9, 2007)
[4] Balas, E. A., & Boren, S. A. (2000). Managing clinical knowledge for health care improvement. In J. Bemmel & A. T. McCray (Eds.), Yearbook of Medical Informatics (pp. 65-70). Stuttgart: Schattauer Verlagsgesellschaft mbH.
[5] Clancy, C. M., & Cronin, K. (2005). Evidence-based decision making: Global evidence, local decisions. Health Affairs, 24(1), 151-162.
[6]  Dying for Data: A comprehensive system of electronic medical records promises to save lives and cut health care costs—but how do you build one? IEEE Spectrum Online (Oct 2006)