This article was first published on the HIMSS Conference site on January 29, 2017: http://www.himssconference.org/updates/himss17-perspective-connected-healthbots-will-wake-clinical-decision-support
The computer “woke up once it got connected to other people”.Anil Dash
On a recent Krista Tippett’s OnBeing broadcast, Anil Dash talked about how for the first half of his life the computer was an island that wasn’t plugged into anything. But the computer “woke up once it got connected to other people” via the Internet. He jokes that young people he now mentors find it hard to understand the use of a computer that doesn’t communicate with other computers and they wonder what computer users did with those early machines: stare at the screen?
Those of us who follow the development of EHRs and their effect on the doctor-patient relationship don’t think there’s anything funny about a computer that doesn’t communicate with other computers. Unfortunately, in the health IT community, we’re still pondering what we could do ‘if only’ data could flow more freely between computers.
But, it requires more than just creating a communications network between the boxes to wake up computers and use them to their fullest; it requires transmitting data that can be programmed or executed. My colleague at the InfoCommerce Group, Russell Perkins, who has been producing conferences on the theme of Data Content for over a decade, has long spoken of the power of “data that can do stuff.” HIMSS prefers the hashtag #PutData2Work. The sentiment is the same: we need data that are interoperable and can be communicated efficiently across networks so that important information can be consulted at the point of need and integrated with other data to support health care decisions.
I view it as a stock versus flow problem. The major EHR vendors focus on creating record-keeping systems; they don’t specialize in the communications layer that moves information from one place to another. Their reluctance to develop—or even enable— inter-organization communication is similar to the reaction of the computer hardware and software vendors when faced with the Internet in the mid-90s. These vendors underestimated the changes that would occur once users were allowed to communicate and exchange data across a wide network of computers. It took a major external development effort to invent the Web. In health IT, we don’t have to reinvent the Internet or the Web; they already exist. But, based on events to-date, we should look to a new class of vendors that understand data science and APIs to introduce enhanced communications and cross-institution data analysis. The legacy EMR/EHR vendors that market record-keeping systems either don’t have the skills, imagination or urgency to extend their field of competency into data exchange.
Technology continues to make it easier and faster to find information. Early search engines required training courses to master the interfaces. Then Google came along and offered a simple interface that sat atop algorithms that aimed to present the most relevant results first.
Now, we’ve entered a new phase, where the search box itself is being replaced by bots that surface relevant information without requiring a search query. For example, with my Google Pixel phone, all I have to do is ask Google Assistant to find me direct flights from point A to point B and I’ll receive verbal information about the length of the flights, along with search results for specific flights that I can click on. If I’ve previously searched for flights to that destination, Google will remember the dates I chose and use them as the starting point. Once I’ve booked the flight, I can easily retrieve the information with a quick request to Google. Note, this is an incremental step that focuses on advances in the chat interface and builds upon data extraction and presentation tools that Google has been developing for years.
Through cognitive computing advances, consumer chatbots like Google Assistant, Amazon Alexa, and Apple’s Siri have matured to a point where bots can learn what is important to a user and continuously hone its ability to personalize its services. The popularity of these consumer chatbots is paving the way toward adoption by other segments; telemedicine interfaces are a good example.
I define healthbots as special purpose bots that perform automated context-specific information retrieval and analysis services via voice, text, or other natural language interface.
One of the more obvious areas where healthbots could improve efficiency in health IT is in EHR interfaces. Bots are ideal for navigating through complicated structured databases. Healthbots could speed up the process of finding the right location for both data entry and data retrieval and they could perform countless advanced operations, specialized for each use case.
Still, to achieve higher-value implementations, healthbots need to operate across broader collections of data than what is stored in a single healthcare provider’s EHR system. With the massive amounts of data being produced by medical and life science researchers, devices, sensors and health data analytics services, healthbots will need to work in tandem with APIs to enable the degree of information flows needed to take clinical decision support solutions to the next level, where data from multiple sources can be interpreted in context and delivered within the clinician’s or patient’s workflow.
Each year that I’ve attended HIMSS since 2010, I’ve sought out signs of progress in integrating evidence-based medical information into the clinician’s workflow. Good progress has been made in appending links to relevant content at the point-of-need within patient records (via Infobuttons & more recently CDS Hooks), but there’s still a long way to go before we see a clinical decision support resource that pulls together relevant data on a topic from diverse collections of data and includes collaborative features that help construct a learning system based on collective experience. The biggest limiting factor has been business models for licensing or sharing data across competing publishers and my hope is that APIs will continue to advance in how they manage business terms for data licensing or sharing.
This year at HIMSS17, I’ll be seeking out companies that produce healthbots, as well as those that provide APIs that embed business rules for exchanging data. APIs are a critical piece of the solution needed to wake up data stored in siloed repositories so that healthbots can reinvent the way we communicate with broader collections of health data.
It’s important to recognize that siloed repositories exist across the entire health information landscape, not just in EHR/EMR systems. Medical and life science research information is scattered across many public and private publishing and research organizations. Publishers and information services companies try to aggregate and synthesize results of new research, but there’s no single source for such a complex and constantly-changing set of data, especially considering the number of potential use cases for the data.
We need a solution that allows collaboration across the many sources of data to create learning systems in healthcare. Medical publishers that are implementing APIs to their data silos will definitely catch my attention at HIMSS17.