Searching for Healthbots That Advance Research and Clinical Information Discovery at HIMSS17
Tuesday, January 24, 2017 at 10:42AM
Janice McCallum in cds; himss; healthIT; healthbots; chatbots; interoperability

Improving the flow of information is a consistent motivator in everything I do in my professional life. With early experience as a researcher and product manager at a pre-Internet era search engine, my consulting practice has focused on helping information-centric companies to disseminate their content more effectively within the context of their business objectives. With this long-view of the publishing and information dissemination segments in mind, I’d like to offer some observations and predictions for trends at HIMSS17.

1)      Growth in usage of digital devices and sensors will be a catalyst for progress in interoperability. This is a safe prediction, but I include it because it sets the stage for my other predictions. With the proliferation of devices and sensors, all of which produce data, we need some rationalization in the way the data are recorded and integrated for analysis. It won’t be sufficient to suggest that IT systems manage each device and its data separately. The devices will have to interact with other devices and with EHR & other systems. Another way to state this prediction: as the Internet of Things (IoT) develops into the Industrial IoT, transmitting data in multiple directions in real time for clinical and research purposes will be commonplace. Through consolidation among startups and the entry of big players, more resources will be devoted to resolving health data interoperability issues.

To make sense of the all of the data being produced by sensors and other devices, information systems that can interpret the data in context (i.e., AI/cognitive computing systems that incorporate machine learning techniques) will be needed. That brings me to my second prediction:

2)      Healthbots increasingly become the new interface to health information & health data for patient information. Chatbots have evolved from simple voice recognition technologies to cognitive computing interfaces that can execute complex commands and improve their utility over time with machine learning technologies. I expect success in the consumer space via Apple’s Siri, Google Assistant, Amazon’s Alexa and other examples to carry over to the patient engagement and patient education space quite rapidly, although a secure channel will be required for healthbots, whether the bot uses a voice, haptic, or typing interface. Telemedicine services represent an obvious segment where chatbot interfaces are already in place.

Applications in clinical decision support for professionals will emerge in areas where the knowledgebase is complex, but mostly contained to similar datasets (e.g., EHRs and medication reconciliation use cases). Chatbots make sense, too, in areas where hands-free use is important. But, overall, adoption within clinical enterprises will be hampered by data access issues and will take longer to reach wide acceptance.

See http://www.pharmavoice.com/article/2016-11-health-bots/ for an excellent round-up of opinions from industry leaders on the future of chatbots/healthbots.

My final prediction is a cross-industry trend that will improve information flows in general, but I address it here in context of clinical decision support (CDS).

3)      Information discovery will no longer require an active search. Search will still exist, but it will exist primarily in the background. In fact, Susannah Fox, former CTO at HHS, once called search “wallpaper technology, something we don’t even see anymore, yet it’s clearly an activity worth discussion[1]. [This prescient quote comes from a post written in early 2010; Susannah is one of the best prognosticators in health care, after all!]

This shift from blank search boxes and look-up tables toward surfacing relevant information based on context, prior behavior, collaborative filtering algorithms & other patterns has been occurring for some time. One example that bridges the search and discovery paradigm is Google’s inclusion of knowledge graph items that are displayed at the top of search results when one enters a disease such as ‘diabetes’ in the search box.  

Another example is the TrendMD model[2], which appends personalized contextual links to the article someone is reading. The links can be sourced from any of the 3,000+ sources within the TrendMD network of scholarly publishers and professional news sites, which allows related information from other fields or specialties to be surfaced, but offers the assurance that links won’t be sourced from unwanted advertising sites. Like machine learning-enhanced healthbots, the quality of the related links improve over time with increased usage as the algorithms learn about an individual’s preferences and gain knowledge from the broader community of users.

Closer to home for the HIMSS audience, CDS Hooks, an open standard within the SMART on FHIR framework[3], will advance the clinical decision support goal of delivering the right information to the right person at the right time in the right format within the right channel. However, as described above, cognitive computing and machine learning technologies can take this type of information alert to the next level and act on the data that are surfaced. It will take time for executable CDS to become widespread; mistakes are too costly and clear rules for executing clinical orders aren’t sufficiently established yet to create workflows that are generally acceptable.   

At this point, I’d like to insert a cautionary note about the importance of privacy and transparency in CDS and healthbot systems. Bots are becoming popular and can be rather addictive when they learn from large numbers of information sources and deliver personalized results. For example, it’s great when Google Maps redirects us around traffic accidents in near real time, but the downside of a mistake, say a 10 minute detour, doesn’t compare to the downside of an incorrect dosage or incorrect rehab instructions. We don’t want to become too dependent on bots without understanding how they calculate outputs and maintain privacy of the individuals using the bots.

At HIMSS17, I look forward to reporting on notable advancements in these three areas and to putting the whole HIMSS experience in context of improved information flows and decision support for clinicians, researchers, and patients. 

See also this recent Firetalk video chat on chatbots and healthbots with Chuck Webster, MD and me: https://firetalk.com/events/rJ7WnvcUg  

 


[1] http://susannahfox.com/2010/01/31/whats-the-point-of-health-2-0/.

[2] TrendMD is a current client.

[3] See my blog on this topic: http://www.healthcontentadvisors.com/blog/2016/9/30/health-it-infrastructure-enables-clinical-decision-support-w.html

Article originally appeared on Health Content Advisors (http://www.healthcontentadvisors.com/).
See website for complete article licensing information.