Physicians at breaking point

Unsurprisingly, physicians who are constantly under peak pressure have the highest rate of burnout with an average of 45.8%. However, the source states emergency physicians claim a whopping 60% burnout rate. I also recently received an unverified Tweet about the life expectancy for physicians in this specialty, and the news just gets worse. It’s almost 20 years less than other specialties. I am unsure if it’s that much however, if you have ever ventured into an emergency department you can see for yourself why this may be true.


NCQA Digital Summit workshop - streamlining HEDIS reporting with NLP

Fractured Fairy Tale - the Price of Quality

Recently, an esteemed colleague pointed out an eye-opening research article to me when we were on the subject of Quality Measures and the expenses that occur in the digital age: "US Physician Practices Spend More Than $15.4 Billion Annually To Report Quality Measures".

This article was published in 2016, however I am willing to wager that this annual expense has not gone down in the past couple of years. This expense is accrued by not only hospital care organizations (HCOs) but by the insurance companies (payers) as well, all in the name of trying to make our population healthier.

We know that time equates to money in the workforce. How much time does this reporting take on the clinical side in addition to required duties for patient care? No wonder we are facing a clinician burnout epidemic. Medscape’s 2019 report determined that 44% of physicians described themselves as being burned out. And this report only mentioned the physicians, on nursing.org statistics nurses only reported a burnout rate of 15.6%. Which almost sounds like a relief until you learn 41% of nurses reported they felt “unengaged”. How frightening would it to be to be under the care of a nurse that was “checked-out” of his/her job? Burned-out and checked out. How do we achieve better outcomes this way? And how do we know payers are obtaining the correct information from the clinical staff?


Patient safety is an issue that healthcare organizations (HCOs) must prioritize – but how can they improve efficiency when it comes to reviewing the 80% of relevant patient information that is locked in unstructured data?

Under pressure to provide value-based care and adhere to quality measures, HCOs are increasingly turning to AI-based technologies such as Natural Language Processing (NLP), which makes unstructured data usable – thereby improving the efficiency of quality initiatives, quality measure reporting and, most importantly, patient safety.

Addressing the healthcare safety and quality challenge with NLP

While the U.S. health system has made progress in recent years, patient safety continues to be a challenge that all HCOs must prioritize. An estimated 1.7 million healthcare-associated infections occur each year in the U.S. leading to 99,000 deaths. Moreover, adverse medication events cause more than 770,000 injuries and deaths each year at a cost as high as $5.6 billion annually, according to statistics cited by the Center for Patient Safety.

NLP workflows can help reduce the likelihood of error and improve patient safety by automating the identification and extraction of key concepts from large volumes of clinical documentation. Findings are transformed into structured data to simplify chart review and speed the identification of high-risk patients.

Healthcare provider and payer NLP use cases

Multiple HCOs are now utilizing Linguamatics NLP in a wide variety of use cases to advance patient safety, streamline operations, and improve quality of care and reporting initiatives – with quantifiable ROI results.


Data is now a vital part of the healthcare industry, and structured data alone will not be sufficient for better member outcomes. However, much of it is still inaccessible to analysts and experts, as precious information hidden within electronic health records (EHRs). Natural Language Processing (NLP) technology can unearth the 80% of unstructured information stored within the electronic health record.

NLP provides deeper, vital insights into members

NLP can help us to bring valuable missing pieces of information to the surface to cover risk and monitor patient’s health more closely. Disease severity is just one of those areas - for example cancer stage, is critical for proper member management, but the information isn’t always available consistently in structured claims, but is included in the unstructured data. Claims data gives a lot of detailed structured information about the individual, such as their medications, diseases they're suffering from, as well as procedures and treatments that they've had, but that only provides part of the full member picture.


NLP for FAIRification of unstructured data

Data is the lifeblood of all research, and pharmaceutical research is no exception. Clean accurate integrated data sets promise to provide the substrate for artificial intelligence (AI) and machine learning (ML) models that can assist with better drug discovery and development. Using data in the most effective and efficient way is critical - and improving scientific data management and stewardship through the FAIR (findable, accessible, interoperable, reusable) principles will improve pharma efficiency and effectiveness.

What is FAIR and how can NLP contribute?

The FAIR principles were first proposed in 2016, and this initial paper triggered not just discussions (see the recent Pistoia review paper), but in many organizations, the paper triggered action.

We are seeing applications of natural language processing (NLP) in FAIRification of unstructured data. Around 80% of information needed for pharma decision-making is in unstructured formats, from scientific literature to safety reports, clinical trial protocols to regulatory dossiers and more.

NLP can contribute to FAIRification of these data in a number of ways. NLP enables effective use of ontologies, a key component in data interoperability. Ensuring that life science entities are referred consistently across different data sources enables these data to be integrated and accessible for machine operations. Using a combination of strategies (e.g. ontologies, linguistic rules) NLP can transform unstructured data into formal representations, whether individual concept identifiers, or relationships such as RDF.