Skip to main content

What could ChatGPT mean for Medical Affairs?

image of smart phone being used

The remarkable adoption of ChatGPT since its launch, and the subsequent activity from Microsoft, Google and others in response, has unleashed a wave of excitement and interest in these potentially game changing technologies. Unlike some technology innovations we’ve seen in the past, this is something anyone, in any walk of life, whether in a personal or professional context can start using immediately. What does this mean for Medical Affairs? We don’t know yet how this story plays out. But our team is already using this technology and other large-language based NLP methods in our work, so we are already starting to see the potential practical applications.

ChatGPT for quick answers and content generation

For example, in advance of a drug launch, medical affairs teams need to collate and ask questions of everything from the latest research and congress activity, clinical information about the indications to the perspectives of patients and other stakeholders.  Beyond this, they need to be able to communicate key messages and information relating to the indication they are launching a product for. Tools like ChatGPT take the advancements seen with models like GPT3 and put that power into everyone’s hands. Medical affairs professionals can get quick answers about existing treatments, important factors relating to a therapy area, and far more. As users deepen their questioning within the chat conversation, they can probe and surface detailed topics, such as potential barriers to prescribing, patient challenges, a deeper understanding of modes of action and more.  

This immersive experience can unlock new ideas and allow a more comprehensive brief to be created. Some the content generated by ChatGPT may be ready to use within digital content, some forms of external communication or internal presentations. In other cases the outputs might form a useful starting point only and even have errors and issues, but nevertheless can help experts to frame and create their own materials or refine existing ones. Once the right content and materials are created, ChatGPT can then help to re-word or summarize for different audiences, which is critical for effective dissemination of actionable and impactful information For example “ChatGPT, re-write this paragraph with simpler language and focus on what readers need to do differently”. Some content does require 100% accuracy, such as Medical Information, and ChatGPT may not be suitable for creating this content, or at least not without multiple layers of review and validation.

In creating compelling content, or answering questions in a meaningful and actionable way, GPT changes the dynamic for Pharma. Contrary to the hype, that change in dynamic is not the removal of the human from the process - far from it. With such rich but vast data available to ChatGPT, the role of the user moves from providing data and conducting analysis, to crafting the right question and evaluating and editing the outputs to converge the process to a point where outputs are usable and actionable. ChatGPT doesn’t only use its own smarts to generate output, in fact it heavily relies on using the details of the questions it is given. This 21st century craft of ‘prompting’ is critical in getting the best results. Additional details and nuances provided generate more accurate and usable results.  For example: “Tell me the top challenges with Diabetes” is much less likely to get the result you want, whereas “What are the top 10 medical challenges that patients experience with Type-2 Diabetes” is likely to generate results that are much closer to what you, as the user, need.  Once the right question is asked, and asked well, the rich outputs are generated and then the second critical role of the user must be done - evaluating the quality, risk and validity of the outputs. Outputs and solutions that miss this crucial step are doomed to fail. 

What about compliance? 

Stepping back, it is also important to remember that the processes that govern scientific rigour, legal and regulatory compliance as well as corporate guidelines all remain, and AI-produced output is certainly not exempt from any of these. When it comes to commercial and privacy concerns, where the data goes is also a key concern. Here we must remember that there are far more options than the cloud ChatGPT platform available, and there are a myriad of models that can be deployed and run on corporate or private infrastructure, which provides a way to work locally and safely, reduce risk and avoid uploading data to external systems, perfect for proprietary data like MSL notes.  

Overall, these new approaches can energize professionals, act as efficiency drivers, unlock new insights or be a thought-starter or draft content creator. The richness and human voice of large language models (LLM) are complimentary to the precision and flexibility of supervised machine learning techniques. When you add into the mix the transparency and rigour of rules-based linguistics, we as an industry now have a toolkit that changes the game. Technology can truly empower medical affairs teams to work with the rich and vast data that has been building in volume through the digital revolution. Big data was always about potential, but now we finally have the tools and techniques to unlock that potential. This is an amazing time to accelerate improvements in patient outcomes and care through new ways of working.  

To learn more or discuss your challenges, get in touch today 


Ready to get started?

Request a Demo

Questions? Ask our experts