The AI in Healthcare Executive Forum was an overwhelming success with a great range of attendees from students to C-Level executives networking and contributing to the discussions. People wanted to hear and learn from what others are doing in the field along with scientific examples.
A Quick Panel Overview
Kimberly Powell, Nvidia VP of Healthcare and AI Business Development, kicked off the event with a fantastic overview of AI applications, such as Project Clara, highlighting the various research fields and challenges Nvidia is trying to solve in collaboration with groups ranging from basic research through patient care. Particularly important was the fact that we need supercomputing closer to the instruments, patients, radiologists, and doctors. We need to rethink the way we design systems generating data e.g. imaging devices. Small upgrades in a camera or sensor can produce 10 times more data that needs to be processed, moved and analyzed. For more on this topic, reference this great article on the subject authored by Dr. James Cuff, “Augmenting Pathology Labs with Big Data and Machine Learning.”
During the first panel discussion, Mark Michalski, Exec. Director of MGH & BWH Center at Clinical Data Science, Dan Mulcahy, VP of Product Strategy at the Center for Applied Data Science at Optum Labs, and Prashant Natarajan Iyer, HIMSS Chairperson and Sr. Director, AI Applications Healthcare and Life Sciences at H2O.ai, discussed the successes and challenges with artificial intelligence and machine learning in clinical deployments. The panelists lead the audience through a fascinating discussion covering everything from machine learning strategies and validation to the FDA’s role in the data science space.
I was pleasantly surprised by the interest in clinical applications of AI – a very different world to that of early drug discovery, that must deal with the complexities and challenges associated with data privacy e.g. GDPR. Things are becoming complex, as the true power of AI can only be realized by combining data from multiple sources, as seen with the 100,000 Genomes Project, where whole genome sequencing data is combined with family history and medical records across the UK. However, privacy concerns limit data access to within the UK, which is also the case in many countries around the world.
Our second panel focused on artificial intelligence in drug development. I joined Ankit Gupta, Co-Founder at Reverie Labs, Dr. Christopher Bouton, Founder at Vyasa, and Dr. Alex Zhavoronkov, Founder and CEO at Insilico Medicine to discuss how deep learning is impacting the time and cost required to develop drugs. Alex broke it down: it normally takes 10-15 years to develop a drug. With AI, this could be reduced to five-seven years. A great drug discovery article by NIBR CEO, Jay Bradner, was referenced during the discussion. The discussion then drilled down on best practices to define an AI strategy, such as determining where AI can give the business a competitive edge, what data strategy makes sense, and how to crawl through data and gain insight at scale.
Common Themes and Challenges for AI in Healthcare
Although we have diverse backgrounds, industries, and problems to solve, there were a number of common themes and challenges that emerged. Artificial intelligence, machine, and deep learning are tools that give scientists the time to take a step back and look at the bigger picture around the problems they are trying to solve. How so? With deep learning and AI, mundane and time-consuming work can be done faster, and in many cases, more accurately than a human.
Data scientists are spending too much time solving issues around storage growth, data management, and application builds, leaving less time to tackle data science problems. An important theme that emerged was that data scientists and subject matter experts are the most valuable resources, so their productivity is key. For example, data preparation is critical and time-consuming, but there are no shortcuts. In the last couple of years, NIBR, Johnson and Johnson, and Merck stated that 80-90% of the data scientists’ time is spent cleaning and curating data sets and only 10-20% is spent analyzing the data. Developing data librarians as an asset was recommended to develop the skills required to curate and manage data.
Metadata and metadata structure are key components of any data strategy, and, in turn, any AI strategy. Hundreds of tools exist to query, transform and ingest metadata, but ultimately the right strategic and tactical approach always comes back to the questions you want to ask of the data. Many companies see AI as a competitive advantage, but according to the panel, the competitive edge derives from the scientific and company expertise you leverage by applying AI. A number of off-the-shelf models are great for getting started in the field, but the development of proprietary tools and algorithms are what gives you the edge.
Collaboration, coordination, data sharing, infrastructure and massive data growth are still incredibly difficult problems to solve, while carrying out research and discovery at the same time in such a rapidly evolving field. The volumes of data required for solving deep learning problems are magnitudes larger than those previously seen in the machine learning space. In just the last 5 years, we have seen the progression from analyzing gigabytes of 2D High Content Screening image data per day, to multi-terabytes of 4D streaming data per hour.
We are creating petabytes of data a year that needs to be moved, processed, and analyzed in near real time. The transportation, sharing, and long-term management of data at this scale is a real challenge, so it’s important that groups have a clear data strategy before going all in on digital and AI strategies. Pathology labs are a great example. These labs typically have tens of thousands of physical slides that need to be scanned, annotated and indexed. In addition to the physical effort involved in digitizing slides, systems must be designed and integrated to handle the vast amounts of storage, networking and computing infrastructure required to make a digital strategy a reality.
An important forum discussion focused on how to avoid potential friction when introducing new strategies around digital transformation and AI and how to bring people along. A few key takeaways:
Clearly define what you expect to achieve – a clear strategy;
Show why it’s important for the organization as a whole;
Create a culture of inclusion and trust;
Start with small, quick wins, to show value; Bring together diverse teams, ideas and skill sets; Show it’s a strategic direction and not a fad;
Don’t get hung up on technology, focus on the data and science.
We continue to accelerate scientific discovery and apply it to every step of the drug discovery pipeline from basic research through to patient care. If you are starting on this journey, you can streamline it in several ways. Don’t skimp on basic infrastructure and support to help become operational faster. Service organizations, platforms, infrastructure, and connectivity can really help accelerate your AI journey. For example, Nvidia has an ecosystem of systems and AI tools. Vyasa and Insilico Medicine are developing deep learning platforms for biopharmaceutical applications. Markley is providing the environments to build, store, connect and analyze data at scale. Most importantly, leverage your peers for networking, collaboration, and the sharing of ideas as we saw in this AI forum.