[ad_1]
Digital health companies and investors flocked to Nashville, Tennessee, this week for the second annual ViVE conference. Hot topics among the crowd of 7,500 included worries about being “creepy,” the economy and whether ChatGPT is as good as its buildup. Here were five takeaways.
1. ChatGPT is all people want to talk about.
The potential of generative AI applications like ChatGPT and GPT-4 in healthcare was the dominant topic of conversation. Attendees had two schools of thought on the potential of ChatGPT in healthcare, which was summed nicely by Micky Tripathi, chief of the Office of the National Coordinator for Health Information Technology.
“I think all of us feel tremendous excitement and you also want to feel tremendous fear,” Tripathi said.
Michael Hasselberg, chief digital health officer at University of Rochester Medical Center in New York, said he is a believer in the power of generative AI. The large language AI models from ChatGPT developer OpenAI are “light years ahead” of what he’s seen in the marketplace from various startups automating healthcare administrative and revenue cycle processes.
“It’s so easy to use. I have no formal training as a computer scientist and I can program with OpenAI,” Hasselberg said. “I can spin up an application pretty quickly that can solve a lot of the problems that are sitting on top of my workforce right now. Before I was looking at all of these companies to try to solve [those problems] for me.”
Both Hasselberg and Steve Kraus, partner at venture firm Bessemer Venture Partners, said AI models have a lot of potential to automate processes like prior authorization.
But there was a fair share of skepticism and even fear of generative AI models. Tripathi said that when inappropriately used, algorithms can perpetuate health equity and quality issues. Tom Cassels, CEO of the advisory and consulting arm of Rock Health, agreed.
“What’s dangerous are [digital health companies] who touch clinical practice and are thinking about bringing in big data models by using existing biased medical guidelines and information,” Cassels said.
[ad_2]
Source link