Doctors should not get caught up in the hype when it comes to artificial intelligence, because AI technology is too limited to serve many significant tasks in health care, said Ben Greenberg, vice president of product design at WebMD.
Greenberg, speaking at the 2018 VOICE Summit at the New Jersey Institute of Technology in Newark on Wednesday, said that the health care industry needs to wait until the tech industry fully considers the ethics behind AI.
For example, he said, if a self-driving car is driving on a windy road on a steep cliff, and innocent people are suddenly on the road and there’s not time to hit the brakes, does the AI controlling car save the passengers by killing the people, or save the people and kill the passengers by swerving off of a cliff?
Greenberg told the audience that AI will eventually change the face of health care in many years, but for now, “doctors don’t have to worry about robots replacing them.”
Instead, he said, AI can help doctors be more efficient at their jobs by allowing doctors to ask a voice-based system dosage recommendations, drug interactions and quickly add to electronic health records.
“Calculators did not eliminate the need for mathematicians, they just freed them up to perform higher tasks,” Greenberg said.
He also warned against allowing pharmaceutical companies to use voice technology to give clinical information to doctors, because that information will likely be biased: “Can you create value and money and meaning for doctors using voice and AI technology? Right now advertising is prohibited on Alexa except for podcasts and other exceptions, but that might not be the case in the future. So if a physician asks their voice speaker whether drug A interacts with drug B, physicians don’t want to hear that the answer “was brought to them by drug company A.”
Finally, Greenberg warned the medical community not to get caught up in buzzwords such as artificial intelligence without having a specific use for it.
“When you start as an industry to tell people that they need to develop Alexa skills right away, that’s dangerous,” said Greenberg. “If a company develops an Alexa skill without filling a need, it will have a scale problem.”