Friday, June 3, 2022 / Blog post
Image by mikemacmarketing / CC BY 2.0
AI is increasingly present in hospital environments: applications for the automatic segmentation of medical images, algorithms to predict pathologies, hospital readmission or mortality are some of these examples. However, these models are susceptible to failure if they are not periodically reviewed. It is something similar to what we would do with our car after many kilometers of use, an overhaul or more specifically: quality control. Quality control and periodic reviews are something that is also done in hospitals for the various technological devices that compose it, so why not do the same with AI applications?
This is the premise proposed by a group of American scientists in an article published in the prestigious journal Nature. They also shared their thoughts on twitter: “Unlike drugs and medical device, artificial intelligence needs to be continuously recalibrated because of dataset shifts and calibration drifts. This is why AI in healthcare should not be patented or sold.”
We encourage you to read it.