The accuracy of any medical AI can only be as good as the quality of its training data. Developers need to take extra care to identify and dispose of bias within the algorithm to not put patients of any race, age or gender at a disadvantage. Keep reading to find out more about how movie tropes skew the view of clinical reality, a new prototype sensor scalpel for surgical training, health communication on antimicrobial resistance, and more. Enjoy reading! |
|||
|
|||
Article • Need for diversity in training datasetsArtificial intelligence in healthcare: not always fairMachine learning and AI are playing an increasingly important role in medicine and healthcare, and not just since ChatGPT. This is especially true in data-intensive specialties such as radiology, pathology or intensive care. The quality of ... |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
ePaper |
|||
|
|||
|