In September 2019, CNN published this article about an AI research study. It buries the lede, seemingly intentionally: the spotlight is on the success of AI tools in diagnosis, whereas the study finds that fewer than 1% of papers on AI tools follow robust reporting practices. In fact, an expert quoted at the end of the article stresses that this is the real message of the study.
Besides, the cover image of the article is a robot arm shaking hands with a human, even though the study is about finding patterns in medical images. These humanoid images give a misleading view of AI, as we’ve described here.
See for yourself where the article goes wrong. The contents of the article are reproduced in their entirety on the left, and our annotations are on the right. Link to the original article.
For more details about the pitfalls, read the blog post which introduces this article here.
Copyright to the original article belongs to CNN, and the article can be read in its original form in full at cnn.com. It is republished here for the purposes of critical commentary.
We re-used the code and the copyright notice above from Molly White et al.’s The Edited Latecomers’ Guide to Crypto to generate our annotated article.