Private medical photos from nearly ten years ago were being used in an image set that trains artificial intelligence.

Have I Been Trained is a website that allows artists to check if their work has been used in an image set. Two of her private medical photos suddenly appeared when Lapine did a reverse image search of her face.

Lapine said that a doctor took a picture of his face. The image that I signed for my doctor was the one that ended up in the data set.

Chain of Custody

LAION-5B is supposed to only use images on the internet. You would think that private photos of medical patients wouldn't be allowed. It seems that apparently not.

The photos were taken from her doctor's files and ended up on the internet. More potentially sensitive images of patients in hospitals were found by Ars Technica.

LAION used web scraper to get those images, but who knows what they'll find?

The best way to remove an image from the database is to ask the hosting website to stop hosting it, according to an Ars report.

The process requires you to give personal information to the site in question.

Accountability may be difficult to pin down. Did the hospital or doctor make a mistake by not securing the photos or is LAION too intrusive? The answers aren't mutually exclusive.

It's bad that artists' works are assimilating by artificial intelligence. Allowing private medical photos to be looked at by an artificial intelligence is questionable. Everyone should be alarmed by that. What if those aren't sacrosanct?

The artist found private medical record photos in the data set.

Artificial intelligence experts are horrified by a facial recognition site that puts up potentially explicit photos of children.