New Apple technology will warn parents and children about sexually explicit photos in Messages ' TechCrunch

Apple will soon release new tools to warn parents and children if their child sends or gets sexually explicit photos via the Messages app. This feature is one of a few new technologies Apple will introduce to reduce the spread of Child Sexual Abuse Material across Apple's platforms and services.Apple will now be able detect known CSAM images on iPhones and iPads, as well as photos uploaded to iCloud. This is in line with consumer privacy.The new Messages feature is designed to empower parents to be more involved in helping their children navigate online communication. Messages will soon be able, with a software update scheduled for later in the year to allow it to use machine learning on-device to analyze attachments and determine whether a shared photo is sexually explicit. Apple does not have to read or access the private communications of children using this technology. All processing takes place on the device. Apple's cloud servers do not receive any of the information.A sensitive photo may be found in a message thread. The image will be blocked. A label will appear below it that says, "This may be sensitive." It will also include a link to view the photo. A second screen will appear if the child decides to view the photo. This message informs the child about sensitive photos or videos that show your private body parts. It is not your fault. However, sensitive photos or videos can be used in order to harm you.This could also indicate that the person in the video or photo may not wish it to be visible and that it was shared without their knowledge.These warnings are intended to guide the child in making the right choice by not viewing the content.If the child clicks through to see the photo, they'll be shown an extra screen. This screens informs them that their parents will be notified if they view the photo. It also informs them that their parents want them safe and encourages them to talk to someone if they feel threatened. The screen also provides a link to additional resources that can help.The bottom of the screen still offers the option to view the photo. However, it is not the default. The screen has been designed so that the option to not view the image is highlighted.These features can help children avoid sexual predators. They not only provide technology to interrupt communications and offer advice and resources but they also alert parents. Parents often don't realize that their child has begun talking to predators online and by phone. Child predators can be very manipulative and will try to win the trust of the child, then isolate them from their parents to keep communications secret. Other times, predators may also groom parents.Apple's technology can help in both of these cases by intervening and alerting to explicit material being shared.Self-generated CSAM is a growing source of CSAM material. This is imagery that the child takes and may then be shared with their partner or other children. This is also known as sexting, or sharing nudes. Thorn, an organization that develops technology to combat the sexual exploitation and abuse of children, found that one in five girls aged 13-17 has shared their nudes. One in ten boys, however, have done the same. The child might not be aware of the dangers that sharing this imagery poses to their sexual abuse or exploitation.This new feature, called Messages, will provide similar protections. If a child tries to send explicit photos, they will be warned. If the child decides to send the picture anyway, parents can receive a message.Apple has announced that the new technology will be available as part of a software upgrade later in the year for accounts created as families in iCloud iOS 15, iPadOS 15 and macOS Monterey in America.The update to Siri and Search will include expanded guidance and resources that can help parents and children stay safe online, and help them in dangerous situations. Siri will allow users to ask Siri for information about reporting CSAM and child exploitation. Siri and Search can also be used to intervene in search queries about CSAM. They will explain the topic as harmful and offer resources to help.