August 6, 2021 2 minutes readInc. announced Thursday it will implement a new system for checking iPhone images before uploading them to the iCloud storage. This is to ensure that no images are matched with images of child sexual abuse.Apple stated that the service will convert device images into unreadable sequences of hashes, or complex numbers. These will then be matched against a National Center for MIssing and Exploited children database.Apples website notes that this is only one aspect of its new child safety initiative. First, new communication tools will allow parents to play a more informed part in helping their children navigate online communication. Apple will not be able to read private messages, but the Messages app will make use of machine learning on-device to alert about sensitive content.Related: Is It Time to Get a Bit of Apple Stock?Siri and Search in iOS will play an important role in fighting child abuse. They'll give parents and children information, help in dangerous situations, and intervene when users try to search for abuse-related subjects.Apple, which claims to be a private and secure option for consumers, made sure to emphasize that these steps do not infringe upon privacy.These features will be available in iOS 15, iPadOS 15, and macOS Monterey later this year.Related: Apple Co-Founder Steve Wozniak Says Bitcoin Is Better Than Gold