Apple CSAM FAQ addresses misconceptions and concerns about photo scanning
Ben Lovejoy
- Aug. 9th 2021 4:24 am PT
@benlovejoy
0
Apple has responded to misconceptions and concerns about its
photo scanning announcements
by publishing a CSAM FAQ – answering frequently asked questions about the features.
While child safety organizations welcomed Apple’s plans to help detect possession of child sexual abuse materials (CSAM), and to protect children from predators, there has been a mix of informed and uninformed criticism …
Background
Mainstream media confusion arose when Apple simultaneously announced three separate measures, with many non-technical people confusing the first two:
iMessage explicit photo warnings for children in iCloud Family groups
Detection of known CSAM photos by scanning for digital fingerprints
Responding to Siri and search requests for CSAM materials with a warning and links to help
There has also been a lack of understanding about the methods Apple is using. In the case of iMessage, it is using on-device AI to detect images that appear to be nudes; in the case of CSAM, it is comparing digital fingerprints with fingerprints generated from a user’s stored photos.
In neither case does anyone at Apple get to see any of the photos, with the sole exception of someone flagged for having multiple CSAM photos, when someone at Apple will manually check low-resolution copies to ensure they are true matches before law enforcement is informed.
There has also been confusion between
privacy
and misuse risks with the features as they stand today (which are nil to exceedingly low) and the potential for abuse by authoritarian governments at a future date. Cybersecurity experts have been
warning about the latter
, not the former.
Apple already attempted to address the repressive government concerns by launching only in the US for now, and stating that
expansion would be on a country-by-country basis
, factoring in the legislative environments in each. The FAQ now attempts to address this and other issues.
Apple CSAM FAQ
Apple has
published a six-page FAQ
designed to address some of the concerns that have been raised. It begins by acknowledging the mixed response.
The company then underlines that the first two features are entirely separate.
Other points stressed in the FAQ include:
iMessages to and from children are never shared with law enforcement
iMessages remain end-to-end encrypted
Children with abusive parents can safely seek help via iMessage if using only text
Parents are only notified if children aged 12 and under proceed despite a warning
CSAM fingerprint matches are manually reviewed before law enforcement is informed
The trickiest issue remains
The biggest concern raised by the EFF and others remains. While the system today only flags CSAM images, a repressive government could supply Apple with a database that contains other materials, such as the famous
Tank Man
photo in Tiananmen Square, which is censored in China.
Apple has responded to this by saying it would not allow this:
That statement is, however, predicated on Apple having the legal freedom to refuse. In China, for example, Apple has been
legally required to remove VPN
,
news
, and
other
apps, and to store the iCloud data of Chinese citizens
on a server owned by a government-controlled company
.
There is no realistic way for Apple to promise that it will not comply with future requirements to process government-supplied databases of “CSAM images” that also include matches for materials used by critics and protestors. As the company has often said when defending its actions in countries like China, Apple complies with the law in each of the countries in which it operates.
FTC: We use income earning auto affiliate links.
More.
Check out 9to5Mac on YouTube for more Apple news:
You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out
our homepage
for all the latest news, and follow 9to5Mac on
,
, and
to stay in the loop. Don’t know where to start? Check out our
exclusive stories
,
reviews
,
how-tos
, and
subscribe to our YouTube channel
Guides
AAPL Company
Breaking news from Cupertino. We’ll give you the latest from Apple headquarters and decipher fact from fiction from the rumor mill.
Privacy
Privacy is a growing concern in today's world. Follow along with all our coverage related to privacy, security, and more in our guide.
About the Author
Ben Lovejoy
@benlovejoy
Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!
Ben Lovejoy's favorite gear
NordVPN
Apple Watch Series 3