Decisions about which companies received the first tranche of notices were based on considerations such as the number of complaints to e-Safety, the scope of the company and how much information is already public. More orders are likely to be issued.
Inman Grant said some in the industry had the attitude that if they weren’t aware of the problem, they weren’t responsible, even though some organizations had technology that could track and extract hazardous material.
Each company will be asked different questions to obtain information that is not publicly available. “We have a variety of questions for Meta and WhatsApp, in terms of where are they scanning, what are they scanning, how are they scanning,” Inman Grant said.
Responses will also be reviewed on a case-by-case basis. If businesses are found to be non-compliant after 28 days, they can be fined $550,000 per day.
“In my experience, having worked in the industry [at Microsoft for 17 years]businesses are affected by anything that challenges their revenue, anything that damages their reputation, and any significant regulatory threat,” Inman Grant said.
The Internet has led to a flourishing industry of online child exploitation, involving both shared and live images. “For the last 15 years there has been a live streaming trade in child exploitation material,” Inman Grant said.
“With lockdowns around the world, what we started to see was the Philippines at the epicenter of pay-per-view child abuse material. We now have so many video conferencing platforms that can facilitate that sexual abuse material.”
NSW Police Detective Superintendent Jayne Doherty, commander of the Child Abuse and Sex Crimes Squad, said officers “welcome any opportunity to help identify, target and prosecute those involved in child abuse. kids”.
Federal Communications Minister Michelle Rowland said the companies’ reports “would help inform future government decisions about what needs to be done to protect Australians online and improve transparency for the public.”
Apple faced significant backlash from privacy advocates last year when it signaled a new feature, CSAM, that would scan iCloud photo libraries for known child sexual abuse material (photos that have been validated by at least two agencies).
The company’s website no longer makes reference to CSAM technology. It has added a new feature, which involves an intervention if users search for child exploitation material in their search tools. Interventions explain “that interest in this topic is harmful and problematic, and provide partner resources for help with this problem.”