Chiefs of UK cybersecurity support a plan to check phones for child abuse images.

Providers like Facebook or Apple could make software that checks messages for suspicious behaviour without sending the messages to a central server.

In a joint statement, GCHQ and NCSC argued that client-side scanning could protect children and their privacy.

Government cyber security chiefs say tech firms should move forward with controversial equipment that looks for photos or videos of child abuse on users’ phones.

Social media and other service providers like Facebook and Apple might design software that monitors communications for suspicious behaviour without requiring a central server to receive and store data. This is called “clientside scanning.”

According to NCSC technical directors Ian Levy and Crispin Robinson of GCHQ’s cryptanalysis division, the technology could safeguard children’s safety while protecting their privacy.

As they stated in a discussion paper published on Thursday that they claimed was “not government policy,” they found “no reason why client-side scanning techniques cannot be applied properly in many situations one may face.”

Proposals for client-side scannings, such as Apple’s plan to scan images before uploading them to its image-sharing service, were opposed based on specific problems that might be remedied in reality.

It was suggested that multiple child protection NGOs be required to be involved in preventing a single government from using the scanning apparatus for spying on civilians. Encryption was also recommended as a way of making sure the platform never sees any images that are passed to humans for moderation.

‘Details are important when discussing this topic,’ stated Levy and Robinson. Using vague language or hyperbole when discussing a topic will likely lead to an erroneous conclusion.

Child welfare organizations were happy to see the article. “A significant and highly credible action” that “breaks through the false binary that children’s fundamental right to safety online can only be fulfilled at the expense of adult privacy,” according to Andy Burrows, the NSPCC’s head of child safety online policy.

We Should Know The Differences Between Discipline and Child Abuse

He said there is little doubt that regulation may incentivize corporations to develop technology solutions and provide safer and more private internet services.

Opponents laud end-to-end encryption, but they argue that the focus should be on non-technical solutions to child abuse rather than end-to-end encryption. The report “completely ignores the consequences of their suggestions jeopardizing the privacy of billions of individuals globally,” said Alec Muffett, a cryptography specialist who oversaw Facebook’s attempts to encrypt Messenger.

According to Muffett, it’s strange that they call abuse a “societal problem” and want technology remedies solely for it. Isn’t it possible that more social workers could be employed to execute harm-reduction strategies?

The Levy and Robinson article isn’t the first time they’ve delved into hot-button policy areas. A so-called “ghost protocol” was proposed in 2018 as a way for GCHQ to surreptitiously add itself as a second recipient of messages sent to and from a target device. They argued for this approach in 2018.

Adding a law enforcement participant to an online group chat or phone call is “quite simple,” they noted. There are no more intrusive virtual crocodile clips in our democratically elected representatives and judiciary than there are with this approach,” he writes.

leave a reply