Drugs, stolen credit card data, and images of child sexual abuse are being openly traded on encrypted apps like Telegram, a BBC Radio 4 investigation has found.
Cybersecurity experts told the programme that as law enforcement had cracked down on criminal markets hosted on the dark web, some criminals were shifting towards end-to-end encrypted apps as the primary platforms for their operations.
Secure apps like Telegram and Discord have both public and encrypted areas. While the public sides of these apps will often be listed prominently online, criminal dealings are conducted in the private areas, which use end-to-end encryption to prevent messages from being intercepted and read, and which allow for messages to be deleted.
The Radio 4 investigation found that criminals are flogging stolen credit card details, illegal drugs, and child sex abuse images using these apps.
Some criminals used a method of promoting their groups in the public comments section of YouTube videos, using code words that would be indexed by search engines and lead potential customers to the dark marketplaces, where dealings and exchanges could take place. Researchers confirmed that at least one of these groups contained hundreds of explicit images of children.
According to Dr Victorian Baines, a former Europol officer and advisor to the Serious and Organised Crime Agency, this allowed the sellers to attract potential buyers who may not otherwise know how to access the groups.
“YouTube is indexed by Google, which means if you are an “entry level” – for want of a better phrase – viewer of child abuse material you may start Googling,” she said. “And while Google tries to put restrictions on that, [the links] are publicly accessible on the web, so it is a means of getting people who are curious or idly searching into a closed space, where they can access material.”
The investigation also found that predators exploit the Discord app – which is widely used for innocent purposes such as chatting while gaming – to persuade minors to share explicit photographs. Ariel Ainhoren, research of research at IntSights, said that the security firm had recently identified a Discord group in which a user had promoted access to child sexual abuse and rape forums, and posted prices for child abuse imagery: 9GB for $50 (£39), 50GB for $500 (£390), and 2.2TB for $2500 (£1900).
The Security Minister Ben Wallace said that the government had set up a £1.9bn cybersecurity program to help the police infiltrate web-based criminal groups, and that the government’s upcoming White Paper would consider whether technology companies should have a legal duty of care towards its users in order to oblige them to remove illegal material from their platform.
“We are exploring in the online harm White Paper the area of duty of care, and if they don’t fulfil that, then one of the things we are exploring is that there will be a regulator involved,” he told the programme.
Meanwhile, the Digital Minister Margot James has confirmed that the government will introduce some regulation to force websites to remove illegal content.
Discord said the number of these types of violations made up a “tiny percentage” of usage on the platform, and the company was committed to making it even smaller, with methods used to cut down on forbidden content including using computer, human and community intelligence. A spokesperson for Telegram said that the company processed user-submitted reports and engaged in “proactive searches” to keep the platform free of abuse, including child sexual exploitation and terrorist propaganda. Reports of child abuse are usually processed within an hour, the spokesperson said.
In January 2018, Prime Minister Theresa May told the World Economic Forum in Davos that technology platforms can be exploited as “home to criminals and terrorists”, and that their content should be “automatically removed”.