ComplexCon returns to Long Beach Nov. 6 - 7 with hosts J. Balvin and Kristen Noel Crawley, performances by A$AP Rocky and Turnstile, and more shopping and drops.

Secure your spot while tickets last!

When secure messaging apps Telegram and Telegram X disappeared from the App Store last week for “inappropriate content,” it left a lot of users scratching their heads. Now, it appears the absolute worst appears to be true. Having confirmed the authenticity of an email between Apple marketing chief Phil Schiller and a Telegram user, 9to5Mac reports that the “inappropriate content” was child pornography.

“The Telegram apps were taken down off the App store because the App Store team was alerted to illegal content, specifically child pornography, in the apps,” Schiller wrote in the email. He also explained to the user that the content had been verified before the app was taken down, and that the developer and necessary authorities had since been alerted.

Most online companies adopt a range of digital protection to prevent the distribution of child pornography on their platforms. As The Verge notes, these kind of preventative measures allow the companies to detect this kind of content immediately. They do this by working with federal database hashing technology. These companies are also the ones held responsible in the event that their technology is used to propagate illicit content. Unfortunately for Telegram, it appears as though their technology wasn’t as prepared as it ought to have been. “We were alerted by Apple that inappropriate content was made available to our users and both apps were taken off the store,” said Telegram CEO Pavel Durov in a statement. “Once we have protections in place we expect the apps to be back on the App Store,” he continued.

It’s worth noting that both apps were back online in about a day.