Live streaming app 'LiveMe' makes major changes following award-winning FOX 11 investigation

Major changes have been made to a popular live streaming app called LiveMe after a FOX 11 investigation revealed pedophiles were actively using it to sexually exploit young children.

The original reports can be seen here:

1) Pedophiles using app to manipulate underage girls into sexual acts, sell recordings as child porn

2) LiveMe deletes 600k accounts after FOX 11 reveals pedophiles use app to sexually exploit kids

It revealed that young girls, some less than 10 years old, were live streaming themselves dancing and taking their clothes off at the request of pedophiles were showered them with virtual gifts and currency, and made crude, sexual demands.

FOX 11 also found some pedophiles were recording the young girls live streams, and we discovered multiple porn sites directing customers to live streams or web captures of underage girls on LiveMe.

The links advertised the girls as young jailbait, and that they were secretly recorded without their knowledge.The FOX 11 investigation won a prestigious Edward R. Murrow award, and was seen online millions of times. 

LiveMe’s reps tell FOX 11, our story was a wake up call for them. “Seeing this was something that definitely got our attention,” said Blake Barrett Curry, LiveMe’s head of global partnerships.

Curry sat down with FOX 11 to talk about what he says are the major changes LiveMe has made in response. “We removed 600,000 accounts after the original story aired, and recognized that this is something we knew we had to do, and that meant anyone that was under the age of 18, anyone that was suspicious, anyone that could possibly be unwelcome in our community,” Curry said.

Curry told FOX 11 that LiveMe also made the decision to raise the apps age limit from 13 to 18. “If they lie about their age and it’s not them, we’re going to be most concerned if they are broadcasting,” Curry said.  

That’s why Curry says, LiveMe is now using facial detection software to help identify anyone underage who slips through the cracks.“This isn’t facial recognition, it’s facial detection, so that is a systemic algorithmic program that looks at not only the visual aspects, but also the audio aspects, so that analyzes the voice, and it also looks at the facial detection and determines from that with about 80% accuracy, if it is in fact someone who is 18 or above,” he said.

And because there are people who are indeed 18 or above, but may not look like it, human moderators will make the decision. Speaking of which, LiveMe says they’ve also launched a safety advocate program which has allowed the app to go from 200 moderators to 1200 moderators, a five hundred percent increase.

“We would take those who raise their hand and say I want to help make a difference, criminally background check them, ID verify them, train them, and then give them expedited priority to flag any content before it could become a bigger issue,” Curry said. 

LiveMe has also partnered with BARK, an award winning parental monitoring app that allows parents to monitor their kids texts, emails, and 24 social media platforms, including LiveMe, for any bad behavior. “At the end of the day, parents are still going to be a child’s best protector,” he said. 

“I think that LiveMe is in a challenging position, it certainly seems to me that they’re trying,” said Dr. Lisa Strohman, a clinical psychologist.Dr. Strohman first alerted FOX 11 to the dangers of LiveMe last year, and she admits, the platform is making an effort to clean up.

“I think that LiveMe is doing a fairly good job, when you look at the number of kids that are under 18 it’s less than its ever been,” she said.

“There’s obviously still some there, but I think they’ve made some efforts in that space. Are they perfect?

Absolutely not.”Curry said LIveMe has a message for any predators who may be looking to use LiveMe for nefarious purposes.“We’re on to you, and it’s not just us, we have backup,” he said. 

NewsUs Ca/los Angeles-county/los-angelesCrime Publicsafety