Sexual chat programs
Filters and moderators are essential for a clean experience, said Claire Quinn, safety chief at a smaller site aimed at kids and young teens, Wee World.
But the programs and people cost money and can depress ad rates.
Sites that operate with such software still should have one professional on safety patrol for every 2,000 users online at the same time, said Sacramento-based Metaverse Mod Squad, a moderating service.
“There are companies out there that are more concerned about profitability.” Two recent incidents are raising new questions about companies’ willingness to invest in safety.
“The manner and speed with which they contacted us gave us the ability to respond as soon as possible,” said Duncan, one of a half-dozen law enforcement officials interviewed who praised Facebook for triggering inquiries.
Facebook is among the many companies that are embracing a combination of new technologies and human monitoring to thwart sex predators.
The better software typically starts as a filter, blocking the exchange of abusive language and personal contact information such as email addresses, phone numbers and Skype login names.
But instead of looking just at one set of messages it will examine whether a user has asked for contact information from dozens of people or tried to develop multiple deeper and potentially sexual relationship, a process known as grooming.