For most game designers, the excitement of making an MMO for the under twelve set is quashed by the idea of having to make game design decisions aimed at reducing the odds your game becomes a pedophile’s wet dream.
One person’s design problem is another person’s business opportunity. From the New York Times:
As the number of these virtual worlds grows, so, too, does the demand for sophisticated monitoring software and people, called moderators, who can act as virtual playground monitors. Tamara Littleton, chief executive of eModeration, a company based in London that provides moderation services, says the most common dangers that children and teenagers face online are bullying and young people’s own efforts to share personal information that could enable strangers to identify and contact them in the real world. Sexual predators are always a concern, she says, though she described the likelihood of a child being targeted by an adult with malicious intent as “statistically low.”
…
NetModerator, a software tool built by Crisp Thinking, a private company based in Leeds, England, can monitor online chat “for intent as well as content,” says Andrew Lintell, the company’s chief executive. To build the tool, he says, Crisp Thinking analyzed roughly 700 million lines of chat traffic, some from conversations between children and some, like conversations between children and sexual predators, provided by law enforcement groups.
Recent Comments