Dec. 3, 2009 Just visited the excellent Connect Safely forum to pose the industry question, is there ANYthing that digital moderators or online advocates could’ve done in the child safety realm to prevent the loss of 13-year old Hope Witsell due to the suicide/sexting case? Obviously, Hope’s case (visual at left/MSNBC) was on mobile phones, but cyberbullying/bodysnarking and photo posting is common in online communities as well. So as we continue Part3 of our ‘behind the scenes’ chat with industry pros from eModeration, we’ll find out the nitty gritty:
Who’s monitoring what behavior online? How do reputable sites handle ‘worst case scenarios?’ (like predator panic, cyberbullying extremes and other instances that surface and splash all over mainstream media news and talk shows) How can sites better recognize and track bullying and predatory behavior? When do they intervene? And how?
Today we’re asking moderation pros the tough stuff beyond an internet safety guide overview…Specifically, what are the fire drills and specialized training in place to spot grooming, bullying, age violations, red alerts, and legal vulnerabilities that reputable sites spend countless hours and dollars monitoring?
As you can see by the CSM Internet Safety Guide visual, below…it’s not such a simple task when it comes to the ‘moderation’ of new media forms as there are many and hard to keep eyeballs in all places at once. It’s one of the huge reasons I get REALLY tired of parents being plopped in the driver’s seat with dismissive, “kids are your responsibility” mode.
I’m thrilled to have the pros at eModeration lift the curtain for us like the Wizard of Oz for a peek behind the scenes of online community moderation (defined here as virtual worlds, social networks, MMORPGs, kids’ texting ‘in-game,’ chat rooms, etc.) because frankly, even for a ‘media maven’ it’s a daunting prospect to ‘keep up.’
What ages and stages need strict, airtight enforcement of regulatory and legal vulnerabilities?
How often is this really being done and what can happen when kids ‘slip under the curtain’ and enter social media platforms without the emotional wherewithal to navigate their own internal compass?
We can all relate on some level to Hope’s case, in terms of parental guilt on basic discipline (grounding/denial of media privileges, a crash course on dealing with texting nightmares) as well as the youthful mindset of invincible bravado…Thinking she could ‘handle it,’ bravely confronting her mistake and taking an insurmountable amount of guff to ‘own up’ without sharing/revealing the severity of the daily brutality endured.
This is where we absolutely must engage in a dialog across the generations…
With new roles and rules for navigating new media, it’s imperative to bring forth a ‘Meeting of the Minds’ as the Harvard’s GoodPlay project, and CSM and Global Kids have tackled in their newly released Focus Dialogues about how youth and adults relate to life online through a series of cross-generational online dialogues. (I’ll post on the Focus Dialogues separately…meanwhile…onward!)
For those that missed the rest of the eModeration interview, Part One is about the need for 21st century media literacies (Hastac/Howard Rheingold roundup of “musts”) Part Two discusses the challenges of ‘safe chat’ and vigilance required from online communities to do their job right.
And though I want to give full focus to the serious instances and concerns in Part Three I’d also like to focus on solutions-based approaches in advance so that parents AND kids know ‘what to look for’ how to screen sites, situations and navigate conundrums with common sense so these tragedies won’t occur…
Before we get to making sense of “automated vs. human moderators” how reliance on internet filters and a false sense of security can put kids in peril, etc., I’d like to caution parents NOT to over-react and instead offer them my favorite outreach vehicle to surf through ConnectSafely’s forum and SEE FOR YOURSELF what’s rearing its head…
It’s a great way to get a solid touchpoint on what kids are facing today, while giving a poignant peek at the reality behind the sensationalism.
Yes, there are raw emotional cries for ‘help’ on serious issues ranging from imposter profiles to removing content that’s seeped onto the net…But there are also firsthand peer to peer youth stories and tips, adults skilled in everything from cybersafety to forensics and law enforcement giving their two cents and sharing knowledge, and ALL are intent on helping kids navigate as global citizens.
Example? “Social Networking Abuse & Peer Tech Support” has an ongoing thread of about 1400 voices in dialog, “Sexting, Cyberbullying & other online risks” offer about 300 comments, which is where I posted my Hope Witsell query.
Site co-founders Anne Collier of NetFamilyNews and Larry Magid of Safe Kids.com both take a very level headed approach to ‘balancing safety and fun’ and give youth and parents the tools to triumph in the cybersafety realm while pointing them in the proper direction to troubleshoot issues that are site specific or out of their area of expertise.
So with that caveat, here’s Part Three with eModeration…
Amy Jussel, Shaping Youth How does eModeration handle a serious breach of safety, such as grooming, predatory behavior, stalking, and that kind of thing?
Do you have tracking tools that are automated as well as human? Rely on human relationships with forensics/law enforcement people that can trackback and hunt down the internet service provider as a source or what?
eModeration: We have a serious incident escalation procedure for each project, which is drawn up with the client at the start of a contract. We have to be able to reach clients 24/7 in the event of a time-crucial incident such as a bomb or suicide threat – something where we need to be able to report an incident to the police with an IP address as quickly as possible.
All suicide or bomb threats are taken seriously; they have to be, and our moderators are trained as to what to doing terms of taking threads down, reporting to clients and management, sending evidence through to reporting bodies and following up. Not all serious incidents are time-crucial: for example uploading child abuse images, whilst extremely serious, isn’t time crucial in the same way.
We do what is necessary on the site in terms of take down; logging and reporting, then follow it up with the clients and the relevant authorities – in the UK this means the Child Exploitation and Online Protection Centre (CEOP) (Amy’s note: also, ck out CEOP’s ThinkUKnow microsite which gives a helpful age/stage media literacy snapshot of basic ‘need to knows’ for teaching safety w/hands-on sources) and the Internet Watch Foundation (who work internationally as well), otherwise the Virtual Global Taskforce or (in the US) CyberTipline…
Obviously, because eModeration is a specialist firm, with workflows and protocol in place, we may be able to provide a greater degree of efficiency than an in-house team, including counseling for any moderators who might feel they need it post-incident…Many of our larger clients are also geared up for this type of escalation and have well-oiled systems too.
Amy Jussel, Shaping Youth: What about the more common ‘imposter profiles’ and security breaches like hacking or cyberbullying with inappropriate content?
eModeration: Breaches of security such as suspected hacking profiles fall into another category, and would be reported through to the client.
We log all breaches of terms which result in moderation actions and report through to clients on an agreed basis.
If a child was in immediate danger we would deal directly with the police to intervene immediately.
Also, we’re a member of the IWF, who are the ones that would deal with UK ISPs in relation to any hosting of child abuse material…so we support CEOP and work with them to further their aims in every way.
Amy Jussel, Shaping Youth: What about ‘language barriers?’ If 21st century connectivity is about global reach and digital multi-culturalism, how can we keep kids safe yet let them explore multi-cultured learning as global citizens?
How do you even begin to ‘moderate’ that? (on Shaping Youth I get a lot of incoming comments and links I can’t moderate on a global scale, so I’ve ratched up my spam filters and taken a stern “when in doubt, delete’ approach) How do you moderate multilingual communities specifically?
eModeration: We provide moderation in over 30 languages, but ‘foreign language’ criteria are different for every client.
Some projects are set up to be run in just one language – and hopefully the terms would state this – and we are instructed to delete any UGC not in that language. For others, there is a flexible approach to other languages, and we moderate them on an ad hoc basis.
For a lot of recent clients though, we are set up to moderate in several different languages right from the start, providing the same service level for each. All our moderators are native or fluent in English, and we have a fantastic team of bi-lingual moderators some with several.
It’s not all about language though…it’s also about culture and nuance. For example, with some projects, we are using UK native moderators only because they need to understand deeply the social context of the young people posting. For other projects it’s vital we have moderators who are native speakers in Chinese for example rather than just bi-lingual so they can pick up on cultural issues as well.
Amy Jussel, Shaping Youth: So what exactly does a moderator ‘do?’ Can you explain the job of a ‘host’ moderator or highly visible moderator in a virtual world for kids?
eModeration: Well, here I can quote directly from our white paper on How to Encourage Participation and Player Loyalty in Virtual Worlds:
“Today, there are two types of moderators. The first and more traditional type is the silent moderator, who stays in the background blocking offensive material from participants, warning users, defusing confrontation and reacting to abusive or illegal behavior. ..The second and increasingly-popular type is the in-game moderator, who actively participates as a character or avatar on the site, helping other players engage with the various activities within the game.
This type of moderator may also act as an in-game host – ie visible to the children – and can be compared to the host of a children’s party: the role is about encouraging children to explore and try new things and have as positive experience as possible, but stay safe and secure while doing so.”
Amy Jussel, Shaping Youth: So what happens if this ‘host’ devolves into a ‘peer’ (I’ve seen this on some sites where there are perceived ‘favorites’ and game play is impacted)
Is it better to have moderators visible or invisible…and why?
eModeration: It’s very important for moderators to keep a certain level of detachment from the children and not become their friends, ensuring they remain impartial and act consistently. To this end, moderators should be clearly identifiable as such within the game so that a child can never confuse them with another player…often the moderator becomes an active character or “host” in the game.
Moderators can blend right in to the game itself, letting children know they are there without becoming over-bearing. This also deters children from wanting to chat to the moderator, which could distract them from the game itself.
However, as Izzy Neis has observed:
“[Young people's moderations teams] have a tight rope to walk… keep the audience engaged/happy/online, while also maintaining community, individual safety and the feeling of fantastical freedom almost required in virtual sandboxes..”
…”Youth want you there when they need you, otherwise, they don’t even want to see you – [you're the] elephant in the corner. A child’s behavior changes when an adult is noticably present – no matter how “good” the child is. Adults become role models, scape goats, wardens, security cameras, mayors, etc – adults become “the man”, and that issues a shift in social control.”
So, in-game moderation isn’t all win, by any means.
(Amy’s note: The avatar/viking visual is one of safety guru Izzy Neis’ many personas, you can see a whole bunch of ‘em on her site to give you a feel for the range and tonality within kids’ worlds in order to ‘blend’…)
Amy Jussel, Shaping Youth: What about automated filters as moderators…how does that all work? Can you explain the science behind “content analysis?”
How sophisticated are these ‘engines’ within virtual communities? Are they sort of a ‘first tier strike’ safety measure to weed out crud like a spam filter, or are they more robust?
eModeration: Rather than using simply a blacklist or white list to restrict chat (safe chat dictionaries, etc.) intelligent content analysis engines such as Crisp’s Netmoderator TM not only detect inappropriate content but also the first warning signs of cyberbullying and predatory behavior.
For example it can reveal when one correspondent is trying to make direct contact with another or when someone is revealing personal information which may compromise their future safety…
We all know sexual predatory behavior is purposefully subtle and long-term in nature. So the engine analyzes content and relationships over the long term, looks at speech which in isolation contains nothing untoward (and so would not be picked up by a blacklist), but whose patterns correspond to recognized grooming behaviour.
The Netmoderator TM engine then prioritizes these alerts, and can handle low-level code of conduct breaches automatically with the ABM (gagging/silencing, blocking/banning, etc according to client-defined workflows), alerting the moderators to the more serious threats.
This helps us a lot because it leaves the moderation staff freer to focus their energies on more potentially serious offenders – It also means that clients do not need to scale up their moderation resources at the same rate that their membership base grows…
Amy Jussel, Shaping Youth: Thanks for this, there are some amazing new resources for keeping kids safe…Appreciate your taking the time to explain the ‘back end’ behind the curtain, Tamara…
I’d like to encourage all readers to leave questions/comments on topics that weren’t covered or you want to hear more about, and I’ll send them into the ‘Twitterstream’ for various moderation pros to have at it.
As far as Hope’s tragedy and MSM reporting, here’s Wired magazine’s blurb yesterday about the latest MTV/AP survey on sexting (again, grain of salt required thinking ‘chicken or the egg’ there, but hat tip to public health pro Andre Blackman)
Keep your media literacy hat on (and head level) on that one…as most of the teens I talked to at our ‘sex ed’ high school discussion last night (and other peers) ‘know better’ and gave me the wince and ‘doh, we KNOW that’ routine…so need to dive into the research methodology, regional samplings, ages and context next.
I’m THRILLED that MTV is addressing digital citizenship/peer-personal privacy issues AND partnering with public health pros to give it mindshare for prevention. They even have a contest and PSAs upcoming with their “Redraw the Line Challenge” to develop projects to address digital abuse via web-based tools/games for education and media literacy, woohoo!
Here’s a mini-resource roundup of other pertinent pieces ( I particularly love the UK’s CyberMentoring program, as I’m a huge advocate of peer to peer knowledge sharing!)
And…with that in mind, here’s a step by step video “by kids for kids” on privacy settings called “Hailey Hacks” Hailey uses screenshots to set her Facebook profile settings to minimize leakage and seepage, offering tips for teens on the ‘frenemie’ front too. (I know some adults that could use this advice!)
Too much for one post, crossing between moderation, bullying/sexting, digital conduct and beyond so ‘to be continued…’
…Stay tuned for more on the importance of open conversation between generations as the Harvard GoodPlay/CSM/Global Kids’ Meeting of the Minds study shares insights from over 250 participants and 2500 posts highlighting similarities and differences in mediating ‘life online’ in the digital sphere!
Handful of Related Articles: Kids’ Safety/Convos/Digital Dialog
Related Posts On Shaping Youth
1. Take an inventory. Ask your children to show you all of the gadgets in the house that can take or store photos or videos. These can include cell phones, Webcams, video game consoles and iPods.
2. Ask them to show you images they have stored. Promise you won’t hit the roof if you find something bad — then keep your word.
3. Have a talk. This should be a conversation, not a lecture. Be sure to mention a range of unintended consequences, which could include criminal charges that would jeopardize admission to college.
4. Watch what you buy. Think twice before purchasing devices that can take or send images. Drop the image-sending capability from your child’s cell phone service.