Nov. 26, 2009 Who’s minding the kids online? What’s ‘moderation’ got to do with it?
It doesn’t take a retro flashback to know that “Kids say the darndest things,” but who is there to help them online when they do?
In part one, we summed media literacy’s vital role in online safety to create global citizens from the get go, and in part two today, we’re diving a bit deeper into a behind the scenes look at the challenging role of the moderator, to hear what these pros DO to strike a delicate balance between safety and fun…Seems particularly fitting for days of inclimate winter weather ahead which can drive kids indoors.
The way I see it, moderators set the tone and the vibe for ‘what’s acceptable’ in play and their role is to help kids sift through the digital sandbox without intruding too far onto the beach.
Kind of like lifeguards at the ocean, they casually blend in with the chat scene, and though you hope you won’t need them for serious stuff, they’re alert and at the ready with skill sets that can save your hide in a sudden current shift or wicked undertow.
This lifeguard analogy is helpful to assuage parental predator panic without removing personal responsibility for ‘knowing how to swim’ in the waters to begin with. Just like you wouldn’t send a kid into the surf with a life-vest expecting flotation to replace core water skills, you shouldn’t expect moderators to shore up digital knowledge sans basic media literacy…
Yes, a moderator is a ‘deterrent’ by sheer presence, and the more highly trained staff have the ability to sniff out a ‘situation’ in multiple contexts…
But that doesn’t let parents, educators, and even youth themselves off the hook for personal accountability and responsibility for their own basic media literacy life skills.
I don’t envy these moderation folks…it’s nuanced, and takes finesse to achieve solid moderation without being heavy-handed and tamping down peer to peer fun.
After all, if it’s too tight of a ship, the kids will jump off and go elsewhere to test less ‘controlled’ waters…Moderation that’s TOO limited with automated dictionary filters and pre-set text phrases will have the savviest ‘digizens’ initiating ‘work arounds’ pronto…
If sites are moderated TOO loosely with disrespectful antics, you can end up with a 21st century wild west frontier devolving into a ‘tween’ slamfest of cliques and tribes…
So how do the best-run sites strike that intuitive balance of giving kids the respect and ‘silent code’ of the community itself to embrace one another as the keepers of the flame?
“Adults have a clear responsibility to help steer children through their online environments…Teens and tweens are finding their voices as their brains develop, and in these days of instant communication, sometimes they may later regret online behavior. Moderation has a role in helping to guide them as well as keeping them safe.”
eModeration’s core team help us sort through the hype and holes of age verifications and workarounds in part two today…
In doing so, they help provide kids what they need to know for healthy chat, and provide parents a handy checklist to get beyond a cursory ‘looks ok to me’ glance. (e.g. eyeballs on terms of service, privacy, data collection, commercial interests, etc.)
Amy Jussel, Shaping Youth: What is the biggest ‘hole’ in moderation safety?
Is it kids assuming they ‘know it all?’ Password/PIN protection? Sharing information as social currency? Imposter profiles by ‘frenemies?’ Cyberbullying, or what?
eModeration: Definitely the sharing of personal information – either the most direct (names, email addresses, geo locations) or little bits which add up to the jigsaw effect.
The kids aren’t trying to be inappropriate or to indulge in risky behavior. They just want to be friends, and obviously telling people who you are, what school you go to, how they can see more of you in any other social space – or that you’ll be at the Jonas Brothers gig next week – is just a part of that reaching out process.
Amy Jussel, Shaping Youth: Do you think it’s ‘developmental’ for even the younger (tween) kids to find ‘workarounds,’ diss the age verifications, or even ‘game the system’ as a testing ground for preteen behaviors we’d see offline? When we see this in online communities is it a sign of lousy moderation or just very sharp kids?
eModeration: My instinctive answer is that in the majority of cases, this is kids being kids. They want to give out this information but they don’t have the maturity yet to understand the dangers, why the restrictions are in place. So they’re going to apply their intelligence and creativity to finding a workaround.
The teenage years are all about how to carve themselves their own niche, learn from each other, buck against the imposition of adult rules, delight in getting one over the system … I think online behavior is just mirroring what they’d do offline. Kind of the equivalent of telling Mum you’re staying over at a friend’s when you’re actually at an all-night party. Natural, but dangerous.
The trick is to stay one step ahead but also reinforce safety messages. It’s one thing to buck the system so you can say rude words, it’s another when kids are sharing personal info.
Amy Jussel, Shaping Youth: What should parents look for in the ‘about us’ and privacy/terms of service sections when it comes to moderation, security from data mining, etc?
eModeration: They should look for a site which is transparent in its declaration of moderation and a filtering system with moderation 24/7, available directly through ‘help’ buttons.
It should be COPPA compliant (or the local region equivalent) and have links to relevant help and advice centres for the major regions.
We’d recommend a separate section for parents, telling them what the site is about and how to help educate their children in online safety and supervise their online usage (keeping computers out of bedrooms etc), and signs to look for if their child is a victim of cyberbullying or online abuse/grooming – perhaps with links offsite for more information or help resources.
The terms for the children to agree to should be drafted in an age-appropriate way, and parents should go through them with their kids at sign-up.
Of course, the data stored should never be sold on to third parties.
The European Network and Information Security Agency or ENISA, a stalwart of child safety in virtual worlds, has released tips on how parents should be active in guarding and interacting with their children who are active participants in popular virtual worlds for children. Quoting from them:
“The most useful benchmark of a reputable virtual world with child safety as a priority is the presence of parental control. There should be measures in place where you can always be in control and aware of all the activities, transactions and events inside the virtual world.
Although applications and web sites always give out warnings on sending personal and sensitive data over the Internet, sometimes they are written in very fine print or most of the time, the children just want to get over all the registration slowdowns and just complete certain transactions. This is why ensure that parental control has monitoring capabilities when sensitive data is used in virtual worlds. Sensitive data include name, age, birthday, credit card number, address and mobile telephone number.”
Amy Jussel, Shaping Youth: Who’s doing great work in the live chat realm aside from eModeration? (specific to advocacy, safety/monitoring, transparency & regulation)
eModeration: We’re glad to say, there’s an awful lot going on. In the UK, the UK Council for Child Internet Safety (UKCCIS) has been formed, uniting over 100 organisations from the public and private sector who are working with the UK Government to deliver recommendations from Dr Tanya Byron’s report ‘Safer Children in a Digital World.’ eModeration is currently part of a sub-group reviewing the moderation guidelines.
We’re really impressed by the developments that our partners Crisp Thinking are making in the live chat field.
(Amy’s note: this profile piece on Crisp Thinking’s Head of Safety, Rebecca Newton has detailed info on detection and analysis of predatory behavior; a must-read interview via the Social Media Portal)
Crisp Thinking has just released a new feature to their live chat moderation ‘NetModerator’ (TM) tool called Automated Behaviour Management™ (ABM). In a nutshell, ABM (Amy’s note: link to press release/’world’s first’ just launched this fall ’09) automates what are now manual processes for the ‘low-level’ infringements – warnings, muting etc. This makes responses quicker (educating users) and leaves moderators free to concentrate on more serious infringements such as grooming, self-harm or bullying.
CyberMentors.org.uk is a safe, social networking site providing information and support for young people being bullied or cyber bullied. The big thing here is that it is young people helping each other: Youngsters aged 11-25, are trained as CyberMentors, in schools and online, so that they can offer support to their peers.
At the time of writing this, they had about 200,000 unique users, 1,500 trained cybermentors and 50 registered counsellors. They are doing a really fantastic job (oh, and they use Crisp Netmoderator too)
Amy Jussel, Shaping Youth: Can you explain the biggest differences in moderating tweens vs. teens?
eModeration: There’s obviously a huge overlap in the middle, but if we take the opposite ends of the spectrum, say, 7 and 8 year olds versus 17 year olds, we can clearly see the difference. First, you need to look at what they’re doing in the online space and why they’re doing it.
The younger end of tween age groups is still intent on play, with the unaccustomed feeling of power and control they have over this virtual universe, they’re juggling imaginative play with the beginnings of social interaction. This age group is generally more accepting of overt rigid rules (indeed, actively needs them) and will respond to the authority of the moderator, will acknowledge and appreciate the role of a moderator in their world.
Basically, the same dramas from the playground (bullying, rejection, betrayal) are happening in their online world, and need the careful intervention of an adult just the same. Because they are there to PLAY, when performed correctly, visible moderation can really play a great part in enriching their experience.
How much risk to allow the different age groups is a moot point as well. Tanya Byron argues that we have perhaps restricted our youth too much and that they need to find their own way: otherwise they won’t learn and develop. But that doesn’t mean letting them loose unsupervised onto the internet.
Kids as young as 11 or 12 are being required to make decisions about risk behaviour that their parents didn’t have to do before 16 or 17 years old.
Moderators need to be aware of this, aware of the fact that many tweens and teens *are* in these spaces unsupervised, and inevitably to some degree take the role of “loco parentis” to guide their online charges.
Later teens need different and lighter handling. Some like to compete and play warcraft games, others prefer to socialize and be creative. They probably know what they are doing better, and are more resilient and savvy overall. Also more worldly-wise, cynical, and potentially capable of greater infringements…
Moderation can more easily be perceived as an obstacle in their path. They have reached a level of control and autonomy within their offline lives which means that having their chat moderated may feel like a large hand reaching over their shoulder and gagging them when they’re in the club with their mates.
If the objective of moderation is to keep the space clean for other users, educate the users about acceptable behaviour, and enhance their enjoyment and engagement, then you can see how you’re going to need really different approaches for the two age groups.
Amy Jussel, Shaping Youth: Stay tuned for more with eModeration in part three as we get specific and ask the dicey questions parents are eager to know by putting the moderators in the hot seat of ‘what ifs’…
From recognizing grooming or predator behaviour to bullying intervention, handling multilingual sites and global communities with cultural differences, and how to spot whether a site’s being ‘compliant’ with legal and regulatory issues…we’ll ask the tough stuff so parents know what to ask, how to screen, and ways to make sense of the automated vs. human moderators, filters, and safety conundrums online.
Also, for those who are thinking about a ‘console’ gaming system of some sort for holiday gifting, or families that live in a FOG (my own coinage for ‘friends of gamers’…those households turning a blind eye to their kids’ online safety using the ol’ “I don’t have one, so I don’t need to worry about that’ mindset) then part three with eModeration is a must-read for media savvy parents.
What to know as kids grow to stay engaged, aware, and involved in helping them navigate their online lives for safety and fun…
Later we’ll look at some upcoming subscription based moderated chat communities, from ‘adventurous virtual worlds’ in beta that are coming onto the scene like this one launching on Christmas eve, Wiglington and Wenks or this one which I’ve been wanting to dive into and explore for learning fun…WonderRotunda a ‘theme park for the mind’ for kids 7-12…
We’ll also check back in on our new affiliate, New Moon Girl Media to see how they’re doing with their December 31 deadline to SAVE NEW MOON’s online girls’ community…a haven for creativity and safe chat!
Hint, hint for holiday gifting and great, safe, moderation!
Similarly, there are some new, emerging ‘tween girl’ communities launching like Chica Circle where girls unite and share their own designs…so I really hope that all of these emerging players in the digital world read the industry tips from eModeration as they shape their OWN worlds to create a safer place to play.
Finally, here’s those great starter lists from online community safety guru Izzy Neis on ‘worthy worlds’ for tweens as a bit of a ‘cheat sheet’ to get parents primed since she’s hand-selected some that have passed her moderation ‘safe eyes’ and offered do’s and don’ts for ages and stages with moderation in mind. Izzy’s site is amidst updating so this is ‘oldie but goodie’ material, as some of these worlds have already ‘come and gone’…it’s rough out there in the digital frontier!
Favorite round-up resources from Izzy Neis:
- Worthy Tween/Twid/kid Communities (and virtual worlds)
- Communties NOT for Tweens/Kids
- A Breakdown of Tween/Twid/Kid Virtual Worlds & Terminology
- Future Pre-Beta Virtual Worlds
Also, along these lines of safety and moderation, here’s a recap of our three-part series on gaming and ethics, with links that might be of interest: