[thelist] Content filtering for kids site (slightly OT)

Iva Koberg iva at livestoryboard.com
Wed Oct 8 10:05:32 CDT 2003

Hi all,

We are in the planning stages of a portal for kids. One area of the site
is a kid-friendly CMS - something like a simplified CMS/blogging tool
for young audiences (10 to 17 age group roughly). The client is very
sensitive to security and privacy issues and has asked us to consider
filtering content for profanity and some taboo keyword phrases as an
additional measure towards a predictable quality of content contributed
by users. We are facing a dilemma here because, on one hand, the goal is
to encourage participation and stay away from censorship, on the other,
we would like to offer kids a safer environment. NOTE: this post is not
meant to start a pro-censorship/anti-censorship thread - it is a
requirement we are researching.

We are interested to hear if any of you have had to deal with and what
your thoughts are on content filtering specifically? Assume that
security is taken care of through registration and authentication, etc -
we're only concerned with registered authenticated users content
contribution here.

Some of our choices are:
- delay publishing content (monitoring by humans, which has the downside
of possibly discouraging participation and increases needs for offline
staff support)
- a filter that catches keyword phrases and disallows publishing (the
downside being that it would clearly be tough to catch all "bad" words
without sacrificing content that may be perfectly fine if read by human.
For example, if we check for misspellings - "fcuk" may be describing a
popular for teens clothing company
http://www.fcuk.com/flash/index_large.html, or not :)
- a filter that replaces presumably "bad" words with *** or something
similar (same concerns as above as well as making it obvious that the
site is censored)
- no filtering, rely on clearly messaging the rules of participating in
the site to kids, allow kids to self-monitor by voting on each other's
entries (the downside being we can't really "promise" kids, or parents
that no inappropriate content will sneak in)

In the case that neither of these solutions end up being reasonable for
the client and users, perhaps the approach should be a combination
between very basic filtering, encouraging self-monitoring and periodic
human review. Any recommendations on alternatives would be appreciated,
as well as, sharing past positive or negative experiences on the
subject. Also interested in the thoughts of those that have kids of
their own - what kind of a site would you rather your kids visited? And
last, any server solutions out there that address these needs?


Iva A. Koberg 

More information about the thelist mailing list