You are currently browsing the daily archive for August 25, 2009.

Tools vs. Humans – Content management

This is a topic that will always be debated and discussed around Moderation services.

“Can’t we just allow our members to moderate themselves?” While this may be an ok tactic within some communities, it can be a cause of disaster in others. You will always have groups of people within an online community that become friends (like High school). Whether it is based on demographic, likes/dislikes or whatever, this can cause an unwelcome feeling for “Newbies” within your community. Members may/will end up removing content based on the fact that they do not like what is being said, and you may/will end up with a community with no content. There always needs to be some form of Governance within online communities – an “Over-seeing” authority to assist. Imagine a football game without any Ref’s, a baseball game without any Umps……..I think that you get the picture.

star

vs.

“The tools should be advanced enough to not need any human intervention.” Tools can only capture so much, and “false positives” can ruin a community. Filters can only catch what they are told to catch. Words and phrases can be taken out of context and removed incorrectly. Having a system where content is removed after “x” number of people report it is not an effective tool as well. Members will figure out how the system works and game it – not to mention, tools are only as effective as their owners 😉

tool

Obviously those are the most extreme statements on either side of the fence, but these types of thoughts do still exist within online community management and moderation. As I have posted time and time again across many of my blog posts, there needs to be a balance between tools and humans. Tools can not catch 100% of content that may fall outside of your stated policies, nor can humans. Allowing your members to report content that they believe falls outside of your policies is very effective and should be included in every community. This allows your members to feel that they have an “individual” say in what is/not allowed to be posted. Not to mention, that when your members report something, and you end up removing it, they will recognize the removal and it will encourage them to report other items as well, assisting in the Management and moderation of your community. This balance between appropriate tools and Human interaction and oversight is key to the success of every Online Community. Thoughts?

Employment

I am currently employed at Bose as the Digital Platform Manager, leveraging Ratings & Reviews and Community content to increase customer acquisition and retention

Community Roundtable

Recent Tweets

August 2009
M T W T F S S
 12
3456789
10111213141516
17181920212223
24252627282930
31