You are currently browsing the monthly archive for January 2009.

Online communities are about relinquishing as much control as possible, while at the same time providing straight forward Rules and Regulations / Terms of Services Policies so that members know what is expected.

Time and time again, clients come our way with the thought that they would like to pre-moderate every piece of User Generated Content (UGC). Pre-Moderation is very time consuming and does not facilitate interactions between members within online communities – not to mention it can be very costly. If you are looking for interactions and communications between the members of your community, then you should provide the best user experience as possible. This approach can actually have a negative effect on your community.

At the same time, there are always certain times and places where pre-moderation is an effective tool. In this post, I wanted to discuss some of the situations that we think that pre-moderation can be an appropriate approach.

1. Visual content – Photo’s and video’s that can be uploaded and streamed within an online community should probably be pre-moderated. I say this because visual content can be very damaging in nature, and easily “seen”. When you think about textual content that needs to be read, you actually have to take time to read each and every post to see if a post is actual damaging or has content that violates your policies (not to mention that there are pretty good tools that can be leveraged to find this content ahead of time. Photo’s and video’s are easily looked at and can be easily viewed – especially those that are obviously unacceptable. I also realize that there are photo and video tools out there, but nothing is 100%

2. Sites that appeal to a sensitive demographic – Young children, Financial advice, Health care, Automotive Industry…there are others, but I just wanted to point out a couple. There are many rules/regulations and laws within these specific demographics that need to be followed and addressed. You do not want to be liable for information that is posted within your community based on the fact that you were not following these regulations and laws. You also want to ensure that if you do have a piece of content that falls within these laws that you take the appropriate action and report the content to the proper authorities.

3. Blog comments – If I own my own blog, and want to ensure that the content that is being posted is clean and accurate, then you may think about allowing a member to decide to pre-moderate their own individual blog. Some members want to truly “own” their blog and decide what content should be allowed, especially if it is located within their profile page. Now, they may not approve posts that attack or question their views, but in those cases, they will discredit themselves.

Can you think of other instances where Pre-moderation would be beneficial, or you have witnessed it working?

Mike

Note: Cross-posted at mzinga.com

Advertisements

For those of you that have an Online Community, or have managed one in the past, you are well aware of what “Trolls” are.

For those of you new to Online Communities, a “Troll” is someone that comes to your community, and their main goal is to cause problems, wreak havoc.
Causing a problem just to cause a problem is not acceptable behavior within any community and should be dealt with in a timely fashion. Any malicious behavior should be handled swiftly.

At the same time, there are certain types of behavior that these members (“Semi-Trolls” if you will) can bring to an online community that may, and I stress MAY be acceptable/beneficial.

These members generally choose the opposite side of any story, to get into a debate with other members.
Now as long as the debate is productive, and stays within the posted rules and guidelines, it should be allowed to continue – albeit you should keep a close eye on it.
Before it spirals out of control, you should step in and let your community know that you are watching the discussion, so they are aware that specific actions can and will be taken, both against the content, and any members account.
Public debates can make a community stronger and healthier – again, as long as it can be done in a productive fashion.
It is extremely important that all members understand others views, and get the full picture, so they can form their own view (both on the member and the topic at hand).
Once a member oversteps the line, they discredit themselves within the community. This assists in bringing out personalities/views of other members.

Have you observed this type of behavior in the past? Did it help you to form a better opinion concerning a specific member or topic?

Mike

Note: Cross-posted at mzinga.com

As I have discussed in prior posts, Moderation Services almost always extend beyond content removal and enforcing a Usage Policy.

Including, but not limited to:

  • Proactive scanning
  • Managing member accounts
  • Responding to Violation reports
  • Facilitation and Interaction
  • Customer Support
  • Reporting Escalations
  • Report Generation

I would like to touch upon the next-to-last bullet point with this post (Reporting Escalations).

There are situations where a Moderator and/or client needs to get others involved – in some cases Law Enforcement. Threats of any kind (to one-self or to others) should not be tolerated or taken lightly. Reports of abuse of any nature should be handled appropriately.

Appropriate courses of action need to be outlined so that everyone can be prepared and streamline the process. While this is definitely a “worst-case scenario” thought, escalation paths need to be defined.
In most cases, we assist in the escalation process with our clients, assisting them in defining what needs to be escalated, when, to whom, and what the client should do once they receive a report from us.
In other cases, our clients already have escalation paths in place, so we don’t “re-invent the wheel”, we duplicate what is already in place.

Have you ever been involved in a situation and were unsure how to handle it within your community? If you had prepared, would the situation have been resolved in a timely fashion?

Mike

Note: Cross-posted at mzinga.com

Employment

I am currently employed at Autodesk as a Senior Manager of Online Community and Social Engagement. My team is responsible for the Customer Support initiatives across all of our Social channels.

Community Roundtable

Recent Tweets

January 2009
M T W T F S S
« Nov   Feb »
 1234
567891011
12131415161718
19202122232425
262728293031