You are currently browsing the monthly archive for February 2009.

I got to reading a couple of blog posts concerning how much information that you should share online, and got to thinking how Moderation is
important in that aspect. It really struck me when I read a blog post by Jim Storer, located here.

Now, I understand that transparency is the ultimate goal of many online communities, but want to describe this sharing of information in real-life situations.

Online communities have been described as a party, with the company/business/individual, being the host. When you register on that community,  you give certain information to the “host” of the party, but there are rules and restrictions (legally) with what they can do with that
information. Then you are allowed to attend the party and share whatever information that you would like with other individuals.

You may stand/sit around with a group of 4-5 people (# does not really matter) that you feel comfortable with and interact truly open and honest, and you may mingle around the room and stumble upon others that you do not feel the same comfort level with, so you just sit back  and observe the situation and interactions, not truly participating. You are allowed to ignore those that you do not choose to interact with or do not like. You may pass your business card to some, and choose not to give it to others. You may follow up with some people with a phone call or email, and others you may not even see again.

Now to put this in Online terms, you can try and interact with some groups of people, and you can try to keep some information “close to your hip”, but the fact of the matter is that information is open and available and searchable to everyone, at anytime. You had made the conscious decision to be open in one aspect, but not as open in others, but within an online community, it does not matter. All of the information that you had shared the entire time that you were there is readily accessible.
Now I realize that there are private communities, and some tools/communities have restrictions on who can see what, but even still
any and all information that you post is openly available to anyone within that community – even those that you may not want to have that information.

I understand transparency, and I understand that many want to be open and honest – but Moderation of the content that you share online is very important. I am not saying that you should not be open and honest, I am just saying to be careful, because as my grandmother used to say, “There are a lot of crazies out there”. Thoughts?

Note: Cross-posted at

This question is one that is asked time and time again by potential clients, and the answer is not as cut-and-dry as you would think.

Let me 1st state that, in general, I do not like it when someone answers a question with another question, but the fact of matter is that there are certain times and places when it is appropriate, and this is one of them.

When you think about Moderation services, there are different ways/services that you can support your online community initiative, as I have outlined in previous posts. There are also “times” associated with those “ways/services”. Outlining a strategy in advance will assist in figuring out how much time is needed – as a baseline. It is very important to have an understanding of this “baseline” coverage so that everyone involved understands the importance of investing time. If you do not invest any or minimal time within your community, it is destined to fail. You get out of your community what you put into it.

Here are some questions to think about before you discuss “How much time is needed”:

What types of UGC are you allowing?
Photo’s may be easier to review compared to textual content. Video’s may be 30 seconds, or they may be 2 minutes.

What types of Moderation will be implemented?
Are you going to review every piece of content (pre-moderate) or are you going to allow the content to be posted immediately? Are you going to allow your members to report content that they feel violates your policies (very scalable solution)? Are you planning to proactively scan content? Actively scanning content may not take as long as reviewing member reported posts – ensuring that you pay attention to the detail within the reported content, processing it appropriately.

What/Who is your target demographic?
How familiar are they with online communities? Will they need much hand-holding? Do they understand how they should behave, and what is acceptable? Is the subject or audience sensitive?

Will members accounts be managed – suspend them?

At times, members deserve/need to have a time-out. If it is indefinite, or for a month, are you going to allow them to “appeal the decision”?
Do you have a proper escalation path in place for that to happen, to be efficient? Are you requiring registration at all (easier to manage the accounts)?

Will members questions and concerns be addressed (Customer Support) within your community?

Proactive involvement is important for most every community. While the goal is to eventually have the community to be as self-supportive as possible, there is an initial investment that is needed to assist in the education and support of answering questions.

How many people are going to be involved in participating?
Is support involved? Are other departments planning to participate? Is a 3rd party supplementing your internal coverage? The more people that you can leverage to assist in this “time-management”, the better off that everyone will be. Spread the wealth, harness the power of the masses.

This is by no means an all-inclusive list of questions, but just a beginning – to get the ball rolling and the juices flowing.


Note: Cross-posted at


I am currently employed at Bose as the Digital Platform Manager, leveraging Ratings & Reviews and Community content to increase customer acquisition and retention

Community Roundtable

Recent Tweets

February 2009