You are currently browsing the tag archive for the ‘coverage’ tag.

When was the last time that you walked by a restaurant with no one in it and decided to walk in?
chairs
Empty seats, how inviting is that?

As I have stated many times, Moderation and Management services have transformed over the last year or so. In the past, Moderation services were strictly thought of in a reactive nature – removal of content and the associated member accounts.

Now, the roles and responsibilities of Community Moderators have expanded to include a more proactive nature. Communities that are just beginning need to be jump-started. Content needs to be posted and interactions need to be portrayed within the community, so when potential members “stumble-upon” the area, they are aware of what is acceptable and allowed; what the community is geared towards. Not only that, members need to be aware of all of the features and functionality that is available to them.

Moderation services can assist with the seeding of content and active facilitation of interactions within a community to get it on track and moving forward. While some communities may need more than others, these services can be tailored to meet any companies needs.

After all, when you visit a restaurant or an online community it is nice to know what you are getting into.
restuarant

The next time that you think of an online community and how to manage and moderate it, please think of what your needs and goals are, and how you expect to accomplish them – while at the same time, taking a look from the Outside.

Moderation and Management (whether internally or externally driven) of any online community is not something that should be an after-thought, because if it is, you are already setting yourself up to sink.

This question is one that is asked time and time again by potential clients, and the answer is not as cut-and-dry as you would think.

Let me 1st state that, in general, I do not like it when someone answers a question with another question, but the fact of matter is that there are certain times and places when it is appropriate, and this is one of them.

When you think about Moderation services, there are different ways/services that you can support your online community initiative, as I have outlined in previous posts. There are also “times” associated with those “ways/services”. Outlining a strategy in advance will assist in figuring out how much time is needed – as a baseline. It is very important to have an understanding of this “baseline” coverage so that everyone involved understands the importance of investing time. If you do not invest any or minimal time within your community, it is destined to fail. You get out of your community what you put into it.

Here are some questions to think about before you discuss “How much time is needed”:

What types of UGC are you allowing?
Photo’s may be easier to review compared to textual content. Video’s may be 30 seconds, or they may be 2 minutes.

What types of Moderation will be implemented?
Are you going to review every piece of content (pre-moderate) or are you going to allow the content to be posted immediately? Are you going to allow your members to report content that they feel violates your policies (very scalable solution)? Are you planning to proactively scan content? Actively scanning content may not take as long as reviewing member reported posts – ensuring that you pay attention to the detail within the reported content, processing it appropriately.

What/Who is your target demographic?
How familiar are they with online communities? Will they need much hand-holding? Do they understand how they should behave, and what is acceptable? Is the subject or audience sensitive?

Will members accounts be managed – suspend them?

At times, members deserve/need to have a time-out. If it is indefinite, or for a month, are you going to allow them to “appeal the decision”?
Do you have a proper escalation path in place for that to happen, to be efficient? Are you requiring registration at all (easier to manage the accounts)?

Will members questions and concerns be addressed (Customer Support) within your community?

Proactive involvement is important for most every community. While the goal is to eventually have the community to be as self-supportive as possible, there is an initial investment that is needed to assist in the education and support of answering questions.

How many people are going to be involved in participating?
Is support involved? Are other departments planning to participate? Is a 3rd party supplementing your internal coverage? The more people that you can leverage to assist in this “time-management”, the better off that everyone will be. Spread the wealth, harness the power of the masses.

This is by no means an all-inclusive list of questions, but just a beginning – to get the ball rolling and the juices flowing.

Thoughts?

Note: Cross-posted at mzinga.com

Online communities are about relinquishing as much control as possible, while at the same time providing straight forward Rules and Regulations / Terms of Services Policies so that members know what is expected.

Time and time again, clients come our way with the thought that they would like to pre-moderate every piece of User Generated Content (UGC). Pre-Moderation is very time consuming and does not facilitate interactions between members within online communities – not to mention it can be very costly. If you are looking for interactions and communications between the members of your community, then you should provide the best user experience as possible. This approach can actually have a negative effect on your community.

At the same time, there are always certain times and places where pre-moderation is an effective tool. In this post, I wanted to discuss some of the situations that we think that pre-moderation can be an appropriate approach.

1. Visual content – Photo’s and video’s that can be uploaded and streamed within an online community should probably be pre-moderated. I say this because visual content can be very damaging in nature, and easily “seen”. When you think about textual content that needs to be read, you actually have to take time to read each and every post to see if a post is actual damaging or has content that violates your policies (not to mention that there are pretty good tools that can be leveraged to find this content ahead of time. Photo’s and video’s are easily looked at and can be easily viewed – especially those that are obviously unacceptable. I also realize that there are photo and video tools out there, but nothing is 100%

2. Sites that appeal to a sensitive demographic – Young children, Financial advice, Health care, Automotive Industry…there are others, but I just wanted to point out a couple. There are many rules/regulations and laws within these specific demographics that need to be followed and addressed. You do not want to be liable for information that is posted within your community based on the fact that you were not following these regulations and laws. You also want to ensure that if you do have a piece of content that falls within these laws that you take the appropriate action and report the content to the proper authorities.

3. Blog comments – If I own my own blog, and want to ensure that the content that is being posted is clean and accurate, then you may think about allowing a member to decide to pre-moderate their own individual blog. Some members want to truly “own” their blog and decide what content should be allowed, especially if it is located within their profile page. Now, they may not approve posts that attack or question their views, but in those cases, they will discredit themselves.

Can you think of other instances where Pre-moderation would be beneficial, or you have witnessed it working?

Mike

Note: Cross-posted at mzinga.com

As I have discussed in prior posts, Moderation Services almost always extend beyond content removal and enforcing a Usage Policy.

Including, but not limited to:

  • Proactive scanning
  • Managing member accounts
  • Responding to Violation reports
  • Facilitation and Interaction
  • Customer Support
  • Reporting Escalations
  • Report Generation

I would like to touch upon the next-to-last bullet point with this post (Reporting Escalations).

There are situations where a Moderator and/or client needs to get others involved – in some cases Law Enforcement. Threats of any kind (to one-self or to others) should not be tolerated or taken lightly. Reports of abuse of any nature should be handled appropriately.

Appropriate courses of action need to be outlined so that everyone can be prepared and streamline the process. While this is definitely a “worst-case scenario” thought, escalation paths need to be defined.
In most cases, we assist in the escalation process with our clients, assisting them in defining what needs to be escalated, when, to whom, and what the client should do once they receive a report from us.
In other cases, our clients already have escalation paths in place, so we don’t “re-invent the wheel”, we duplicate what is already in place.

Have you ever been involved in a situation and were unsure how to handle it within your community? If you had prepared, would the situation have been resolved in a timely fashion?

Mike

Note: Cross-posted at mzinga.com

As I have outlined in the Introduction to Moderation webinar’s and blog posts that I have hosted/posted over the last 2 months, there are many forms of Online Moderation – Seeding, participating, scanning, removing, managing (member accounts and content)… Here I would like to describe the service that will assist in your initial launch – Seeding content – and could be the most important thing that you do.

When you are building and developing your online community area, you will have the tools and the outline/framework of what your community is going to look like, but you do not have any content. Supplying content within your community before it launches is an extremely important first step. The content that you, or a 3rd party, supply is the key to your launch and starting off on the right foot. When members come and review the content and they see an area that is vibrant, robust and informative, odds are that they will stop and stay a while, maybe even register and post, interacting within your community. When they see an area that is empty, they will simply look, leave, and likely never return.

If you picture this in every day terms, it is just like when you are visiting a town, walking down their streets and all of a sudden, you feel hungry. You walk by the first restaurant, notice that there is no one in it except for their staff, waiting to work, so you walk right by. Granted there would not be a wait to sit down and eat, because the restaurant is empty, but you are not willing to invest your time and money. Not to mention, no one else is even there, how good could it be?

Now the second restaurant that you walk by (located right next door) is full of people, laughing, eating, drinking, and the wait is 15 minutes. The food and atmosphere is very inviting, so you go in, grab one of their “buzzers” and take a seat in the bar. While you are there waiting for your table, you listen to conversation, maybe watch the TV, and observe the behaviors that are happening all around you. This may even be a place that you revisit over time, and may even become a regular.

*Note – I do realize that I make a lot of restaurant references

The success of your community depends on the investments that you make. The tools that you are going to offer are only 1 aspect of having and launching a successful community. Proper Management techniques and Moderation services are extremely important. This of course does not mean that you have to work with a third party to support your causes, but more often than not, it is better to work with a team of professionals that have established credibility, rather than trying to wing it.

As always, thanks for reading.

Mike

I was reading Jeremiah Owyang’s blog post about the “Bozo” feature within community platforms, and it got me to thinking about how valuable that the feature can be, when used properly.

There are always cases where you tend to run through all of your options when managing members (gagging, locking out, banning, restriction, pre-moderating…..). These are many of the 1st steps or tools that are used initially to manage members accounts, and restrict their access to a community. The “Bozo” option, flag (or whatever you want to call it) is generally a last course of action the moderation team, or a Community manager can take. While decisions may never make it down to this level, it is an option that’s available.

Limiting the actions that you can take against a members account is not a great way to moderate/manage a community. I was always taught to leave all of my options open, and to never back myself in a corner. In general, the flag is used against members who have no interest in your community/members/brand/company/employees…..their only reason for existence is try to demolish and destroy, at all costs.

Also, by leveraging this flag, you can “buy yourself some time”. This allows a Moderation team or Community Manager the time to review the account, look at any registration information, and note it. In many cases this member (like banned members), will try to come back. If you are able to review the current account, you can often find any new account(s) that this member may register, within minutes, and “beat them to the punch”. Now granted, this is always a cat-and-mouse game, and can take a lot of time to accomplish, but if you are able to do your research and “buy yourself this time”, it will only help you in the long run.

Mike

Note: Cross-posted at mzinga.com

Employment

I am currently employed at Autodesk as a Senior Manager of Online Community and Social Engagement. My team is responsible for the Customer Support initiatives across all of our Social channels.

Community Roundtable

Recent Tweets

October 2014
M T W T F S S
« Aug    
 12345
6789101112
13141516171819
20212223242526
2728293031  
Follow

Get every new post delivered to your Inbox.