So Much UGC, So Little Time: Options for Moderating User-Generated Content
By Christie Flanagan on May 23, 2013
We all know the value of incorporating social computing features such as comments, ratings, reviews, polls and blogs into the web presence. These kinds of interactive capabilities are essential for driving customer engagement and fostering community around your brand. Ratings and reviews can also play an important role in influencing the buying behavior of other site visitors. Checking out online ratings and reviews is something many of us commonly due before selecting a particular product or service. These social capabilities certainly help cultivate loyal and satisfied customers who in turn recommend your brand to others. But before enabling these kinds of features on your web presence, there are some things you need to consider.
Whether you want to enable product reviews or commenting on articles, you’ll want to assure that the dialogue on your website remains focused and relevant. To accomplish this, you’ll need to consider how best to approach the moderation of the user-contributed content your site visitors will be generating.
With an open community approach, there really is no moderation. All can contribute to the conversation and all contributions are published on your website automatically. While this approach eliminates barriers to participation, encourages a very free dialogue and requires little oversight, there are some serious drawbacks. Open communities can invite spam, trolling or other inappropriate behavior that can be damaging to your brand. For many enterprises, an open community is simply an inappropriate choice.
On the opposite side of the spectrum is a controlled community approach. In this type of approach, all user-generated content is subject to moderation prior to publishing. All contributed content needs to be reviewed and approved by a community manager prior to publication. While this approach encourages contributors to be on their best behavior and provides the community manager with the most control, this approach can result in a considerable delay in the publishing of user-generated content and can discourage social interaction on your web presence.
Enterprises that wish to encourage social interactivity while safeguarding brand integrity may want to consider striking a balance between these two approaches. If your web experience management platform permits it, this can be done by introducing some level of automated or community member triggered moderation. For example, you might employ user white-lists and blacklists to help determine which site visitors require moderation for their comments. You could also use customizable keyword filters that will trigger the moderation of comments that contain specific key words such as profanity. Another option is to enlist members of the community in the moderation process, by enabling site visitors to flag content that they think is inappropriate. When a comment is flagged for moderation by an automated filter or by site visitors, it can automatically be assigned to the appropriate community manager for review, editing, approval or deletion. In this way, low risk user-generated content can be published to your site quickly, while content that may be a cause for concern enters a moderation queue for further review.
See how simple is it to deploy user-generated content features and moderation using Oracle WebCenter Sites.