Content moderation has become an integral activity of many online communities and websites alike, especially since it was discovered that a certain percentage of registered users tend to create offensive postings in the hope of arousing a significant amount of attention from their peers content moderator. Such instances must carry a certain level of social penalty, and content moderation professionals are faced with the challenge of devising procedures that would lessen this threat, while still maintaining a certain degree of control over the type of materials that any given community can choose to accept or not. A perfect example of this dilemma is the case study conducted by the University of California – Los Angeles’ Electronic Press Office on their efforts to moderate the content of several different online forums.
As you may probably surmise, the process of regulating the content of online forums falls under the heading of “content management,” which is actually a subset of “social media management.” This division was first made possible when the members of these online communities agreed to work together to identify offensive postings and take necessary measures to prevent them from entering their various chat rooms, bulletin boards, and discussion forums.
This mutual accountability is considered to be one of the major reasons behind the popularity of the forum concept, specifically when it is applied to the implementation of must-carry laws. A must-carry law is basically civil and criminal legislation that dictates how online communities should handle offensive content that may potentially cause offense to other members of the community. For instance, in some cases, a content management team may post information about an upcoming event on a public website without first ensuring that such information will remain free of offensive material or a breach of any existing social media laws. The same principle may also apply if an employer wishes to ban employees from using certain social networking platforms, like Facebook, at the workplace.
The Internet offers a variety of must-carry activities that online communities can engage in for the benefit of its users. One of these is the implementation of a content moderation policy that is posted on the main page of the network. However, this policy does not only define the procedures that must be implemented to uphold freedom of speech on the site; it also sets standards by which such content can be reviewed. This is where Zeran’s service can come into play.