IAB Australia releases social media comment moderation guidelines for organisations

| | No Comments

Thumbnail image for iab.jpgIAB Australia has today launched its Social Media Comment Moderation Guidelines which set out recommendations on how organisations should moderate user generated comments that are posted to social media channels. 

Designed to build upon the guidance issued by the Australian Competition and Consumer Commission (ACCC), IAB’s guidelines also highlight the tools made available by the social media platform operators, identifies good business practice with respect to moderating user comments and provides general guidance around how to manage these conversations.

The Guidelines already have the support of social media platform operators and organisations that use these platforms to engage, as well as agencies that support their clients’ efforts in this space.  These signatories include the IAB Australia membership (including Google, Facebook, LinkedIn, Mi9, and Media Mind); as well as industry organisations including AIMIA, Tourism Australia, Connecting Up, Dialogue Consulting, Quiip, brr media, Australian Government Bureau of Meteorology and Valued Interactive Media.  The Guideline has also drawn support from the international community with sister associations IAB NZ and IAB UK also lending their support.

 

According to Samantha Yorke, Director of Regulatory Affairs at IAB Australia there has been some confusion amongst the business community about how to manage user comments on social media platforms: “After a careful analysis of existing laws and regulation and industry practice around social media IAB Australia has reached the view that user comments directed towards an organisation or social media platform, or to other users who are drawn to a particular organisation, do not constitute advertising.

 

“There is a real risk that organisations who treat user comments as advertising will err on the side of caution and moderate user comments very conservatively, which will adversely impact their presence on social platforms and which arguably undermines the very spirit under which social media thrives,” said Yorke.

 

In the Social Media Comment Moderation Guidelines, IAB acknowledges that all stakeholders have a role in managing user comments including users, who should think about the appropriateness of their content before they post it and take responsibility for their comments; platforms who should remove comments reported to them which are illegal or violate their terms and conditions and empower organisations using their platforms with tools to assist them in moderating their properties; the community, who should report comments that violate applicable rules; and organisations should engage in responsible moderation of user comments posted to their social media channels.

 

As well as outlining the legal treatment of user comments that needs to be considered under Australian Consumer Law and which are enforced by the ACCC, the Guidelines also set down the best practice recommendations as follows:

1. Develop moderation guidelines and publish them on your social media property so that your community is very clear about how behaviour is being managed.

2. Consider developing an internal moderation schedule, appropriate to your resourcing levels, which identifies who is moderating which social media properties and at which times.

3. Develop a crisis management plan in the event that an issue arises on your social media platform which needs escalating.

4. Moderate the user comments on your branded social media properties to the extent your resources allow.

5. If you don’t have the resources within your organisation to moderate user comments, or your internal risk analysis has deemed your use of social media platforms to be high risk, consider hiring a specialist moderation business that have all the necessary clearances and are well versed in conflict management and jurisdictional matters.

6. If you are directly soliciting a response or the creation of user generated content in relation to a provocative or edgy question posted on your social media channels which are likely to elicit controversial responses, ensure you have adequate resources to take extra care to review all responses and any provided user generated content promptly.

7. If your business or product is directed towards children, be aware that there may be specific legal or regulatory requirements that you need to meet, and you should employ moderators who have been through a working with children check or police check and who are trained to identify suspicious behaviour which could be indicative of grooming or other predatory behaviour.

8. Regularly review the tools that are available to you when you develop a presence on social media and consider which tools are appropriate for you to implement

9. Provide feedback to the platform operators around how the tools work and any suggestions for improvement.