Glossary

Full glossary

Moderation

What is video moderation?

Video moderation is the process in which videos are vetted for content before being published. Videos are scanned for content to ensure that the videos published meet the requirements of the company or website.

Moderation is typically done by a team of people assigned to watch each video, and judge the content against allowed criteria. More recently, machine learning algorithms have arisen to help the moderation process.

Why do I need video moderation?

Adding User Generated Content (UGC) to your site is becoming a popular way to grow a feeling of community around your content. UGC is highly trusted - for example, reviews by fellow consumers are generally highly trusted by viewers. However, with more and more videos being uploaded, many companies are understandably concerned that a customer could, in bad faith, upload a video with inappropriate content, harming the reputation of the company. By moderating every video prior to publication, they can ensure that every video meets their standards of publication.

How can I implement moderation?

If the number of videos being uploaded is small, you can have a person (or a team of people) moderate the videos submitted. videos that meet the submission guidelines are published, while videos that do not are removed from the pipeline.

There are a growing number of moderation services that utilize machine learning to examine the videos for content - these can be used as a "pre-filter" to protect your moderators from the most egregious content. If you find a high level of trust in the moderation software, it could also be used as a stand-alone moderation (without human moderators_.

Moderation at api.video

Videos uploaded are not moderated or flagged for content by api.video. If you would like to moderate videos uploaded into your account, we have built a demo showing an example implementation. The sample app is available at moderate.a.video. When a new video is uploaded, it is tagged as "uncategorized." Videos with this tag are not displayed on the site, but are sent to a moderation API that tests the video for adult content, smoking, weapons and hate groups. When the results are returned to the server, the video is retagged with the findings of the algorithm (either 'safe for work' or 'not safe for work', etc.). The website then can use these tags to decide which videos are appropriate to show to their customers.