For today's social networks, success can be a double-edged sword. Not only are more people using social, they're using it more frequently for longer periods, and they're sharing more content than ever. Especially videos and images, all of which need to be moderated to ensure that users see only content the social network defines as appropriate.
Until now, social networks have been forced to depend on the only option available for moderation - humans reviewing and filtering explicit videos, images and text. Two enormous problems with this method:
1. Human moderation doesn't work in real-time
2. Humans (no matter how many you hire) can't keep pace with the massive volume of images and videos being shared
ImageVision changes all this. Serving as a frontline of defense, ImageVision technology scans for nudity in videos and images and inappropriate language in text. This occurs automatically and in real-time, as the content is being shared. Content that's regarded as "safe" loads immediately to the network with no further action taken. Suspect videos, images and text are "flagged" for human moderators to review.
By saving human moderation for only the suspect content - not for all content - social media companies serve their users more efficiently and slash human moderation costs up to 80%, all while delivering a more enjoyable social networking experience.