video

Time-Based Digital Assets are now Mission Critical

How much video are you watching online? I’m pretty confident it’s more than last year, or the year before. It seems that every website now features video in some form or other. Video is also becoming increasingly prevalent across the various social media platforms too. There’s a good reason, studies have shown that video is more engaging than text or still imagery.

A video with a well told story that provides value or entertainment (or better yet, both) is often commented on and shared. Video is everywhere in the digital world. In fact a report by Cisco suggests that this year (2017) video will account for 69% of all consumer driven traffic on the web. Having video assets has also become important for findability with YouTube now ranked as the second largest search engine, processing three billion searches a month.

Video has become mission critical

The rise in voice-activated applications and devices means audio is not far behind as voice driven search is rapidly growing with some estimates suggesting that 50% of search queries will be done by voice by 2020.

Audio is becoming mission critical

Both Video and audio can be considered as time-based digital assets, and need to be managed, tagged, and produced in a controlled workflow just like more traditional media assets such as photography. The OpenText™ Media Management (OTMM) platform is perfectly positioned to handle traditional media and provide the functionality needed to manage and deliver the growing demand for time-based media.

OpenText™ Media Management now offers an optional Advanced Video Workflow that extends OTMM functionality into the editing suite specifically to meet the needs of dealing with time-based media assets in three specific areas: more detailed metadata, more control over the asset, and improved integration with preferred editing suites and workflow.

  • OTMM now automatically pulls additional metadata from time-based assets to improve search results and asset handling.
  • New Logging functionality means you can now add annotations and metadata over single scenes, or even single frames, or sound-bites. The meta-data selection buttons are totally configurable and can be driven by controlled language, domain knowledge terminology, or other defined terminology sets to provide intuitive tagging. Ranges of frames can also be tagged to create defined sub-clips.
  • The editing tool integration allows frame-by-frame broadcast quality interactions, frame search, and the support of multiple audio channels all within a browser environment. One-button toggling between low-res editing streams and a hi-res preview makes the editing workflow more efficient.

Once the tagging and editing work is complete, the finalized assets are sent back to OTMM for storage and retrieval from a single digital asset platform that provides the single source for all your brand-approved assets.

The Advanced Video Workflow option for OpenText™ Media Management provides key video tools so your teams can provide compelling and attention-getting content.

About Alan Porter

Alan Porter
Alan J. Porter is the Senior Product Marketing Manager for the OpenText Customer Experience Suite. He is a regular writer and industry speaker on various aspects of Customer Experience and Content Strategy.

Check Also

DAM

Digital Plants and Conveyor Belts – a Different Approach to DAM

The world’s largest museum complex is probably not the first thing that comes to mind …

WFO Video Series

OpenText WFO Video Series: Why Should Customer Experience be a top Enterprise Goal?

When faced with a choice of products, or suppliers, how to decide which one to …

Leave a Reply

Your email address will not be published. Required fields are marked *