For years, the primary goal of publishers and platforms has been viewer acquisition. But with today’s viewers becoming overwhelmed with options, engagement has taken center stage in the battle for more viewers, more loyal viewers, and increased revenue
. No longer can publishers live by the maxim “build it and they will come.” Rather, it’s about delivering viewing experiences that are transformed from passive to active, broad to localized, and generic to contextual.
It’s against this backdrop that interactivity becomes a true competitive differentiator. And metadata has emerged as the key ingredient that drives it.
What Is Video Metadata?
Before we can understand the potential metadata holds for engaging your audiences, let’s start by understanding what it is. In its simplest form, video metadata is the descriptive information that is embedded within a video file. Essentially, it is data about data. It answers content questions such as:
What was created?
When was it made?
Where was it created?
Who made the content?
How should the content be shown?
Which platforms should the content be shown on?
Why was the content made?
Why Is Video Metadata Important?
For video, there are three crucial roles that metadata plays.
Enables Search Engine Indexing
Just as with metadata from websites and other uploaded content, video metadata tells search engines what the content is about. It helps them determine which results content should show up in based on a user’s query, and ultimately how it will rank compared to other related content. Three key parts of this metadata to consider in terms of search engines include the title, description, and tags.
Title. Make sure your title includes essential keywords that users are likely to search for. However, the title should also accurately reflect the content in the video to help ensure longer viewing times, which lead to higher rankings in the search engine.
Descriptions. Since search engines don’t “watch” your content, descriptions are needed to tell search engines what the content is about. These descriptions are often included in search engine results and can help videos rank better on some platforms.
Tags. Tags help search engines better understand what the video is about, which will also aid in search engine discoverability. They also play an important role in helping publishers organize content in logical ways that align with how users search. And finally, as your content library grows, they’ll be essential in helping you to internally organize and quickly find content.
Improves User Experience
Metadata also offers a better experience on video sites and OTT apps. On these platforms, metadata helps viewers better understand what the video is about and whether they should invest the time to watch. It also helps optimize the organization of content and increases discoverability as users browse through genres and episodic content.
Increases Audience Engagement
Perhaps the most interesting role metadata plays is its potential to increase viewer engagement. It’s the foundation for creating the interactivity today’s viewers want.
How Video Metadata Can Drive Interactivity
Keep in mind that while interactive video is highly engaging, it should not be incorporated into every piece of video content. It’s important to build the combination of the right video for the right audience and the most engaging interactive experience. That said, there are four use cases where interactive video can be particularly potent, especially when driven by metadata.
Sports is a powerful use case because of the huge amount of metadata that is generated during a typical game, match, or season. Combine this with the historical data that can be mined and utilized for both live broadcasts and video on demand (VOD) and you can start to see how it can enrich the viewing experience in a variety of ways. Common areas where it’s used include displaying player and team stats in real time, instant replays, and highlight reels.
For example, the German Football League media hub, which stores all video and metadata related to German football matches, has 11 petabytes of data within the hub—the equivalent of 2.75 billion smartphone photos. The metadata generated includes all the official match data, live logging of action types, and data driven by machine learning and artificial intelligence.
When building out an interactive framework for sports, there are several important questions to consider.
How does the metadata you have access to support the game or match?
Is the game being broadcast live, or will you only have access to VOD?
How often will the metadata be updated during the game?
How will you get this data to the content and the audience?
How will you show this data to the audience in a way that is meaningful but also doesn’t distract?
But the biggest challenge with sports is that the data and interactivity have to sync with what is happening in the game. Audiences will expect scores and points to be updated in a livestream in the same way as in a TV broadcast. Because of the numerous systems involved and gates to pass through in the workflow, this can be quite tricky.
Metadata can be an effective way to enrich the viewer’s experience during their favorite TV shows. Interactive features like actor bios can provide a new layer of depth and additional context that encourages engagement. It also offers publishers new revenue streams through product placements and targeted and contextual advertising that’s closely aligned with the content itself.
Implementing metadata-driven interactivity into entertainment content has its own set of considerations.
How will you ensure the metadata supports the content as well as the target audience and their interests?
How can you make the interactions seamless and meaningful by adding new layers of context and depth?
What will be your approach for ensuring the interactivity aligns with both viewer preference and the overall nature of the content?
While metadata-driven interactions are relatively straightforward on web and mobile platforms, there can be challenges when bringing this experience to living room devices—where long-form content is typically consumed. Current operating systems, app frameworks, SDKs, and ad servers may not fully support this as an optimal user experience. However, as these technologies continue to evolve, we’ll see even more engaging metadata-driven experiences on the web, on mobile, and in the home.
News and Weather
Metadata is already changing the way we consume news- and weather-related content. It provides the opportunity to personalize the viewing experience without disrupting the livestream by adding localized context to live news feeds and weather reports.
When implementing metadata-driven approaches for news and weather, you’ll want to consider the sophisticated workflows you’ll need. They should be capable of serving metadata in real time across different locations in order to deliver a personalized, immediate, and local experience.
However, this isn’t without challenges. Naturally, viewers expect their news and weather to be both accurate and timely. Given their dynamic nature, having metadata that’s rich enough to add real-time context can be particularly difficult.
In an increasingly remote world, metadata offers the opportunity to transform traditional university lectures into interactive experiences for virtual students. As professors give live lectures, metadata can allow them to incorporate links and overlays into the content. Common examples could include linking to supplemental materials, related web pages for further research, or additional context to aid the comprehension of a particular lecture.
Institutions should take into consideration the detailed level of planning this can entail. Instructors will need to meticulously prepare the links and overlays and preload them into the livestream using timestamps. Otherwise, the metadata-driven interactivity you aim for can actually become counterproductive if there are unnecessary interruptions that distract students from the lecture itself.
It’s important to note that incorporating metadata will require a fair amount of technical proficiency from the lecturer or the educational institution since this isn’t a typical workflow. A workaround for this could be utilizing Brightcove’s Simulive Player and pre-recorded lectures to incorporate interactivity. This allows instructors to still benefit from the interactive elements while maintaining the real-time effect of a live session.
Setting Up Metadata-Driven Interactivity
Creating an interactive player using metadata is very straightforward with Brightcove, as long as the data is well structured and complete.
Create a data feed. This can be done using XML or JSON. You can also use WebVTT, which is used for closed captions. You’ll want to create a reusable feed so that it can be leveraged across multiple videos. A sports feed can usually be generated from sports data platforms such as Opta Sports.
Upload the feed. Add the data feed to an internet-accessible server or bucket. Alternatively, you can use Brightcove’s Media module and CMS API to ingest the feed to create cue-points within the video.
Publish your custom player and associated videos. Now you’re ready to embed the player on your webpage.
Synchronize the player plugin and the video. During video playback, the player plugin will control the interactivity UX, metadata, and associated timestamps.
As mentioned, the plugin can read the data feed to power the UX, or cue-points can be created in Brightcove. For a more robust and complex interactive experience, it would be best to use the plugin method as you have more control over the complexity of the data sets.
Livestreaming is a little more complex than VOD as you need to ensure that the data is in sync with what is happening in the live broadcast.
Use an Encoder with Cue-point Support. This allows you to insert cue-points and markers into the live stream before it’s ingested. An alternative could be adding ID3 tags to the stream using Brightcove’s Live API.
Ingest the Live Stream. Using the Brightcove Live API, you can ingest the live stream. The same API allows you to add ID3 tags via a feed to the live stream.
Confirm Timed Descriptions. When adding an ID3 tag, ensure it contains a description of what occurred at a specific time. Once the tag is added, the metadata will immediately appear or display at the time specified in the tag.
Again, a plugin is built to control the interactivity UX and read the ID3 tags in the live stream. Based on what is present in the tags, the plugin will display the metadata at the right time and in sync with the content.
Metadata: The Essential Tool for Interactivity
The power of metadata can deliver new levels of interactivity and engagement that’s much needed in today’s ever-competitive content landscape. It offers the perfect marriage of engagement and personalization that’s required to build and maintain a loyal user base.
Beyond the interactivity itself, metadata can increase existing revenue by optimizing discoverability both in search engines and within platforms. It can also open new revenue opportunities through avenues like product placement and contextual ads.
From delivering rich interactive VOD experiences or enhancing a live broadcast, we’ll continue to see metadata play an increasing role in many different types of content. There’s no better time than now to start exploring the benefits it can bring to your content and your bottom line.
While the setup, particularly for live metadata-driven experiences can be complex, the experts at Brightcove Global Services have the knowledge and experience to help you achieve your content and revenue goals.