How LinkedIn Uses Data to Improve Video Performance


At LinkedIn, we use data to improve our members’ experiences while using our site. On the video team, we value metrics that yield insight on how long our videos take to load, why certain videos draw more attention than others, and how our members tend to interact with videos across web, iOS, and Android. In short, the various data points collected during video playback on LinkedIn are leveraged to drive powerful improvements in video performance.

Technical terms

This post will make mention of the following terms, defined below for your convenience:

  • Iframe: An element that can render the content of external web pages inside of it. This is very useful in the case of video, as it allows us to render videos from third-party (e.g., Youtube, Vimeo) domains directly within our site.

  • Viewport: The portion of a website that is visible on the screen.

  • DOM: The representation of the web page as a tree made up of many content nodes.

Capturing data during playback

Our systems capture troves of data on how a video performs during playback. We have found that by focusing on the following data points, we have been able to dramatically improve video performance on LinkedIn.com:

  • Media Initialization Start: When the player starts to initialize.

    • For videos played through an iframe, such as third-party videos, this metric marks when the iframe is first rendered on the page.

    • For HTML5, or native, videos that are rendered directly on the page, this metric marks when the loadstart event is emitted by the video player.

  • Media Initialization End: When the player initialization has completed.

  • Media Buffering Start: When the media first begins to buffer, prior to the video playing.

  • Media Buffering End: When the media stops buffering, just before the video begins to play.

  • Time to Start (TTS): The time between when the player is initialized and when the player is ready to play the video.

  • Perceived Time to Start (PTTS): The time between when a member requests a video to play and when the video actually starts playing.

  • Media Initialization Time: The amount of time between the Media Initialization Start and the Media Initialization End events.

  • Media Initialization Rate: A data point that determines the percentage of videos that enter the viewport and successfully load before they exit the viewport.

Later on in this post, we’ll zoom in on a couple of experiments that leveraged many of the above data points to improve one of our most important metrics, PTTS.

Using data to benefit our members

Now that we have amassed a wealth of insightful video playback data, how can we use it to improve the experiences of our members? We tackle this problem from two approaches.

Detailed, real-time metrics reporting
At LinkedIn, we leverage multiple internal tools and services that allow us to store data and visualize changes in this data in real-time. One of the tools that is particularly helpful is called InGraphs, which allows us to visualize the many data points collected across our products.

In addition to the charts that InGraphs provides, we have services that notify relevant teams if any core metrics fall below a pre-set threshold. These tools allow us to take immediate action should we detect a degradation in member experience within one of our products.

Continuous A/B testing of features
We are constantly experimenting with new features, as well as tweaks to existing ones, with the overarching goal of producing the best experience for our members. We use metrics—in conjunction with reporting tools like InGraphs—to paint a clear picture of how a given experiment affects user interaction across our site.

For example, imagine a fictitious experiment in which we tested the effect of showing only video content for the first thirty posts in each member’s feed. The experiment may initially seem like a success, as the amount of video being watched by our members has gone up. However, after a closer look at InGraphs, we notice that the number of posts being shared by our members has dropped. By understanding this correlation and considering both impacts, the experiment would be terminated for having a negative impact on member experience.

Ensuring that our data is accurate

Data is only as useful as it is accurate. If we cannot trust that the data we store is accurate, there is no basis for testing the various experiments that we create. In addition to the data monitoring services mentioned above, we make heavy use of automated (unit, integration, and acceptance) tests to ensure a given feature works correctly. As you might imagine, it is not scalable at LinkedIn’s size to manually test all existing features during the development of new ones. Instead, tests are used to run existing features in isolation and guarantee that, through various interactions, the feature performs as expected. For example, we might write a test which asserts that clicking on a video’s play button causes the video to start playing, and captures data about the video’s loading performance. Automated tests therefore allow our engineers to guarantee that the metrics being emitted by their feature are accurate long after the feature has been created. In addition to automated tests, LinkedIn engineers have some handy tools (see the Tracking Overlay mentioned in a prior blog post on engineering infrastructure at scale) at their disposal to facilitate the validation of metrics emitted by a given feature.

Using data for video performance

Due to the naturally large size of a video asset, video performance requires a unique approach: we need a way to download enough of the video so that it starts to play right away, while also ensuring that we do not slow down the rest of the elements rendering on the page.

Case study: Perceived time to start (PTTS)
At LinkedIn, we measure perceived loading times to get an understanding of how long our members are waiting for videos to play. The principal metric that we use to gain insight into how long a video takes to load is perceived time to start (PTTS). PTTS measures how long the browser takes to download a video, in addition to the time the video spends buffering prior to playback.



Source link