Aug 8, 2016 - By Ben Swartz

Improving Chat Rendering Performance

Chat is a core part of the Twitch experience. Rendering new chat lines dozens of times a second on the client turns out to be a very difficult task.

As Twitch has grown, the number of chat lines that are rendered has increased. Along with the increasing quality of video and complexity of the chat feature set, interacting with the core Twitch experience has resulted in occasional frame drops. Over the past couple months, I could constantly be seen around the office making the NotLikeThis face as I attempted to watch streams. I knew something had to be done.

For the client side of our website, we use Ember. Our chat backend, a service we call “TMI”, sends each chat message in a given room to subscribed clients. From there, we generate a list of messages and render a message-line component for each message in that list.

We also allow for 60fps video on our site. This means that if we want to prevent video streams from skipping frames, rendering and painting of video and chat messages needs to occur within 16.667ms — a tall order.

When I started this investigation, additional messages were taking over 200ms of render time. TMI allows for tens of messages per second to go through the system, meaning that in the worst case we were skipping a ton of frames. The end result was choppy video for many streams with an active chat.

Methodology

When we started investigating these issues, we wanted to see how long it took for a given message to render and how long it took for the first 100 lines to appear on the page. We used a test channel, which has both a high quality 60fps stream and the maximum limit of chat messages per second.

We used the Chrome Developer Tools to look into how long render events were taking and we used ef4’s initial-render-perf along with some R to programmatically come up with render performance data.

Solutions

Batching Message updates

The first thing we noticed was that there was no need to render message lines the moment we received them from the chat backend. If we just buffered and batched the message updates for a given time frame — say every 100ms — we would reduce the number of render events by a significant number.

This was the first performance change we made. Whenever we received a message from TMI on the client side, we buffered that message into a queue. When 100ms passed by, we pushed all messages from that queue into the list of messages that the message-line template was bound to.

In busy chat rooms this halved the amount of time it took to render lines. Instead of rendering multiple times causing 100–200ms render events, we reduced it down to a single render event every 100ms. This meant that even in really fast chats, we reduced the amount of frame skips.

Removing extraneous component calls

In Ember, each additional component you render increases the time to render. Our initial template for chat looked like the following:

{% raw %}
// chat/chat-display.hbs
{% endraw %}
{% raw %}
{{#each messages as |message|}}
  {{chat/chat-line
    msgObject=message
    ...
  }}
{{/each}}
{% endraw %}

Where chat/chat-line.hbs looked like:

{% raw %}
// chat/chat-line.hbs
{% endraw %}
{% raw %}
{{#if isWhisper}}
  {{chat/whisper-line
    msgObject=msgObject
    ...
  }}
{{else}}
  {{chat/message-line
    msgObject=msgObject
    ...
  }}
{{/if}}
{% endraw %}

Now generating that chat/chat-line component every time added time to render. One quick fix that Robert Jackson, Ember Core Team member and Twitch engineer, took advantage of was pulling up that conditional into chat/chat-display.hbs:

{% raw %}
// chat/chat-display.hbs
{% endraw %}
{% raw %}
{{#each messages as |message|}}
  {{#if (eq message.style 'whisper')}}
    {{chat/whisper-line
      msgObject=message
      ...
    }}
  {{else}}
    {{chat/message-line
      msgObject=message
      ...
    }}
  {{/if}}
{{/each}}
{% endraw %}

With this realization in mind I went on a mission to remove extraneous component calls. Another common component use case was tooltips. For a very long time at Twitch, we have been using a jQuery addon called Tipsy to make nice looking tooltips. Tipsy is a pretty old piece of software that relies heavily on jQuery. One way that we made Tipsy maintainable in our Ember code base was by creating a tipsy-wrapper component around certain tags that would correctly create and destroy Tipsy when those elements were created and destroyed.

Since we used Tipsy for both emoticon and badge tooltips, for each message, we could be invoking more than 10 tipsy-wrapper components. Each of these new components was adding 0.25–0.5ms to the total render time in Chrome on my i7 MacBook Pro.

Our solution was one that our design team had wanted to use for a while: pure CSS tooltips. Pure CSS tooltips have a bunch of advantages: there’s no JavaScript to deal with, their styling is done in pure CSS, and there’s nothing to have to clean up — there is no risk of dangling tooltips.

When we made this change, render times improved dramatically. As a bonus, our tooltips look much cleaner:

Old:

New:

Removing Computed Properties

When I started looking deep into the stack trace of message-line renders, I realized that a lot of time was spent managing the dependencies of computed properties. Computed properties are instrumental to Ember for making sure that your templates are bound to model changes. In chat’s case, it’s pretty unlikely that the message’s underlying information is going to change while it’s still on the page. This means that message-line can effectively be a function from message object to HTML.

Knowing that, we looked into removing computed properties from our message-line component. In order to keep our code looking familiar to developers already comfortable with computed properties, Robert Jackson made a Getter class that lazily calculates a value once, avoiding dependency bookkeeping and recalculation. This Getter class works nearly identically to ES6 getters but makes sure to include the constructor/teardown functions that Ember expects.

This turned convenience computed properties like this one from this:

{% raw %}
// chat/message-line.js
{% endraw %}
{% raw %}
systemMsg: computed(‘msgObject.tags.system-msg’, function () {
  return this.get(‘msgObject.tags.system-msg’);
}),
{% endraw %}

To this:

{% raw %}
// chat/message-line.js
{% endraw %}
{% raw %}
systemMsg: getter(function () {
  return this.get(‘msgObject.tags.system-msg’);
}),
{% endraw %}

Removing computed properties was actually a pretty significant change. In time to render for 100 messages on our test channel, it reduced the time by over 10%. Going forward, it seems clear that for things that need to be very performant we should be very careful when we use computed properties.

Next Steps

While I was able to accomplish a lot and increase performance by quite a bit, there’s still plenty more we can do. One of the things that is next up for us is upgrading to the newest version of Ember. During the course of these performance improvements, Robert Jackson found some V8 deoptimizations in Ember itself. He has since pushed these changes upstream, but we are waiting to clean up a few deprecations before we can upgrade (Ember Views — BibleThump).

Another thing Matthew Beale, another Ember Core Team member, noticed while we were looking into things was that we are running code unimportant to chat rendering (user activity tracking) before allowing the browser to completely layout and paint chat lines:

Moving these functions to a shared runloop if needed, or deferring them until after chat renders if possible, will minimize their competition with chat rendering and reduce the number of times that we skip frames.

Finally, one huge takeaway from all of this is to reduce the amount of logic we put into rendering the chat. In an ideal world, the message-line component would be a simple function from a message object to HTML. We need to continue to be vigilant about keeping our message-line renderer side effect free and only calculate things when we need to.

Things are looking better now, but there’s still plenty of work to do. We’re hiring engineers, so come join us if this sounds interesting!

Special thanks to Robert Jackson, Matthew Beale, Tiffany Huang, Chris Carroll, and Timothy Yen for their help with performance improvements and reading drafts of this blog post.

In other news
Aug 12, 2016

All your Clips in a single spot

Starting today, you can revisit and re-share all your favorite Twitch moments and see which of your Clips make the biggest splash.
All your Clips in a single spot Post
Aug 7, 2016

On Pokémon GO Cheating

Recently we issued a statement that streaming content on our services which violates third-party terms of service or other user agreements…
On Pokémon GO Cheating Post