Skip to main content
Version: 1.1.0

WebCodecs Decoder

Decode and render H.264, H.265 and AV1 streams in Web browsers using WebCodecs API, the new Web standard for hardware-accelerated video decoding.

It's fast, uses less hardware resources, and supports more profiles and levels.

npm install @yume-chan/scrcpy-decoder-webcodecs

Browser support

Data on support for the webcodecs feature across the major browsers from caniuse.com

Because WebCodecs API is still relatively new, you should check if it is supported before using it.

import { WebCodecsVideoDecoder } from "@yume-chan/scrcpy-decoder-webcodecs";

console.log(WebCodecsVideoDecoder.isSupported);

Internally, it checks if VideoDecoder is defined in the global object.

class WebCodecsVideoDecoder {
static get isSupported() {
return typeof globalThis.VideoDecoder !== "undefined";
}
}

If WebCodecs API is not supported, you can fallback to the TinyH264 decoder, but be aware only H.264 main profile level 4 is supported, and the performance is much worse.

Codec support

Scrcpy v2.0 added the videoCodec option, to specify the video codec to use by the server. H.265 and AV1 codecs can provide better video quality at the same bitrate compared to H.264.

WebCodecs spec supports H.264, H.265 and AV1, but runtime support requires a joint effort from browsers, operating systems, graphics cards and drivers. H.264 should be supported by all browsers that support WebCodecs API, but support for other newer codecs may vary.

VideoDecoder.isConfigSupported static method can be used to check if a given codec is supported. It takes a codec parameter string, for example "hev1.1.60.L153.B0.0.0.0.0.0" for H.265 and "av01.0.05M.08" for AV1.

const result = await VideoDecoder.isConfigSupported({
codec: "hev1.1.60.L153.B0.0.0.0.0.0",
});
const isHevcSupported = result.supported === true;

Microsoft Edge on Windows

By default, Chromium browsers uses FFMpeg internally for WebCodecs API. However, Microsoft Edge, when running on Windows, uses Media Foundation decoders instead.

Decoding H.265 requires the HEVC Video Extensions ($0.99) or HEVC Video Extensions from Device Manufacturer (free but not available anymore) app from Microsoft Store.

Decoding AV1 requires the AV1 Video Extension (free) app from Microsoft Store.

Firefox

Firefox 133 supports playing H.265 videos, but does not support decoding H.265 streams using WebCodecs API.

Renderer

WebCodecs API decodes video frames into VideoFrame objects. There are multiple methods to render those VideoFrame objects onto the page. This package provides three renderers:

info

These renderers are not tied to our WebCodecsVideoDecoder, they can also be used separately to render any VideoFrame objects from WebCodecs API.

  • InsertableStreamVideoFrameRenderer: Renders to a <video> element using Insertable Streams API. See quirks below.
  • WebGLVideoFrameRenderer: Renders to a <canvas> or OffscreenCanvas using WebGL. It only works with hardware accelerated WebGL, because without hardware acceleration, the performance is even worse than the bitmap renderer below.
  • BitmapVideoFrameRenderer: Renders to a <canvas> or OffscreenCanvas using bitmap renderer.
info

VideoFrames can also be rendered using 2D canvas. However, because it's slower than bitmap renderer, and bitmap renderer is already available on all devices, we didn't think it's necessary to implement it.

Quirks of Insertable Stream renderer

The Insertable Streams renderer should be considered as experimental, because there are several issues around it:

Performance

The Insertable Streams API is specifically designed to render video frames from WebCodecs API, but in reality it's only easier to integrate, not faster. So it doesn't have the performance advantage over other renderers.

Compatibility

Its specification has two versions: the old MediaStreamTrackGenerator API, and the new VideoTrackGenerator. Only Chrome implemented the old API. The new API was added in mid 2023, but until end of 2024, nobody (including Chrome, who authored the specification), has implemented the new API (Chrome issue, Firefox issue).

As a result, we implemented the Insertable Stream renderer using the old MediaStreamTrackGenerator API. We will monitor the situation and update the renderer if necessary.

Lifecycle

Because it renders to a <video> element, if the video element is removed from the DOM tree (e.g. to move it into another element, or another page), it will be automatically paused. You need to call renderer.element.play() to resume playback after adding it back to the DOM tree.

It sets the autoplay attribute on the <video> element, so it will start playing automatically for the first time.

Create a renderer

Generally, the performance ranking is InsertableStreamWebGL >> Bitmap. However, because Insertable Stream renderer and WebGL renderer are not available on all devices, we recommend the following method to choose the best renderer on all devices:

InsertableStreamVideoFrameRenderer and WebGLVideoFrameRenderer both have an isSupported static property, to check whether they are supported by the current browser and hardware:

import type { VideoFrameRenderer } from "@yume-chan/scrcpy-decoder-webcodecs";
import {
InsertableStreamVideoFrameRenderer,
WebGLVideoFrameRenderer,
BitmapVideoFrameRenderer,
} from "@yume-chan/scrcpy-decoder-webcodecs";

function createVideoFrameRenderer(): {
renderer: VideoFrameRenderer;
element: HTMLVideoElement | HTMLCanvasElement;
} {
if (InsertableStreamVideoFrameRenderer.isSupported) {
const renderer = new InsertableStreamVideoFrameRenderer();
return { renderer, element: renderer.element };
}

if (WebGLVideoFrameRenderer.isSupported) {
const renderer = new WebGLVideoFrameRenderer();
return { renderer, element: renderer.canvas as HTMLCanvasElement };
}

const renderer = new BitmapVideoFrameRenderer();
return { renderer, element: renderer.canvas as HTMLCanvasElement };
}

When the constructors are called without arguments, they will create a rendering target automatically (<video> element for InsertableStreamVideoFrameRenderer, <canvas> for WebGLVideoFrameRenderer and BitmapVideoFrameRenderer). You will need to insert the created element into the page to display the video:

const { renderer, element } = createVideoFrameRenderer();
document.body.appendChild(element);

Or, all renderers accept existing rendering targets:

new InsertableStreamVideoFrameRenderer(videoElement);
new WebGLVideoFrameRenderer(canvasElementOrOffscreenCanvas);
new BitmapVideoFrameRenderer(canvasElementOrOffscreenCanvas);

Create a decoder

import type { ScrcpyMediaStreamPacket } from "@yume-chan/scrcpy";
import { ScrcpyVideoCodecId } from "@yume-chan/scrcpy";
import type {
ScrcpyVideoDecoder,
ScrcpyVideoDecoderCapability,
} from "@yume-chan/scrcpy-decoder-tinyh264";
import { WritableStream } from "@yume-chan/stream-extra";
import type { VideoFrameRenderer } from "./render/index.js";

export declare class WebCodecsVideoDecoder implements ScrcpyVideoDecoder {
static get isSupported(): boolean;
static readonly capabilities: Record<string, ScrcpyVideoDecoderCapability>;

get codec(): ScrcpyVideoCodecId;
get writable(): WritableStream<ScrcpyMediaStreamPacket>;
get renderer(): VideoFrameRenderer;

get framesRendered(): number;
get framesSkipped(): number;

get sizeChanged(): import("@yume-chan/event").AddEventListener<
{
width: number;
height: number;
},
unknown
>;

/**
* Create a new WebCodecs video decoder.
*/
constructor(options: WebCodecsVideoDecoder.Options);

snapshot(): Promise<Blob | undefined>;
dispose(): void;
}

export declare namespace WebCodecsVideoDecoder {
interface Options {
/**
* The video codec to decode
*/
codec: ScrcpyVideoCodecId;

renderer: VideoFrameRenderer;
}
}

The constructor requires two options:

  • codec: the video codec to be decoded. It can be retrieved from the video stream metadata, or hard-coded if you only use a specific video codec.
  • renderer: a renderer created in the previous section.

Similar to the TinyH264 decoder, after creating a decoder instance, you need to pipe the video stream into the writable stream:

import type { ScrcpyMediaStreamPacket } from "@yume-chan/scrcpy";
import { WebCodecsVideoDecoder } from "@yume-chan/scrcpy-decoder-webcodecs";

declare const videoPacketStream: ReadableStream<ScrcpyMediaStreamPacket>;

const decoder = new WebCodecsVideoDecoder({
codec: videoMetadata.codec,
renderer: renderer,
});

void videoPacketStream.pipeTo(decoder.writable).catch((e) => {
console.error(e);
});

Handle size changes

When the device orientation changes, Scrcpy server will recreate a new video encoder with the new size. The decoder will parse the new video configuration, and update the canvas size automatically.

However, the video size is also useful for other purposes, like injecting touch events. The sizeChanged event will be emitted when the video size changes:

decoder.sizeChanged(({ width, height }) => {
console.log(width, height);
});

Use in Web Worker

Although WebCodecs API already runs in its own thread, and the renderers are very fast, you might still want to run them in a dedicated Web Worker, so other tasks on the main thread won't affect its responsiveness.

Only WebGLVideoFrameRenderer and BitmapVideoFrameRenderer are supported in Web Worker. There are two ways to render the frames:

Create OffscreenCanvas directly

When their constructors are called without arguments, they will create an OffscreenCanvas object internally.

This OffscreenCanvas is not bound to a <canvas> element. See Synchronous display of frames produced by an OffscreenCanvas on MDN for how send the image data to a normal <canvas> element.

Create OffscreenCanvas from a <canvas> element

An OffscreenCanvas object can be created by calling transferControlToOffscreen() method on an existing <canvas> element.

It can then be postMessage to the worker, and pass into renderer constructors. When the renderer draws on the OffscreenCanvas, the content will be displayed on the source <canvas> element automatically.

<!-- index.html -->
<canvas id="canvas"></canvas>
// index.js
import type { ScrcpyMediaStreamPacket } from "@yume-chan/scrcpy";
import { ScrcpyVideoCodecId } from "@yume-chan/scrcpy";

declare const videoPacketStream: ReadableStream<ScrcpyMediaStreamPacket>;

const canvas = document.getElementById("canvas");
const offscreenCanvas = canvas.transferControlToOffscreen();

const worker = new Worker("worker.js");
worker.postMessage(
{
codec: ScrcpyVideoCodecId.H264,
canvas: offscreenCanvas,
stream: videoPacketStream,
},
[offscreenCanvas, videoPacketStream],
);
worker.addEventListener("message", (e) => {
const { width, height } = e.data;
canvas.width = width;
canvas.height = height;
});
// worker.js
import type { ScrcpyVideoCodecId, ScrcpyMediaStreamPacket } from "@yume-chan/scrcpy";
import {
WebGLVideoFrameRenderer,
WebCodecsVideoDecoder,
} from "@yume-chan/scrcpy-decoder-webcodecs";

self.addEventListener("message", (e) => {
const { codec, canvas, stream } = e.data as {
codec: ScrcpyVideoCodecId;
canvas: OffscreenCanvas;
stream: ReadableStream<ScrcpyMediaStreamPacket>;
};

const renderer = new WebGLVideoFrameRenderer(canvas);
const decoder = new WebCodecsVideoDecoder({ codec, renderer });

decoder.sizeChanged(({ width, height }) => {
postMessage({ width, height });
});

void stream.pipeTo(decoder.writable).catch((e) => {
console.error(e);
});
});

Take a screenshot

Because WebCodecs decoder can render to different types of targets, it's more difficult to manually capture the latest frame.

To help with that, the decoder provides the snapshot method to easily capture the last rendered frame as a PNG image.

const blob: Blob | undefined = await decoder.snapshot();

If no frames has been rendered, the return value will be undefined. Otherwise it will be a Blob object with the PNG image data.