Skip to content

Method to directly render the VideoDecoder to a Canvas element #88

@koush

Description

@koush

Currently, sending data from the VideoDecoder to Canvas involves:

  • VideoDecoder.decode
  • VideoFrame.createImageBitmap
  • createImageBitmap again to flipY the imageOrientation (it seems that Chrome currently doesn't support the ImageBitmapOptions parameter on VideoFrame)
  • CanvasContext.transferImageFromBitmap

Under the hood, this I suspect this converts VIdeoFrame YUV to ImageBitmap RGB, rather than directly rendering the YUV planes of the video frame in the webgl canvas. (Please let me know if I'm wrong)
There's also some lifecycle management on the frames and bitmaps to close and destroy them.

Ideally, the video frames (or perhaps decoder) should have a method to directly render to a canvas. This is similar to how Android's decoder operates, and for lack of better description "feels" like it is closer to achieving zero copy.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions