Skip to content

Use GStreamer to play media streams in chromium. We implemented a Media Process which is own by the Browser Process and creates players on-demand. Any Video tag will be backed by a GStreamer pipeline that lives in the Media Process.

License

BSD-3-Clause, BSD-3-Clause licenses found

Licenses found

BSD-3-Clause
LICENSE
BSD-3-Clause
LICENSE.chromium_os

aobzhirov/ChromiumGStreamerBackend

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Chromium GStreamer Backend

Chromium, GStreamer, MediaProcess, Sandbox, MSE, EME, Zero-Copy, GstPlayer, GstGL, GstChromiumHttpSrc, Build, Tips, Maintenance, UnitTests, Upstream, Issues, GstConf2015

Current branching point from official chromium/src

1553708cc722dcc8ae421838d8232d3a0e10faff (Mon Apr 4 2016) - tag 51.0.2700.0

Project description

This is an experimental project that aims to have GStreamer as media player in Chromium browser. We introduced a dedicated Media Process that maintains GStreamer pipelines. The Media Process is sandboxed with same level as GPU Process. GstPlayer is used to construct and to maintain GStreamer pipelines. Each HTML5 video tag is backed by a GStreamer pipeline that lives in the Media Process.

Licence

Same as official chromium/src source code: here.

Current supported features

  • Progressive streaming (http)
  • Adaptive streaming (hls, dash)
  • Media Source Extension (Youtube)
  • Encrypted Media Extension (Protected content)
  • Zero-copy (dmabuf export / EGLImage import / Cross process)

Talk at GStreamer Conference 2015

List of modified and added files

05/10/2015: (just to give an idea of the delta from official chromium/src)
79 files modified, 996 insertions(+), 23 deletions(-)
64 files added, 10474 insertions(+)
git diff --diff-filter=AM --stat sha-1 HEAD

Build

Start from a working build of official Chromium Browser. Then refer to this section to build the Chromium GStreamer Backend.

Media Process overview

There is only one Media Process no matter the number of video tags in the same pages or the number of pages (~ tabulations). GStreamer is almost only used through the a high level API GstPlayer (ex: gst_player_new. gst_player_play, gst_player_pause, gst_player_seek) It reduces GStreamer code lines a lot and it avoids doing mistakes when using more low level GStreamer API. Exception for the video rendering part because GstGL needs to be setup to use GPU Process. In short we pass an external get_process_addr to GstGL. Indeed the Media Process does not load OpenGL libraries; it uses chromium API to forward OpenGL calls to GPU Process which is the only sandboxed process that is allowed to load GL libraries. Exception also for the GstChromiumHttpSrc. It is a GStreamer element that wraps chromium API to load an url.

__

Media Process stack

GstGLContextGPUProcess: A new backend that allows to build the GstGL’s vtable from a given get_proc_addr, i.e. chromium::gles2::GetGLFunctionPointer. See gpu-accelerated-compositing-in-chrome “From the client's perspective, an application has the option to either write commands directly into the command buffer or use the GL ES 2.0 API via a client side library that we provide which handles the serialization behind the scenes. Both the compositor and WebGL currently use the GL ES client side library for convenience. On the server side, commands received via the command buffer are converted to calls into either desktop OpenGL or Direct3D via ANGLE.” The Media Process has its own connection to the GPU Process and it uses this second mechanism to forward gl calls it. In the Media Process all gl calls happen in the GLThread.

MediaPlayerGStreamer: Created in the Media Process for each video tag. It receives player commands (load, play, pause, seek) from the WebMediaPlayerGStreamer that lives a Renderer Process. It uses the gst-player library to play a stream. In the handler of glimagesink “client-draw” signal, the current gltexture is wrapped using glGenMailboxCHROMIUM/glProduceTextureDirectCHROMIUM (see mailbox in gpu-accelerated-compositing-in-chrome). The mailbox’s name for each gltexture is forwarded to WebMediaPlayerGStreamer through IPC (MediaChannel). But this can be avoided by making MediaPlayerGStreamer inherits from cc::VideoFrameProvider in order to avoid sending the gltexture id to Renderer Process (to be sync with the web compositor). Indeed according to oop-iframes-rendering and oop-iframes-rendering it is theoretically possible to make our Media Process be a kind of SubFrame renderer. By registering a cc::VideoLayer in the MediaProcess, the ubercompositor will do the synchronisation with the Frame in the RendererProcess. We suspect it is all about having a cc::VideoFrameProvider::Client in MediaProcess.

WebMediaPlayerGStreamer: Created in a Renderer Process for each video tag. It inherits from blink::WebMediaPlayer and cc::VideoFrameProvider. The gltexture is retrieved from a mailbox name and wrapped into a media::VideoFrame using media::VideoFrame::WrapNativeTexture. At any time the compositing engine is free to call GetCurrentFrame(). This late part can be avoided, see MediaPlayerGStreamer description.

GstChromiumHttpSrc: GStreamer element that wraps the chromium media::BufferedDataSource and instantiate a content::WebURLLoader . It was not possible to re-use WebKitWebSrc because some interfaces are missing or has been removed like PlatformMediaResourceLoader. It uses some parts of WebKitWebSrc though but not sure if it is necessary. Also It does not use appsrc comparing to WebKitWebSrc. It is more similar to filesrc due to media::BufferedDataSource design. That’s clearly an area to improve, maybe we can get rid of media::BufferedDataSource and implement missing interface in order to re-use WebKitWebSrc. Indeed internally it uses ResourceDispatcher, see multi-process-resource-loading. Or maybe we can design a complete new element and interfaces that could be shared between blink and WebKit.

__

Media Process sandbox

It has limited access to resources. For example it cannot create network sockets. It has to use chromium API in order to ask the Browser Process to create a socket. This is similar to Renderer Process. Also all system calls are filters using linux kernel feature Seccomp-BPF (link).

We opted for similar design as GPU Process sandbox, using Seccomp-BPF, see chromium sandbox doc. All the following sequence is existing sequence for GPU Process. In some sort the Media Process just defines its own policy for the system calls, see bpf_media_policy_linux.cc. A main difference is that the Media Process allow loading GStreamer plugins dynamically. But the list can be restricted and open is done in the brocker process.

In Media Process main the sandbox is setup before any thread creation. Also there is a preSansbox step where it is possible to dlopen some particular resources. A routine is executed in one go for each system call. In this loop it is decided to allow the system call number, to refuse it or to add a handler for emulation. All of this is setup through Seccomp-BPF. A trap handler (SECCOMP_RET_TRAP), is installed for open and access. Then when starting the sandbox the process forks itself (see chromium/src/sandbox/linux/syscall_broker/broker_process ::BrokerProcess::Init). The child becomes the so called Media Broker Process which instantiates a BrokerHost. The Media Process instantiates a BrokerProcessClient. When a open or access happens it sends a cachable SIGSYS. In this trap handler the BrokerClient emulates the open/access call. It actually asks the BrokerProcessHost, through IPC, to do the real call. Depending on the policy that has been setup the path is allowed or rejected. The return value is passed back to the BrokerClient, i.e. the Media Process, for usage if the call has been accepted.

__

Media Process IPC

IPC between media process and render process is maintained via browser process. Browser process MediaProcessHost is initialized with a connection to media process during the browser start up. Later, WebMediaPlayerDispatcher is used when the render process decides to create a media player for html media element. A first message is sent to media process. It establishes a direct IPC channel from render process to media process is created.

Establish media channel request comes from WebMediaPlayerGStreamer to RenderThreadImpl (main render thread), which creates MediaChannelHost for render process and sends establish channel message to MediaMessageFilter in browser process, which redirects the message through MediaProcessHost to media process.

Media process, upon receiving the request, establishes the channel (MediaChannel) with the help of MediaChannelFilter and sends back the channel handler which is passed then by browser process to the render process media player and used to further IPC communication between render process and media process. Any messages from media process to render process get filtered by MediaChannelHost and get dispatched by WebMediaPlayerMessageDispatcher to the right WebMediaPlayergStreamer instance.

For any messages from WebMediaPlayerGStreamer WebMediaPlayerMessageDispatcher dispatches messages via MediaChannelHost to MediaChannel in media process. MediaChannel maps the message to the associated media player and call the callback.

__

MSE

Media Source Extensions specification.

When load_type param of blink::WebMediaPlayer::load is equal to LoadTypeMediaSource the url is formated to start with mediasourceblob://.

It allows GstPlayer's pipeline to select GstChromiumMediaSrc. This GstBin adds and removes appsrc elements when receiving AddSourceBuffer and RemoveSourceBuffer.

Encoded data is currently sent through IPC message. We can later consider using shared memory but it seems efficient enough like. And it is also what Android does.

Currently seeking is not implemented but WebKitGTK does not support it too. Also it exists on-going work somewhere. And it should be easier to handle it with future multiappsrc element.

__

EME

Encrypted Media Extensions specification.

ChromiumCommonEncryptionDecrypt GStreamer element factory is registered in MediaPlayerGStreamer to decrypt encrypted streams inside mp4 and webm containers.

When MediaPlayerGStreamer receives a need key asynchronous event from ChromiumCommonEncryptionDecrypt (it is triggered by GStreamer protection event) it passes an initial data from encrypted stream pssh box to to the render process.

On key needed event WebMediaPlayerGStreamer passes the initial data from the media process to the default CDM. It registers OnCdmKeysReady callback in order to receive a parsed key data from the web application. Then new keys are sent back to the media process.

On getting add key IPC message from the render process MediaPlayerGStreamer passes the key down to ChromiumCommonEncryptionDecrypt element, which unblocks and starts decrypting of the encrypted frame in place.

__

Zero-Copy

Khronos specification.

The main idea is to keep decoded frames in GPU memory all the time. In other words, any round trip to CPU memory has to be avoided. As the rendering is done through OpenGL, the decoded surface has to be converted to a GL texture.

2 solutions depending on the HW:

  • Export the HW decoded surface as a DMA buffer in Media Process. Then import the DMA buffer into an EGLImage in GPU Process. When using VA-API and gstreamer-vaapi it means exporting the surface using VaAcquireBufferHandle and importing the DMA buffer using EGL_EXT_image_dma_buf_import in the GPU Process.
  • Export from GPU Process using EGL_MESA_image_dma_buf_export and import in Media Process. This second way is required when using gst-omx that wraps OpenMAX

In our experimentation we have selected the first solution because EGL_EXT_image_dma_buf_import is in EXT.
Whereas EGL_MESA_image_dma_buf_export is still under MESA specific extension. Though it should move to EXT as some point. To experiment that on desktop openmax backend provided by Gallium3D needs to support Tizonia first instead of Bellagio. It will be similar improvements we made for vaapi backend provided by Gallium3D.

For additional information see slides 17, 18, 19, 20 and the Demo2 from live or slides

The following diagram exposes the call stack using Gallium drivers ("nouveau" in this case) but we also support intel vaapi driver. In theory it should work on AMD gpu because our solution is generic in a sense that we support all Gallium drivers.

__

HTML5 video element

# Progressive streaming:
http://www.w3schools.com/html/mov_bbb.ogg
http://www.w3schools.com/html/html5_video.asp

<video width="560" height="320" controls>
    <source src="http://video.webmfiles.org/big-buck-bunny_trailer.webm" type="video/webm">
</video>

# Adaptive streaming
<!-- HLS -->
<video width="480" height="360" controls>
    <source src="http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8" type="application/x-mpegURL">
</video>

Build steps

# GStreamer
gstreamer >= 1.8 is required.

# clone official chromium repositories
git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git (then add it in front of your PATH)
git clone https://chromium.googlesource.com/chromium/chromium

# fetch and sync all sub repos. (run all of this from chromium directory, not chromium/src)
fetch --nohooks chromium # only the very first time
git rebase-update
gclient sync # see proxy section
gclient runhooks # see proxy section
build/install-build-deps.sh

# go to chromium/src and build everything.
cd src
build/install-build-deps.sh
ninja -C out/Release chrome

# checkout Chromium GStreamer Backend
git remote add github_gstbackend https://github.com/Samsung/ChromiumGStreamerBackend
git fetch github_gstbackend
git checkout -b gstbackend --track github_gstbackend/master
git replace 94383df24b3f7dad19633061d583d92a2e7f7a11 1553708cc722dcc8ae421838d8232d3a0e10faff

# build Chromium GStreamer Backend
cd .. # to go to root chromium directory
git rebase-update
gclient sync # see proxy section
gclient runhooks # see proxy section
cd src

# 2 ways to generate ninja build files:
GYP (old but stable so we recommend it for now) and GN (new from a few months)

# Using GYP to generate ninja build files
GYP_DEFINES="use_sysroot=0 proprietary_codecs=1 ffmpeg_branding=Chrome" build/gyp_chromium -D component=shared_library # if icecc then add linux_use_debug_fission=0 linux_use_bundled_binutils=0 clang=0
ninja -C out/Release chrome chrome_sandbox -jN # if icecc -j60

# Using GN to generate ninja build files
gn clean out/mybuild/
gn args out/mybuild --list
gn args out/mybuild
It should open a file then put the following flags inside:
  is_debug = false
  use_debug_fission = false
  linux_use_bundled_binutils = false
  is_clang = false
  use_sysroot = false
  proprietary_codecs = true
  ffmpeg_branding = Chrome
  is_component_build = true
  enable_nacl = false
  media_use_ffmpeg = false
  media_use_libvpx = false
  media_use_libwebm = false
ninja -C out/mybuild chrome chrome_sandbox -jN # if icecc -j60

# Debug build
If any linker error or crash then try to add linux_use_bundled_gold=0 or linux_use_gold_flags=0 to GYP_DEFINES

# Using clang instead of gcc
GYP_DEFINES="clang=1 clang_use_chrome_plugins=0 use_sysroot=0" # if icecc then add make_clang_dir=/opt/icecream/ linux_use_debug_fission=0 linux_use_bundled_binutils=0
When lunching chrome you might need to adjust LD_LIBRARY_PATH.

# Run without any sandbox
./out/Release/chrome --no-sandbox http://www.w3schools.com/html/mov_bbb.ogg

# Run while disabling setuid and media sandbox
./out/Release/chrome --disable-media-sandbox --disable-setuid-sandbox http://www.w3schools.com/html/mov_bbb.ogg

# Run with all sandbox
CHROME_DEVEL_SANDBOX=out/Release/chrome_sandbox ./out/Release/chrome http://www.w3schools.com/html/mov_bbb.ogg

# Run with EME enabled:
./out/Release/chrome --no-sandbox --enable-prefixed-encrypted-media http://yt-dash-mse-test.commondatastorage.googleapis.com/unit-tests/2015.html

# Run with zero-copy decoding/rendering:
gst-inspect-1.0 vaapi
LIBVA_DRIVER_NAME=gallium vainfo # or LIBVA_DRIVER_NAME=i965 if you have an intel gpu, uses gallium otherwise
./out/Release/chrome --no-sandbox --use-gl=egl  http://www.w3schools.com/html/html5_video.asp

Proxy

# touch ~/.boto and copy the 3 following lines inside ~/.boto
[Boto]
proxy = _proxy_ip
proxy_port = _proxy_port

# run
NO_AUTH_BOTO_CONFIG=~/.boto gclient sync
NO_AUTH_BOTO_CONFIG=~/.boto gclient runhooks

Maintenance

# We have not managed to push original history of chromium because:
# "remote: error: File chrome_frame/tools/test/reference_build/chrome_frame/chrome_dll.pdb is
#   124.32 MB; this exceeds GitHub's file size limit of 100.00 MB"
# This file is not there anymore but it was there in the past.
# Even using tools like "bfg --strip-blobs-bigger-than 50M" it will change all the next SHA
# from the point it strips a file.
# Also there is no official chromium/src repo on github that we could fork.

# solution: truncate history to a point where there is not file bigger than 100.00 MB in order
# to push to github.
# We use "git replace" to allow rebasing between our truncated branch and original branch.

# fake our branch point because we truncated the history
git replace 94383df24b3f7dad19633061d583d92a2e7f7a11 1553708cc722dcc8ae421838d8232d3a0e10faff

# replay chromium original upstream commits on top of our branch with truncated history
git checkout NEW_ORIGIN_SHA
git checkout -b master_new
git rebase 94383df24b3f7dad19633061d583d92a2e7f7a11
git replace $(git rev-parse HEAD) NEW_ORIGIN_SHA

# replay gst backend
git checkout -b gstbackend --track github_gstbackend/master
git rebase master_new
git push github_gstbackend gstbackend:master --force

# replace are located in .git/refs/replace/
git replace -l
git replace -d SHA

Tips

# disable ffmpeg and other decoders
Insert media_use_ffmpeg=0 media_use_libvpx=0 media_use_libwebm=0 into GYP_DEFINES, see build steps.

# disable nacl to reduce build time
Insert disable_nacl=1 to GYP_DEFINES, see build steps.

# command lines options for ./out/Release/chrome
A list of all command line switches is available here: http://peter.sh/experiments/chromium-command-line-switches/

# disable gpu process and particular sandbox at runtime
CHROME_SANDBOX_DEBUGGING=1 CHROME_DEVEL_SANDBOX=out/Release/chrome_sandbox ./out/Release/chrome --allow-sandbox-debugging --disable-render-sandbox --disable-hang-monitor  http://localhost/test/mov_bbb.ogg
CHROME_DEVEL_SANDBOX=out/Release/chrome_sandbox ./out/Release/chrome --disable-render-sandbox --disable-gpu --disable-hang-monitor about:blank

# run chromium in debug mode
CHROME_DEVEL_SANDBOX=out/Debug/chrome_sandbox ./out/Debug/chrome http://www.w3schools.com/html/mov_bbb.ogg
CHROME_DEVEL_SANDBOX=out/Debug/chrome_sandbox ./out/Debug/chrome --media-startup-dialog --allow-sandbox-debugging about:blank

# gdb
./out/Release/chrome --disable-media-sandbox --disable-setuid-sandbox --allow-sandbox-debugging --media-launcher='xterm -title renderer -e gdb --args'
CHROME_DEVEL_SANDBOX=out/Release/chrome_sandbox ./out/Release/chrome --disable-render-sandbox --disable-gpu --disable-hang-monitor --media-startup-dialog --allow-sandbox-debugging  about:blank

# Enable / disable gpu workarounds
./out/Release/chrome --no-sandbox --use-gl=egl --use_virtualized_gl_contexts=0
All workarounds are listed in chromium/src/gpu/config/gpu_driver_bug_workaround_type.h

# disable all processes, i.e. make all processes running as in-process mode in browser process.
--single-process

# Run media process as in-process mode, so it will be thread in Browser Process
--in-media-process

# logs in debug mode
DVLOG(level) in debug mode, DVLOG(1)
VLOG(level) in release mode, VLOG(1)
--enable-logging=stderr --v=N (level)
--enable-logging=stderr --v=1

# attach debugger
pass --media-startup-dialog to pause the media process at startup and print the pid
Then attach a debugger: gdb -p the_pid
In gdb type: signal SIGUSR1 to resume pause and type c to continue

# indent code
Just run: git cl format, to indent latest commit.

Build and run unit tests

# build and run "content" unit tests
ninja -C out/Release content_unittest
./out/Release/content_unittests

# build more group of unit tests at a time
ninja -C out/Release media_blink_unittests content_unittests media_unittests base_unittests gl_unittests gpu_unittests

# find other group of unit tests and run one
find chromium/src -name "*unittests.isolate" :  ./media/blink/media_blink_unittests.isolate
ninja -C out/Release media_blink_unittests

# list all tests within "gpu" unit tests group
./out/Release/gpu_unittests --gtest_list_tests

# run all tests in "gpu" unit tests group that contains "TexSubImage2DFloatDoesClearOnGLES3"
./out/Release/gpu_unittests --gtest_filter=*TexSubImage2DFloatDoesClearOnGLES3* --single-process-tests

# run all tests in "gpu" unit tests group that contains "GPUConfig"
./out/Release/gpu_unittests --gtest_filter=*GPUConfig* --single-process-tests

Build and run browser tests

# list
./content/test/gpu/run_gpu_test.py list

# run
CHROME_DEVEL_SANDBOX=out/Release/chrome_sandbox ./content/test/gpu/run_unittests.py
CHROME_DEVEL_SANDBOX=out/Release/chrome_sandbox ./content/test/gpu/run_gpu_test.py gpu_process

# run browser tests with visible stdout
CHROME_DEVEL_SANDBOX=out/Release/chrome_sandbox content/test/gpu/run_gpu_test.py --browser=release --show-stdout webgl_conformance

Build and run layout tests

# build
ninja -C out/Debug content_shell

# run
./out/Debug/chrome --run-layout-test --no-sandbox inspector/network/network-domain-filter.html

# official steps
https://www.chromium.org/developers/testing/webkit-layout-tests

Contributing to upstream Chromium

Contributor License Agreements (CLA)

If you signed the CLA with a non gmail account then create a google account from that external email: SignUpWithoutGmail

Submitting a patch

Sign in to codereview.chromium.org using the email account which one you used to sign the CLA.
Before uploading the patch run "depot-tools-auth login https://codereview.chromium.org" to authentificate using "OAuth2".
It should open a new tab in your browser starting by "localhost:8090" and after logged in it should be written: "The authentication flow has completed." Then you are ready to upload your patch by running "git cl upload".
To submit to a patch to an existing CL just type "git cl issue 1415793003" before uploading.
To make sure it creates a new CL just type "git cl issue 0".

Issues and roadmap

About

Use GStreamer to play media streams in chromium. We implemented a Media Process which is own by the Browser Process and creates players on-demand. Any Video tag will be backed by a GStreamer pipeline that lives in the Media Process.

Resources

License

BSD-3-Clause, BSD-3-Clause licenses found

Licenses found

BSD-3-Clause
LICENSE
BSD-3-Clause
LICENSE.chromium_os

Code of conduct

Stars

Watchers

Forks

Packages

No packages published