Base-Art - Projectshttps://base-art.net/2023-07-08T17:30:00+02:00GNOME Web Canary is back2023-07-08T10:00:00+02:002023-07-08T17:30:00+02:00Philippe Normandtag:base-art.net,2023-07-08:/Articles/gnome-web-canary-is-back/<p>This is a short <span class="caps">PSA</span> post announcing the return of the <span class="caps">GNOME</span> Web Canary builds.
Read on for the crunchy details.</p>
<p>A couple years ago I was blogging about the <a href="https://base-art.net/Articles/introducing-the-gnome-web-canary-flavor/"><span class="caps">GNOME</span> Web Canary
flavor</a>.
In summary this special build of <span class="caps">GNOME</span> Web provides a preview of the upcoming
version of …</p><p>This is a short <span class="caps">PSA</span> post announcing the return of the <span class="caps">GNOME</span> Web Canary builds.
Read on for the crunchy details.</p>
<p>A couple years ago I was blogging about the <a href="https://base-art.net/Articles/introducing-the-gnome-web-canary-flavor/"><span class="caps">GNOME</span> Web Canary
flavor</a>.
In summary this special build of <span class="caps">GNOME</span> Web provides a preview of the upcoming
version of the underlying WebKitGTK engine, it is potentially unstable, but
allows for testing features that have not shipped in a stable release yet.</p>
<p>Unfortunately, Canary broke right after <span class="caps">GNOME</span> Web switched to <span class="caps">GTK4</span>, because back
then the WebKit <span class="caps">CI</span> was missing build bots and infrastructure for hosting
WebKitGTK4 build artefacts. Recently, thanks to the efforts of my
<a href="https://igalia.com">Igalia</a> colleagues, <a href="https://www.igalia.com/team/pabelenda">Pablo
Abelenda</a>, <a href="https://www.igalia.com/team/lmoura">Lauro
Moura</a>, <a href="https://www.igalia.com/team/dpino">Diego
Pino</a> and <a href="https://www.igalia.com/team/clopez">Carlos
López</a> the WebKit <span class="caps">CI</span> provides WebKitGTK4
build artefacts, hosted on a server kindly provided by Igalia.</p>
<p><img src="https://base-art.net/png/Epiphany-Canary-GTK4.png"/></p>
<p>The installation instructions are already mentioned in the introductory post but I’ll just remind them again here:</p>
<div class="highlight"><pre><span></span><code>flatpak<span class="w"> </span>--user<span class="w"> </span>remote-add<span class="w"> </span>--if-not-exists<span class="w"> </span>webkit<span class="w"> </span>https://software.igalia.com/flatpak-refs/webkit-sdk.flatpakrepo
flatpak<span class="w"> </span>--user<span class="w"> </span>install<span class="w"> </span>https://nightly.gnome.org/repo/appstream/org.gnome.Epiphany.Canary.flatpakref
</code></pre></div>
<p><strong>Update</strong>:</p>
<p>If you installed the older version of Canary, pre-<span class="caps">GTK4</span>, you might see an error
related with an expired <span class="caps">GPG</span> key. This is due to how I update the WebKit runtime,
and I’ll try to avoid it in future updates. For the time being, you can remove
the flatpak remote and re-add it:</p>
<div class="highlight"><pre><span></span><code>flatpak<span class="w"> </span>--user<span class="w"> </span>remote-delete<span class="w"> </span>webkit
flatpak<span class="w"> </span>--user<span class="w"> </span>remote-add<span class="w"> </span>webkit<span class="w"> </span>https://software.igalia.com/flatpak-refs/webkit-sdk.flatpakrepo
</code></pre></div>
<p>That’s all folks, happy hacking and happy testing.</p>WebRTC in WebKitGTK and WPE, status updates, part I2023-02-16T21:30:00+01:002023-02-16T21:19:35+01:00Philippe Normandtag:base-art.net,2023-02-16:/Articles/webrtc-in-webkitgtk-and-wpe-status-updates-part-i/<script type="module">
import mermaid from 'https://cdn.jsdelivr.net/npm/mermaid@9/dist/mermaid.esm.min.mjs';
mermaid.initialize({ startOnLoad: true });
</script>
<p>Some time ago we at <a href="https://igalia.com">Igalia</a> embarked on the journey to ship
a <a href="https://gstreamer.freedesktop.org">GStreamer</a>-powered WebRTC backend. This is
a long journey, it is not over, but we made some progress …</p><script type="module">
import mermaid from 'https://cdn.jsdelivr.net/npm/mermaid@9/dist/mermaid.esm.min.mjs';
mermaid.initialize({ startOnLoad: true });
</script>
<p>Some time ago we at <a href="https://igalia.com">Igalia</a> embarked on the journey to ship
a <a href="https://gstreamer.freedesktop.org">GStreamer</a>-powered WebRTC backend. This is
a long journey, it is not over, but we made some progress. This post is the
first of a series providing some insights of the challenges we are facing and
the plans for the next release cycle(s).</p>
<p>Most web-engines nowadays bundle a version of <a href="https://webrtc.org/">LibWebRTC</a>,
it is indeed a pragmatic approach. WebRTC is a huge spec, spanning across many
protocols, RFCs and codecs. LibWebRTC is in fact a multimedia framework on its
own, and it’s a very big code-base. I still remember the suprised face of
<a href="https://crisal.io/">Emilio</a> at the <a href="https://webengineshackfest.org/2022/">2022 WebEngines
conference</a> when I told him we had unusual
plans regarding WebRTC support in GStreamer WebKit ports. There are several
reasons for this plan, explained in the <a href="https://wpewebkit.org/about/faq.html"><span class="caps">WPE</span>
<span class="caps">FAQ</span></a>. We worked on a LibWebRTC backend for
the WebKit GStreamer ports, my colleague Thibault Saunier <a href="https://blogs.gnome.org/tsaunier/2018/07/31/webkitgtk-and-wpe-gains-webrtc-support-back/">blogged about
it</a>
but unfortunately this backend has remained disabled by default and not shipped
in the tarballs, for the reasons explained in the <span class="caps">WPE</span> <span class="caps">FAQ</span>.</p>
<p>The GStreamer project nowadays provides a library and a plugin allowing
applications to interact with third-party WebRTC actors. This is in my opinion a
paradigm shift, because it enables new ways of interoperability between the
so-called Web and traditional native applications. Since the GstWebRTC
announcement back in late 2017 I’ve been experimenting with the idea of shipping
an alternative to LibWebRTC in WebKitGTK and <span class="caps">WPE</span>. The <a href="https://github.com/WebKit/WebKit/commit/a530847a7ce739e1b8b7a55cf1f5ae41a0938a11">initial GstWebRTC WebKit
backend</a>
was merged upstream on March 18, 2022.</p>
<p>As you might already know, before any audio/video call, your browser might ask
permission to access your webcam/microphone and even during the call you can now
share your screen. From the WebKitGTK/<span class="caps">WPE</span> perspective the procedure is the same
of course. Let’s dive in.</p>
<h2>WebCam/Microphone capture</h2>
<p>Back in 2018 for the LibWebRTC backend, Thibault added support for
GStreamer-powered media capture to WebKit, meaning that capture devices such as
microphones and webcams would be accessible from WebKit applications using the
<a href="https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia">getUserMedia</a>
spec. Under the hood, a GStreamer source element is created, using the
<a href="https://gstreamer.freedesktop.org/documentation/gstreamer/gstdevice.html?gi-language=c">GstDevice</a>
<span class="caps">API</span>. This implementation is now re-used for the GstWebRTC backend, it works
fine, still has room for improvements but that’s a topic for a follow-up post.</p>
<p>A MediaStream can be rendered in a <code><audio></code> or <code><video></code> element, through a
custom GStreamer source element that we also provide in WebKit, this is all
internally wired up so that the following <span class="caps">JS</span> code will trigger the WebView in
natively capturing and rendering a WebCam device using GStreamer:</p>
<div class="highlight"><pre><span></span><code><span class="p"><</span><span class="nt">html</span><span class="p">></span>
<span class="p"><</span><span class="nt">head</span><span class="p">></span>
<span class="p"><</span><span class="nt">script</span><span class="p">></span>
<span class="w"> </span><span class="nx">navigator</span><span class="p">.</span><span class="nx">mediaDevices</span><span class="p">.</span><span class="nx">getUserMedia</span><span class="p">({</span><span class="nx">video</span><span class="o">:</span><span class="w"> </span><span class="kc">true</span><span class="p">,</span><span class="w"> </span><span class="nx">audio</span><span class="o">:</span><span class="w"> </span><span class="kc">false</span><span class="w"> </span><span class="p">}).</span><span class="nx">then</span><span class="p">((</span><span class="nx">mediaStream</span><span class="p">)</span><span class="w"> </span><span class="p">=></span><span class="w"> </span><span class="p">{</span>
<span class="w"> </span><span class="kd">const</span><span class="w"> </span><span class="nx">video</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="nb">document</span><span class="p">.</span><span class="nx">querySelector</span><span class="p">(</span><span class="s1">'video'</span><span class="p">);</span>
<span class="w"> </span><span class="nx">video</span><span class="p">.</span><span class="nx">srcObject</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="nx">mediaStream</span><span class="p">;</span>
<span class="w"> </span><span class="nx">video</span><span class="p">.</span><span class="nx">onloadedmetadata</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">()</span><span class="w"> </span><span class="p">=></span><span class="w"> </span><span class="p">{</span>
<span class="w"> </span><span class="nx">video</span><span class="p">.</span><span class="nx">play</span><span class="p">();</span>
<span class="w"> </span><span class="p">};</span>
<span class="w"> </span><span class="p">});</span>
<span class="w"> </span><span class="p"></</span><span class="nt">script</span><span class="p">></span>
<span class="p"></</span><span class="nt">head</span><span class="p">></span>
<span class="p"><</span><span class="nt">body</span><span class="p">></span>
<span class="p"><</span><span class="nt">video</span><span class="p">/></span>
<span class="p"></</span><span class="nt">body</span><span class="p">></span>
<span class="p"></</span><span class="nt">html</span><span class="p">></span>
</code></pre></div>
<p>When this WebPage is rendered and after the user has granted access to capture
devices, the GStreamer backends will create not one, but two pipelines.</p>
<pre class="mermaid">
flowchart LR
pipewiresrc-->videoscale-->videoconvert-->videorate-->valve-->appsink
</pre>
<pre class="mermaid">
flowchart LR
subgraph mediastreamsrc
appsrc-->srcghost[src]
end
subgraph playbin3
subgraph decodebin3
end
subgraph webkitglsink
end
decodebin3-->webkitglsink
end
srcghost-->decodebin3
</pre>
<p>The first pipeline routes video frames from the capture device using
<code>pipewiresrc</code> to an <code>appsink</code>. From the <code>appsink</code> our capturer leveraging the
Observer design pattern notifies its observers. In this case there is only one
observer which is a GStreamer source element internal to WebKit called
<code>mediastreamsrc</code>. The playback pipeline shown above is heavily simplified, in
reality more elements are involved, but what matters most is that thanks to the
flexibility of Gstreamer, we can leverage the existing MediaPlayer backend that
we at Igalia have been maintaining for more than 10 years, to render
MediaStreams. All we needed was a custom source element, the rest of our
MediaPlayer didn’t need much changes to support this use-case.</p>
<p>One notable change we did since the initial implementation though is that for us
a MediaStream can be either raw, encoded or even encapsulated in a <span class="caps">RTP</span> payload. So
depending on which component is going to render the MediaStream, we have enough
flexibility to allow zero-copy, in most scenarios. In the example above,
typically the stream will be raw from source to renderer. However, some webcams
can provide encoded streams. <span class="caps">WPE</span> and WebKitGTK will be able to internally
leverage these and in some cases allow for direct streaming from hardware device
to outgoing PeerConnection without third-party encoding.</p>
<h2>Desktop capture</h2>
<p>There is another <span class="caps">JS</span> <span class="caps">API</span>, allowing to capture from your screen or a window,
called
<a href="https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getDisplayMedia">getDisplayMedia</a>,
and yes, we also support it! Thanks to these recent years ground-breaking
progress of the Linux Desktop such as <a href="https://pipewire.org">PipeWire</a> and
<a href="https://flatpak.github.io/xdg-desktop-portal/">xdg-desktop-portal</a> we can now
stream your favorite desktop environment over WebRTC. Under the hood when the
WebView is granted access to the desktop capture through the portal, our backend
creates a <code>pipewiresrc</code> GStreamer element, configured to source from the file
descriptor provided by the portal, and we have a healthy raw video stream.</p>
<p>Here’s a <a href="https://www.youtube.com/live/fAbbiyRnJ6I">demo</a>:</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/fAbbiyRnJ6I" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
<h2>WebAudio capture</h2>
<p>What more, yes you can also create a <a href="https://developer.mozilla.org/en-US/docs/Web/API/MediaStreamAudioDestinationNode">MediaStream from a WebAudio
node</a>.
On the backend side, the
<a href="https://github.com/WebKit/WebKit/blob/main/Source/WebCore/Modules/webaudio/MediaStreamAudioSourceGStreamer.cpp">GStreamerMediaStreamAudioSource</a>
fills GstBuffers from the audio bus channels and notifies third-parties
internally observing the MediaStream, such as outgoing media sources, or simply
an <code><audio></code> element that was configured to source from the given MediaStream. I
have no demo for this, you’ll have to take my word.</p>
<h2>Canvas capture</h2>
<p>But wait there is more. Did I hear canvas? Yes we can feed your favorite
<code><canvas></code> to a MediaStream. The <span class="caps">JS</span> <span class="caps">API</span> is called
<a href="https://developer.mozilla.org/en-US/docs/Web/API/HTMLCanvasElement/captureStream">captureStream</a>,
<a href="https://github.com/WebKit/WebKit/blob/main/Source/WebCore/Modules/mediastream/CanvasCaptureMediaStreamTrack.cpp">its code is actually
cross-platform</a>
but defers to the <code>HTMLCanvasElement::toVideoFrame()</code> method which has a
GStreamer implementation. The code is not the most optimal yet though due to
shortcomings of our current graphics pipeline implementation. Here is a <a href="https://www.youtube.com/live/gW0OzeRpzeU">demo of
Canvas to WebRTC</a> running in the
WebKitGTK MiniBrowser:</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/gW0OzeRpzeU" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
<h2>Wrap-up</h2>
<p>So we’ve got MediaStream support covered. This is only one part of the puzzle
though. We are facing challenges now on the
<a href="https://developer.mozilla.org/en-US/docs/Web/API/RTCPeerConnection">PeerConnection</a>
implementation. MediaStreams are cool but it’s even better when you can share
them with your friends on the fancy A/V conferencing websites, but we’re not
entirely ready for this yet in WebKitGTK and <span class="caps">WPE</span>. For this reason, WebRTC is not
yet enabled by default in the upcoming WebKitGTK and <span class="caps">WPE</span> 2.40 releases. We’re
just not there yet. In the next part of these series I’ll tackle the
PeerConnection backend on which we’re working hard on these days, both in WebKit
and in GStreamer.</p>
<p>Happy hacking and as always, all my gratitude goes to my fellow
<a href="https://igalia.com">Igalia</a> comrades for allowing me to keep working on these
domains and to Metrological for funding some of this work. Is your organization
or company interested in leveraging modern WebRTC APIs from WebKitGTK and/or
<span class="caps">WPE</span>? If so please <a href="https://www.igalia.com/contact/">get in touch with us</a> in
order to help us speed-up the implementation work.</p>Introducing the GNOME Web Canary flavor2021-08-02T14:00:00+02:002021-08-02T19:15:00+02:00Philippe Normandtag:base-art.net,2021-08-02:/Articles/introducing-the-gnome-web-canary-flavor/<p>Today I am happy to unveil <span class="caps">GNOME</span> Web Canary which aims to provide bleeding edge,
most likely very unstable builds of Epiphany, depending on daily builds of the
WebKitGTK development version. Read on to know more about this.</p>
<p>Until recently the <a class="reference external" href="https://wiki.gnome.org/Apps/Web"><span class="caps">GNOME</span> Web browser</a> was available for end-users in two …</p><p>Today I am happy to unveil <span class="caps">GNOME</span> Web Canary which aims to provide bleeding edge,
most likely very unstable builds of Epiphany, depending on daily builds of the
WebKitGTK development version. Read on to know more about this.</p>
<p>Until recently the <a class="reference external" href="https://wiki.gnome.org/Apps/Web"><span class="caps">GNOME</span> Web browser</a> was available for end-users in two
flavors. The primary, stable release provides the vanilla experience of the
upstream Web browser. It is shipped as part of the <span class="caps">GNOME</span> release cycle and in
distros. The second flavor, called <a class="reference external" href="https://blogs.gnome.org/mcatanzaro/2018/01/24/announcing-epiphany-technology-preview/">Tech Preview</a>, is oriented towards early
testers of <span class="caps">GNOME</span> Web. It is available as a Flatpak, included in the <span class="caps">GNOME</span>
nightly repo. The builds represent the current state of the <span class="caps">GNOME</span> Web master
branch, the <a class="reference external" href="https://webkitgtk.org">WebKitGTK</a> version it links to is the one provided by the <span class="caps">GNOME</span>
nightly runtime.</p>
<p>Tech Preview is great for users testing the latest development of <span class="caps">GNOME</span> Web, but
what if you want to test features that are not yet shipped in <em>any</em> WebKitGTK
version? Or what if you are <span class="caps">GNOME</span> Web developer and you want to implement new
features on Web that depend on <span class="caps">API</span> that was not released yet in WebKitGTK?</p>
<p>Historically, the answer was simply “you can <a class="reference external" href="https://trac.webkit.org/wiki/BuildingGtk">build WebKitGTK yourself</a>“.
However, this requires some knowledge and a good build machine (or a lot of
patience). Even as WebKit developer builds have become easier to produce thanks
to the Flatpak <span class="caps">SDK</span> we provide, you would still need to somehow make Epiphany
detect your local build of WebKit. Other browsers offer nightly or “Canary”
builds which don’t have such requirements. This is exactly what Epiphany Canary
aims to do! Without building WebKit yourself!</p>
<p>A brief interlude about the term: Canary typically refers to highly unstable
builds of a project, they are named after <a class="reference external" href="https://en.wikipedia.org/wiki/Sentinel_species">Sentinel species</a>. Canary birds were
taken into mines to warn coal miners of carbon monoxide presence. For instance
<a class="reference external" href="https://www.google.com/chrome/canary/">Chrome has been providing Canary builds</a> of its browser for a long time. These
builds are useful because they allow early testing, by end-users. Hence
potentially early detection of bugs that might not have been detected by the
usual automated test harness that buildbots and <span class="caps">CI</span> systems run.</p>
<p>To similar ends, a new build profile and icon were added in Epiphany, along with
a new Flatpak manifest. Everything is now nicely integrated in the Epiphany
project <span class="caps">CI</span>. WebKit builds are already done for every upstream commit using the
<a class="reference external" href="https://build.webkit.org/">WebKit Buildbot</a>. As those builds are made with the WebKit Flatpak <span class="caps">SDK</span>, they
can be reused elsewhere (x86_64 is the only arch supported for now) as long as
the WebKit Flatpak platform runtime is being used as well. Build artifacts are
saved, compressed, and <a class="reference external" href="https://webkitgtk-release.igalia.com/built-products/">uploaded to a web server</a> kindly hosted and provided by
<a class="reference external" href="https://igalia.com">Igalia</a>. The <span class="caps">GNOME</span> Web <span class="caps">CI</span> now has a new job, called <tt class="docutils literal">canary</tt>, that generates a
build manifest that installs WebKitGTK build artifacts in the build sandbox,
that can be detected during the Epiphany Flatpak build. The resulting Flatpak
bundle can be downloaded and locally installed. The runtime environment is the
one provided by the WebKit <span class="caps">SDK</span> though, so not exactly the same as the one
provided by <span class="caps">GNOME</span> Nightly.</p>
<p>Back to the two main use-cases, and who would want to use this:</p>
<ul class="simple">
<li>You are a <span class="caps">GNOME</span> Web developer looking for <span class="caps">CI</span> coverage of some shiny new
WebKitGTK <span class="caps">API</span> you want to use from <span class="caps">GNOME</span> Web. Every new merge request on the
<span class="caps">GNOME</span> Web Gitlab repo now produces installable Canary bundles, that can be
used to test the code changes being submitted for review. This bundle is not
automatically updated though, it’s good only for one-off testing.</li>
<li>You are an early tester of <span class="caps">GNOME</span> Web, looking for bleeding edge version of
both <span class="caps">GNOME</span> Web and WebKitGTK. You can <a class="reference external" href="https://nightly.gnome.org/repo/appstream/org.gnome.Epiphany.Canary.flatpakref">install Canary</a> using the provided
Flatpakref. Every commit on the <span class="caps">GNOME</span> Web master branch produces an update of
Canary, that users can get through the usual <tt class="docutils literal">flatpak update</tt> or through their
flatpak-enabled app-store.</li>
</ul>
<p><strong>Update:</strong></p>
<p>Due to an issue in the Flatpakref file, the WebKit <span class="caps">SDK</span> flatpak remote
is not automatically added during the installation of <span class="caps">GNOME</span> Web Canary. So it
needs to be manually added before attempting to install the flatpakref:</p>
<pre class="literal-block">
$ flatpak --user remote-add --if-not-exists webkit https://software.igalia.com/flatpak-refs/webkit-sdk.flatpakrepo
$ flatpak --user install https://nightly.gnome.org/repo/appstream/org.gnome.Epiphany.Canary.flatpakref
</pre>
<p>As you can see in the screenshot below, the <span class="caps">GNOME</span> Web branding is clearly
modified compared to the other flavors of the application. The updated logo,
kindly provided by Tobias Bernard, has some yellow tones and the Tech Preview
stripes. Also the careful reader will notice the reported WebKitGTK version in
the screenshot is a development build of <span class="caps">SVN</span> revision r280382. Users are
strongly advised to add this information to bug reports.</p>
<img src="https://base-art.net/png/Epiphany-Canary.png"/><p>As WebKit developers we are always interested in getting users’ feedback. I hope
this new flavor of <span class="caps">GNOME</span> Web will be useful for both <span class="caps">GNOME</span> and WebKitGTK
communities. Many thanks to <a class="reference external" href="https://igalia.com">Igalia</a> for sponsoring WebKitGTK build artifacts
hosting and some of the work time I spent on this side project. Also thanks to
Michael Catanzaro, Alexander Mikhaylenko and Jordan Petridis for the reviews in Gitlab.</p>
Catching up on WebKit GStreamer WebAudio backends maintenance2020-11-29T13:45:00+01:002020-11-29T13:45:00+01:00Philippe Normandtag:base-art.net,2020-11-29:/Articles/catching-up-on-webkit-gstreamer-webaudio-backends-maintenance/<p>Over the past few months the WebKit development team has been working on
modernizing support for the <a class="reference external" href="https://www.w3.org/TR/webaudio/">WebAudio</a> specification. This post highlights some
of the changes that were recently merged, focusing on the GStreamer ports.</p>
<p>My fellow WebKit colleague, Chris Dumez, has been very active lately, updating
the WebAudio implementation …</p><p>Over the past few months the WebKit development team has been working on
modernizing support for the <a class="reference external" href="https://www.w3.org/TR/webaudio/">WebAudio</a> specification. This post highlights some
of the changes that were recently merged, focusing on the GStreamer ports.</p>
<p>My fellow WebKit colleague, Chris Dumez, has been very active lately, updating
the WebAudio implementation for the mac ports in order to comply with the latest
changes of the specification. His contributions have been documented in the
Safari Technology Preview release notes for <a class="reference external" href="https://webkit.org/blog/11294/release-notes-for-safari-technology-preview-113/">version 113</a>, <a class="reference external" href="https://webkit.org/blog/11294/release-notes-for-safari-technology-preview-114/">version 114</a>,
<a class="reference external" href="https://webkit.org/blog/11294/release-notes-for-safari-technology-preview-115/">version 115</a> and <a class="reference external" href="https://webkit.org/blog/11294/release-notes-for-safari-technology-preview-116/">version 116</a>. This is great for the WebKit project! Since
the initial implementation landed around 2011, there wasn’t much activity and
over the years our implementation started lagging behind other web engines in
terms of features and spec compliance. So, many thanks Chris, I think you’re
making a lot of WebAudio web developers very happy these days :)</p>
<p>The flip side of the coin is that some of these changes broke the GStreamer
backends, as Chris is focusing mostly on the Apple ports, a few bugs slipped in,
noticed by the <span class="caps">CI</span> test bots and dutifully <a class="reference external" href="https://trac.webkit.org/wiki/WebKitGTK/Gardening/Howto">gardened</a> by our bots sheriffs. <a class="reference external" href="https://base-art.net/Articles/1/">Those
backends were upstreamed in 2012</a> and since then I didn’t devote much time to
their maintenance, aside from casual bug-fixing.</p>
<p>One of the WebAudio features recently supported by WebKit is <a class="reference external" href="https://webaudio.github.io/web-audio-api/#audioworklet">the Audio Worklet
interface</a> which allows applications to perform audio processing in a dedicated
thread, thus relieving some pressure off the main thread and ensuring a
glitch-free WebAudio rendering. I added support for this feature in <a class="reference external" href="https://trac.webkit.org/changeset/268579">r268579</a>.
Folks eager to test this can try the <span class="caps">GTK</span> nightly MiniBrowser with the demos:</p>
<pre class="literal-block">
$ wget https://trac.webkit.org/export/270226/webkit/trunk/Tools/Scripts/webkit-flatpak-run-nightly
$ chmod +x webkit-flatpak-run-nightly
$ python3 webkit-flatpak-run-nightly --gtk MiniBrowser https://googlechromelabs.github.io/web-audio-samples/audio-worklet/
</pre>
<p>For many years our AudioFileReader implementation was limited to mono and stereo
audio layouts. This limitation was lifted off in <a class="reference external" href="https://trac.webkit.org/changeset/269104">r269104</a> allowing for
processing of up to 5.1 surround audio files in the <a class="reference external" href="https://www.w3.org/TR/webaudio/#AudioBufferSourceNode">AudioBufferSourceNode</a>.</p>
<p>Our AudioDestination, used for audio playback, was only able to render stereo.
It is now able to probe the GStreamer platform audio sink for the maximum number
of channels it can handle, since <a class="reference external" href="https://trac.webkit.org/changeset/268727">r268727</a>. Support for <a class="reference external" href="https://www.w3.org/TR/webaudio/#dom-audiocontext-getoutputtimestamp">AudioContext
getOutputTimestamp</a> was hooked up in the GStreamer backend in <a class="reference external" href="https://trac.webkit.org/changeset/266109">r266109</a>.</p>
<p>The WebAudio spec has a <a class="reference external" href="https://www.w3.org/TR/webaudio/#mediastreamaudiosourcenode">MediaStreamAudioDestinationNode</a> for MediaStreams,
allowing to feed audio samples coming from the WebAudio pipeline to outgoing
WebRTC streams. Since <a class="reference external" href="https://trac.webkit.org/changeset/269827">r269827</a> the GStreamer ports now support this feature as
well! Similarly, incoming WebRTC streams or capture devices can stream their
audio samples to a WebAudio pipeline, this has been supported for a couple years
already, contributed by my colleague Thibault Saunier.</p>
<p>Our GStreamer FFTFrame implementation was broken for a few weeks, while Chris
was landing various improvements for the platform-agnostic and mac-specific
implementations. I finally fixed it in <a class="reference external" href="https://trac.webkit.org/changeset/267471">r267471</a>.</p>
<p>This is only the tip of the iceberg. A few more patches were merged, including
some security-related bug-fixes. As the Web Platform keeps growing, supporting
more and more multimedia-related use-cases, we, at the <a class="reference external" href="https://www.igalia.com/technology/multimedia">Igalia Multimedia team</a>,
are committed to maintain our position as GStreamer experts in the WebKit community.</p>
Web-augmented graphics overlay broadcasting with WPE and GStreamer2020-07-02T15:00:00+02:002020-07-02T15:00:00+02:00Philippe Normandtag:base-art.net,2020-07-02:/Articles/web-augmented-graphics-overlay-broadcasting-with-wpe-and-gstreamer/<p>Graphics overlays are everywhere nowadays in the live video broadcasting
industry. In this post I introduce a new demo relying on <a class="reference external" href="https://gstreamer.freedesktop.org">GStreamer</a> and
<a class="reference external" href="https://wpewebkit.org">WPEWebKit</a> to deliver low-latency web-augmented video broadcasts.</p>
<p>Readers of this blog might remember a few posts about <a class="reference external" href="https://wpewebkit.org">WPEWebKit</a> and a
<a class="reference external" href="https://gstreamer.freedesktop.org">GStreamer</a> element we at <a class="reference external" href="https://igalia.com">Igalia</a> worked on …</p><p>Graphics overlays are everywhere nowadays in the live video broadcasting
industry. In this post I introduce a new demo relying on <a class="reference external" href="https://gstreamer.freedesktop.org">GStreamer</a> and
<a class="reference external" href="https://wpewebkit.org">WPEWebKit</a> to deliver low-latency web-augmented video broadcasts.</p>
<p>Readers of this blog might remember a few posts about <a class="reference external" href="https://wpewebkit.org">WPEWebKit</a> and a
<a class="reference external" href="https://gstreamer.freedesktop.org">GStreamer</a> element we at <a class="reference external" href="https://igalia.com">Igalia</a> worked on. In december 2018 I <a class="reference external" href="https://base-art.net/Articles/web-overlay-in-gstreamer-with-wpewebkit/">introduced
GstWPE</a> and a few months later blogged about a <a class="reference external" href="https://base-art.net/Articles/html-overlays-with-gstwpe-the-demo/">proof-of-concept application</a>
I wrote for it. So, learning from this first iteration, I wrote another demo!</p>
<p>The first demo was already quite cool, but had a few down-sides:</p>
<ol class="arabic simple">
<li>It works only on desktop (running in a Wayland compositor). The Wayland
compositor dependency can be a burden in some cases. Ideally we could
imaginge GstWPE applications running “in the cloud”, on machines without <span class="caps">GPU</span>,
bare metal.</li>
<li>While it was cool to stream to Twitch, Youtube and the like, these platforms
currently can ingest only <span class="caps">RTMP</span> streams. That means the latency introduced can
be quite significant, depending on the network conditions of course, but even
in ideal conditions the latency was between one and 2 seconds. This is not
great, in the world we live in.</li>
</ol>
<p>To address the first point, <span class="caps">WPE</span> founding engineer, <a class="reference external" href="https://blogs.igalia.com/zdobersek/">Žan Doberšek</a> enabled
software rasterizing support in <span class="caps">WPE</span> and its <a class="reference external" href="https://github.com/igalia/wpebackend-fdo"><span class="caps">FDO</span> backend</a>. This is great because
it allows <span class="caps">WPE</span> to run on machines without <span class="caps">GPU</span> (like continuous integration
builders, test bots) but also “in the cloud” where machines with <span class="caps">GPU</span> are less
affordable than bare metal! Following up, I enabled this feature in GstWPE. The
source element caps template now has <cite>video/x-raw</cite>, in addition to
<cite>video/x-raw(memory:GLMemory)</cite>. To force swrast, you need to set the
<cite>LIBGL_ALWAYS_SOFTWARE=true</cite> environment variable. The downside of swrast is
that you need a good <span class="caps">CPU</span>. Of course it depends on the video resolution and
framerate you want to target.</p>
<p>On the latency front, I decided to switch from <span class="caps">RTMP</span> to WebRTC! This <span class="caps">W3C</span> spec
isn’t only about video chat! With WebRTC, sub-second live one-to-many
broadcasting can be achieved, without much efforts, given you have a good <span class="caps">SFU</span>.
For this demo I chose <a class="reference external" href="https://janus.conf.meetecho.com">Janus</a>, because its APIs are well documented, and it’s a
cool project! I’m not sure it would scale very well in large deployments, but
for my modest use-case, it fits very well.</p>
<p>Janus has a plugin called <a class="reference external" href="https://janus.conf.meetecho.com/docs/videoroom.html">video-room</a> which allows multiple participants to
chat. But then imagine a participant only publishing its video stream and
multiple “clients” connecting to that room, without sharing any video or audio
stream, one-to-many broadcasting. As it turns out, <a class="reference external" href="https://gstreamer.freedesktop.org">GStreamer</a> applications can
already connect to this video-room plugin using <a class="reference external" href="https://gstreamer.freedesktop.org/documentation/webrtc/index.html?gi-language=c#webrtcbin-page">GstWebRTC</a>! A demo was developed
by <a class="reference external" href="https://github.com/tobiasfriden">tobiasfriden</a> and <a class="reference external" href="https://github.com/saket424">saket424</a> in Python, it recently moved to the <a class="reference external" href="https://gitlab.freedesktop.org/gstreamer/gst-examples">gst-examples</a>
repository. As I kind of prefer to use Rust nowadays (whenever I can anyway) I
ported this demo to Rust, it was upstreamed in <a class="reference external" href="https://gitlab.freedesktop.org/gstreamer/gst-examples">gst-examples</a> as well. This
specific demo streams the video test pattern to a Janus instance.</p>
<p>Adapting this Janus demo was then quite trivial. By relying on a similar video
mixer approach I used for the first GstWPE demo, I had a GstWPE-powered WebView
streaming to Janus.</p>
<p>The next step was the actual graphics overlays infrastructure. In the first
GstWPE demo I had a basic <span class="caps">GTK</span> <span class="caps">UI</span> allowing to edit the overlays on-the-fly. This
can’t be used for this new demo, because I wanted to use it headless. After
doing some research I found a really nice NodeJS app on Github, it was developed
by <a class="reference external" href="https://github.com/moschopsuk">Luke Moscrop</a>, who’s actually one of the main developers of the <a class="reference external" href="https://github.com/bbc/brave">Brave
<span class="caps">BBC</span></a> project. The <a class="reference external" href="https://github.com/moschopsuk/Roses-2015-CasparCG-Graphics">Roses CasparCG Graphics</a> was developed in the context of
the <a class="reference external" href="https://www.la1tv.co.uk">Lancaster University Students’ Union <span class="caps">TV</span> Station</a>, this app starts a
web-server on port 3000 with two main entry points:</p>
<ul class="simple">
<li>An admin web-<span class="caps">UI</span> (in <cite>/admin/</cite> allowing to create and manage overlays, like sports score
boards, info banners, and so on.</li>
<li>The target overlay page (in the root location of the server), which is a
web-page without predetermined background, displaying the overlays with <span class="caps">HTML</span>,
<span class="caps">CSS</span> and <span class="caps">JS</span>. This web-page is meant to be fed to <a class="reference external" href="https://casparcg.com">CasparCG</a> (or GstWPE :))</li>
</ul>
<p>After making a few tweaks in this NodeJS app, I can now:</p>
<ol class="arabic simple">
<li>Start the NodeJS app, load the admin <span class="caps">UI</span> in a browser and enable some overlays</li>
<li>Start my native Rust GStreamer/<span class="caps">WPE</span> application, which:<ul>
<li>connects to the overlay web-server</li>
<li>mixes a live video source (webcam for instances) with the <span class="caps">WPE</span>-powered overlay</li>
<li>encodes the video stream to H.264, <span class="caps">VP8</span> or <span class="caps">VP9</span></li>
<li>sends the encoded <span class="caps">RTP</span> stream using WebRTC to a Janus server</li>
</ul>
</li>
<li>Let “consumer” clients connect to Janus with their browser, in order to see
the resulting live broadcast.</li>
</ol>
<iframe width="560" height="315" src="https://www.youtube.com/embed/QNZJYOuVGiE" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><p>(If the video doesn’t display, here is the <a class="reference external" href="https://youtu.be/QNZJYOuVGiE">Youtube link</a>.)</p>
<p>This is pretty cool and fun, as my colleague <a class="reference external" href="https://bkardell.com">Brian Kardell</a> mentions in the
video. Working on this new version gave me more ideas for the next one. And very
recently the <a class="reference external" href="https://github.com/Igalia/WPEBackend-fdo/commit/fd14bac16e465083b755675c441e19cc60db04b3">audio rendering protocol</a> was merged in WPEBackend-<span class="caps">FDO</span>! That
means even more use-cases are now unlocked for GstWPE.</p>
<p>This demo’s source code is hosted on <a class="reference external" href="https://github.com/Igalia/gst-wpe-webrtc-demo">Github</a>. Feel free to open issues there, I
am always interested in getting feedback, good or bad!</p>
<p>GstWPE is maintained upstream in <a class="reference external" href="https://gstreamer.freedesktop.org">GStreamer</a> and relies heavily on <a class="reference external" href="https://wpewebkit.org">WPEWebKit</a> and
its <a class="reference external" href="https://github.com/igalia/wpebackend-fdo"><span class="caps">FDO</span> backend</a>. Don’t hesitate to <a class="reference external" href="https://www.igalia.com/contact/">contact us</a> if you have specific
requirements or issues with these projects :)</p>
GStreamer’s playbin3 overview for application developers2020-06-13T12:05:12+02:002020-06-13T12:05:12+02:00Philippe Normandtag:base-art.net,2020-06-13:/Articles/gstreamers-playbin3-overview-for-application-developers/<p>Multimedia applications based on GStreamer usually handle playback with the
<a class="reference external" href="https://gstreamer.freedesktop.org/documentation/tutorials/playback/playbin-usage.html">playbin</a> element. I recently added support for <a class="reference external" href="https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-playbin3.html">playbin3</a> in WebKit. This post
aims to document the changes needed on application side to support this new
generation flavour of playbin.</p>
<p>So, first of, why is it named playbin3 anyway? The GStreamer …</p><p>Multimedia applications based on GStreamer usually handle playback with the
<a class="reference external" href="https://gstreamer.freedesktop.org/documentation/tutorials/playback/playbin-usage.html">playbin</a> element. I recently added support for <a class="reference external" href="https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-playbin3.html">playbin3</a> in WebKit. This post
aims to document the changes needed on application side to support this new
generation flavour of playbin.</p>
<p>So, first of, why is it named playbin3 anyway? The GStreamer 0.10.x series had a
playbin element but a first rewrite (playbin2) made it obsolete in the GStreamer
1.x series. So playbin2 was renamed to playbin. That’s why a second rewrite is
nicknamed playbin3, I suppose :)</p>
<p>Why should you care about playbin3? Playbin3 (and the elements it’s using
internally: parsebin, decodebin3, uridecodebin3 among others) is the result of a
deep re-design of playbin2 (along with decodebin2 and uridecodebin) to better support:</p>
<ul class="simple">
<li>gapless playback</li>
<li>audio cross-fading support (not yet implemented)</li>
<li>adaptive streaming</li>
<li>reduced <span class="caps">CPU</span>, memory and I/O resource usage</li>
<li>faster stream switching and full control over the stream selection process</li>
</ul>
<p>This work was carried on mostly by Edward Hervey, he presented his work in
detail at 3 GStreamer conferences. If you want to learn more about this and the
internals of playbin3 make sure to watch his awesome presentations at the <a class="reference external" href="https://gstconf.ubicast.tv/videos/decodebin3-or-dealing-with-modern-playback-use-cases/">2015
gst-conf</a>, <a class="reference external" href="https://gstconf.ubicast.tv/videos/the-new-gststream-api-design-and-usage/">2016 gst-conf</a> and <a class="reference external" href="https://gstconf.ubicast.tv/videos/lightning-talks/#start=1418&autoplay&timeline">2017 gst-conf</a>.</p>
<p>Playbin3 was added in GStreamer 1.10. It is still considered experimental but in
my experience it works already very well. Just keep in mind you should use at
least the latest GStreamer 1.12 (or even the upcoming 1.14) release before
reporting any issue in Bugzilla. Playbin3 is not a drop-in replacement for
playbin, both elements share only a sub-set of GObject properties and signals.
However, if you don’t want to modify your application source code just yet, it’s
very easy to try playbin3 anyway:</p>
<pre class="literal-block">
$ USE_PLAYBIN3=1 my-playbin-based-app
</pre>
<p>Setting the <cite>USE_PLAYBIN</cite> environment variable enables a code path inside the
GStreamer playback plugin which swaps the playbin element for the playbin3
element. This trick provides a glance to the playbin3 element for the most lazy
people :) The problem is that depending on your use of playbin, you might get
runtime warnings, here’s an example with the Totem player:</p>
<pre class="literal-block">
$ USE_PLAYBIN3=1 totem ~/Videos/Agent327.mp4
(totem:22617): GLib-GObject-WARNING **: ../../../../gobject/gsignal.c:2523: signal 'video-changed' is invalid for instance '0x556db67f3170' of type 'GstPlayBin3'
(totem:22617): GLib-GObject-WARNING **: ../../../../gobject/gsignal.c:2523: signal 'audio-changed' is invalid for instance '0x556db67f3170' of type 'GstPlayBin3'
(totem:22617): GLib-GObject-WARNING **: ../../../../gobject/gsignal.c:2523: signal 'text-changed' is invalid for instance '0x556db67f3170' of type 'GstPlayBin3'
(totem:22617): GLib-GObject-WARNING **: ../../../../gobject/gsignal.c:2523: signal 'video-tags-changed' is invalid for instance '0x556db67f3170' of type 'GstPlayBin3'
(totem:22617): GLib-GObject-WARNING **: ../../../../gobject/gsignal.c:2523: signal 'audio-tags-changed' is invalid for instance '0x556db67f3170' of type 'GstPlayBin3'
(totem:22617): GLib-GObject-WARNING **: ../../../../gobject/gsignal.c:2523: signal 'text-tags-changed' is invalid for instance '0x556db67f3170' of type 'GstPlayBin3'
sys:1: Warning: g_object_get_is_valid_property: object class 'GstPlayBin3' has no property named 'n-audio'
sys:1: Warning: g_object_get_is_valid_property: object class 'GstPlayBin3' has no property named 'n-text'
sys:1: Warning: ../../../../gobject/gsignal.c:3492: signal name 'get-video-pad' is invalid for instance '0x556db67f3170' of type 'GstPlayBin3'
</pre>
<p>As mentioned previously, playbin and playbin3 don’t share the same set of
GObject properties and signals, so some changes in your application are required
in order to use playbin3.</p>
<p>If your application is based on the <a class="reference external" href="https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad-libs/html/GstPlayer.html">GstPlayer</a> library then you should set the
<cite>GST_PLAYER_USE_PLAYBIN3</cite> environment variable. GstPlayer already handles both
playbin and playbin3, so no changes needed in your application if you use GstPlayer!</p>
<p>Ok, so what if your application relies directly on playbin? Some
changes are needed! If you previously used playbin stream selection
properties and signals, you will now need to handle the GstStream and
GstStreamCollection APIs. Playbin3 will emit a <a class="reference external" href="https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstMessage.html#GST-MESSAGE-STREAM-COLLECTION:CAPS">stream collection
message</a> on the bus, this is very nice because the collection
includes information (metadata!) about the streams (or tracks) the
media asset contains. In playbin this was handled with a bunch of
signals (audio-tags-changed, audio-changed, etc), properties (n-audio,
n-video, etc) and action signals (get-audio-tags, get-audio-pad, etc).
The new <a class="reference external" href="https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/gstreamer-GstStream.html">GstStream <span class="caps">API</span></a> provides a centralized and
non-playbin-specific access point for all these informations. To
select streams with playbin3 you now need to send a <a class="reference external" href="https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstEvent.html#gst-event-new-select-streams">select_streams
event</a> so that the demuxer can know exactly which streams should be
exposed to downstream elements. That means potentially improved
performance! Once playbin3 completed the stream selection it will emit
a <a class="reference external" href="https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstMessage.html#GST-MESSAGE-STREAMS-SELECTED:CAPS">streams selected message</a>, the application should handle this
message and potentially update its internal state about the selected
streams. This is also the best moment to update your <span class="caps">UI</span> regarding the
selected streams (like audio track language, video track dimensions, etc).</p>
<p>Another small difference between playbin and playbin3 is about the source
element setup. In playbin there is a <cite>source</cite> read-only GObject property and a
<cite>source-setup</cite> GObject signal. In playbin3 only the latter is available, so your
application should rely on <cite>source-setup</cite> instead of the <cite>notify::source</cite>
GObject signal.</p>
<p>The <a class="reference external" href="https://cgit.freedesktop.org/gstreamer/gst-plugins-base/tree/tools/gst-play.c">gst-play-1.0</a> playback utility program already supports playbin3 so it
provides a good source of inspiration if you consider porting your application
to playbin3. As mentioned at the beginning of this post, WebKit also now
supports playbin3, however it needs to be enabled at build time using the CMake
<cite>-DUSE_GSTREAMER_PLAYBIN3=<span class="caps">ON</span></cite> option. This feature is not part of the WebKitGTK+
2.20 series but should be shipped in 2.22. As a final note I wanted to
acknowledge my favorite worker-owned coop <a class="reference external" href="https://igalia.com">Igalia</a> for allowing me to work on
this WebKit feature and also our friends over at <a class="reference external" href="https://centricular.com">Centricular</a> for all the
quality work on playbin3.</p>
The GNOME-Shell Gajim extension maintenance2020-06-13T12:05:12+02:002020-06-13T12:05:12+02:00Philippe Normandtag:base-art.net,2020-06-13:/Articles/the-gnome-shell-gajim-extension-maintenance/<p>Back in January 2011 I wrote a <span class="caps">GNOME</span>-Shell <a class="reference external" href="https://extensions.gnome.org/extension/565/gajim-im-integration/">extension</a> allowing Gajim users to
carry on with their chats using the Empathy infrastructure and <span class="caps">UI</span> present in the
Shell. For some time the extension was also part of the official
gnome-shell-extensions module and then I had to move it to …</p><p>Back in January 2011 I wrote a <span class="caps">GNOME</span>-Shell <a class="reference external" href="https://extensions.gnome.org/extension/565/gajim-im-integration/">extension</a> allowing Gajim users to
carry on with their chats using the Empathy infrastructure and <span class="caps">UI</span> present in the
Shell. For some time the extension was also part of the official
gnome-shell-extensions module and then I had to move it to Github as a
<a class="reference external" href="https://github.com/philn/gnome-shell-gajim-extension">standalone extension</a>. Sadly I stopped using Gajim a few years ago and my
interest in maintaining this extension has decreased quite a lot.</p>
<p>I don’t know if this extension is actively used by anyone beyond the few bugs
reported in Github, so this is a call for help. If anyone still uses this
extension and wants it supported in future versions of <span class="caps">GNOME</span>-Shell, please send
me a mail so I can transfer ownership of the Github repository and see what I
can do for the extensions.gnome.org page as well.</p>
<p>(Huh, also. Hi blogosphere again! My last post was in 2014 it seems :))</p>
WebKitGTK and WPE now supporting videos in the img tag2020-06-09T18:00:00+02:002020-06-10T09:00:00+02:00Philippe Normandtag:base-art.net,2020-06-09:/Articles/webkitgtk-and-wpe-now-supporting-videos-in-the-img-tag/<p>Using videos in the <cite><img></cite> <span class="caps">HTML</span> tag can lead to more responsive web-page loads
in most cases. Colin Bendell blogged about this topic, make sure to read his
<a class="reference external" href="https://cloudinary.com/blog/evolution_of_img_gif_without_the_gif">post on the cloudinary website</a>. As it turns out, this feature has been
supported for more than 2 years in Safari, but …</p><p>Using videos in the <cite><img></cite> <span class="caps">HTML</span> tag can lead to more responsive web-page loads
in most cases. Colin Bendell blogged about this topic, make sure to read his
<a class="reference external" href="https://cloudinary.com/blog/evolution_of_img_gif_without_the_gif">post on the cloudinary website</a>. As it turns out, this feature has been
supported for more than 2 years in Safari, but only recently the WebKitGTK and
WPEWebKit ports caught up. Read on for the crunchy details or skip to the end of
the post if you want to try this new feature.</p>
<p>As WebKitGTK and WPEWebKit already heavily use GStreamer for their multimedia
backends, it was natural for us to also use GStreamer to provide the video
ImageDecoder implementation.</p>
<p>The preliminary step is to hook our new decoder into the <a class="reference external" href="https://github.com/WebKit/webkit/blob/master/Source/WebCore/platform/MIMETypeRegistry.cpp">MIMETypeRegistry</a> and
into the <a class="reference external" href="https://github.com/WebKit/webkit/blob/master/Source/WebCore/platform/graphics/ImageDecoder.cpp">ImageDecoder.cpp</a> platform-agnostic module. This is where the main
decoder branches out to platform-specific backends. Then we need to add a new
class implementing WebKit’s <a class="reference external" href="https://github.com/WebKit/webkit/blob/master/Source/WebCore/platform/graphics/ImageDecoder.h">ImageDecoder</a> virtual interface.</p>
<p>First you need to implement <cite>supportsMediaType()</cite>. For this method we already
had all the code in place. WebKit scans the GStreamer plugin registry and
depending on the plugins available on the target platform, a mime-type cache is
built, by the <a class="reference external" href="https://github.com/WebKit/webkit/blob/master/Source/WebCore/platform/graphics/gstreamer/GStreamerRegistryScanner.cpp">RegistryScanner</a>. Our new image decoder just needs to hook into
this component (exposed as singleton) so that we can be sure that the decoder
will be used only for media types supported by GStreamer.</p>
<p>The second most important places of the decoder are its constructor and the
<cite>setData()</cite> method. This is the place where the decoder receives encoded data,
as a SharedBuffer, and performs the decoding. Because this method is
synchronously called from a secondary thread, we run the GStreamer pipeline
there, until the video has been decoded entirely. Our pipeline relies on the
GStreamer decodebin element and WebKit’s internal video sink. For the time being
hardware-accelerated decoding is not supported. We should soon be able to fix
this issue though. Once all samples have been received by the sink, the decoder
notifies its caller using a callback. The caller then knows it can request
decoded frames.</p>
<p>Last, the decoder needs to provide decoded frames! This is implemented using the
<cite>createFrameImageAtIndex()</cite> method. Our decoder implementation keeps an internal
Map of the decoded samples. We sub-classed the <a class="reference external" href="https://github.com/WebKit/webkit/blob/master/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.h">MediaSampleGStreamer</a> to provide
an <cite>image()</cite> method, which returns the Cairo surface representing the decoded
frame. Again, here we don’t support <span class="caps">GL</span> textures yet. Some more infrastructure
work is likely going to be needed in that area of our WebKit ports.</p>
<p>Our implementation of the video ImageDecoder lives in <a class="reference external" href="https://github.com/WebKit/webkit/blob/master/Source/WebCore/platform/graphics/gstreamer/ImageDecoderGStreamer.cpp">ImageDecoderGStreamer</a>
which will be shipped in WebKitGTK and WPEWebKit 2.30, around September/October.
But what if you want to try this already? Well, building WebKit can be a tedious
task. We’ve been hard a work at <a class="reference external" href="https://igalia.com">Igalia</a> to make this a bit easier using a <a class="reference external" href="https://base-art.net/Articles/introducing-the-webkit-flatpak-sdk/">new Flatpak <span class="caps">SDK</span></a>.
So, you can either try this feature in Epiphany Tech Preview or
(surprise!) with our new tooling allowing to download and run nightly binaries
from the upstream WebKit build bots:</p>
<pre class="literal-block">
$ wget https://raw.githubusercontent.com/WebKit/webkit/master/Tools/Scripts/webkit-flatpak-run-nightly
$ chmod +x webkit-flatpak-run-nightly
$ python3 webkit-flatpak-run-nightly MiniBrowser https://colinbendell.github.io/webperf/img-mp4/
</pre>
<img src="https://base-art.net/png/WebKitGTK-video-img-tag.png"/><p>This script locally installs our new Flatpak-based developer <span class="caps">SDK</span> in
<cite>~/.cache/wk-nightly</cite> and then downloads a zip archive of the build artefacts
from servers recently brought up by my colleague <a class="reference external" href="http://blog.neutrino.es/">Carlos Alberto Lopez Perez</a>,
many thanks to him :). The downloaded zip file is unpacked in <cite>/tmp</cite> and kept
around in case you want to run this again without re-downloading the build
archive. Flatpak is then used to run the binaries inside a sandbox! This is a
nice way to run the bleeding edge of the web-engine, without having to build it
or install any distro package.</p>
<p>Implementing new features in WebKit is one of the many expertize domains we are
involved in at <a class="reference external" href="https://igalia.com">Igalia</a>. <a class="reference external" href="https://www.igalia.com/technology/multimedia">Our multimedia team</a> is always on the lookout to help
folks in their projects involving either GStreamer or WebKit or both! Don’t
hesitate to reach out.</p>
Introducing the WebKit Flatpak SDK2020-06-08T18:50:00+02:002020-06-11T14:50:00+02:00Philippe Normandtag:base-art.net,2020-06-08:/Articles/introducing-the-webkit-flatpak-sdk/<p>Working on a web-engine often requires a complex build infrastructure. This post
documents our transition from JHBuild to <a class="reference external" href="https://flatpak.org">Flatpak</a> for the WebKitGTK and
WPEWebKit development builds.</p>
<p>For the last 10 years, <a class="reference external" href="https://webkitgtk.org">WebKitGTK</a> has been relying on a custom <a class="reference external" href="https://developer.gnome.org/jhbuild/">JHBuild</a>
moduleset to handle its dependencies and (try to) ensure a reproducible …</p><p>Working on a web-engine often requires a complex build infrastructure. This post
documents our transition from JHBuild to <a class="reference external" href="https://flatpak.org">Flatpak</a> for the WebKitGTK and
WPEWebKit development builds.</p>
<p>For the last 10 years, <a class="reference external" href="https://webkitgtk.org">WebKitGTK</a> has been relying on a custom <a class="reference external" href="https://developer.gnome.org/jhbuild/">JHBuild</a>
moduleset to handle its dependencies and (try to) ensure a reproducible test
environment for the build bots. When <a class="reference external" href="https://wpewebkit.org">WPEWebKit</a> was upstreamed several years
ago, a similar approach was used. We ended up with two slightly different
modulesets to maintain. The biggest problem with that is that we still depend on
the host <span class="caps">OS</span> for some dependencies. Another set of scripts was then written to
install those, depending on which distro the host is running… This is a bit
unfortunate. There are more issues with JHBuild, when a moduleset is updated,
the bots wipe all the resulting builds and start a new build from scratch! Every
WebKitGTK and <span class="caps">WPE</span> developer is strongly advised to use this setup, so everybody
ends up building the dependencies.</p>
<p>In 2018, my colleague <a class="reference external" href="https://blogs.gnome.org/tsaunier/">Thibault Saunier</a> worked on a new approach, based on
Flatpak. WebKit could be built as a Flatpak app relying on the <span class="caps">GNOME</span> <span class="caps">SDK</span>. This
experiment was quite interesting but didn’t work for several reasons:</p>
<ul class="simple">
<li>Developers were not really aware of this new approach</li>
<li>The <span class="caps">GNOME</span> <span class="caps">SDK</span> OSTree commits we were pinning we being removed from the
upstream repo periodically, triggering issues on our side</li>
<li>Developers still had to build all the WebKit “app” dependencies</li>
</ul>
<p>In late 2019 I started having a look at this, with a different workflow. I
started to experiment with a custom <span class="caps">SDK</span>, based on the <a class="reference external" href="https://freedesktop-sdk.io/">Freedesktop <span class="caps">SDK</span></a>. The
goal was to distribute this to WebKit developers, who wouldn’t need to worry as
much about build dependencies anymore and just focus on building the damn web-engine.</p>
<p>I first learned about <a class="reference external" href="https://docs.flatpak.org/en/latest/flatpak-builder.html">flatpak-builder</a>, built a <span class="caps">SDK</span> with that, and started
playing with it for WebKit builds. I was almost happy with that, but soon
realized flatpak-builder is cool for apps packaging, but for big SDKs, it
doesn’t really work out. Flatpak-builder builds a single recipe at a time and if
I want to update one, everything below in the manifest is rebuilt. As the
journey goes on, I found <a class="reference external" href="https://buildstream.build/">Buildstream</a>, which is actually used by the <span class="caps">FDO</span> and
<span class="caps">GNOME</span> folks nowadays. After converting my flatpak-builder manifest to
Buildstream I finally achieved happiness. Buildstream is bit like Yocto,
Buildroot and all the similar <span class="caps">NIH</span> sysroot builders. It was one more thing to
learn, but worth it.</p>
<p>The <a class="reference external" href="https://github.com/WebKit/webkit/tree/master/Tools/buildstream"><span class="caps">SDK</span> build definitions</a> are hosted in WebKit’s repository. The resulting
Flatpak images are hosted on Igalia’s server and locally installed in a custom
Flatpak UserDir so as to not interfere with the rest of the host <span class="caps">OS</span> Flatpak
apps. The usual mix of python, perl, ruby WebKit scripts rely on the <span class="caps">SDK</span> to
perform their job of building WebKit, running the various test suites (<span class="caps">API</span>
tests, layout tests, <span class="caps">JS</span> tests, etc). All you need to do is:</p>
<ol class="arabic simple">
<li>clone the WebKit git repo</li>
<li>run <cite>Tools/Scripts/update-webkit-flatpak</cite></li>
<li>run <cite>Tools/Scripts/build-webkit —gtk</cite></li>
<li>run build artefacts, like the MiniBrowser, <cite>Tools/Scripts/run-minibrowser —gtk</cite></li>
</ol>
<p>Under the hood, the <span class="caps">SDK</span> will be installed in <cite>WebKitBuild/UserFlatpak</cite> and
transparently used by the various build scripts. The sandbox is started using a
<cite>flatpak run</cite> call which bind-mounts the WebKit checkout and build directory.
This is great! We finally have a unified workflow, not depending on specific
host distros. We can easily update toolchains, package handy debug tools like
<a class="reference external" href="https://rr-project.org">rr</a>, <span class="caps">LSP</span> tooling such as <a class="reference external" href="https://github.com/MaskRay/ccls">ccls</a>, etc and let the lazy developers actually focus
on development, rather than tooling infrastructure.</p>
<p>Another nice tool we now support, is <a class="reference external" href="https://github.com/mozilla/sccache">sccache</a>. By deploying a “cloud” of
builders in our <a class="reference external" href="https://igalia.com">Igalia</a> servers we can now achieve improved build times. The <span class="caps">SDK</span>
generates a custom sccache config based on the toolchains it includes (currently
<span class="caps">GCC</span> 9.3.0 and clang 8). You can optionally provide an authentication token and
you’re set. Access to the Igalia build cloud is restricted to Igalians, but
folks who have their own sccache infra can easily set it up for WebKit. In my
home office where I used to wait more than 20 minutes for a build to complete, I
can now have a build done in around 12 minutes. Once we have Redis-powered cloud
storage we might reach even better results. This is great because the buildbots
should be able to keep the Redis cache warm enough. Also developers who can rely
on this won’t need powerful build machines anymore, a standard workstation or
laptop should be sufficient because the heavy C++ compilation jobs happen in the “cloud”.</p>
<p>If you don’t like sccache, we still support <a class="reference external" href="https://github.com/icecc/icecream">IceCC</a> of course. The main
difference is that IceCC works better on local networks.</p>
<p>Hacking on WebKit often involves hacking on its dependencies, such as <span class="caps">GTK</span>,
GStreamer, libsoup and so on. With JHBuild it was fairly easy to vendor patches
in the moduleset. With Buildstream the process is a bit more complicated, but
actually not too bad! As we depend on the <span class="caps">FDO</span> <span class="caps">SDK</span> we can easily “patch” the
junction file and also adding new dependencies from scratch is quite easy. Many
thanks to the Buildstream developers for the hard work invested in the project!</p>
<p>As a conclusion, all our WebKitGTK and WPEWebKit <a class="reference external" href="https://build.webkit.org/">bots</a> are now using the <span class="caps">SDK</span>.
JHBuild remains available for now, but on opt-in basis. The goal is to gently
migrate most of our developers to this new setup and eventually JHBuild will be
phased out. We still have some tasks pending, but we achieved very good progress
on improving the WebKit developer workflow. A few more things are now easier to
achieve with this new setup. Stay tuned for more! Thanks for reading!</p>
HTML overlays with GstWPE, the demo2019-12-08T15:00:00+01:002019-12-08T15:00:00+01:00Philippe Normandtag:base-art.net,2019-12-08:/Articles/html-overlays-with-gstwpe-the-demo/<p>Once again this year I attended the <a class="reference external" href="https://gstreamer.freedesktop.org/conference/2019/">GStreamer conference</a> and just before
that, <a class="reference external" href="https://events19.linuxfoundation.org/events/embedded-linux-conference-europe-2019/">Embedded Linux conference Europe</a> which took place in Lyon (France).
Both events were a good opportunity to demo one of the use-cases I have in mind
for <a class="reference external" href="https://base-art.net/Articles/web-overlay-in-gstreamer-with-wpewebkit/">GstWPE</a>, <span class="caps">HTML</span> overlays!</p>
<p>As we, at <a class="reference external" href="https://igalia.com">Igalia</a>, usually have a …</p><p>Once again this year I attended the <a class="reference external" href="https://gstreamer.freedesktop.org/conference/2019/">GStreamer conference</a> and just before
that, <a class="reference external" href="https://events19.linuxfoundation.org/events/embedded-linux-conference-europe-2019/">Embedded Linux conference Europe</a> which took place in Lyon (France).
Both events were a good opportunity to demo one of the use-cases I have in mind
for <a class="reference external" href="https://base-art.net/Articles/web-overlay-in-gstreamer-with-wpewebkit/">GstWPE</a>, <span class="caps">HTML</span> overlays!</p>
<p>As we, at <a class="reference external" href="https://igalia.com">Igalia</a>, usually have a booth at <span class="caps">ELC</span>, I thought a GstWPE demo would be
nice to have so we can show it there. The demo is a rather simple <span class="caps">GTK</span>
application presenting a live preview of the webcam video capture with an <span class="caps">HTML</span>
overlay blended in. The <span class="caps">HTML</span> and <span class="caps">CSS</span> can be modified using the embedded text
editor and the overlay will be updated accordingly. The final video stream can
even be streamed over <span class="caps">RTMP</span> to the main streaming platforms (Twitch, Youtube,
Mixer)! Here is a screenshot:</p>
<img src="https://base-art.net/png/gst-wpe-broadcast-demo-screenshot.png"/><p>The code of the demo is available on <a class="reference external" href="https://github.com/Igalia/gst-wpe-broadcast-demo">Igalia’s GitHub</a>. Interested people
should be able to try it as well, the app is packaged as a Flatpak. See the
instructions in the GitHub repo for more details.</p>
<p>Having this demo running on our booth greatly helped us to explain GstWPE and
how it can be used in real-life <a class="reference external" href="https://gstreamer.freedesktop.org/">GStreamer</a> applications. Combining the
flexibility of the Multimedia framework with the wide (wild) features of the Web
Platform will for sure increase the synergy and foster collaboration!</p>
<p>As a reminder, GstWPE is available in <a class="reference external" href="https://gstreamer.freedesktop.org/">GStreamer</a> since the 1.16 version. On the
roadmap for the mid-term I plan to add Audio support, thus allowing even better
integration between WPEWebKit and GStreamer. Imagine injecting <span class="caps">PCM</span> audio coming
from WPEWebKit into an audio mixer in your GStreamer-based application! Stay tuned.</p>
Review of the Igalia Multimedia team Activities (2019/H1)2019-08-05T15:30:00+02:002019-08-05T15:30:00+02:00Philippe Normandtag:base-art.net,2019-08-05:/Articles/review-of-the-igalia-multimedia-team-activities-2019h1/<p>This blog post takes a look back at the various Multimedia-related tasks the
<a class="reference external" href="https://www.igalia.com/technology/multimedia">Igalia Multimedia team</a> was involved in during the first half of 2019.</p>
<div class="section" id="gstreamer-editing-services">
<h2>GStreamer Editing Services</h2>
<p>Thibault added support for the <a class="reference external" href="https://github.com/PixarAnimationStudios/OpenTimelineIO">OpenTimelineIO</a> open format for editorial
timeline information. Having many editorial timeline information formats
supported by OpenTimelineIO reduces …</p></div><p>This blog post takes a look back at the various Multimedia-related tasks the
<a class="reference external" href="https://www.igalia.com/technology/multimedia">Igalia Multimedia team</a> was involved in during the first half of 2019.</p>
<div class="section" id="gstreamer-editing-services">
<h2>GStreamer Editing Services</h2>
<p>Thibault added support for the <a class="reference external" href="https://github.com/PixarAnimationStudios/OpenTimelineIO">OpenTimelineIO</a> open format for editorial
timeline information. Having many editorial timeline information formats
supported by OpenTimelineIO reduces vendor lock-in in the tools used by video artists in
post-production studios. For instance a movie project edited in Final Cut Pro
can now be easily reimported in the <a class="reference external" href="https://pitivi.org">Pitivi</a> free and open-source video editor.
This accomplishment was made possible by implementing an OpenTimelineIO <span class="caps">GES</span>
<a class="reference external" href="https://github.com/PixarAnimationStudios/OpenTimelineIO/pull/412">adapter</a> and <a class="reference external" href="https://gitlab.freedesktop.org/gstreamer/gst-editing-services/merge_requests/67">formatter</a>, both upstreamed respectively in OpenTimelineIO and <span class="caps">GES</span> by Thibault.</p>
<img src="https://base-art.net/png/FcpAndPitivi.png"/><p>Another important feature for non-linear video editors is nested timeline
support. It allows teams to decouple big editing projects in smaller chunks that
can later on be assembled for the final product. Another use-case is about
pre-filling the timeline with boilerplate scenes, so that an initial version of
the movie can be assembled before all teams involved in the project have
provided the final content. To support this, Thibault implemented a <a class="reference external" href="https://gitlab.freedesktop.org/gstreamer/gst-editing-services/blob/master/plugins/ges/gesdemux.c"><span class="caps">GES</span> demuxer</a> which transparently enables
playback support for <span class="caps">GES</span> files (through <a class="reference external" href="file://path/to/file.xges">file://path/to/file.xges</a> URIs) in any GStreamer-based
media player.</p>
<p>As if this wasn’t impressive enough yet, Thibault greatly improved the <span class="caps">GES</span>
unit-tests, fixing a lot of memory leaks, race conditions and generally
improving the reliability of the test suite. This is very important because the
Gitlab continuous integration now executes the tests harness for every submitted
merge request.</p>
<p>For more information about this, the curious readers can dive in <a class="reference external" href="https://blogs.gnome.org/tsaunier/2019/04/22/gstreamer-editing-services-opentimelineio-support/">Thibault’s blog post</a>.
Thibault was invited to talk about these on-going efforts at <a class="reference external" href="https://s2019.siggraph.org/"><span class="caps">SIGGRAPH</span></a> during
the <a class="reference external" href="https://s2019.siggraph.org/presentation/?id=bof_124&sess=sess314">OpenTimelineIO <span class="caps">BOF</span></a>.</p>
<p>Finally, Thibault is mentoring <a class="reference external" href="https://swaynethoughts.wordpress.com/">Swayamjeet Swain</a> as part of the <a class="reference external" href="https://summerofcode.withgoogle.com/">GSoC</a> program,
the project is about adding nested timeline support in <a class="reference external" href="https://pitivi.org">Pitivi</a>.</p>
</div>
<div class="section" id="gstreamer-va-api">
<h2>GStreamer <span class="caps">VA</span>-<span class="caps">API</span></h2>
<p>Víctor performed a good number of code reviews for GStreamer-<span class="caps">VAAPI</span> contributors,
he has also started investigating GStreamer-<span class="caps">VAAPI</span> bugs specific to the <span class="caps">AMDGPU</span>
Gallium driver, in order to improve the support of <span class="caps">AMD</span> hardware in multimedia applications.</p>
<p>As part of the on-going GStreamer community efforts to improve continuous
integration (<span class="caps">CI</span>) in the project, we purchased Intel and <span class="caps">AMD</span> powered devices
aimed to run validation tests. Running <span class="caps">CI</span> on real end-user hardware will help
ensure regressions remain under control.</p>
</div>
<div class="section" id="servo-and-gstreamer-rs">
<h2>Servo and GStreamer-rs</h2>
<p>As part of our on-going involvement in the <a class="reference external" href="https://servo.org/">Servo</a> Mozilla project, <a class="reference external" href="https://blogs.igalia.com/vjaquez">Víctor</a> has
enabled zero-copy video rendering support in the GStreamer-based <a class="reference external" href="https://github.com/servo/media">Servo-media</a>
crate, along with <a class="reference external" href="https://github.com/rust-windowing/glutin/pull/1082">bug fixes in Glutin</a> and finally in <a class="reference external" href="https://github.com/servo/servo/pull/23483">Servo itself</a>.</p>
<p>A prior requirement for this major milestone was to enable gst-gl in the
GStreamer Rust bindings and after around 20 patches were merged in the
repository, we are pleased to announce Linux and Android platforms are
supported. Windows and macOS platforms will also be supported, soon.</p>
<p>The following screencast shows how hardware-accelerated video rendering performs
on Víctor’s laptop:</p>
<video controls src="https://s3.amazonaws.com/media-p.slid.es/videos/105177/rzteE40V/hwacceleration.mp4"/><p>Víctor also delivered a talk at the <a class="reference external" href="https://www.meetup.com/MadRust/events/259059099/">MadRust</a> meetup, about GStreamer-rs, the
<a class="reference external" href="https://people.igalia.com/vjaquez/talks/madrust2019/">slides are available</a> on his website.</p>
</div>
<div class="section" id="webkit-s-media-source-extensions">
<h2>WebKit’s Media Source Extensions</h2>
<p>As part of our on-going collaboration with various device manufacturers, <a class="reference external" href="https://www.igalia.com/igalian/aboya">Alicia</a>
and <a class="reference external" href="https://www.igalia.com/igalian/eocanha">Enrique</a> validated the <a class="reference external" href="https://ytlr-cert.appspot.com/2019/main.html">Youtube <span class="caps">TV</span> <span class="caps">MSE</span> 2019 test suite</a> in WPEWebKit-based products.</p>
<p>Alicia developed a new GstValidate plugin to improve GStreamer pipelines
testing, called <a class="reference external" href="https://gstreamer.freedesktop.org/documentation/gst-devtools/plugins/validateflow.html?gi-language=c">validateflow</a>. Make sure to read her <a class="reference external" href="https://blogs.igalia.com/aboya/2019/05/14/validateflow-a-new-tool-to-test-gstreamer-pipelines/">blog post about it</a>!.</p>
<p>In her quest to further improve <span class="caps">MSE</span> support, especially seek support, Alicia
rewrote the GStreamer source element we use in WebKit for <span class="caps">MSE</span> playback. The code
<a class="reference external" href="https://bugs.webkit.org/show_bug.cgi?id=199719">review is on-going in Bugzilla</a> and on track for inclusion in WPEWebKit and
WebKitGTK 2.26. This new design of the <span class="caps">MSE</span> source element requires the playbin3
GStreamer element and at least GStreamer 1.16. Another feature we plan to work
on in the near future is multi-track support; stay tuned!</p>
</div>
<div class="section" id="webkit-webrtc">
<h2>WebKit WebRTC</h2>
<p>We announced <a class="reference external" href="https://blogs.gnome.org/tsaunier/2018/07/31/webkitgtk-and-wpe-gains-webrtc-support-back/">LibWebRTC support for WPEWebKit and WebKitGTK</a> one year ago.
Since then, Thibault has been implementing new features and fixing bugs in the
backend. Bridging between the WebAudio and WebRTC backend was implemented,
allowing tight integration of WebAudio and WebRTC web apps. More WebKit WebRTC
layout tests were unskipped in the buildbots, allowing better tracking of regressions.</p>
<p>Thibault also fixed various performance issues on Raspberry Pi platforms during
<a class="reference external" href="https://appr.tc/">apprtc</a> video-calls. As part of this effort he upstreamed a
<a class="reference external" href="https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/commit/ee108d0ed20c376b5871b40108ba1cd54156baf9">GstUVCH264DeviceProvider</a> in GStreamer, allowing applications to use
already-encoded H264 streams from webcams that provide it, thus removing the
need for applications to encode raw video streams.</p>
<p>Additionally, Thibault upstreamed a <a class="reference external" href="https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/commit/ac5d0f7da688c1010ab2656f32c40e854e5b8bf2">device provider for <span class="caps">ALSA</span></a> in GStreamer,
allowing applications to probe for Microphones and speakers supported through
the <span class="caps">ALSA</span> kernel driver.</p>
<p>Finally, Thibault tweaked the GStreamer encoders used by the WebKit LibWebRTC
backend, in order to match the behavior of the Apple implementation and also
fixing a few performance and rendering issues on the way.</p>
</div>
<div class="section" id="webkit-gstreamer-multimedia-maintenance">
<h2>WebKit GStreamer Multimedia maintenance</h2>
<p>The whole team is always keeping an eye on WebKit’s Bugzilla, watching out for
multimedia-related bugs reported by the community members. <a class="reference external" href="https://www.igalia.com/igalian/cturner">Charlie</a> recently
fixed a few annoying <a class="reference external" href="https://bugs.webkit.org/show_bug.cgi?id=197358">volume</a> <a class="reference external" href="https://bugs.webkit.org/show_bug.cgi?id=199505">bugs</a> along with an issue related with <a class="reference external" href="https://bugs.webkit.org/show_bug.cgi?id=197355">youtube</a>.</p>
<p>I rewrote the <a class="reference external" href="https://trac.webkit.org/changeset/243058/webkit">WebKitWebSrc</a> GStreamer source element we use in WebKit to
download <span class="caps">HTTP</span>(S) media resources and feed the data to the playback pipeline.
This new element is now based on the GStreamer pushsrc base class, instead of
appsrc. A few seek-related issues were fixed on the way but unfortunately some
regressions also slipped in; those should all be fixed by now and shipped in
WPEWebKit/WebKitGTK 2.26.</p>
<p>An initial version of the <a class="reference external" href="https://w3c.github.io/media-capabilities/">MediaCapabilities</a> <a class="reference external" href="https://bugs.webkit.org/show_bug.cgi?id=191191">backend</a> was also upstreamed in
WebKit. It allows web-apps (such as Youtube <span class="caps">TV</span>) to probe the user-agent for
media decoding and encoding capabilities. The GStreamer backend relies on the
GStreamer plugin registry to provide accurate information about the supported
codecs and containers. <a class="reference external" href="https://bugs.webkit.org/show_bug.cgi?id=198569">H264 <span class="caps">AVC1</span> profile and level</a> information are also probed.</p>
</div>
<div class="section" id="wpeqt">
<h2>WPEQt</h2>
<p>Well, this isn’t directly related with multimedia, but I finally announced the
initial release of <a class="reference external" href="https://base-art.net/Articles/introducing-wpeqt-a-wpe-api-for-qt5/">WPEQt</a>. Any feedback is welcome, QtWebKit has phased out and
if anyone out there relies on it for web-apps embedded in native <span class="caps">QML</span> apps, now
is the time to try WPEQt!</p>
</div>
Introducing WPEQt, a WPE API for Qt52019-04-08T12:20:00+02:002019-04-08T12:20:00+02:00Philippe Normandtag:base-art.net,2019-04-08:/Articles/introducing-wpeqt-a-wpe-api-for-qt5/<p>WPEQt provides a <span class="caps">QML</span> plugin implementing an <span class="caps">API</span> very similar to the <a class="reference external" href="https://doc.qt.io/qt-5/qml-qtwebview-webview.html">QWebView
<span class="caps">API</span></a>. This blog post explains the rationale behind this new project aimed for
QtWebKit users.</p>
<p>Qt5 already provides multiple WebView APIs, one based on QtWebKit (deprecated)
and one based on QWebEngine (aka Chromium). WPEQt aims to provide …</p><p>WPEQt provides a <span class="caps">QML</span> plugin implementing an <span class="caps">API</span> very similar to the <a class="reference external" href="https://doc.qt.io/qt-5/qml-qtwebview-webview.html">QWebView
<span class="caps">API</span></a>. This blog post explains the rationale behind this new project aimed for
QtWebKit users.</p>
<p>Qt5 already provides multiple WebView APIs, one based on QtWebKit (deprecated)
and one based on QWebEngine (aka Chromium). WPEQt aims to provide a viable
alternative to the former. QtWebKit is being retired and has by now lagged a lot
behind upstream WebKit in terms of features and security fixes. WPEQt can also
be considered as an alternative to QWebEngine but bear in mind the underlying
Chromium web-engine doesn’t support the same <span class="caps">HTML5</span> features as WebKit.</p>
<p>WPEQt is included in WPEWebKit, starting from the 2.24 series. Bugs should be
reported in WebKit’s Bugzilla. WPEQt’s code is published under the same licenses
as WPEWebKit, the <span class="caps">LGPL2</span> and <span class="caps">BSD</span>.</p>
<p>At <a class="reference external" href="https://igalia.com">Igalia</a> we have compared <a class="reference external" href="https://people.igalia.com/pnormand/wpe-qt-webkit-comparison/">WPEQt and QtWebKit</a> using the <a class="reference external" href="https://browserbench.org">BrowserBench tests</a>.
The JetStream1.1 results show that WPEQt completes all the tests twice
as fast as QtWebKit. The Speedometer benchmark doesn’t even finish due to a
crash in the QtWebKit <span class="caps">DFG</span> <span class="caps">JIT</span>. Although the memory consumption looks similar in
both engines, the upstream WPEQt engine is well maintained and includes security
bug-fixes. Another advantage of WPEQt compared to QtWebKit is that its
multimedia support is much stronger, with specs such as <span class="caps">MSE</span>, <span class="caps">EME</span> and
media-capabilities being covered. WebRTC support is coming along as well!</p>
<p>So to everybody still stuck with QtWebKit in their apps and not yet ready (or
reluctant) to migrate to QtWebEngine, please have a look at WPEQt! The
remaining of this post explains how to build it and test it.</p>
<div class="section" id="building-wpeqt">
<h2>Building WPEQt</h2>
<p>For the time being, WPEQt only targets Linux platforms using graphics drivers
compatible with wayland-egl. Therefore, the end-user Qt application has to
use the wayland-egl Qt <span class="caps">QPA</span> plugin. Under certain circumstances the <span class="caps">EGLFS</span> <span class="caps">QPA</span>
might also work, <span class="caps">YMMV</span>.</p>
<div class="section" id="using-a-svn-git-webkit-snapshot">
<h3>Using a <span class="caps">SVN</span>/git WebKit snapshot</h3>
<p>If you have a <a class="reference external" href="https://trac.webkit.org/wiki/UsingGitWithWebKit#Checkout"><span class="caps">SVN</span>/git development checkout of upstream WebKit</a>, then you can build
WPEQt with the following commands on a Linux desktop platform:</p>
<pre class="literal-block">
$ Tools/wpe/install-dependencies
$ Tools/Scripts/webkit-flatpak --wpe --wpe-extension=qt update
$ Tools/Scripts/build-webkit --wpe --cmakeargs="-DENABLE_WPE_QT=ON"
</pre>
<p>The first command will install the main <span class="caps">WPE</span> host build dependencies. The second
command will setup the remaining build dependencies (including Qt5) using
Flatpak. The third command will build WPEWebKit along with WPEQt.</p>
</div>
<div class="section" id="using-the-wpewebkit-2-24-source-tarball">
<h3>Using the WPEWebKit 2.24 source tarball</h3>
<p>This procedure is already documented in the <a class="reference external" href="https://trac.webkit.org/wiki/WPE#BuildingWPEWebKitfromareleasetarball"><span class="caps">WPE</span> Wiki</a> page. The only
change required is the new CMake option for WPEQt, which needs to be
explicitly enabled as follows:</p>
<pre class="literal-block">
$ cmake -DCMAKE_BUILD_TYPE=RelWithDebInfo -DENABLE_WPE_QT=ON -GNinja
</pre>
<p>Then, invoke <cite>ninja</cite>, as documented in the Wiki.</p>
</div>
<div class="section" id="using-yocto">
<h3>Using Yocto</h3>
<p>At <a class="reference external" href="https://igalia.com">Igalia</a> we’re maintaining a <a class="reference external" href="https://github.com/igalia/meta-webkit/">Yocto overlay for <span class="caps">WPE</span></a> (and WebKitGTK). It was
tested for the rocko, sumo and thud Yocto releases. The target platform we
tested so far is the Zodiac <span class="caps">RDU2</span> board, which is based on the Freescale i.<span class="caps">MX6</span>
QuadPlus SoC. The backend we used is WPEBackend-fdo which fits very naturally in
the Mesa open-source graphics environment, inside Weston 5. The underlying
graphics driver is etnaviv. In addition to this platform, WPEQt should also run
on Raspberry Pi (with the WPEBackend-rdk or -fdo). Please let us know how it goes!</p>
<p>To enable WPEQt in meta-webkit, the <cite>qtwpe</cite> option needs to be enabled in the
wpewebkit recipe:</p>
<pre class="literal-block">
PACKAGECONFIG_append_pn-wpewebkit = " qtwpe"
</pre>
<p>The resulting <span class="caps">OS</span> image can also include WPEQt’s sample browser application:</p>
<pre class="literal-block">
IMAGE_INSTALL_append = " wpewebkit-qtwpe-qml-plugin qt-wpe-simple-browser"
</pre>
<p>Then, on device, the sample application can be executed either in Weston:</p>
<pre class="literal-block">
$ qt-wpe-simple-browser -platform wayland-egl https://wpewebkit.org
</pre>
<p>Or with the <span class="caps">EGLFS</span> <span class="caps">QPA</span>:</p>
<pre class="literal-block">
$ # stop weston
$ qt-wpe-simple-browser -platform eglfs https://wpewebkit.org
</pre>
</div>
</div>
<div class="section" id="using-wpeqt-in-your-application">
<h2>Using WPEQt in your application</h2>
<p>A sample MiniBrowser application is included in WebKit, in the
<a class="reference external" href="https://trac.webkit.org/browser/webkit/trunk/Tools/MiniBrowser/wpe/qt">Tools/MiniBrowser/wpe/qt</a> directory. If you have a desktop build of WPEQt you
can launch it with the following command:</p>
<pre class="literal-block">
$ Tools/Scripts/run-qt-wpe-minibrowser -platform wayland <url>
</pre>
<p>Here’s the <span class="caps">QML</span> code used for the WPEQt MiniBrowser. As you can see it’s fairly straightforward!</p>
<pre class="literal-block">
import QtQuick 2.11
import QtQuick.Window 2.11
import org.wpewebkit.qtwpe 1.0
Window {
id: main_window
visible: true
width: 1280
height: 720
title: qsTr("Hello WPE!")
WPEView {
url: initialUrl
focus: true
anchors.fill: parent
onTitleChanged: {
main_window.title = title;
}
}
}
</pre>
<p>As explained in this blog post, WPEQt is a simple alternative to QtWebKit.
Migrating existing applications should be straightforward because the <span class="caps">API</span>
provided by WPEQt is very similar to the QWebView <span class="caps">API</span>. We look forward to
hearing your feedback or inquiries on the <a class="reference external" href="https://lists.webkit.org/mailman/listinfo/webkit-wpe">webkit-wpe</a> mailing list and you are
welcome to file bugs in <a class="reference external" href="https://bugs.webkit.org/">Bugzilla</a>.</p>
<p>I wouldn’t close this post without acknowledging the support of my company
<a class="reference external" href="https://igalia.com">Igalia</a> and Zodiac, many thanks to them!</p>
</div>
Web overlay in GStreamer with WPEWebKit2018-12-08T15:09:36+01:002018-12-08T15:09:36+01:00Philippe Normandtag:base-art.net,2018-12-08:/Articles/web-overlay-in-gstreamer-with-wpewebkit/<p>After a year or two of hiatus I attended the <a class="reference external" href="https://gstreamer.freedesktop.org/conference/2018/">GStreamer conference</a> which
happened in beautiful Edinburgh. It was great to meet the friends from the
community again and learn about what’s going on in the multimedia world. The
quality of the talks was great, the videos are published …</p><p>After a year or two of hiatus I attended the <a class="reference external" href="https://gstreamer.freedesktop.org/conference/2018/">GStreamer conference</a> which
happened in beautiful Edinburgh. It was great to meet the friends from the
community again and learn about what’s going on in the multimedia world. The
quality of the talks was great, the videos are published online as usual in
<a class="reference external" href="https://gstconf.ubicast.tv/channels/#gstreamer-conference-2018">Ubicast</a>. I delivered a talk about the Multimedia support in <a class="reference external" href="https://wpewebkit.org">WPEWebKit</a>, you can
<a class="reference external" href="https://gstconf.ubicast.tv/videos/multimedia-support-in-webkitgtk-and-wpe-current-status-and-plans/">watch it there</a> and <a class="reference external" href="https://gstreamer.freedesktop.org/data/events/gstreamer-conference/2018/Philippe%20Normand%20-%20Multimedia%20support%20in%20WebKitGTK%20and%20WPE,%20current%20status%20and%20plans.pdf">the slides are also available</a>.</p>
<p>One of the many interesting presentations was about <a class="reference external" href="https://gstconf.ubicast.tv/videos/gstreamer-for-cloud-based-live-video-handling/">GStreamer for cloud-based
live video</a>. Usually anything with the word cloud would tend to draw my
attention away but for some reason I attended this presentation, and didn’t
regret it! The last demo presented by the <span class="caps">BBC</span> folks was about overlaying Web
content on native video streams. It’s an interesting use-case for live <span class="caps">TV</span>
broadcasting for instance. A web page provides dynamic notifications popping up
and down, the web page is rendered with a transparent background and blended
over the live video stream. The <span class="caps">BBC</span> folks implemented a <a class="reference external" href="https://github.com/bbc/brave/tree/master/gst-WebRenderSrc">GStreamer source
element relying on <span class="caps">CEF</span></a> for their Brave project.</p>
<p>So here you wonder, why am I talking about Chromium Embedded Framework (<span class="caps">CEF</span>)?
Isn’t this post about <a class="reference external" href="https://wpewebkit.org">WPEWebKit</a>? After seeing the demo from the Brave
developers I immediately thought <span class="caps">WPE</span> could be a great fit for this <span class="caps">HTML</span> overlay
use-case too! So a few weeks after the conference I finally had the time to
start working on the <a class="reference external" href="https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/tree/master/ext/wpe"><span class="caps">WPE</span> GStreamer plugin</a>. My colleague <a class="reference external" href="https://twitter.com/falconsigh">Žan Doberšek</a>,
<span class="caps">WPE</span>’s founder hacker, provided a nice solution for the initial rendering issues
of the prototype, many thanks to him!</p>
<p>Here’s a first example, a basic web-browser with gst-play:</p>
<pre class="literal-block">
$ gst-play-1.0 --videosink gtkglsink wpe://https://gnome.org
</pre>
<p>A <span class="caps">GTK</span> window opens up and the <span class="caps">GNOME</span> homepage should load. You can click on links
too! To overlay a web page on top of a video you can use a pipeline like this one:</p>
<pre class="literal-block">
$ gst-launch-1.0 glvideomixer name=m sink_1::zorder=0 sink_0::height=818 sink_0::width=1920 ! gtkglsink \
wpesrc location="file:///home/phil/Downloads/plunk/index.html" draw-background=0 ! m. \
uridecodebin uri="http://192.168.1.44/Sintel.2010.1080p.mkv" name=d d. ! queue ! glupload \
! glcolorconvert ! m.
</pre>
<p>which can be represented with this simplified graph:</p>
<img src="https://base-art.net/svg/wpesrc.svg"/><p>The advantage of this approach is that many heavy-lifting tasks happen in the
<span class="caps">GPU</span>. <span class="caps">WPE</span> loads the page using its WPENetworkProcess external process, parses
everything (<span class="caps">DOM</span>, <span class="caps">CSS</span>, <span class="caps">JS</span>, …) and renders it as a EGLImage, shared with the
UIProcess (the GStreamer application, gst-launch in this case). In most
situations <cite>decodebin</cite> will use an hardware decoder. The decoded video frames are
uploaded to the <span class="caps">GPU</span> and composited with the EGLImages representing the web-page,
in a single OpenGL scene, using the <cite>glvideomixer</cite> element.</p>
<p>The initial version of the GstWPE plugin is now part of the gst-plugins-bad
staging area, where most new plugins are uploaded for further improvements later
on. Speaking of improvements, the following tasks have been identified:</p>
<ul class="simple">
<li>The wpesrc <cite>draw-background</cite> property is not yet operational due to missing
WPEWebKit <span class="caps">API</span> for background-color configuration support. I expect to complete
this task very soon, interested people can follow <a class="reference external" href="https://bugs.webkit.org/show_bug.cgi?id=192305">this bugzilla ticket</a></li>
<li>Audio support, WPEWebKit currently provides only EGLImages to application
side. The audio session is rendered directly to GStreamer’s <cite>autoaudiosink</cite> in
WebKit, so there’s currently no audio sharing support in wpesrc.</li>
<li>DMABuf support as an alternative to EGLImages. WPEWebKit internally leverages
linux-dmabuf support already but doesn’t expose the file descriptors and plane informations.</li>
<li>Better navigation events support. GStreamer’s navigation events <span class="caps">API</span> was
initially designed for <span class="caps">DVD</span> menus navigation uses-cases mostly, the exposed
input events informations are not a perfect match for WPEWebKit which expects
hardware-level informations from keyboard, mouse and touch devices.</li>
</ul>
<p>There are more ways and use-cases related with <span class="caps">WPE</span>, I expect to unveil another
<span class="caps">WPE</span> embedding project very soon. Watch this space! As usual many thanks to my
<a class="reference external" href="https://igalia.com">Igalia</a> colleagues for sponsoring this work. We are always happy to hear what
others are doing with <span class="caps">WPE</span> and to help improving it, don’t hesitate to get in touch!</p>
Mirabeau on Maemo2010-02-11T16:20:39+01:002010-02-11T16:20:39+01:00Philippe Normandtag:base-art.net,2010-02-11:/Articles/115/<p>At <span class="caps">FOSDEM</span> Frank and I showed the work we did on <a class="reference external" href="http://coherence.beebits.net/wiki/Mirabeau">Mirabeau</a>, a
screencast was made few days before <span class="caps">FOSDEM</span> and we showed it but I
wanted to add some comments and re-arrange some parts of it, so
here it is now, re-arranged with PiTiVi :)</p>
<video controls>
<source type="video/ogg" src="http://base-art.net/static/mirabeau2.ogg" codecs="theora, vorbis">
Please install a <span class="caps">HTML5</span> compliant …</source></video><p>At <span class="caps">FOSDEM</span> Frank and I showed the work we did on <a class="reference external" href="http://coherence.beebits.net/wiki/Mirabeau">Mirabeau</a>, a
screencast was made few days before <span class="caps">FOSDEM</span> and we showed it but I
wanted to add some comments and re-arrange some parts of it, so
here it is now, re-arranged with PiTiVi :)</p>
<video controls>
<source type="video/ogg" src="http://base-art.net/static/mirabeau2.ogg" codecs="theora, vorbis">
Please install a <span class="caps">HTML5</span> compliant browser. Meanwhile you can download the video from http://base-art.net/static/mirabeau2.ogg
</video><p>(<a class="reference external" href="http://base-art.net/static/mirabeau2.ogg">Video here</a> in ogg/theora/vorbis if your browser fails to play it).</p>
<p>This is still work in progress, we have some issues with
<a class="reference external" href="http://telepathy.freedesktop.org">Telepathy</a> <span class="caps">MUC</span> Tubes, I promised Sjoerd from Telepathy fame to
create some new bugs in bugzilla. Also the <span class="caps">UI</span> itself still needs
work, especially the MediaRenderer <span class="caps">UI</span>. I will also at some point
add a chatroom window.</p>
<p>Oh and we also won a N900 at the <span class="caps">XMPP</span> developer contest thanks to
this application! Thanks a lot to the <a class="reference external" href="http://www.xmpp.org/xsf/"><span class="caps">XSF</span></a> and Nokia :)</p>
Coherence team at GCDS2009-07-09T12:58:30+02:002009-07-12T20:27:47+02:00Philippe Normandtag:base-art.net,2009-07-09:/Articles/109/<p>I gave a <a class="reference external" href="http://www.grancanariadesktopsummit.org/node/203">talk</a> about <a class="reference external" href="http://coherence-project.org">Coherence</a> and <a class="reference external" href="http://telepathy.freedesktop.org">Telepathy</a> at Gran Canaria
Desktop Summit with <a class="reference external" href="http://netzflocken.de">Frank</a> on tuesday. It went well, there were
about 50 people. I think most of the people got the concept we
tried to explain: UPnP devices and services are not bound to
local network anymore, thanks …</p><p>I gave a <a class="reference external" href="http://www.grancanariadesktopsummit.org/node/203">talk</a> about <a class="reference external" href="http://coherence-project.org">Coherence</a> and <a class="reference external" href="http://telepathy.freedesktop.org">Telepathy</a> at Gran Canaria
Desktop Summit with <a class="reference external" href="http://netzflocken.de">Frank</a> on tuesday. It went well, there were
about 50 people. I think most of the people got the concept we
tried to explain: UPnP devices and services are not bound to
local network anymore, thanks to the combination of Coherence and
<span class="caps">XMPP</span> through Telepathy. The slides should be online soon, will post
an update to this post when I know the url ;)</p>
<p>Part of the Coherence gang including Benjamin, Frank and me is
here at Las Palmas, feel free to grab us if you have any
question. It was also nice to meet Karl Vollmer, the <a class="reference external" href="http://ampache.org">Ampache</a>
lead developer :)</p>
<p>So we are quite excited about Mirabeau, work on it will continue
during the summer and a first release should happen in
september. I’m writing a little howto explaining how to easily
try the current (work-in-progress) version of Mirabeau, it will
land in the wiki in the coming days ;)</p>
<p><strong>update</strong>: Slides are <a class="reference external" href="http://coherence-project.org/download/mirabeau-guadec-2009.odp">there</a></p>