forked from w3c/web-roadmaps
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathsynchronized.html
51 lines (51 loc) · 5.72 KB
/
synchronized.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Media orchestration</title>
</head>
<body>
<header>
<h1>Media orchestration</h1>
<p>In media content, time is of the essence, and when playing media content simultaneously either within a single device or across devices, ensuring that audio, video and other time-based content are properly synchronized is critical to deliver a good user experience.</p>
</header>
<main>
<section class="featureset well-deployed">
<h2>Well-deployed technologies</h2>
<p data-feature="Timeline management">The basic metronome to use to react to playing media over time is provided by the <a data-featureid="timeupdate"><code>timeupdate</code> event</a> of HTML’s <code><audio></code> and <code><video></code> elements.</p>
<p data-feature="Audio synchronization">The <a data-featureid="webaudio">Web Audio API</a> defines a full-fledged audio processing API that exposes the precise time at which audio will be played, exposing the latency introduced by the output pipeline in particular. This allows for very tight synchronization between different audio processing events happening in the local audio context.</p>
<div data-feature="Close Captioning">
<p>To associate an external text track (e.g. close captions) to a video, the <a data-featureid="webvtt">Web VTT</a> format can be plugged into a <code><video></code> element.</p>
<p>While WebVTT is the main format used to render captions in browsers, <a data-featureid="ttml">TTML 2</a> provides a richer language for describing timed text that can be used as an interchange format among authoring systems.</p>
</div>
<p data-feature="Timestamp accuracy in captured content">The <a data-featureid="getusermedia">Media Capture and Streams API</a>, which allows to control microphones and cameras from Web applications, exposes the targeted <a href="https://www.w3.org/TR/mediacapture-streams/#def-constraint-latency">latency</a> of the configuration, which can be used to relate timestamps in captured content to events happening in the real world.</p>
</section>
<section class="featureset in-progress">
<h2>Specifications in progress</h2>
<p data-feature="Timeline management">Beyond text tracks, Web pages may contain many other time-based animations with which synchronization can be useful; the <a data-featureid="web-animations">Web Animations API</a> offers the tools needed to set up these synchronization points. However, it does not allow synchronizing animations to playback of audio or video media.</p>
</section>
<section class="featureset exploratory-work">
<h2>Exploratory work</h2>
<p data-feature="Close Captioning">When media resources enclose their own text tracks, having this in-band information exposed to the Web application enables creating richer interactions; the <a data-featureid="inband">Sourcing In-band Media Resource Tracks from Media Containers into HTML</a> document offers guidance as to how that in-band information should be exposed in browsers.</p>
<p data-feature="Synchronization to a media stream">The <a data-featureid="me-media-timed-events">Media Timed Events</a> document collects use cases and requirements for improved support for timed events related to audio or video media on the Web, such as subtitles, captions, or other web content, where synchronization to a playing audio or video media stream is needed, and makes recommendations for new or changed Web APIs to realize these requirements.</p>
<p data-feature="Multi-device synchronization">There are a number of cases that need synchronization of several tracks in the same page, for instance to synchronize the sign-language transcript of an audio track with its associated video. The <code>MediaController</code> interface, initially defined in HTML5, was dropped from the HTML specification due to very limited implementation support. The <a data-featureid="timing">Timing Object</a> specification proposes a mechanism to bring shared on-line clocks to browsers and ease synchronization of heterogeneous content within a single device and across devices. The proposed timing model is generic, in that it could be applied to any timed media, such as audio or video playback or presentation or animation of web content in general.</p>
</section>
<section>
<h2>Discontinued features</h2>
<dl>
<dt>The <code>MediaController</code> interface</dt>
<dd>The <code>MediaController</code> interface had been introduced in HTML5 to ease the playback synchronization of different media elements within a single page. The interface was dropped from the HTML5.1 specification due to very limited implementation support and concerns about the performance and technical viability of such solutions on constrained devices.</dd>
</dl>
</section>
<section>
<h2>Features not covered by ongoing work</h2>
<dl>
<dt>Media-synchronised Web animations</dt>
<dd><a data-featureid="web-animations">Web Animations</a> allows Web applications to synchronise CSS Transitions, CSS Animations, and SVG with a global clock. There is no built-in mechanism to synchronize animations to playback of audio or video media, which could be useful to enable tighter synchronization in libraries such as Mozilla's <a href="https://github.com/menismu/popcorn-js">Popcorn.js</a>. Possible solutions include an extension to Web Animations to be able to associate animations with a media element, or integration with the more generic Timing Object solution.
</dd>
</dl>
</section>
</main>
<script src="../js/generate.js"></script>
</body>
</html>