HTTP Live Streaming
HTTP Live Streaming is an Apple standard for ABR media streaming using fMP4 or MPEG-TS containers and H.264 or H.265 video codecs.
HTTP Live Streaming is an Apple standard for ABR media streaming using fMP4 or MPEG-TS containers and H.264 or H.265 video codecs.
Video compression standard, published in 2013, with the goal to succeed AVC/H.264 by reducing bitrate up to 50% while maintaining the same perceptual video quality (PVQ).
Unique 128-bit identifier, usually represented as a 32 hexadecimal string. UUIDs are extensively used to uniquely identify entities in computing systems, including media streaming workflows.
Fragmented MP4 is an MP4 container with its media logically partitioned into moof-mdat pairs (fragments). fMP4 can be one .mp4 file (all fragments) streamed using byte-range requests to the .mp4 file or many .m4s files (one per fragment) streamed using regular requests for each .m4s file.
Encoded Bitrate is the bitrate of the compressed (i.e. during ABR encode/transcode) audio or video media objects, each Bitrate level representing different quality levels in ABR. The concept of Encoded Bitrate is inherent in various ABR-related definitions: e.g. CMCD br key, e.g. HLS Playlist Variant Stream BANDWIDTH parameter, e.g. DASH
Dynamic Adaptive Streaming over HTTP is an MPEG international standard to enable non-proprietary ABR media streaming using fMP4 containers and any codec format.
Hardware or software that encodes or decodes a data stream (e.g. audio, video, closed captions).
Common Media Server Data is a CTA standard (CTA-5006) published in 2022 defining how Origins and CDNs send status information to downstream workflow nodes (e.g. CDNs, players) with every object response.
CTA standard (CTA-5004) published in 2020 defining how player clients send playout and error status information to upstream workflow nodes (e.g. CDNs) with every object request.
Common Media Application Format is an Apple-Microsoft standard aiming to simplify workflows by defining concurrent use of multiple ABR methods (e.g. DASH, HLS) while using one set of audio and video media files.
The Streaming Video Wiki is organized and maintained by the Streaming Video Technology Alliance, a global technical association dedicated to solving the critical technical challenges of streaming high-quality video at scale. For more information about the SVTA or questions about this Wiki, please contact the SVTA.
© Streaming Video Technology Alliance. All Rights Reserved.