fragmented MP4

fragmented MP4

A streaming media format based on Part 12 of the MPEG-4 standard (ISO base media file format). Unlike the older MPEG-2 Transport Stream (M2TS) format, which is used in Apple's streaming platform, fragmented MP4 (fMP4) does not multiplex the audio and video together. The streams can be stored separately, and the total combinations of files necessary to support multiple bit rates, resolutions and languages is dramatically lower than M2TS. Microsoft's Smooth Streaming was the first streaming platform to adopt fMP4, and Adobe's HTTP Dynamic Streaming followed suit. See MPEG-2 TS, Smooth Streaming and HTTP Dynamic Streaming.
References in periodicals archive ?
By way of background, HLS originally used only MPEG-2 transport stream segments, while Smooth Streaming and DASH used fragmented MP4 files.
This ushered in a new era in which storytellers have to rely more on large text overlays than voiceovers; Twitter ponied up to exclusively stream Thursday night NFL games; virtual reality, augmented reality, 4K, and 360-degree video became household names; Apple announced it will add fragmented MP4 support to HLS (HTTP live streaming); more consumers migrated to unlimited plans on 4G and LTE networks for seamless mobile video streaming; and branded video content became more widespread.
Fragmented MP4 Support: Enables segmented streaming of an MP4 file as well as the multiplexing of multiple streams with different bitrates into a single MP4 file, essential for streaming in conditions with fluctuating bandwidths
This transcoding media service is one of the first to offer high quality, Microsoft Smooth Streaming-compliant, fragmented MP4 file sets for Video-on-Demand (VOD) services on Windows Azure Media Services.
Advances made in packaging of segments were first addressed in late 2011, with Adobe and Microsoft making the joint case for the use of fragmented MP4 files that would allow delivery of multiple permutations of video streams (e.
The HTTP Dynamic Streaming workflow includes content preparation tools, fragmented MP4 files that are HTTP cache friendly and options for protected streaming powered by Adobe Flash Access[TM] 2.
The DASH implementation of content protection began with the Common Encryption Scheme (CENC), which allowed five specific digital rights management (DRM) solutions to be interchangeably used for DASH content that was delivered as fragmented MP4 files.
Previously, HLS only supported files packaged in the MPEG-2 Transport Stream container, while the Dynamic Adaptive Streaming over HTTP standard (DASH), and proprietary technologies like Microsoft's Smooth Streaming and Adobe's HTTP Dynamic Streaming (HDS), supported files packaged in the fragmented MP4 container (tMP4).
The ISO Base Media File Format (ISO-BMFF) is the MP4 file container, and this is key to being able to stream fragmented MP4 (fMP4) using byte-range addressing without the need to create hundreds, thousands, or even hundreds of thousands of standalone segments or chunks before streaming commences.
Whether it's for Apple's HTTP Live Streaming (HLS), which segments the master MP4 files in to MPEG-2 Transport Stream (M2TS), or the use of the industry standard Dynamic Adaptive Streaming over HTTP (DASH), which creates fragmented MP4 files, the need for ABR content delivery is clear.
Accommodations have been made within the DASH specification to accommodate AVC coding via M2TS, as a way to attract Apple to throw in with the MPEG standards committee and toss aside HLS, but the majority of effort to date has focused on the use of fragmented MP4 files delivered without the need for M2TS as the transport stream.
Due to DASH's ability to deliver any of several types of files--from the fragmented MP4 version of ISO Base Media File Format (ISOBMFF) to the Apple-modified MPEG2 Transport Stream (M2TS)--the DASH specification reads like an encyclopedia of encoding, encryption, and delivery technologies.