You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

2577 lines
97KB

  1. @chapter Muxers
  2. @c man begin MUXERS
  3. Muxers are configured elements in FFmpeg which allow writing
  4. multimedia streams to a particular type of file.
  5. When you configure your FFmpeg build, all the supported muxers
  6. are enabled by default. You can list all available muxers using the
  7. configure option @code{--list-muxers}.
  8. You can disable all the muxers with the configure option
  9. @code{--disable-muxers} and selectively enable / disable single muxers
  10. with the options @code{--enable-muxer=@var{MUXER}} /
  11. @code{--disable-muxer=@var{MUXER}}.
  12. The option @code{-muxers} of the ff* tools will display the list of
  13. enabled muxers. Use @code{-formats} to view a combined list of
  14. enabled demuxers and muxers.
  15. A description of some of the currently available muxers follows.
  16. @anchor{aiff}
  17. @section aiff
  18. Audio Interchange File Format muxer.
  19. @subsection Options
  20. It accepts the following options:
  21. @table @option
  22. @item write_id3v2
  23. Enable ID3v2 tags writing when set to 1. Default is 0 (disabled).
  24. @item id3v2_version
  25. Select ID3v2 version to write. Currently only version 3 and 4 (aka.
  26. ID3v2.3 and ID3v2.4) are supported. The default is version 4.
  27. @end table
  28. @anchor{asf}
  29. @section asf
  30. Advanced Systems Format muxer.
  31. Note that Windows Media Audio (wma) and Windows Media Video (wmv) use this
  32. muxer too.
  33. @subsection Options
  34. It accepts the following options:
  35. @table @option
  36. @item packet_size
  37. Set the muxer packet size. By tuning this setting you may reduce data
  38. fragmentation or muxer overhead depending on your source. Default value is
  39. 3200, minimum is 100, maximum is 64k.
  40. @end table
  41. @anchor{avi}
  42. @section avi
  43. Audio Video Interleaved muxer.
  44. @subsection Options
  45. It accepts the following options:
  46. @table @option
  47. @item reserve_index_space
  48. Reserve the specified amount of bytes for the OpenDML master index of each
  49. stream within the file header. By default additional master indexes are
  50. embedded within the data packets if there is no space left in the first master
  51. index and are linked together as a chain of indexes. This index structure can
  52. cause problems for some use cases, e.g. third-party software strictly relying
  53. on the OpenDML index specification or when file seeking is slow. Reserving
  54. enough index space in the file header avoids these problems.
  55. The required index space depends on the output file size and should be about 16
  56. bytes per gigabyte. When this option is omitted or set to zero the necessary
  57. index space is guessed.
  58. @item write_channel_mask
  59. Write the channel layout mask into the audio stream header.
  60. This option is enabled by default. Disabling the channel mask can be useful in
  61. specific scenarios, e.g. when merging multiple audio streams into one for
  62. compatibility with software that only supports a single audio stream in AVI
  63. (see @ref{amerge,,the "amerge" section in the ffmpeg-filters manual,ffmpeg-filters}).
  64. @item flipped_raw_rgb
  65. If set to true, store positive height for raw RGB bitmaps, which indicates
  66. bitmap is stored bottom-up. Note that this option does not flip the bitmap
  67. which has to be done manually beforehand, e.g. by using the vflip filter.
  68. Default is @var{false} and indicates bitmap is stored top down.
  69. @end table
  70. @anchor{chromaprint}
  71. @section chromaprint
  72. Chromaprint fingerprinter.
  73. This muxer feeds audio data to the Chromaprint library,
  74. which generates a fingerprint for the provided audio data. See @url{https://acoustid.org/chromaprint}
  75. It takes a single signed native-endian 16-bit raw audio stream of at most 2 channels.
  76. @subsection Options
  77. @table @option
  78. @item silence_threshold
  79. Threshold for detecting silence. Range is from -1 to 32767, where -1 disables
  80. silence detection. Silence detection can only be used with version 3 of the
  81. algorithm.
  82. Silence detection must be disabled for use with the AcoustID service. Default is -1.
  83. @item algorithm
  84. Version of algorithm to fingerprint with. Range is 0 to 4.
  85. Version 3 enables silence detection. Default is 1.
  86. @item fp_format
  87. Format to output the fingerprint as. Accepts the following options:
  88. @table @samp
  89. @item raw
  90. Binary raw fingerprint
  91. @item compressed
  92. Binary compressed fingerprint
  93. @item base64
  94. Base64 compressed fingerprint @emph{(default)}
  95. @end table
  96. @end table
  97. @anchor{crc}
  98. @section crc
  99. CRC (Cyclic Redundancy Check) testing format.
  100. This muxer computes and prints the Adler-32 CRC of all the input audio
  101. and video frames. By default audio frames are converted to signed
  102. 16-bit raw audio and video frames to raw video before computing the
  103. CRC.
  104. The output of the muxer consists of a single line of the form:
  105. CRC=0x@var{CRC}, where @var{CRC} is a hexadecimal number 0-padded to
  106. 8 digits containing the CRC for all the decoded input frames.
  107. See also the @ref{framecrc} muxer.
  108. @subsection Examples
  109. For example to compute the CRC of the input, and store it in the file
  110. @file{out.crc}:
  111. @example
  112. ffmpeg -i INPUT -f crc out.crc
  113. @end example
  114. You can print the CRC to stdout with the command:
  115. @example
  116. ffmpeg -i INPUT -f crc -
  117. @end example
  118. You can select the output format of each frame with @command{ffmpeg} by
  119. specifying the audio and video codec and format. For example to
  120. compute the CRC of the input audio converted to PCM unsigned 8-bit
  121. and the input video converted to MPEG-2 video, use the command:
  122. @example
  123. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f crc -
  124. @end example
  125. @section flv
  126. Adobe Flash Video Format muxer.
  127. This muxer accepts the following options:
  128. @table @option
  129. @item flvflags @var{flags}
  130. Possible values:
  131. @table @samp
  132. @item aac_seq_header_detect
  133. Place AAC sequence header based on audio stream data.
  134. @item no_sequence_end
  135. Disable sequence end tag.
  136. @item no_metadata
  137. Disable metadata tag.
  138. @item no_duration_filesize
  139. Disable duration and filesize in metadata when they are equal to zero
  140. at the end of stream. (Be used to non-seekable living stream).
  141. @item add_keyframe_index
  142. Used to facilitate seeking; particularly for HTTP pseudo streaming.
  143. @end table
  144. @end table
  145. @anchor{dash}
  146. @section dash
  147. Dynamic Adaptive Streaming over HTTP (DASH) muxer that creates segments
  148. and manifest files according to the MPEG-DASH standard ISO/IEC 23009-1:2014.
  149. For more information see:
  150. @itemize @bullet
  151. @item
  152. ISO DASH Specification: @url{http://standards.iso.org/ittf/PubliclyAvailableStandards/c065274_ISO_IEC_23009-1_2014.zip}
  153. @item
  154. WebM DASH Specification: @url{https://sites.google.com/a/webmproject.org/wiki/adaptive-streaming/webm-dash-specification}
  155. @end itemize
  156. It creates a MPD manifest file and segment files for each stream.
  157. The segment filename might contain pre-defined identifiers used with SegmentTemplate
  158. as defined in section 5.3.9.4.4 of the standard. Available identifiers are "$RepresentationID$",
  159. "$Number$", "$Bandwidth$" and "$Time$".
  160. In addition to the standard identifiers, an ffmpeg-specific "$ext$" identifier is also supported.
  161. When specified ffmpeg will replace $ext$ in the file name with muxing format's extensions such as mp4, webm etc.,
  162. @example
  163. ffmpeg -re -i <input> -map 0 -map 0 -c:a libfdk_aac -c:v libx264 \
  164. -b:v:0 800k -b:v:1 300k -s:v:1 320x170 -profile:v:1 baseline \
  165. -profile:v:0 main -bf 1 -keyint_min 120 -g 120 -sc_threshold 0 \
  166. -b_strategy 0 -ar:a:1 22050 -use_timeline 1 -use_template 1 \
  167. -window_size 5 -adaptation_sets "id=0,streams=v id=1,streams=a" \
  168. -f dash /path/to/out.mpd
  169. @end example
  170. @table @option
  171. @item min_seg_duration @var{microseconds}
  172. This is a deprecated option to set the segment length in microseconds, use @var{seg_duration} instead.
  173. @item seg_duration @var{duration}
  174. Set the segment length in seconds (fractional value can be set). The value is
  175. treated as average segment duration when @var{use_template} is enabled and
  176. @var{use_timeline} is disabled and as minimum segment duration for all the other
  177. use cases.
  178. @item frag_duration @var{duration}
  179. Set the length in seconds of fragments within segments (fractional value can be set).
  180. @item frag_type @var{type}
  181. Set the type of interval for fragmentation.
  182. @item window_size @var{size}
  183. Set the maximum number of segments kept in the manifest.
  184. @item extra_window_size @var{size}
  185. Set the maximum number of segments kept outside of the manifest before removing from disk.
  186. @item remove_at_exit @var{remove}
  187. Enable (1) or disable (0) removal of all segments when finished.
  188. @item use_template @var{template}
  189. Enable (1) or disable (0) use of SegmentTemplate instead of SegmentList.
  190. @item use_timeline @var{timeline}
  191. Enable (1) or disable (0) use of SegmentTimeline in SegmentTemplate.
  192. @item single_file @var{single_file}
  193. Enable (1) or disable (0) storing all segments in one file, accessed using byte ranges.
  194. @item single_file_name @var{file_name}
  195. DASH-templated name to be used for baseURL. Implies @var{single_file} set to "1". In the template, "$ext$" is replaced with the file name extension specific for the segment format.
  196. @item init_seg_name @var{init_name}
  197. DASH-templated name to used for the initialization segment. Default is "init-stream$RepresentationID$.$ext$". "$ext$" is replaced with the file name extension specific for the segment format.
  198. @item media_seg_name @var{segment_name}
  199. DASH-templated name to used for the media segments. Default is "chunk-stream$RepresentationID$-$Number%05d$.$ext$". "$ext$" is replaced with the file name extension specific for the segment format.
  200. @item utc_timing_url @var{utc_url}
  201. URL of the page that will return the UTC timestamp in ISO format. Example: "https://time.akamai.com/?iso"
  202. @item method @var{method}
  203. Use the given HTTP method to create output files. Generally set to PUT or POST.
  204. @item http_user_agent @var{user_agent}
  205. Override User-Agent field in HTTP header. Applicable only for HTTP output.
  206. @item http_persistent @var{http_persistent}
  207. Use persistent HTTP connections. Applicable only for HTTP output.
  208. @item hls_playlist @var{hls_playlist}
  209. Generate HLS playlist files as well. The master playlist is generated with the filename @var{hls_master_name}.
  210. One media playlist file is generated for each stream with filenames media_0.m3u8, media_1.m3u8, etc.
  211. @item hls_master_name @var{file_name}
  212. HLS master playlist name. Default is "master.m3u8".
  213. @item streaming @var{streaming}
  214. Enable (1) or disable (0) chunk streaming mode of output. In chunk streaming
  215. mode, each frame will be a moof fragment which forms a chunk.
  216. @item adaptation_sets @var{adaptation_sets}
  217. Assign streams to AdaptationSets. Syntax is "id=x,streams=a,b,c id=y,streams=d,e" with x and y being the IDs
  218. of the adaptation sets and a,b,c,d and e are the indices of the mapped streams.
  219. To map all video (or audio) streams to an AdaptationSet, "v" (or "a") can be used as stream identifier instead of IDs.
  220. When no assignment is defined, this defaults to an AdaptationSet for each stream.
  221. Optional syntax is "id=x,seg_duration=x,frag_duration=x,frag_type=type,descriptor=descriptor_string,streams=a,b,c id=y,seg_duration=y,frag_type=type,streams=d,e" and so on,
  222. descriptor is useful to the scheme defined by ISO/IEC 23009-1:2014/Amd.2:2015.
  223. For example, -adaptation_sets "id=0,descriptor=<SupplementalProperty schemeIdUri=\"urn:mpeg:dash:srd:2014\" value=\"0,0,0,1,1,2,2\"/>,streams=v".
  224. Please note that descriptor string should be a self-closing xml tag.
  225. seg_duration, frag_duration and frag_type override the global option values for each adaptation set.
  226. For example, -adaptation_sets "id=0,seg_duration=2,frag_duration=1,frag_type=duration,streams=v id=1,seg_duration=2,frag_type=none,streams=a"
  227. type_id marks an adaptation set as containing streams meant to be used for Trick Mode for the referenced adaptation set.
  228. For example, -adaptation_sets "id=0,seg_duration=2,frag_type=none,streams=0 id=1,seg_duration=10,frag_type=none,trick_id=0,streams=1"
  229. @item timeout @var{timeout}
  230. Set timeout for socket I/O operations. Applicable only for HTTP output.
  231. @item index_correction @var{index_correction}
  232. Enable (1) or Disable (0) segment index correction logic. Applicable only when
  233. @var{use_template} is enabled and @var{use_timeline} is disabled.
  234. When enabled, the logic monitors the flow of segment indexes. If a streams's
  235. segment index value is not at the expected real time position, then the logic
  236. corrects that index value.
  237. Typically this logic is needed in live streaming use cases. The network bandwidth
  238. fluctuations are common during long run streaming. Each fluctuation can cause
  239. the segment indexes fall behind the expected real time position.
  240. @item format_options @var{options_list}
  241. Set container format (mp4/webm) options using a @code{:} separated list of
  242. key=value parameters. Values containing @code{:} special characters must be
  243. escaped.
  244. @item global_sidx @var{global_sidx}
  245. Write global SIDX atom. Applicable only for single file, mp4 output, non-streaming mode.
  246. @item dash_segment_type @var{dash_segment_type}
  247. Possible values:
  248. @table @option
  249. @item auto
  250. If this flag is set, the dash segment files format will be selected based on the stream codec. This is the default mode.
  251. @item mp4
  252. If this flag is set, the dash segment files will be in in ISOBMFF format.
  253. @item webm
  254. If this flag is set, the dash segment files will be in in WebM format.
  255. @end table
  256. @item ignore_io_errors @var{ignore_io_errors}
  257. Ignore IO errors during open and write. Useful for long-duration runs with network output.
  258. @item lhls @var{lhls}
  259. Enable Low-latency HLS(LHLS). Adds #EXT-X-PREFETCH tag with current segment's URI.
  260. Apple doesn't have an official spec for LHLS. Meanwhile hls.js player folks are
  261. trying to standardize a open LHLS spec. The draft spec is available in https://github.com/video-dev/hlsjs-rfcs/blob/lhls-spec/proposals/0001-lhls.md
  262. This option will also try to comply with the above open spec, till Apple's spec officially supports it.
  263. Applicable only when @var{streaming} and @var{hls_playlist} options are enabled.
  264. This is an experimental feature.
  265. @item ldash @var{ldash}
  266. Enable Low-latency Dash by constraining the presence and values of some elements.
  267. @item master_m3u8_publish_rate @var{master_m3u8_publish_rate}
  268. Publish master playlist repeatedly every after specified number of segment intervals.
  269. @item write_prft @var{write_prft}
  270. Write Producer Reference Time elements on supported streams. This also enables writing
  271. prft boxes in the underlying muxer. Applicable only when the @var{utc_url} option is enabled.
  272. It's set to auto by default, in which case the muxer will attempt to enable it only in modes
  273. that require it.
  274. @item mpd_profile @var{mpd_profile}
  275. Set one or more manifest profiles.
  276. @item http_opts @var{http_opts}
  277. A :-separated list of key=value options to pass to the underlying HTTP
  278. protocol. Applicable only for HTTP output.
  279. @item target_latency @var{target_latency}
  280. Set an intended target latency in seconds (fractional value can be set) for serving. Applicable only when @var{streaming} and @var{write_prft} options are enabled.
  281. This is an informative fields clients can use to measure the latency of the service.
  282. @item min_playback_rate @var{min_playback_rate}
  283. Set the minimum playback rate indicated as appropriate for the purposes of automatically
  284. adjusting playback latency and buffer occupancy during normal playback by clients.
  285. @item max_playback_rate @var{max_playback_rate}
  286. Set the maximum playback rate indicated as appropriate for the purposes of automatically
  287. adjusting playback latency and buffer occupancy during normal playback by clients.
  288. @item update_period @var{update_period}
  289. Set the mpd update period ,for dynamic content.
  290. The unit is second.
  291. @end table
  292. @anchor{framecrc}
  293. @section framecrc
  294. Per-packet CRC (Cyclic Redundancy Check) testing format.
  295. This muxer computes and prints the Adler-32 CRC for each audio
  296. and video packet. By default audio frames are converted to signed
  297. 16-bit raw audio and video frames to raw video before computing the
  298. CRC.
  299. The output of the muxer consists of a line for each audio and video
  300. packet of the form:
  301. @example
  302. @var{stream_index}, @var{packet_dts}, @var{packet_pts}, @var{packet_duration}, @var{packet_size}, 0x@var{CRC}
  303. @end example
  304. @var{CRC} is a hexadecimal number 0-padded to 8 digits containing the
  305. CRC of the packet.
  306. @subsection Examples
  307. For example to compute the CRC of the audio and video frames in
  308. @file{INPUT}, converted to raw audio and video packets, and store it
  309. in the file @file{out.crc}:
  310. @example
  311. ffmpeg -i INPUT -f framecrc out.crc
  312. @end example
  313. To print the information to stdout, use the command:
  314. @example
  315. ffmpeg -i INPUT -f framecrc -
  316. @end example
  317. With @command{ffmpeg}, you can select the output format to which the
  318. audio and video frames are encoded before computing the CRC for each
  319. packet by specifying the audio and video codec. For example, to
  320. compute the CRC of each decoded input audio frame converted to PCM
  321. unsigned 8-bit and of each decoded input video frame converted to
  322. MPEG-2 video, use the command:
  323. @example
  324. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f framecrc -
  325. @end example
  326. See also the @ref{crc} muxer.
  327. @anchor{framehash}
  328. @section framehash
  329. Per-packet hash testing format.
  330. This muxer computes and prints a cryptographic hash for each audio
  331. and video packet. This can be used for packet-by-packet equality
  332. checks without having to individually do a binary comparison on each.
  333. By default audio frames are converted to signed 16-bit raw audio and
  334. video frames to raw video before computing the hash, but the output
  335. of explicit conversions to other codecs can also be used. It uses the
  336. SHA-256 cryptographic hash function by default, but supports several
  337. other algorithms.
  338. The output of the muxer consists of a line for each audio and video
  339. packet of the form:
  340. @example
  341. @var{stream_index}, @var{packet_dts}, @var{packet_pts}, @var{packet_duration}, @var{packet_size}, @var{hash}
  342. @end example
  343. @var{hash} is a hexadecimal number representing the computed hash
  344. for the packet.
  345. @table @option
  346. @item hash @var{algorithm}
  347. Use the cryptographic hash function specified by the string @var{algorithm}.
  348. Supported values include @code{MD5}, @code{murmur3}, @code{RIPEMD128},
  349. @code{RIPEMD160}, @code{RIPEMD256}, @code{RIPEMD320}, @code{SHA160},
  350. @code{SHA224}, @code{SHA256} (default), @code{SHA512/224}, @code{SHA512/256},
  351. @code{SHA384}, @code{SHA512}, @code{CRC32} and @code{adler32}.
  352. @end table
  353. @subsection Examples
  354. To compute the SHA-256 hash of the audio and video frames in @file{INPUT},
  355. converted to raw audio and video packets, and store it in the file
  356. @file{out.sha256}:
  357. @example
  358. ffmpeg -i INPUT -f framehash out.sha256
  359. @end example
  360. To print the information to stdout, using the MD5 hash function, use
  361. the command:
  362. @example
  363. ffmpeg -i INPUT -f framehash -hash md5 -
  364. @end example
  365. See also the @ref{hash} muxer.
  366. @anchor{framemd5}
  367. @section framemd5
  368. Per-packet MD5 testing format.
  369. This is a variant of the @ref{framehash} muxer. Unlike that muxer,
  370. it defaults to using the MD5 hash function.
  371. @subsection Examples
  372. To compute the MD5 hash of the audio and video frames in @file{INPUT},
  373. converted to raw audio and video packets, and store it in the file
  374. @file{out.md5}:
  375. @example
  376. ffmpeg -i INPUT -f framemd5 out.md5
  377. @end example
  378. To print the information to stdout, use the command:
  379. @example
  380. ffmpeg -i INPUT -f framemd5 -
  381. @end example
  382. See also the @ref{framehash} and @ref{md5} muxers.
  383. @anchor{gif}
  384. @section gif
  385. Animated GIF muxer.
  386. It accepts the following options:
  387. @table @option
  388. @item loop
  389. Set the number of times to loop the output. Use @code{-1} for no loop, @code{0}
  390. for looping indefinitely (default).
  391. @item final_delay
  392. Force the delay (expressed in centiseconds) after the last frame. Each frame
  393. ends with a delay until the next frame. The default is @code{-1}, which is a
  394. special value to tell the muxer to re-use the previous delay. In case of a
  395. loop, you might want to customize this value to mark a pause for instance.
  396. @end table
  397. For example, to encode a gif looping 10 times, with a 5 seconds delay between
  398. the loops:
  399. @example
  400. ffmpeg -i INPUT -loop 10 -final_delay 500 out.gif
  401. @end example
  402. Note 1: if you wish to extract the frames into separate GIF files, you need to
  403. force the @ref{image2} muxer:
  404. @example
  405. ffmpeg -i INPUT -c:v gif -f image2 "out%d.gif"
  406. @end example
  407. Note 2: the GIF format has a very large time base: the delay between two frames
  408. can therefore not be smaller than one centi second.
  409. @anchor{hash}
  410. @section hash
  411. Hash testing format.
  412. This muxer computes and prints a cryptographic hash of all the input
  413. audio and video frames. This can be used for equality checks without
  414. having to do a complete binary comparison.
  415. By default audio frames are converted to signed 16-bit raw audio and
  416. video frames to raw video before computing the hash, but the output
  417. of explicit conversions to other codecs can also be used. Timestamps
  418. are ignored. It uses the SHA-256 cryptographic hash function by default,
  419. but supports several other algorithms.
  420. The output of the muxer consists of a single line of the form:
  421. @var{algo}=@var{hash}, where @var{algo} is a short string representing
  422. the hash function used, and @var{hash} is a hexadecimal number
  423. representing the computed hash.
  424. @table @option
  425. @item hash @var{algorithm}
  426. Use the cryptographic hash function specified by the string @var{algorithm}.
  427. Supported values include @code{MD5}, @code{murmur3}, @code{RIPEMD128},
  428. @code{RIPEMD160}, @code{RIPEMD256}, @code{RIPEMD320}, @code{SHA160},
  429. @code{SHA224}, @code{SHA256} (default), @code{SHA512/224}, @code{SHA512/256},
  430. @code{SHA384}, @code{SHA512}, @code{CRC32} and @code{adler32}.
  431. @end table
  432. @subsection Examples
  433. To compute the SHA-256 hash of the input converted to raw audio and
  434. video, and store it in the file @file{out.sha256}:
  435. @example
  436. ffmpeg -i INPUT -f hash out.sha256
  437. @end example
  438. To print an MD5 hash to stdout use the command:
  439. @example
  440. ffmpeg -i INPUT -f hash -hash md5 -
  441. @end example
  442. See also the @ref{framehash} muxer.
  443. @anchor{hls}
  444. @section hls
  445. Apple HTTP Live Streaming muxer that segments MPEG-TS according to
  446. the HTTP Live Streaming (HLS) specification.
  447. It creates a playlist file, and one or more segment files. The output filename
  448. specifies the playlist filename.
  449. By default, the muxer creates a file for each segment produced. These files
  450. have the same name as the playlist, followed by a sequential number and a
  451. .ts extension.
  452. Make sure to require a closed GOP when encoding and to set the GOP
  453. size to fit your segment time constraint.
  454. For example, to convert an input file with @command{ffmpeg}:
  455. @example
  456. ffmpeg -i in.mkv -c:v h264 -flags +cgop -g 30 -hls_time 1 out.m3u8
  457. @end example
  458. This example will produce the playlist, @file{out.m3u8}, and segment files:
  459. @file{out0.ts}, @file{out1.ts}, @file{out2.ts}, etc.
  460. See also the @ref{segment} muxer, which provides a more generic and
  461. flexible implementation of a segmenter, and can be used to perform HLS
  462. segmentation.
  463. @subsection Options
  464. This muxer supports the following options:
  465. @table @option
  466. @item hls_init_time @var{duration}
  467. Set the initial target segment length. Default value is @var{0}.
  468. @var{duration} must be a time duration specification,
  469. see @ref{time duration syntax,,the Time duration section in the ffmpeg-utils(1) manual,ffmpeg-utils}.
  470. Segment will be cut on the next key frame after this time has passed on the first m3u8 list.
  471. After the initial playlist is filled @command{ffmpeg} will cut segments
  472. at duration equal to @code{hls_time}
  473. @item hls_time @var{duration}
  474. Set the target segment length. Default value is 2.
  475. @var{duration} must be a time duration specification,
  476. see @ref{time duration syntax,,the Time duration section in the ffmpeg-utils(1) manual,ffmpeg-utils}.
  477. Segment will be cut on the next key frame after this time has passed.
  478. @item hls_list_size @var{size}
  479. Set the maximum number of playlist entries. If set to 0 the list file
  480. will contain all the segments. Default value is 5.
  481. @item hls_delete_threshold @var{size}
  482. Set the number of unreferenced segments to keep on disk before @code{hls_flags delete_segments}
  483. deletes them. Increase this to allow continue clients to download segments which
  484. were recently referenced in the playlist. Default value is 1, meaning segments older than
  485. @code{hls_list_size+1} will be deleted.
  486. @item hls_ts_options @var{options_list}
  487. Set output format options using a :-separated list of key=value
  488. parameters. Values containing @code{:} special characters must be
  489. escaped.
  490. @item hls_wrap @var{wrap}
  491. This is a deprecated option, you can use @code{hls_list_size}
  492. and @code{hls_flags delete_segments} instead it
  493. This option is useful to avoid to fill the disk with many segment
  494. files, and limits the maximum number of segment files written to disk
  495. to @var{wrap}.
  496. @item hls_start_number_source
  497. Start the playlist sequence number (@code{#EXT-X-MEDIA-SEQUENCE}) according to the specified source.
  498. Unless @code{hls_flags single_file} is set, it also specifies source of starting sequence numbers of
  499. segment and subtitle filenames. In any case, if @code{hls_flags append_list}
  500. is set and read playlist sequence number is greater than the specified start sequence number,
  501. then that value will be used as start value.
  502. It accepts the following values:
  503. @table @option
  504. @item generic (default)
  505. Set the starting sequence numbers according to @var{start_number} option value.
  506. @item epoch
  507. The start number will be the seconds since epoch (1970-01-01 00:00:00)
  508. @item epoch_us
  509. The start number will be the microseconds since epoch (1970-01-01 00:00:00)
  510. @item datetime
  511. The start number will be based on the current date/time as YYYYmmddHHMMSS. e.g. 20161231235759.
  512. @end table
  513. @item start_number @var{number}
  514. Start the playlist sequence number (@code{#EXT-X-MEDIA-SEQUENCE}) from the specified @var{number}
  515. when @var{hls_start_number_source} value is @var{generic}. (This is the default case.)
  516. Unless @code{hls_flags single_file} is set, it also specifies starting sequence numbers of segment and subtitle filenames.
  517. Default value is 0.
  518. @item hls_allow_cache @var{allowcache}
  519. Explicitly set whether the client MAY (1) or MUST NOT (0) cache media segments.
  520. @item hls_base_url @var{baseurl}
  521. Append @var{baseurl} to every entry in the playlist.
  522. Useful to generate playlists with absolute paths.
  523. Note that the playlist sequence number must be unique for each segment
  524. and it is not to be confused with the segment filename sequence number
  525. which can be cyclic, for example if the @option{wrap} option is
  526. specified.
  527. @item hls_segment_filename @var{filename}
  528. Set the segment filename. Unless @code{hls_flags single_file} is set,
  529. @var{filename} is used as a string format with the segment number:
  530. @example
  531. ffmpeg -i in.nut -hls_segment_filename 'file%03d.ts' out.m3u8
  532. @end example
  533. This example will produce the playlist, @file{out.m3u8}, and segment files:
  534. @file{file000.ts}, @file{file001.ts}, @file{file002.ts}, etc.
  535. @var{filename} may contain full path or relative path specification,
  536. but only the file name part without any path info will be contained in the m3u8 segment list.
  537. Should a relative path be specified, the path of the created segment
  538. files will be relative to the current working directory.
  539. When strftime_mkdir is set, the whole expanded value of @var{filename} will be written into the m3u8 segment list.
  540. When @code{var_stream_map} is set with two or more variant streams, the
  541. @var{filename} pattern must contain the string "%v", this string specifies
  542. the position of variant stream index in the generated segment file names.
  543. @example
  544. ffmpeg -i in.ts -b:v:0 1000k -b:v:1 256k -b:a:0 64k -b:a:1 32k \
  545. -map 0:v -map 0:a -map 0:v -map 0:a -f hls -var_stream_map "v:0,a:0 v:1,a:1" \
  546. -hls_segment_filename 'file_%v_%03d.ts' out_%v.m3u8
  547. @end example
  548. This example will produce the playlists segment file sets:
  549. @file{file_0_000.ts}, @file{file_0_001.ts}, @file{file_0_002.ts}, etc. and
  550. @file{file_1_000.ts}, @file{file_1_001.ts}, @file{file_1_002.ts}, etc.
  551. The string "%v" may be present in the filename or in the last directory name
  552. containing the file, but only in one of them. (Additionally, %v may appear multiple times in the last
  553. sub-directory or filename.) If the string %v is present in the directory name, then
  554. sub-directories are created after expanding the directory name pattern. This
  555. enables creation of segments corresponding to different variant streams in
  556. subdirectories.
  557. @example
  558. ffmpeg -i in.ts -b:v:0 1000k -b:v:1 256k -b:a:0 64k -b:a:1 32k \
  559. -map 0:v -map 0:a -map 0:v -map 0:a -f hls -var_stream_map "v:0,a:0 v:1,a:1" \
  560. -hls_segment_filename 'vs%v/file_%03d.ts' vs%v/out.m3u8
  561. @end example
  562. This example will produce the playlists segment file sets:
  563. @file{vs0/file_000.ts}, @file{vs0/file_001.ts}, @file{vs0/file_002.ts}, etc. and
  564. @file{vs1/file_000.ts}, @file{vs1/file_001.ts}, @file{vs1/file_002.ts}, etc.
  565. @item use_localtime
  566. Same as strftime option, will be deprecated.
  567. @item strftime
  568. Use strftime() on @var{filename} to expand the segment filename with localtime.
  569. The segment number is also available in this mode, but to use it, you need to specify second_level_segment_index
  570. hls_flag and %%d will be the specifier.
  571. @example
  572. ffmpeg -i in.nut -strftime 1 -hls_segment_filename 'file-%Y%m%d-%s.ts' out.m3u8
  573. @end example
  574. This example will produce the playlist, @file{out.m3u8}, and segment files:
  575. @file{file-20160215-1455569023.ts}, @file{file-20160215-1455569024.ts}, etc.
  576. Note: On some systems/environments, the @code{%s} specifier is not available. See
  577. @code{strftime()} documentation.
  578. @example
  579. ffmpeg -i in.nut -strftime 1 -hls_flags second_level_segment_index -hls_segment_filename 'file-%Y%m%d-%%04d.ts' out.m3u8
  580. @end example
  581. This example will produce the playlist, @file{out.m3u8}, and segment files:
  582. @file{file-20160215-0001.ts}, @file{file-20160215-0002.ts}, etc.
  583. @item use_localtime_mkdir
  584. Same as strftime_mkdir option, will be deprecated .
  585. @item strftime_mkdir
  586. Used together with -strftime_mkdir, it will create all subdirectories which
  587. is expanded in @var{filename}.
  588. @example
  589. ffmpeg -i in.nut -strftime 1 -strftime_mkdir 1 -hls_segment_filename '%Y%m%d/file-%Y%m%d-%s.ts' out.m3u8
  590. @end example
  591. This example will create a directory 201560215 (if it does not exist), and then
  592. produce the playlist, @file{out.m3u8}, and segment files:
  593. @file{20160215/file-20160215-1455569023.ts}, @file{20160215/file-20160215-1455569024.ts}, etc.
  594. @example
  595. ffmpeg -i in.nut -strftime 1 -strftime_mkdir 1 -hls_segment_filename '%Y/%m/%d/file-%Y%m%d-%s.ts' out.m3u8
  596. @end example
  597. This example will create a directory hierarchy 2016/02/15 (if any of them do not exist), and then
  598. produce the playlist, @file{out.m3u8}, and segment files:
  599. @file{2016/02/15/file-20160215-1455569023.ts}, @file{2016/02/15/file-20160215-1455569024.ts}, etc.
  600. @item hls_key_info_file @var{key_info_file}
  601. Use the information in @var{key_info_file} for segment encryption. The first
  602. line of @var{key_info_file} specifies the key URI written to the playlist. The
  603. key URL is used to access the encryption key during playback. The second line
  604. specifies the path to the key file used to obtain the key during the encryption
  605. process. The key file is read as a single packed array of 16 octets in binary
  606. format. The optional third line specifies the initialization vector (IV) as a
  607. hexadecimal string to be used instead of the segment sequence number (default)
  608. for encryption. Changes to @var{key_info_file} will result in segment
  609. encryption with the new key/IV and an entry in the playlist for the new key
  610. URI/IV if @code{hls_flags periodic_rekey} is enabled.
  611. Key info file format:
  612. @example
  613. @var{key URI}
  614. @var{key file path}
  615. @var{IV} (optional)
  616. @end example
  617. Example key URIs:
  618. @example
  619. http://server/file.key
  620. /path/to/file.key
  621. file.key
  622. @end example
  623. Example key file paths:
  624. @example
  625. file.key
  626. /path/to/file.key
  627. @end example
  628. Example IV:
  629. @example
  630. 0123456789ABCDEF0123456789ABCDEF
  631. @end example
  632. Key info file example:
  633. @example
  634. http://server/file.key
  635. /path/to/file.key
  636. 0123456789ABCDEF0123456789ABCDEF
  637. @end example
  638. Example shell script:
  639. @example
  640. #!/bin/sh
  641. BASE_URL=$@{1:-'.'@}
  642. openssl rand 16 > file.key
  643. echo $BASE_URL/file.key > file.keyinfo
  644. echo file.key >> file.keyinfo
  645. echo $(openssl rand -hex 16) >> file.keyinfo
  646. ffmpeg -f lavfi -re -i testsrc -c:v h264 -hls_flags delete_segments \
  647. -hls_key_info_file file.keyinfo out.m3u8
  648. @end example
  649. @item -hls_enc @var{enc}
  650. Enable (1) or disable (0) the AES128 encryption.
  651. When enabled every segment generated is encrypted and the encryption key
  652. is saved as @var{playlist name}.key.
  653. @item -hls_enc_key @var{key}
  654. 16-octet key to encrypt the segments, by default it
  655. is randomly generated.
  656. @item -hls_enc_key_url @var{keyurl}
  657. If set, @var{keyurl} is prepended instead of @var{baseurl} to the key filename
  658. in the playlist.
  659. @item -hls_enc_iv @var{iv}
  660. 16-octet initialization vector for every segment instead
  661. of the autogenerated ones.
  662. @item hls_segment_type @var{flags}
  663. Possible values:
  664. @table @samp
  665. @item mpegts
  666. Output segment files in MPEG-2 Transport Stream format. This is
  667. compatible with all HLS versions.
  668. @item fmp4
  669. Output segment files in fragmented MP4 format, similar to MPEG-DASH.
  670. fmp4 files may be used in HLS version 7 and above.
  671. @end table
  672. @item hls_fmp4_init_filename @var{filename}
  673. Set filename to the fragment files header file, default filename is @file{init.mp4}.
  674. Use @code{-strftime 1} on @var{filename} to expand the segment filename with localtime.
  675. @example
  676. ffmpeg -i in.nut -hls_segment_type fmp4 -strftime 1 -hls_fmp4_init_filename "%s_init.mp4" out.m3u8
  677. @end example
  678. This will produce init like this
  679. @file{1602678741_init.mp4}
  680. @item hls_fmp4_init_resend
  681. Resend init file after m3u8 file refresh every time, default is @var{0}.
  682. When @code{var_stream_map} is set with two or more variant streams, the
  683. @var{filename} pattern must contain the string "%v", this string specifies
  684. the position of variant stream index in the generated init file names.
  685. The string "%v" may be present in the filename or in the last directory name
  686. containing the file. If the string is present in the directory name, then
  687. sub-directories are created after expanding the directory name pattern. This
  688. enables creation of init files corresponding to different variant streams in
  689. subdirectories.
  690. @item hls_flags @var{flags}
  691. Possible values:
  692. @table @samp
  693. @item single_file
  694. If this flag is set, the muxer will store all segments in a single MPEG-TS
  695. file, and will use byte ranges in the playlist. HLS playlists generated with
  696. this way will have the version number 4.
  697. For example:
  698. @example
  699. ffmpeg -i in.nut -hls_flags single_file out.m3u8
  700. @end example
  701. Will produce the playlist, @file{out.m3u8}, and a single segment file,
  702. @file{out.ts}.
  703. @item delete_segments
  704. Segment files removed from the playlist are deleted after a period of time
  705. equal to the duration of the segment plus the duration of the playlist.
  706. @item append_list
  707. Append new segments into the end of old segment list,
  708. and remove the @code{#EXT-X-ENDLIST} from the old segment list.
  709. @item round_durations
  710. Round the duration info in the playlist file segment info to integer
  711. values, instead of using floating point.
  712. @item discont_start
  713. Add the @code{#EXT-X-DISCONTINUITY} tag to the playlist, before the
  714. first segment's information.
  715. @item omit_endlist
  716. Do not append the @code{EXT-X-ENDLIST} tag at the end of the playlist.
  717. @item periodic_rekey
  718. The file specified by @code{hls_key_info_file} will be checked periodically and
  719. detect updates to the encryption info. Be sure to replace this file atomically,
  720. including the file containing the AES encryption key.
  721. @item independent_segments
  722. Add the @code{#EXT-X-INDEPENDENT-SEGMENTS} to playlists that has video segments
  723. and when all the segments of that playlist are guaranteed to start with a Key frame.
  724. @item iframes_only
  725. Add the @code{#EXT-X-I-FRAMES-ONLY} to playlists that has video segments
  726. and can play only I-frames in the @code{#EXT-X-BYTERANGE} mode.
  727. @item split_by_time
  728. Allow segments to start on frames other than keyframes. This improves
  729. behavior on some players when the time between keyframes is inconsistent,
  730. but may make things worse on others, and can cause some oddities during
  731. seeking. This flag should be used with the @code{hls_time} option.
  732. @item program_date_time
  733. Generate @code{EXT-X-PROGRAM-DATE-TIME} tags.
  734. @item second_level_segment_index
  735. Makes it possible to use segment indexes as %%d in hls_segment_filename expression
  736. besides date/time values when strftime is on.
  737. To get fixed width numbers with trailing zeroes, %%0xd format is available where x is the required width.
  738. @item second_level_segment_size
  739. Makes it possible to use segment sizes (counted in bytes) as %%s in hls_segment_filename
  740. expression besides date/time values when strftime is on.
  741. To get fixed width numbers with trailing zeroes, %%0xs format is available where x is the required width.
  742. @item second_level_segment_duration
  743. Makes it possible to use segment duration (calculated in microseconds) as %%t in hls_segment_filename
  744. expression besides date/time values when strftime is on.
  745. To get fixed width numbers with trailing zeroes, %%0xt format is available where x is the required width.
  746. @example
  747. ffmpeg -i sample.mpeg \
  748. -f hls -hls_time 3 -hls_list_size 5 \
  749. -hls_flags second_level_segment_index+second_level_segment_size+second_level_segment_duration \
  750. -strftime 1 -strftime_mkdir 1 -hls_segment_filename "segment_%Y%m%d%H%M%S_%%04d_%%08s_%%013t.ts" stream.m3u8
  751. @end example
  752. This will produce segments like this:
  753. @file{segment_20170102194334_0003_00122200_0000003000000.ts}, @file{segment_20170102194334_0004_00120072_0000003000000.ts} etc.
  754. @item temp_file
  755. Write segment data to filename.tmp and rename to filename only once the segment is complete. A webserver
  756. serving up segments can be configured to reject requests to *.tmp to prevent access to in-progress segments
  757. before they have been added to the m3u8 playlist. This flag also affects how m3u8 playlist files are created.
  758. If this flag is set, all playlist files will written into temporary file and renamed after they are complete, similarly as segments are handled.
  759. But playlists with @code{file} protocol and with type (@code{hls_playlist_type}) other than @code{vod}
  760. are always written into temporary file regardless of this flag. Master playlist files (@code{master_pl_name}), if any, with @code{file} protocol,
  761. are always written into temporary file regardless of this flag if @code{master_pl_publish_rate} value is other than zero.
  762. @end table
  763. @item hls_playlist_type event
  764. Emit @code{#EXT-X-PLAYLIST-TYPE:EVENT} in the m3u8 header. Forces
  765. @option{hls_list_size} to 0; the playlist can only be appended to.
  766. @item hls_playlist_type vod
  767. Emit @code{#EXT-X-PLAYLIST-TYPE:VOD} in the m3u8 header. Forces
  768. @option{hls_list_size} to 0; the playlist must not change.
  769. @item method
  770. Use the given HTTP method to create the hls files.
  771. @example
  772. ffmpeg -re -i in.ts -f hls -method PUT http://example.com/live/out.m3u8
  773. @end example
  774. This example will upload all the mpegts segment files to the HTTP
  775. server using the HTTP PUT method, and update the m3u8 files every
  776. @code{refresh} times using the same method.
  777. Note that the HTTP server must support the given method for uploading
  778. files.
  779. @item http_user_agent
  780. Override User-Agent field in HTTP header. Applicable only for HTTP output.
  781. @item var_stream_map
  782. Map string which specifies how to group the audio, video and subtitle streams
  783. into different variant streams. The variant stream groups are separated
  784. by space.
  785. Expected string format is like this "a:0,v:0 a:1,v:1 ....". Here a:, v:, s: are
  786. the keys to specify audio, video and subtitle streams respectively.
  787. Allowed values are 0 to 9 (limited just based on practical usage).
  788. When there are two or more variant streams, the output filename pattern must
  789. contain the string "%v", this string specifies the position of variant stream
  790. index in the output media playlist filenames. The string "%v" may be present in
  791. the filename or in the last directory name containing the file. If the string is
  792. present in the directory name, then sub-directories are created after expanding
  793. the directory name pattern. This enables creation of variant streams in
  794. subdirectories.
  795. @example
  796. ffmpeg -re -i in.ts -b:v:0 1000k -b:v:1 256k -b:a:0 64k -b:a:1 32k \
  797. -map 0:v -map 0:a -map 0:v -map 0:a -f hls -var_stream_map "v:0,a:0 v:1,a:1" \
  798. http://example.com/live/out_%v.m3u8
  799. @end example
  800. This example creates two hls variant streams. The first variant stream will
  801. contain video stream of bitrate 1000k and audio stream of bitrate 64k and the
  802. second variant stream will contain video stream of bitrate 256k and audio
  803. stream of bitrate 32k. Here, two media playlist with file names out_0.m3u8 and
  804. out_1.m3u8 will be created. If you want something meaningful text instead of indexes
  805. in result names, you may specify names for each or some of the variants
  806. as in the following example.
  807. @example
  808. ffmpeg -re -i in.ts -b:v:0 1000k -b:v:1 256k -b:a:0 64k -b:a:1 32k \
  809. -map 0:v -map 0:a -map 0:v -map 0:a -f hls -var_stream_map "v:0,a:0,name:my_hd v:1,a:1,name:my_sd" \
  810. http://example.com/live/out_%v.m3u8
  811. @end example
  812. This example creates two hls variant streams as in the previous one.
  813. But here, the two media playlist with file names out_my_hd.m3u8 and
  814. out_my_sd.m3u8 will be created.
  815. @example
  816. ffmpeg -re -i in.ts -b:v:0 1000k -b:v:1 256k -b:a:0 64k \
  817. -map 0:v -map 0:a -map 0:v -f hls -var_stream_map "v:0 a:0 v:1" \
  818. http://example.com/live/out_%v.m3u8
  819. @end example
  820. This example creates three hls variant streams. The first variant stream will
  821. be a video only stream with video bitrate 1000k, the second variant stream will
  822. be an audio only stream with bitrate 64k and the third variant stream will be a
  823. video only stream with bitrate 256k. Here, three media playlist with file names
  824. out_0.m3u8, out_1.m3u8 and out_2.m3u8 will be created.
  825. @example
  826. ffmpeg -re -i in.ts -b:v:0 1000k -b:v:1 256k -b:a:0 64k -b:a:1 32k \
  827. -map 0:v -map 0:a -map 0:v -map 0:a -f hls -var_stream_map "v:0,a:0 v:1,a:1" \
  828. http://example.com/live/vs_%v/out.m3u8
  829. @end example
  830. This example creates the variant streams in subdirectories. Here, the first
  831. media playlist is created at @file{http://example.com/live/vs_0/out.m3u8} and
  832. the second one at @file{http://example.com/live/vs_1/out.m3u8}.
  833. @example
  834. ffmpeg -re -i in.ts -b:a:0 32k -b:a:1 64k -b:v:0 1000k -b:v:1 3000k \
  835. -map 0:a -map 0:a -map 0:v -map 0:v -f hls \
  836. -var_stream_map "a:0,agroup:aud_low a:1,agroup:aud_high v:0,agroup:aud_low v:1,agroup:aud_high" \
  837. -master_pl_name master.m3u8 \
  838. http://example.com/live/out_%v.m3u8
  839. @end example
  840. This example creates two audio only and two video only variant streams. In
  841. addition to the #EXT-X-STREAM-INF tag for each variant stream in the master
  842. playlist, #EXT-X-MEDIA tag is also added for the two audio only variant streams
  843. and they are mapped to the two video only variant streams with audio group names
  844. 'aud_low' and 'aud_high'.
  845. By default, a single hls variant containing all the encoded streams is created.
  846. @example
  847. ffmpeg -re -i in.ts -b:a:0 32k -b:a:1 64k -b:v:0 1000k \
  848. -map 0:a -map 0:a -map 0:v -f hls \
  849. -var_stream_map "a:0,agroup:aud_low,default:yes a:1,agroup:aud_low v:0,agroup:aud_low" \
  850. -master_pl_name master.m3u8 \
  851. http://example.com/live/out_%v.m3u8
  852. @end example
  853. This example creates two audio only and one video only variant streams. In
  854. addition to the #EXT-X-STREAM-INF tag for each variant stream in the master
  855. playlist, #EXT-X-MEDIA tag is also added for the two audio only variant streams
  856. and they are mapped to the one video only variant streams with audio group name
  857. 'aud_low', and the audio group have default stat is NO or YES.
  858. By default, a single hls variant containing all the encoded streams is created.
  859. @example
  860. ffmpeg -re -i in.ts -b:a:0 32k -b:a:1 64k -b:v:0 1000k \
  861. -map 0:a -map 0:a -map 0:v -f hls \
  862. -var_stream_map "a:0,agroup:aud_low,default:yes,language:ENG a:1,agroup:aud_low,language:CHN v:0,agroup:aud_low" \
  863. -master_pl_name master.m3u8 \
  864. http://example.com/live/out_%v.m3u8
  865. @end example
  866. This example creates two audio only and one video only variant streams. In
  867. addition to the #EXT-X-STREAM-INF tag for each variant stream in the master
  868. playlist, #EXT-X-MEDIA tag is also added for the two audio only variant streams
  869. and they are mapped to the one video only variant streams with audio group name
  870. 'aud_low', and the audio group have default stat is NO or YES, and one audio
  871. have and language is named ENG, the other audio language is named CHN.
  872. By default, a single hls variant containing all the encoded streams is created.
  873. @example
  874. ffmpeg -y -i input_with_subtitle.mkv \
  875. -b:v:0 5250k -c:v h264 -pix_fmt yuv420p -profile:v main -level 4.1 \
  876. -b:a:0 256k \
  877. -c:s webvtt -c:a mp2 -ar 48000 -ac 2 -map 0:v -map 0:a:0 -map 0:s:0 \
  878. -f hls -var_stream_map "v:0,a:0,s:0,sgroup:subtitle" \
  879. -master_pl_name master.m3u8 -t 300 -hls_time 10 -hls_init_time 4 -hls_list_size \
  880. 10 -master_pl_publish_rate 10 -hls_flags \
  881. delete_segments+discont_start+split_by_time ./tmp/video.m3u8
  882. @end example
  883. This example adds @code{#EXT-X-MEDIA} tag with @code{TYPE=SUBTITLES} in
  884. the master playlist with webvtt subtitle group name 'subtitle'. Please make sure
  885. the input file has one text subtitle stream at least.
  886. @item cc_stream_map
  887. Map string which specifies different closed captions groups and their
  888. attributes. The closed captions stream groups are separated by space.
  889. Expected string format is like this
  890. "ccgroup:<group name>,instreamid:<INSTREAM-ID>,language:<language code> ....".
  891. 'ccgroup' and 'instreamid' are mandatory attributes. 'language' is an optional
  892. attribute.
  893. The closed captions groups configured using this option are mapped to different
  894. variant streams by providing the same 'ccgroup' name in the
  895. @code{var_stream_map} string. If @code{var_stream_map} is not set, then the
  896. first available ccgroup in @code{cc_stream_map} is mapped to the output variant
  897. stream. The examples for these two use cases are given below.
  898. @example
  899. ffmpeg -re -i in.ts -b:v 1000k -b:a 64k -a53cc 1 -f hls \
  900. -cc_stream_map "ccgroup:cc,instreamid:CC1,language:en" \
  901. -master_pl_name master.m3u8 \
  902. http://example.com/live/out.m3u8
  903. @end example
  904. This example adds @code{#EXT-X-MEDIA} tag with @code{TYPE=CLOSED-CAPTIONS} in
  905. the master playlist with group name 'cc', language 'en' (english) and
  906. INSTREAM-ID 'CC1'. Also, it adds @code{CLOSED-CAPTIONS} attribute with group
  907. name 'cc' for the output variant stream.
  908. @example
  909. ffmpeg -re -i in.ts -b:v:0 1000k -b:v:1 256k -b:a:0 64k -b:a:1 32k \
  910. -a53cc:0 1 -a53cc:1 1\
  911. -map 0:v -map 0:a -map 0:v -map 0:a -f hls \
  912. -cc_stream_map "ccgroup:cc,instreamid:CC1,language:en ccgroup:cc,instreamid:CC2,language:sp" \
  913. -var_stream_map "v:0,a:0,ccgroup:cc v:1,a:1,ccgroup:cc" \
  914. -master_pl_name master.m3u8 \
  915. http://example.com/live/out_%v.m3u8
  916. @end example
  917. This example adds two @code{#EXT-X-MEDIA} tags with @code{TYPE=CLOSED-CAPTIONS} in
  918. the master playlist for the INSTREAM-IDs 'CC1' and 'CC2'. Also, it adds
  919. @code{CLOSED-CAPTIONS} attribute with group name 'cc' for the two output variant
  920. streams.
  921. @item master_pl_name
  922. Create HLS master playlist with the given name.
  923. @example
  924. ffmpeg -re -i in.ts -f hls -master_pl_name master.m3u8 http://example.com/live/out.m3u8
  925. @end example
  926. This example creates HLS master playlist with name master.m3u8 and it is
  927. published at http://example.com/live/
  928. @item master_pl_publish_rate
  929. Publish master play list repeatedly every after specified number of segment intervals.
  930. @example
  931. ffmpeg -re -i in.ts -f hls -master_pl_name master.m3u8 \
  932. -hls_time 2 -master_pl_publish_rate 30 http://example.com/live/out.m3u8
  933. @end example
  934. This example creates HLS master playlist with name master.m3u8 and keep
  935. publishing it repeatedly every after 30 segments i.e. every after 60s.
  936. @item http_persistent
  937. Use persistent HTTP connections. Applicable only for HTTP output.
  938. @item timeout
  939. Set timeout for socket I/O operations. Applicable only for HTTP output.
  940. @item -ignore_io_errors
  941. Ignore IO errors during open, write and delete. Useful for long-duration runs with network output.
  942. @item headers
  943. Set custom HTTP headers, can override built in default headers. Applicable only for HTTP output.
  944. @end table
  945. @anchor{ico}
  946. @section ico
  947. ICO file muxer.
  948. Microsoft's icon file format (ICO) has some strict limitations that should be noted:
  949. @itemize
  950. @item
  951. Size cannot exceed 256 pixels in any dimension
  952. @item
  953. Only BMP and PNG images can be stored
  954. @item
  955. If a BMP image is used, it must be one of the following pixel formats:
  956. @example
  957. BMP Bit Depth FFmpeg Pixel Format
  958. 1bit pal8
  959. 4bit pal8
  960. 8bit pal8
  961. 16bit rgb555le
  962. 24bit bgr24
  963. 32bit bgra
  964. @end example
  965. @item
  966. If a BMP image is used, it must use the BITMAPINFOHEADER DIB header
  967. @item
  968. If a PNG image is used, it must use the rgba pixel format
  969. @end itemize
  970. @anchor{image2}
  971. @section image2
  972. Image file muxer.
  973. The image file muxer writes video frames to image files.
  974. The output filenames are specified by a pattern, which can be used to
  975. produce sequentially numbered series of files.
  976. The pattern may contain the string "%d" or "%0@var{N}d", this string
  977. specifies the position of the characters representing a numbering in
  978. the filenames. If the form "%0@var{N}d" is used, the string
  979. representing the number in each filename is 0-padded to @var{N}
  980. digits. The literal character '%' can be specified in the pattern with
  981. the string "%%".
  982. If the pattern contains "%d" or "%0@var{N}d", the first filename of
  983. the file list specified will contain the number 1, all the following
  984. numbers will be sequential.
  985. The pattern may contain a suffix which is used to automatically
  986. determine the format of the image files to write.
  987. For example the pattern "img-%03d.bmp" will specify a sequence of
  988. filenames of the form @file{img-001.bmp}, @file{img-002.bmp}, ...,
  989. @file{img-010.bmp}, etc.
  990. The pattern "img%%-%d.jpg" will specify a sequence of filenames of the
  991. form @file{img%-1.jpg}, @file{img%-2.jpg}, ..., @file{img%-10.jpg},
  992. etc.
  993. The image muxer supports the .Y.U.V image file format. This format is
  994. special in that that each image frame consists of three files, for
  995. each of the YUV420P components. To read or write this image file format,
  996. specify the name of the '.Y' file. The muxer will automatically open the
  997. '.U' and '.V' files as required.
  998. @subsection Options
  999. @table @option
  1000. @item frame_pts
  1001. If set to 1, expand the filename with pts from pkt->pts.
  1002. Default value is 0.
  1003. @item start_number
  1004. Start the sequence from the specified number. Default value is 1.
  1005. @item update
  1006. If set to 1, the filename will always be interpreted as just a
  1007. filename, not a pattern, and the corresponding file will be continuously
  1008. overwritten with new images. Default value is 0.
  1009. @item strftime
  1010. If set to 1, expand the filename with date and time information from
  1011. @code{strftime()}. Default value is 0.
  1012. @item protocol_opts @var{options_list}
  1013. Set protocol options as a :-separated list of key=value parameters. Values
  1014. containing the @code{:} special character must be escaped.
  1015. @end table
  1016. @subsection Examples
  1017. The following example shows how to use @command{ffmpeg} for creating a
  1018. sequence of files @file{img-001.jpeg}, @file{img-002.jpeg}, ...,
  1019. taking one image every second from the input video:
  1020. @example
  1021. ffmpeg -i in.avi -vsync cfr -r 1 -f image2 'img-%03d.jpeg'
  1022. @end example
  1023. Note that with @command{ffmpeg}, if the format is not specified with the
  1024. @code{-f} option and the output filename specifies an image file
  1025. format, the image2 muxer is automatically selected, so the previous
  1026. command can be written as:
  1027. @example
  1028. ffmpeg -i in.avi -vsync cfr -r 1 'img-%03d.jpeg'
  1029. @end example
  1030. Note also that the pattern must not necessarily contain "%d" or
  1031. "%0@var{N}d", for example to create a single image file
  1032. @file{img.jpeg} from the start of the input video you can employ the command:
  1033. @example
  1034. ffmpeg -i in.avi -f image2 -frames:v 1 img.jpeg
  1035. @end example
  1036. The @option{strftime} option allows you to expand the filename with
  1037. date and time information. Check the documentation of
  1038. the @code{strftime()} function for the syntax.
  1039. For example to generate image files from the @code{strftime()}
  1040. "%Y-%m-%d_%H-%M-%S" pattern, the following @command{ffmpeg} command
  1041. can be used:
  1042. @example
  1043. ffmpeg -f v4l2 -r 1 -i /dev/video0 -f image2 -strftime 1 "%Y-%m-%d_%H-%M-%S.jpg"
  1044. @end example
  1045. You can set the file name with current frame's PTS:
  1046. @example
  1047. ffmpeg -f v4l2 -r 1 -i /dev/video0 -copyts -f image2 -frame_pts true %d.jpg"
  1048. @end example
  1049. A more complex example is to publish contents of your desktop directly to a
  1050. WebDAV server every second:
  1051. @example
  1052. ffmpeg -f x11grab -framerate 1 -i :0.0 -q:v 6 -update 1 -protocol_opts method=PUT http://example.com/desktop.jpg
  1053. @end example
  1054. @section matroska
  1055. Matroska container muxer.
  1056. This muxer implements the matroska and webm container specs.
  1057. @subsection Metadata
  1058. The recognized metadata settings in this muxer are:
  1059. @table @option
  1060. @item title
  1061. Set title name provided to a single track. This gets mapped to
  1062. the FileDescription element for a stream written as attachment.
  1063. @item language
  1064. Specify the language of the track in the Matroska languages form.
  1065. The language can be either the 3 letters bibliographic ISO-639-2 (ISO
  1066. 639-2/B) form (like "fre" for French), or a language code mixed with a
  1067. country code for specialities in languages (like "fre-ca" for Canadian
  1068. French).
  1069. @item stereo_mode
  1070. Set stereo 3D video layout of two views in a single video track.
  1071. The following values are recognized:
  1072. @table @samp
  1073. @item mono
  1074. video is not stereo
  1075. @item left_right
  1076. Both views are arranged side by side, Left-eye view is on the left
  1077. @item bottom_top
  1078. Both views are arranged in top-bottom orientation, Left-eye view is at bottom
  1079. @item top_bottom
  1080. Both views are arranged in top-bottom orientation, Left-eye view is on top
  1081. @item checkerboard_rl
  1082. Each view is arranged in a checkerboard interleaved pattern, Left-eye view being first
  1083. @item checkerboard_lr
  1084. Each view is arranged in a checkerboard interleaved pattern, Right-eye view being first
  1085. @item row_interleaved_rl
  1086. Each view is constituted by a row based interleaving, Right-eye view is first row
  1087. @item row_interleaved_lr
  1088. Each view is constituted by a row based interleaving, Left-eye view is first row
  1089. @item col_interleaved_rl
  1090. Both views are arranged in a column based interleaving manner, Right-eye view is first column
  1091. @item col_interleaved_lr
  1092. Both views are arranged in a column based interleaving manner, Left-eye view is first column
  1093. @item anaglyph_cyan_red
  1094. All frames are in anaglyph format viewable through red-cyan filters
  1095. @item right_left
  1096. Both views are arranged side by side, Right-eye view is on the left
  1097. @item anaglyph_green_magenta
  1098. All frames are in anaglyph format viewable through green-magenta filters
  1099. @item block_lr
  1100. Both eyes laced in one Block, Left-eye view is first
  1101. @item block_rl
  1102. Both eyes laced in one Block, Right-eye view is first
  1103. @end table
  1104. @end table
  1105. For example a 3D WebM clip can be created using the following command line:
  1106. @example
  1107. ffmpeg -i sample_left_right_clip.mpg -an -c:v libvpx -metadata stereo_mode=left_right -y stereo_clip.webm
  1108. @end example
  1109. @subsection Options
  1110. This muxer supports the following options:
  1111. @table @option
  1112. @item reserve_index_space
  1113. By default, this muxer writes the index for seeking (called cues in Matroska
  1114. terms) at the end of the file, because it cannot know in advance how much space
  1115. to leave for the index at the beginning of the file. However for some use cases
  1116. -- e.g. streaming where seeking is possible but slow -- it is useful to put the
  1117. index at the beginning of the file.
  1118. If this option is set to a non-zero value, the muxer will reserve a given amount
  1119. of space in the file header and then try to write the cues there when the muxing
  1120. finishes. If the reserved space does not suffice, no Cues will be written, the
  1121. file will be finalized and writing the trailer will return an error.
  1122. A safe size for most use cases should be about 50kB per hour of video.
  1123. Note that cues are only written if the output is seekable and this option will
  1124. have no effect if it is not.
  1125. @item default_mode
  1126. This option controls how the FlagDefault of the output tracks will be set.
  1127. It influences which tracks players should play by default. The default mode
  1128. is @samp{infer}.
  1129. @table @samp
  1130. @item infer
  1131. In this mode, for each type of track (audio, video or subtitle), if there is
  1132. a track with disposition default of this type, then the first such track
  1133. (i.e. the one with the lowest index) will be marked as default; if no such
  1134. track exists, the first track of this type will be marked as default instead
  1135. (if existing). This ensures that the default flag is set in a sensible way even
  1136. if the input originated from containers that lack the concept of default tracks.
  1137. @item infer_no_subs
  1138. This mode is the same as infer except that if no subtitle track with
  1139. disposition default exists, no subtitle track will be marked as default.
  1140. @item passthrough
  1141. In this mode the FlagDefault is set if and only if the AV_DISPOSITION_DEFAULT
  1142. flag is set in the disposition of the corresponding stream.
  1143. @end table
  1144. @item flipped_raw_rgb
  1145. If set to true, store positive height for raw RGB bitmaps, which indicates
  1146. bitmap is stored bottom-up. Note that this option does not flip the bitmap
  1147. which has to be done manually beforehand, e.g. by using the vflip filter.
  1148. Default is @var{false} and indicates bitmap is stored top down.
  1149. @end table
  1150. @anchor{md5}
  1151. @section md5
  1152. MD5 testing format.
  1153. This is a variant of the @ref{hash} muxer. Unlike that muxer, it
  1154. defaults to using the MD5 hash function.
  1155. @subsection Examples
  1156. To compute the MD5 hash of the input converted to raw
  1157. audio and video, and store it in the file @file{out.md5}:
  1158. @example
  1159. ffmpeg -i INPUT -f md5 out.md5
  1160. @end example
  1161. You can print the MD5 to stdout with the command:
  1162. @example
  1163. ffmpeg -i INPUT -f md5 -
  1164. @end example
  1165. See also the @ref{hash} and @ref{framemd5} muxers.
  1166. @section mov, mp4, ismv
  1167. MOV/MP4/ISMV (Smooth Streaming) muxer.
  1168. The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4
  1169. file has all the metadata about all packets stored in one location
  1170. (written at the end of the file, it can be moved to the start for
  1171. better playback by adding @var{faststart} to the @var{movflags}, or
  1172. using the @command{qt-faststart} tool). A fragmented
  1173. file consists of a number of fragments, where packets and metadata
  1174. about these packets are stored together. Writing a fragmented
  1175. file has the advantage that the file is decodable even if the
  1176. writing is interrupted (while a normal MOV/MP4 is undecodable if
  1177. it is not properly finished), and it requires less memory when writing
  1178. very long files (since writing normal MOV/MP4 files stores info about
  1179. every single packet in memory until the file is closed). The downside
  1180. is that it is less compatible with other applications.
  1181. @subsection Options
  1182. Fragmentation is enabled by setting one of the AVOptions that define
  1183. how to cut the file into fragments:
  1184. @table @option
  1185. @item -moov_size @var{bytes}
  1186. Reserves space for the moov atom at the beginning of the file instead of placing the
  1187. moov atom at the end. If the space reserved is insufficient, muxing will fail.
  1188. @item -movflags frag_keyframe
  1189. Start a new fragment at each video keyframe.
  1190. @item -frag_duration @var{duration}
  1191. Create fragments that are @var{duration} microseconds long.
  1192. @item -frag_size @var{size}
  1193. Create fragments that contain up to @var{size} bytes of payload data.
  1194. @item -movflags frag_custom
  1195. Allow the caller to manually choose when to cut fragments, by
  1196. calling @code{av_write_frame(ctx, NULL)} to write a fragment with
  1197. the packets written so far. (This is only useful with other
  1198. applications integrating libavformat, not from @command{ffmpeg}.)
  1199. @item -min_frag_duration @var{duration}
  1200. Don't create fragments that are shorter than @var{duration} microseconds long.
  1201. @end table
  1202. If more than one condition is specified, fragments are cut when
  1203. one of the specified conditions is fulfilled. The exception to this is
  1204. @code{-min_frag_duration}, which has to be fulfilled for any of the other
  1205. conditions to apply.
  1206. Additionally, the way the output file is written can be adjusted
  1207. through a few other options:
  1208. @table @option
  1209. @item -movflags empty_moov
  1210. Write an initial moov atom directly at the start of the file, without
  1211. describing any samples in it. Generally, an mdat/moov pair is written
  1212. at the start of the file, as a normal MOV/MP4 file, containing only
  1213. a short portion of the file. With this option set, there is no initial
  1214. mdat atom, and the moov atom only describes the tracks but has
  1215. a zero duration.
  1216. This option is implicitly set when writing ismv (Smooth Streaming) files.
  1217. @item -movflags separate_moof
  1218. Write a separate moof (movie fragment) atom for each track. Normally,
  1219. packets for all tracks are written in a moof atom (which is slightly
  1220. more efficient), but with this option set, the muxer writes one moof/mdat
  1221. pair for each track, making it easier to separate tracks.
  1222. This option is implicitly set when writing ismv (Smooth Streaming) files.
  1223. @item -movflags skip_sidx
  1224. Skip writing of sidx atom. When bitrate overhead due to sidx atom is high,
  1225. this option could be used for cases where sidx atom is not mandatory.
  1226. When global_sidx flag is enabled, this option will be ignored.
  1227. @item -movflags faststart
  1228. Run a second pass moving the index (moov atom) to the beginning of the file.
  1229. This operation can take a while, and will not work in various situations such
  1230. as fragmented output, thus it is not enabled by default.
  1231. @item -movflags rtphint
  1232. Add RTP hinting tracks to the output file.
  1233. @item -movflags disable_chpl
  1234. Disable Nero chapter markers (chpl atom). Normally, both Nero chapters
  1235. and a QuickTime chapter track are written to the file. With this option
  1236. set, only the QuickTime chapter track will be written. Nero chapters can
  1237. cause failures when the file is reprocessed with certain tagging programs, like
  1238. mp3Tag 2.61a and iTunes 11.3, most likely other versions are affected as well.
  1239. @item -movflags omit_tfhd_offset
  1240. Do not write any absolute base_data_offset in tfhd atoms. This avoids
  1241. tying fragments to absolute byte positions in the file/streams.
  1242. @item -movflags default_base_moof
  1243. Similarly to the omit_tfhd_offset, this flag avoids writing the
  1244. absolute base_data_offset field in tfhd atoms, but does so by using
  1245. the new default-base-is-moof flag instead. This flag is new from
  1246. 14496-12:2012. This may make the fragments easier to parse in certain
  1247. circumstances (avoiding basing track fragment location calculations
  1248. on the implicit end of the previous track fragment).
  1249. @item -write_tmcd
  1250. Specify @code{on} to force writing a timecode track, @code{off} to disable it
  1251. and @code{auto} to write a timecode track only for mov and mp4 output (default).
  1252. @item -movflags negative_cts_offsets
  1253. Enables utilization of version 1 of the CTTS box, in which the CTS offsets can
  1254. be negative. This enables the initial sample to have DTS/CTS of zero, and
  1255. reduces the need for edit lists for some cases such as video tracks with
  1256. B-frames. Additionally, eases conformance with the DASH-IF interoperability
  1257. guidelines.
  1258. This option is implicitly set when writing ismv (Smooth Streaming) files.
  1259. @item -write_prft
  1260. Write producer time reference box (PRFT) with a specified time source for the
  1261. NTP field in the PRFT box. Set value as @samp{wallclock} to specify timesource
  1262. as wallclock time and @samp{pts} to specify timesource as input packets' PTS
  1263. values.
  1264. Setting value to @samp{pts} is applicable only for a live encoding use case,
  1265. where PTS values are set as as wallclock time at the source. For example, an
  1266. encoding use case with decklink capture source where @option{video_pts} and
  1267. @option{audio_pts} are set to @samp{abs_wallclock}.
  1268. @end table
  1269. @subsection Example
  1270. Smooth Streaming content can be pushed in real time to a publishing
  1271. point on IIS with this muxer. Example:
  1272. @example
  1273. ffmpeg -re @var{<normal input/transcoding options>} -movflags isml+frag_keyframe -f ismv http://server/publishingpoint.isml/Streams(Encoder1)
  1274. @end example
  1275. @section mp3
  1276. The MP3 muxer writes a raw MP3 stream with the following optional features:
  1277. @itemize @bullet
  1278. @item
  1279. An ID3v2 metadata header at the beginning (enabled by default). Versions 2.3 and
  1280. 2.4 are supported, the @code{id3v2_version} private option controls which one is
  1281. used (3 or 4). Setting @code{id3v2_version} to 0 disables the ID3v2 header
  1282. completely.
  1283. The muxer supports writing attached pictures (APIC frames) to the ID3v2 header.
  1284. The pictures are supplied to the muxer in form of a video stream with a single
  1285. packet. There can be any number of those streams, each will correspond to a
  1286. single APIC frame. The stream metadata tags @var{title} and @var{comment} map
  1287. to APIC @var{description} and @var{picture type} respectively. See
  1288. @url{http://id3.org/id3v2.4.0-frames} for allowed picture types.
  1289. Note that the APIC frames must be written at the beginning, so the muxer will
  1290. buffer the audio frames until it gets all the pictures. It is therefore advised
  1291. to provide the pictures as soon as possible to avoid excessive buffering.
  1292. @item
  1293. A Xing/LAME frame right after the ID3v2 header (if present). It is enabled by
  1294. default, but will be written only if the output is seekable. The
  1295. @code{write_xing} private option can be used to disable it. The frame contains
  1296. various information that may be useful to the decoder, like the audio duration
  1297. or encoder delay.
  1298. @item
  1299. A legacy ID3v1 tag at the end of the file (disabled by default). It may be
  1300. enabled with the @code{write_id3v1} private option, but as its capabilities are
  1301. very limited, its usage is not recommended.
  1302. @end itemize
  1303. Examples:
  1304. Write an mp3 with an ID3v2.3 header and an ID3v1 footer:
  1305. @example
  1306. ffmpeg -i INPUT -id3v2_version 3 -write_id3v1 1 out.mp3
  1307. @end example
  1308. To attach a picture to an mp3 file select both the audio and the picture stream
  1309. with @code{map}:
  1310. @example
  1311. ffmpeg -i input.mp3 -i cover.png -c copy -map 0 -map 1
  1312. -metadata:s:v title="Album cover" -metadata:s:v comment="Cover (Front)" out.mp3
  1313. @end example
  1314. Write a "clean" MP3 without any extra features:
  1315. @example
  1316. ffmpeg -i input.wav -write_xing 0 -id3v2_version 0 out.mp3
  1317. @end example
  1318. @section mpegts
  1319. MPEG transport stream muxer.
  1320. This muxer implements ISO 13818-1 and part of ETSI EN 300 468.
  1321. The recognized metadata settings in mpegts muxer are @code{service_provider}
  1322. and @code{service_name}. If they are not set the default for
  1323. @code{service_provider} is @samp{FFmpeg} and the default for
  1324. @code{service_name} is @samp{Service01}.
  1325. @subsection Options
  1326. The muxer options are:
  1327. @table @option
  1328. @item mpegts_transport_stream_id @var{integer}
  1329. Set the @samp{transport_stream_id}. This identifies a transponder in DVB.
  1330. Default is @code{0x0001}.
  1331. @item mpegts_original_network_id @var{integer}
  1332. Set the @samp{original_network_id}. This is unique identifier of a
  1333. network in DVB. Its main use is in the unique identification of a service
  1334. through the path @samp{Original_Network_ID, Transport_Stream_ID}. Default
  1335. is @code{0x0001}.
  1336. @item mpegts_service_id @var{integer}
  1337. Set the @samp{service_id}, also known as program in DVB. Default is
  1338. @code{0x0001}.
  1339. @item mpegts_service_type @var{integer}
  1340. Set the program @samp{service_type}. Default is @code{digital_tv}.
  1341. Accepts the following options:
  1342. @table @samp
  1343. @item hex_value
  1344. Any hexadecimal value between @code{0x01} and @code{0xff} as defined in
  1345. ETSI 300 468.
  1346. @item digital_tv
  1347. Digital TV service.
  1348. @item digital_radio
  1349. Digital Radio service.
  1350. @item teletext
  1351. Teletext service.
  1352. @item advanced_codec_digital_radio
  1353. Advanced Codec Digital Radio service.
  1354. @item mpeg2_digital_hdtv
  1355. MPEG2 Digital HDTV service.
  1356. @item advanced_codec_digital_sdtv
  1357. Advanced Codec Digital SDTV service.
  1358. @item advanced_codec_digital_hdtv
  1359. Advanced Codec Digital HDTV service.
  1360. @end table
  1361. @item mpegts_pmt_start_pid @var{integer}
  1362. Set the first PID for PMTs. Default is @code{0x1000}, minimum is @code{0x0020},
  1363. maximum is @code{0x1ffa}. This option has no effect in m2ts mode where the PMT
  1364. PID is fixed @code{0x0100}.
  1365. @item mpegts_start_pid @var{integer}
  1366. Set the first PID for elementary streams. Default is @code{0x0100}, minimum is
  1367. @code{0x0020}, maximum is @code{0x1ffa}. This option has no effect in m2ts mode
  1368. where the elementary stream PIDs are fixed.
  1369. @item mpegts_m2ts_mode @var{boolean}
  1370. Enable m2ts mode if set to @code{1}. Default value is @code{-1} which
  1371. disables m2ts mode.
  1372. @item muxrate @var{integer}
  1373. Set a constant muxrate. Default is VBR.
  1374. @item pes_payload_size @var{integer}
  1375. Set minimum PES packet payload in bytes. Default is @code{2930}.
  1376. @item mpegts_flags @var{flags}
  1377. Set mpegts flags. Accepts the following options:
  1378. @table @samp
  1379. @item resend_headers
  1380. Reemit PAT/PMT before writing the next packet.
  1381. @item latm
  1382. Use LATM packetization for AAC.
  1383. @item pat_pmt_at_frames
  1384. Reemit PAT and PMT at each video frame.
  1385. @item system_b
  1386. Conform to System B (DVB) instead of System A (ATSC).
  1387. @item initial_discontinuity
  1388. Mark the initial packet of each stream as discontinuity.
  1389. @end table
  1390. @item mpegts_copyts @var{boolean}
  1391. Preserve original timestamps, if value is set to @code{1}. Default value
  1392. is @code{-1}, which results in shifting timestamps so that they start from 0.
  1393. @item omit_video_pes_length @var{boolean}
  1394. Omit the PES packet length for video packets. Default is @code{1} (true).
  1395. @item pcr_period @var{integer}
  1396. Override the default PCR retransmission time in milliseconds. Default is
  1397. @code{-1} which means that the PCR interval will be determined automatically:
  1398. 20 ms is used for CBR streams, the highest multiple of the frame duration which
  1399. is less than 100 ms is used for VBR streams.
  1400. @item pat_period @var{duration}
  1401. Maximum time in seconds between PAT/PMT tables. Default is @code{0.1}.
  1402. @item sdt_period @var{duration}
  1403. Maximum time in seconds between SDT tables. Default is @code{0.5}.
  1404. @item tables_version @var{integer}
  1405. Set PAT, PMT and SDT version (default @code{0}, valid values are from 0 to 31, inclusively).
  1406. This option allows updating stream structure so that standard consumer may
  1407. detect the change. To do so, reopen output @code{AVFormatContext} (in case of API
  1408. usage) or restart @command{ffmpeg} instance, cyclically changing
  1409. @option{tables_version} value:
  1410. @example
  1411. ffmpeg -i source1.ts -codec copy -f mpegts -tables_version 0 udp://1.1.1.1:1111
  1412. ffmpeg -i source2.ts -codec copy -f mpegts -tables_version 1 udp://1.1.1.1:1111
  1413. ...
  1414. ffmpeg -i source3.ts -codec copy -f mpegts -tables_version 31 udp://1.1.1.1:1111
  1415. ffmpeg -i source1.ts -codec copy -f mpegts -tables_version 0 udp://1.1.1.1:1111
  1416. ffmpeg -i source2.ts -codec copy -f mpegts -tables_version 1 udp://1.1.1.1:1111
  1417. ...
  1418. @end example
  1419. @end table
  1420. @subsection Example
  1421. @example
  1422. ffmpeg -i file.mpg -c copy \
  1423. -mpegts_original_network_id 0x1122 \
  1424. -mpegts_transport_stream_id 0x3344 \
  1425. -mpegts_service_id 0x5566 \
  1426. -mpegts_pmt_start_pid 0x1500 \
  1427. -mpegts_start_pid 0x150 \
  1428. -metadata service_provider="Some provider" \
  1429. -metadata service_name="Some Channel" \
  1430. out.ts
  1431. @end example
  1432. @section mxf, mxf_d10, mxf_opatom
  1433. MXF muxer.
  1434. @subsection Options
  1435. The muxer options are:
  1436. @table @option
  1437. @item store_user_comments @var{bool}
  1438. Set if user comments should be stored if available or never.
  1439. IRT D-10 does not allow user comments. The default is thus to write them for
  1440. mxf and mxf_opatom but not for mxf_d10
  1441. @end table
  1442. @section null
  1443. Null muxer.
  1444. This muxer does not generate any output file, it is mainly useful for
  1445. testing or benchmarking purposes.
  1446. For example to benchmark decoding with @command{ffmpeg} you can use the
  1447. command:
  1448. @example
  1449. ffmpeg -benchmark -i INPUT -f null out.null
  1450. @end example
  1451. Note that the above command does not read or write the @file{out.null}
  1452. file, but specifying the output file is required by the @command{ffmpeg}
  1453. syntax.
  1454. Alternatively you can write the command as:
  1455. @example
  1456. ffmpeg -benchmark -i INPUT -f null -
  1457. @end example
  1458. @section nut
  1459. @table @option
  1460. @item -syncpoints @var{flags}
  1461. Change the syncpoint usage in nut:
  1462. @table @option
  1463. @item @var{default} use the normal low-overhead seeking aids.
  1464. @item @var{none} do not use the syncpoints at all, reducing the overhead but making the stream non-seekable;
  1465. Use of this option is not recommended, as the resulting files are very damage
  1466. sensitive and seeking is not possible. Also in general the overhead from
  1467. syncpoints is negligible. Note, -@code{write_index} 0 can be used to disable
  1468. all growing data tables, allowing to mux endless streams with limited memory
  1469. and without these disadvantages.
  1470. @item @var{timestamped} extend the syncpoint with a wallclock field.
  1471. @end table
  1472. The @var{none} and @var{timestamped} flags are experimental.
  1473. @item -write_index @var{bool}
  1474. Write index at the end, the default is to write an index.
  1475. @end table
  1476. @example
  1477. ffmpeg -i INPUT -f_strict experimental -syncpoints none - | processor
  1478. @end example
  1479. @section ogg
  1480. Ogg container muxer.
  1481. @table @option
  1482. @item -page_duration @var{duration}
  1483. Preferred page duration, in microseconds. The muxer will attempt to create
  1484. pages that are approximately @var{duration} microseconds long. This allows the
  1485. user to compromise between seek granularity and container overhead. The default
  1486. is 1 second. A value of 0 will fill all segments, making pages as large as
  1487. possible. A value of 1 will effectively use 1 packet-per-page in most
  1488. situations, giving a small seek granularity at the cost of additional container
  1489. overhead.
  1490. @item -serial_offset @var{value}
  1491. Serial value from which to set the streams serial number.
  1492. Setting it to different and sufficiently large values ensures that the produced
  1493. ogg files can be safely chained.
  1494. @end table
  1495. @anchor{segment}
  1496. @section segment, stream_segment, ssegment
  1497. Basic stream segmenter.
  1498. This muxer outputs streams to a number of separate files of nearly
  1499. fixed duration. Output filename pattern can be set in a fashion
  1500. similar to @ref{image2}, or by using a @code{strftime} template if
  1501. the @option{strftime} option is enabled.
  1502. @code{stream_segment} is a variant of the muxer used to write to
  1503. streaming output formats, i.e. which do not require global headers,
  1504. and is recommended for outputting e.g. to MPEG transport stream segments.
  1505. @code{ssegment} is a shorter alias for @code{stream_segment}.
  1506. Every segment starts with a keyframe of the selected reference stream,
  1507. which is set through the @option{reference_stream} option.
  1508. Note that if you want accurate splitting for a video file, you need to
  1509. make the input key frames correspond to the exact splitting times
  1510. expected by the segmenter, or the segment muxer will start the new
  1511. segment with the key frame found next after the specified start
  1512. time.
  1513. The segment muxer works best with a single constant frame rate video.
  1514. Optionally it can generate a list of the created segments, by setting
  1515. the option @var{segment_list}. The list type is specified by the
  1516. @var{segment_list_type} option. The entry filenames in the segment
  1517. list are set by default to the basename of the corresponding segment
  1518. files.
  1519. See also the @ref{hls} muxer, which provides a more specific
  1520. implementation for HLS segmentation.
  1521. @subsection Options
  1522. The segment muxer supports the following options:
  1523. @table @option
  1524. @item increment_tc @var{1|0}
  1525. if set to @code{1}, increment timecode between each segment
  1526. If this is selected, the input need to have
  1527. a timecode in the first video stream. Default value is
  1528. @code{0}.
  1529. @item reference_stream @var{specifier}
  1530. Set the reference stream, as specified by the string @var{specifier}.
  1531. If @var{specifier} is set to @code{auto}, the reference is chosen
  1532. automatically. Otherwise it must be a stream specifier (see the ``Stream
  1533. specifiers'' chapter in the ffmpeg manual) which specifies the
  1534. reference stream. The default value is @code{auto}.
  1535. @item segment_format @var{format}
  1536. Override the inner container format, by default it is guessed by the filename
  1537. extension.
  1538. @item segment_format_options @var{options_list}
  1539. Set output format options using a :-separated list of key=value
  1540. parameters. Values containing the @code{:} special character must be
  1541. escaped.
  1542. @item segment_list @var{name}
  1543. Generate also a listfile named @var{name}. If not specified no
  1544. listfile is generated.
  1545. @item segment_list_flags @var{flags}
  1546. Set flags affecting the segment list generation.
  1547. It currently supports the following flags:
  1548. @table @samp
  1549. @item cache
  1550. Allow caching (only affects M3U8 list files).
  1551. @item live
  1552. Allow live-friendly file generation.
  1553. @end table
  1554. @item segment_list_size @var{size}
  1555. Update the list file so that it contains at most @var{size}
  1556. segments. If 0 the list file will contain all the segments. Default
  1557. value is 0.
  1558. @item segment_list_entry_prefix @var{prefix}
  1559. Prepend @var{prefix} to each entry. Useful to generate absolute paths.
  1560. By default no prefix is applied.
  1561. @item segment_list_type @var{type}
  1562. Select the listing format.
  1563. The following values are recognized:
  1564. @table @samp
  1565. @item flat
  1566. Generate a flat list for the created segments, one segment per line.
  1567. @item csv, ext
  1568. Generate a list for the created segments, one segment per line,
  1569. each line matching the format (comma-separated values):
  1570. @example
  1571. @var{segment_filename},@var{segment_start_time},@var{segment_end_time}
  1572. @end example
  1573. @var{segment_filename} is the name of the output file generated by the
  1574. muxer according to the provided pattern. CSV escaping (according to
  1575. RFC4180) is applied if required.
  1576. @var{segment_start_time} and @var{segment_end_time} specify
  1577. the segment start and end time expressed in seconds.
  1578. A list file with the suffix @code{".csv"} or @code{".ext"} will
  1579. auto-select this format.
  1580. @samp{ext} is deprecated in favor or @samp{csv}.
  1581. @item ffconcat
  1582. Generate an ffconcat file for the created segments. The resulting file
  1583. can be read using the FFmpeg @ref{concat} demuxer.
  1584. A list file with the suffix @code{".ffcat"} or @code{".ffconcat"} will
  1585. auto-select this format.
  1586. @item m3u8
  1587. Generate an extended M3U8 file, version 3, compliant with
  1588. @url{http://tools.ietf.org/id/draft-pantos-http-live-streaming}.
  1589. A list file with the suffix @code{".m3u8"} will auto-select this format.
  1590. @end table
  1591. If not specified the type is guessed from the list file name suffix.
  1592. @item segment_time @var{time}
  1593. Set segment duration to @var{time}, the value must be a duration
  1594. specification. Default value is "2". See also the
  1595. @option{segment_times} option.
  1596. Note that splitting may not be accurate, unless you force the
  1597. reference stream key-frames at the given time. See the introductory
  1598. notice and the examples below.
  1599. @item segment_atclocktime @var{1|0}
  1600. If set to "1" split at regular clock time intervals starting from 00:00
  1601. o'clock. The @var{time} value specified in @option{segment_time} is
  1602. used for setting the length of the splitting interval.
  1603. For example with @option{segment_time} set to "900" this makes it possible
  1604. to create files at 12:00 o'clock, 12:15, 12:30, etc.
  1605. Default value is "0".
  1606. @item segment_clocktime_offset @var{duration}
  1607. Delay the segment splitting times with the specified duration when using
  1608. @option{segment_atclocktime}.
  1609. For example with @option{segment_time} set to "900" and
  1610. @option{segment_clocktime_offset} set to "300" this makes it possible to
  1611. create files at 12:05, 12:20, 12:35, etc.
  1612. Default value is "0".
  1613. @item segment_clocktime_wrap_duration @var{duration}
  1614. Force the segmenter to only start a new segment if a packet reaches the muxer
  1615. within the specified duration after the segmenting clock time. This way you
  1616. can make the segmenter more resilient to backward local time jumps, such as
  1617. leap seconds or transition to standard time from daylight savings time.
  1618. Default is the maximum possible duration which means starting a new segment
  1619. regardless of the elapsed time since the last clock time.
  1620. @item segment_time_delta @var{delta}
  1621. Specify the accuracy time when selecting the start time for a
  1622. segment, expressed as a duration specification. Default value is "0".
  1623. When delta is specified a key-frame will start a new segment if its
  1624. PTS satisfies the relation:
  1625. @example
  1626. PTS >= start_time - time_delta
  1627. @end example
  1628. This option is useful when splitting video content, which is always
  1629. split at GOP boundaries, in case a key frame is found just before the
  1630. specified split time.
  1631. In particular may be used in combination with the @file{ffmpeg} option
  1632. @var{force_key_frames}. The key frame times specified by
  1633. @var{force_key_frames} may not be set accurately because of rounding
  1634. issues, with the consequence that a key frame time may result set just
  1635. before the specified time. For constant frame rate videos a value of
  1636. 1/(2*@var{frame_rate}) should address the worst case mismatch between
  1637. the specified time and the time set by @var{force_key_frames}.
  1638. @item segment_times @var{times}
  1639. Specify a list of split points. @var{times} contains a list of comma
  1640. separated duration specifications, in increasing order. See also
  1641. the @option{segment_time} option.
  1642. @item segment_frames @var{frames}
  1643. Specify a list of split video frame numbers. @var{frames} contains a
  1644. list of comma separated integer numbers, in increasing order.
  1645. This option specifies to start a new segment whenever a reference
  1646. stream key frame is found and the sequential number (starting from 0)
  1647. of the frame is greater or equal to the next value in the list.
  1648. @item segment_wrap @var{limit}
  1649. Wrap around segment index once it reaches @var{limit}.
  1650. @item segment_start_number @var{number}
  1651. Set the sequence number of the first segment. Defaults to @code{0}.
  1652. @item strftime @var{1|0}
  1653. Use the @code{strftime} function to define the name of the new
  1654. segments to write. If this is selected, the output segment name must
  1655. contain a @code{strftime} function template. Default value is
  1656. @code{0}.
  1657. @item break_non_keyframes @var{1|0}
  1658. If enabled, allow segments to start on frames other than keyframes. This
  1659. improves behavior on some players when the time between keyframes is
  1660. inconsistent, but may make things worse on others, and can cause some oddities
  1661. during seeking. Defaults to @code{0}.
  1662. @item reset_timestamps @var{1|0}
  1663. Reset timestamps at the beginning of each segment, so that each segment
  1664. will start with near-zero timestamps. It is meant to ease the playback
  1665. of the generated segments. May not work with some combinations of
  1666. muxers/codecs. It is set to @code{0} by default.
  1667. @item initial_offset @var{offset}
  1668. Specify timestamp offset to apply to the output packet timestamps. The
  1669. argument must be a time duration specification, and defaults to 0.
  1670. @item write_empty_segments @var{1|0}
  1671. If enabled, write an empty segment if there are no packets during the period a
  1672. segment would usually span. Otherwise, the segment will be filled with the next
  1673. packet written. Defaults to @code{0}.
  1674. @end table
  1675. Make sure to require a closed GOP when encoding and to set the GOP
  1676. size to fit your segment time constraint.
  1677. @subsection Examples
  1678. @itemize
  1679. @item
  1680. Remux the content of file @file{in.mkv} to a list of segments
  1681. @file{out-000.nut}, @file{out-001.nut}, etc., and write the list of
  1682. generated segments to @file{out.list}:
  1683. @example
  1684. ffmpeg -i in.mkv -codec hevc -flags +cgop -g 60 -map 0 -f segment -segment_list out.list out%03d.nut
  1685. @end example
  1686. @item
  1687. Segment input and set output format options for the output segments:
  1688. @example
  1689. ffmpeg -i in.mkv -f segment -segment_time 10 -segment_format_options movflags=+faststart out%03d.mp4
  1690. @end example
  1691. @item
  1692. Segment the input file according to the split points specified by the
  1693. @var{segment_times} option:
  1694. @example
  1695. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.csv -segment_times 1,2,3,5,8,13,21 out%03d.nut
  1696. @end example
  1697. @item
  1698. Use the @command{ffmpeg} @option{force_key_frames}
  1699. option to force key frames in the input at the specified location, together
  1700. with the segment option @option{segment_time_delta} to account for
  1701. possible roundings operated when setting key frame times.
  1702. @example
  1703. ffmpeg -i in.mkv -force_key_frames 1,2,3,5,8,13,21 -codec:v mpeg4 -codec:a pcm_s16le -map 0 \
  1704. -f segment -segment_list out.csv -segment_times 1,2,3,5,8,13,21 -segment_time_delta 0.05 out%03d.nut
  1705. @end example
  1706. In order to force key frames on the input file, transcoding is
  1707. required.
  1708. @item
  1709. Segment the input file by splitting the input file according to the
  1710. frame numbers sequence specified with the @option{segment_frames} option:
  1711. @example
  1712. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.csv -segment_frames 100,200,300,500,800 out%03d.nut
  1713. @end example
  1714. @item
  1715. Convert the @file{in.mkv} to TS segments using the @code{libx264}
  1716. and @code{aac} encoders:
  1717. @example
  1718. ffmpeg -i in.mkv -map 0 -codec:v libx264 -codec:a aac -f ssegment -segment_list out.list out%03d.ts
  1719. @end example
  1720. @item
  1721. Segment the input file, and create an M3U8 live playlist (can be used
  1722. as live HLS source):
  1723. @example
  1724. ffmpeg -re -i in.mkv -codec copy -map 0 -f segment -segment_list playlist.m3u8 \
  1725. -segment_list_flags +live -segment_time 10 out%03d.mkv
  1726. @end example
  1727. @end itemize
  1728. @section smoothstreaming
  1729. Smooth Streaming muxer generates a set of files (Manifest, chunks) suitable for serving with conventional web server.
  1730. @table @option
  1731. @item window_size
  1732. Specify the number of fragments kept in the manifest. Default 0 (keep all).
  1733. @item extra_window_size
  1734. Specify the number of fragments kept outside of the manifest before removing from disk. Default 5.
  1735. @item lookahead_count
  1736. Specify the number of lookahead fragments. Default 2.
  1737. @item min_frag_duration
  1738. Specify the minimum fragment duration (in microseconds). Default 5000000.
  1739. @item remove_at_exit
  1740. Specify whether to remove all fragments when finished. Default 0 (do not remove).
  1741. @end table
  1742. @anchor{streamhash}
  1743. @section streamhash
  1744. Per stream hash testing format.
  1745. This muxer computes and prints a cryptographic hash of all the input frames,
  1746. on a per-stream basis. This can be used for equality checks without having
  1747. to do a complete binary comparison.
  1748. By default audio frames are converted to signed 16-bit raw audio and
  1749. video frames to raw video before computing the hash, but the output
  1750. of explicit conversions to other codecs can also be used. Timestamps
  1751. are ignored. It uses the SHA-256 cryptographic hash function by default,
  1752. but supports several other algorithms.
  1753. The output of the muxer consists of one line per stream of the form:
  1754. @var{streamindex},@var{streamtype},@var{algo}=@var{hash}, where
  1755. @var{streamindex} is the index of the mapped stream, @var{streamtype} is a
  1756. single character indicating the type of stream, @var{algo} is a short string
  1757. representing the hash function used, and @var{hash} is a hexadecimal number
  1758. representing the computed hash.
  1759. @table @option
  1760. @item hash @var{algorithm}
  1761. Use the cryptographic hash function specified by the string @var{algorithm}.
  1762. Supported values include @code{MD5}, @code{murmur3}, @code{RIPEMD128},
  1763. @code{RIPEMD160}, @code{RIPEMD256}, @code{RIPEMD320}, @code{SHA160},
  1764. @code{SHA224}, @code{SHA256} (default), @code{SHA512/224}, @code{SHA512/256},
  1765. @code{SHA384}, @code{SHA512}, @code{CRC32} and @code{adler32}.
  1766. @end table
  1767. @subsection Examples
  1768. To compute the SHA-256 hash of the input converted to raw audio and
  1769. video, and store it in the file @file{out.sha256}:
  1770. @example
  1771. ffmpeg -i INPUT -f streamhash out.sha256
  1772. @end example
  1773. To print an MD5 hash to stdout use the command:
  1774. @example
  1775. ffmpeg -i INPUT -f streamhash -hash md5 -
  1776. @end example
  1777. See also the @ref{hash} and @ref{framehash} muxers.
  1778. @anchor{fifo}
  1779. @section fifo
  1780. The fifo pseudo-muxer allows the separation of encoding and muxing by using
  1781. first-in-first-out queue and running the actual muxer in a separate thread. This
  1782. is especially useful in combination with the @ref{tee} muxer and can be used to
  1783. send data to several destinations with different reliability/writing speed/latency.
  1784. API users should be aware that callback functions (interrupt_callback,
  1785. io_open and io_close) used within its AVFormatContext must be thread-safe.
  1786. The behavior of the fifo muxer if the queue fills up or if the output fails is
  1787. selectable,
  1788. @itemize @bullet
  1789. @item
  1790. output can be transparently restarted with configurable delay between retries
  1791. based on real time or time of the processed stream.
  1792. @item
  1793. encoding can be blocked during temporary failure, or continue transparently
  1794. dropping packets in case fifo queue fills up.
  1795. @end itemize
  1796. @table @option
  1797. @item fifo_format
  1798. Specify the format name. Useful if it cannot be guessed from the
  1799. output name suffix.
  1800. @item queue_size
  1801. Specify size of the queue (number of packets). Default value is 60.
  1802. @item format_opts
  1803. Specify format options for the underlying muxer. Muxer options can be specified
  1804. as a list of @var{key}=@var{value} pairs separated by ':'.
  1805. @item drop_pkts_on_overflow @var{bool}
  1806. If set to 1 (true), in case the fifo queue fills up, packets will be dropped
  1807. rather than blocking the encoder. This makes it possible to continue streaming without
  1808. delaying the input, at the cost of omitting part of the stream. By default
  1809. this option is set to 0 (false), so in such cases the encoder will be blocked
  1810. until the muxer processes some of the packets and none of them is lost.
  1811. @item attempt_recovery @var{bool}
  1812. If failure occurs, attempt to recover the output. This is especially useful
  1813. when used with network output, since it makes it possible to restart streaming transparently.
  1814. By default this option is set to 0 (false).
  1815. @item max_recovery_attempts
  1816. Sets maximum number of successive unsuccessful recovery attempts after which
  1817. the output fails permanently. By default this option is set to 0 (unlimited).
  1818. @item recovery_wait_time @var{duration}
  1819. Waiting time before the next recovery attempt after previous unsuccessful
  1820. recovery attempt. Default value is 5 seconds.
  1821. @item recovery_wait_streamtime @var{bool}
  1822. If set to 0 (false), the real time is used when waiting for the recovery
  1823. attempt (i.e. the recovery will be attempted after at least
  1824. recovery_wait_time seconds).
  1825. If set to 1 (true), the time of the processed stream is taken into account
  1826. instead (i.e. the recovery will be attempted after at least @var{recovery_wait_time}
  1827. seconds of the stream is omitted).
  1828. By default, this option is set to 0 (false).
  1829. @item recover_any_error @var{bool}
  1830. If set to 1 (true), recovery will be attempted regardless of type of the error
  1831. causing the failure. By default this option is set to 0 (false) and in case of
  1832. certain (usually permanent) errors the recovery is not attempted even when
  1833. @var{attempt_recovery} is set to 1.
  1834. @item restart_with_keyframe @var{bool}
  1835. Specify whether to wait for the keyframe after recovering from
  1836. queue overflow or failure. This option is set to 0 (false) by default.
  1837. @item timeshift @var{duration}
  1838. Buffer the specified amount of packets and delay writing the output. Note that
  1839. @var{queue_size} must be big enough to store the packets for timeshift. At the
  1840. end of the input the fifo buffer is flushed at realtime speed.
  1841. @end table
  1842. @subsection Examples
  1843. @itemize
  1844. @item
  1845. Stream something to rtmp server, continue processing the stream at real-time
  1846. rate even in case of temporary failure (network outage) and attempt to recover
  1847. streaming every second indefinitely.
  1848. @example
  1849. ffmpeg -re -i ... -c:v libx264 -c:a aac -f fifo -fifo_format flv -map 0:v -map 0:a
  1850. -drop_pkts_on_overflow 1 -attempt_recovery 1 -recovery_wait_time 1 rtmp://example.com/live/stream_name
  1851. @end example
  1852. @end itemize
  1853. @anchor{tee}
  1854. @section tee
  1855. The tee muxer can be used to write the same data to several outputs, such as files or streams.
  1856. It can be used, for example, to stream a video over a network and save it to disk at the same time.
  1857. It is different from specifying several outputs to the @command{ffmpeg}
  1858. command-line tool. With the tee muxer, the audio and video data will be encoded only once.
  1859. With conventional multiple outputs, multiple encoding operations in parallel are initiated,
  1860. which can be a very expensive process. The tee muxer is not useful when using the libavformat API
  1861. directly because it is then possible to feed the same packets to several muxers directly.
  1862. Since the tee muxer does not represent any particular output format, ffmpeg cannot auto-select
  1863. output streams. So all streams intended for output must be specified using @code{-map}. See
  1864. the examples below.
  1865. Some encoders may need different options depending on the output format;
  1866. the auto-detection of this can not work with the tee muxer, so they need to be explicitly specified.
  1867. The main example is the @option{global_header} flag.
  1868. The slave outputs are specified in the file name given to the muxer,
  1869. separated by '|'. If any of the slave name contains the '|' separator,
  1870. leading or trailing spaces or any special character, those must be
  1871. escaped (see @ref{quoting_and_escaping,,the "Quoting and escaping"
  1872. section in the ffmpeg-utils(1) manual,ffmpeg-utils}).
  1873. @subsection Options
  1874. @table @option
  1875. @item use_fifo @var{bool}
  1876. If set to 1, slave outputs will be processed in separate threads using the @ref{fifo}
  1877. muxer. This allows to compensate for different speed/latency/reliability of
  1878. outputs and setup transparent recovery. By default this feature is turned off.
  1879. @item fifo_options
  1880. Options to pass to fifo pseudo-muxer instances. See @ref{fifo}.
  1881. @end table
  1882. Muxer options can be specified for each slave by prepending them as a list of
  1883. @var{key}=@var{value} pairs separated by ':', between square brackets. If
  1884. the options values contain a special character or the ':' separator, they
  1885. must be escaped; note that this is a second level escaping.
  1886. The following special options are also recognized:
  1887. @table @option
  1888. @item f
  1889. Specify the format name. Required if it cannot be guessed from the
  1890. output URL.
  1891. @item bsfs[/@var{spec}]
  1892. Specify a list of bitstream filters to apply to the specified
  1893. output.
  1894. It is possible to specify to which streams a given bitstream filter
  1895. applies, by appending a stream specifier to the option separated by
  1896. @code{/}. @var{spec} must be a stream specifier (see @ref{Format
  1897. stream specifiers}).
  1898. If the stream specifier is not specified, the bitstream filters will be
  1899. applied to all streams in the output. This will cause that output operation
  1900. to fail if the output contains streams to which the bitstream filter cannot
  1901. be applied e.g. @code{h264_mp4toannexb} being applied to an output containing an audio stream.
  1902. Options for a bitstream filter must be specified in the form of @code{opt=value}.
  1903. Several bitstream filters can be specified, separated by ",".
  1904. @item use_fifo @var{bool}
  1905. This allows to override tee muxer use_fifo option for individual slave muxer.
  1906. @item fifo_options
  1907. This allows to override tee muxer fifo_options for individual slave muxer.
  1908. See @ref{fifo}.
  1909. @item select
  1910. Select the streams that should be mapped to the slave output,
  1911. specified by a stream specifier. If not specified, this defaults to
  1912. all the mapped streams. This will cause that output operation to fail
  1913. if the output format does not accept all mapped streams.
  1914. You may use multiple stream specifiers separated by commas (@code{,}) e.g.: @code{a:0,v}
  1915. @item onfail
  1916. Specify behaviour on output failure. This can be set to either @code{abort} (which is
  1917. default) or @code{ignore}. @code{abort} will cause whole process to fail in case of failure
  1918. on this slave output. @code{ignore} will ignore failure on this output, so other outputs
  1919. will continue without being affected.
  1920. @end table
  1921. @subsection Examples
  1922. @itemize
  1923. @item
  1924. Encode something and both archive it in a WebM file and stream it
  1925. as MPEG-TS over UDP:
  1926. @example
  1927. ffmpeg -i ... -c:v libx264 -c:a mp2 -f tee -map 0:v -map 0:a
  1928. "archive-20121107.mkv|[f=mpegts]udp://10.0.1.255:1234/"
  1929. @end example
  1930. @item
  1931. As above, but continue streaming even if output to local file fails
  1932. (for example local drive fills up):
  1933. @example
  1934. ffmpeg -i ... -c:v libx264 -c:a mp2 -f tee -map 0:v -map 0:a
  1935. "[onfail=ignore]archive-20121107.mkv|[f=mpegts]udp://10.0.1.255:1234/"
  1936. @end example
  1937. @item
  1938. Use @command{ffmpeg} to encode the input, and send the output
  1939. to three different destinations. The @code{dump_extra} bitstream
  1940. filter is used to add extradata information to all the output video
  1941. keyframes packets, as requested by the MPEG-TS format. The select
  1942. option is applied to @file{out.aac} in order to make it contain only
  1943. audio packets.
  1944. @example
  1945. ffmpeg -i ... -map 0 -flags +global_header -c:v libx264 -c:a aac
  1946. -f tee "[bsfs/v=dump_extra=freq=keyframe]out.ts|[movflags=+faststart]out.mp4|[select=a]out.aac"
  1947. @end example
  1948. @item
  1949. As above, but select only stream @code{a:1} for the audio output. Note
  1950. that a second level escaping must be performed, as ":" is a special
  1951. character used to separate options.
  1952. @example
  1953. ffmpeg -i ... -map 0 -flags +global_header -c:v libx264 -c:a aac
  1954. -f tee "[bsfs/v=dump_extra=freq=keyframe]out.ts|[movflags=+faststart]out.mp4|[select=\'a:1\']out.aac"
  1955. @end example
  1956. @end itemize
  1957. @section webm_dash_manifest
  1958. WebM DASH Manifest muxer.
  1959. This muxer implements the WebM DASH Manifest specification to generate the DASH
  1960. manifest XML. It also supports manifest generation for DASH live streams.
  1961. For more information see:
  1962. @itemize @bullet
  1963. @item
  1964. WebM DASH Specification: @url{https://sites.google.com/a/webmproject.org/wiki/adaptive-streaming/webm-dash-specification}
  1965. @item
  1966. ISO DASH Specification: @url{http://standards.iso.org/ittf/PubliclyAvailableStandards/c065274_ISO_IEC_23009-1_2014.zip}
  1967. @end itemize
  1968. @subsection Options
  1969. This muxer supports the following options:
  1970. @table @option
  1971. @item adaptation_sets
  1972. This option has the following syntax: "id=x,streams=a,b,c id=y,streams=d,e" where x and y are the
  1973. unique identifiers of the adaptation sets and a,b,c,d and e are the indices of the corresponding
  1974. audio and video streams. Any number of adaptation sets can be added using this option.
  1975. @item live
  1976. Set this to 1 to create a live stream DASH Manifest. Default: 0.
  1977. @item chunk_start_index
  1978. Start index of the first chunk. This will go in the @samp{startNumber} attribute
  1979. of the @samp{SegmentTemplate} element in the manifest. Default: 0.
  1980. @item chunk_duration_ms
  1981. Duration of each chunk in milliseconds. This will go in the @samp{duration}
  1982. attribute of the @samp{SegmentTemplate} element in the manifest. Default: 1000.
  1983. @item utc_timing_url
  1984. URL of the page that will return the UTC timestamp in ISO format. This will go
  1985. in the @samp{value} attribute of the @samp{UTCTiming} element in the manifest.
  1986. Default: None.
  1987. @item time_shift_buffer_depth
  1988. Smallest time (in seconds) shifting buffer for which any Representation is
  1989. guaranteed to be available. This will go in the @samp{timeShiftBufferDepth}
  1990. attribute of the @samp{MPD} element. Default: 60.
  1991. @item minimum_update_period
  1992. Minimum update period (in seconds) of the manifest. This will go in the
  1993. @samp{minimumUpdatePeriod} attribute of the @samp{MPD} element. Default: 0.
  1994. @end table
  1995. @subsection Example
  1996. @example
  1997. ffmpeg -f webm_dash_manifest -i video1.webm \
  1998. -f webm_dash_manifest -i video2.webm \
  1999. -f webm_dash_manifest -i audio1.webm \
  2000. -f webm_dash_manifest -i audio2.webm \
  2001. -map 0 -map 1 -map 2 -map 3 \
  2002. -c copy \
  2003. -f webm_dash_manifest \
  2004. -adaptation_sets "id=0,streams=0,1 id=1,streams=2,3" \
  2005. manifest.xml
  2006. @end example
  2007. @section webm_chunk
  2008. WebM Live Chunk Muxer.
  2009. This muxer writes out WebM headers and chunks as separate files which can be
  2010. consumed by clients that support WebM Live streams via DASH.
  2011. @subsection Options
  2012. This muxer supports the following options:
  2013. @table @option
  2014. @item chunk_start_index
  2015. Index of the first chunk (defaults to 0).
  2016. @item header
  2017. Filename of the header where the initialization data will be written.
  2018. @item audio_chunk_duration
  2019. Duration of each audio chunk in milliseconds (defaults to 5000).
  2020. @end table
  2021. @subsection Example
  2022. @example
  2023. ffmpeg -f v4l2 -i /dev/video0 \
  2024. -f alsa -i hw:0 \
  2025. -map 0:0 \
  2026. -c:v libvpx-vp9 \
  2027. -s 640x360 -keyint_min 30 -g 30 \
  2028. -f webm_chunk \
  2029. -header webm_live_video_360.hdr \
  2030. -chunk_start_index 1 \
  2031. webm_live_video_360_%d.chk \
  2032. -map 1:0 \
  2033. -c:a libvorbis \
  2034. -b:a 128k \
  2035. -f webm_chunk \
  2036. -header webm_live_audio_128.hdr \
  2037. -chunk_start_index 1 \
  2038. -audio_chunk_duration 1000 \
  2039. webm_live_audio_128_%d.chk
  2040. @end example
  2041. @c man end MUXERS