You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

1648 lines
56KB

  1. @chapter Muxers
  2. @c man begin MUXERS
  3. Muxers are configured elements in FFmpeg which allow writing
  4. multimedia streams to a particular type of file.
  5. When you configure your FFmpeg build, all the supported muxers
  6. are enabled by default. You can list all available muxers using the
  7. configure option @code{--list-muxers}.
  8. You can disable all the muxers with the configure option
  9. @code{--disable-muxers} and selectively enable / disable single muxers
  10. with the options @code{--enable-muxer=@var{MUXER}} /
  11. @code{--disable-muxer=@var{MUXER}}.
  12. The option @code{-formats} of the ff* tools will display the list of
  13. enabled muxers.
  14. A description of some of the currently available muxers follows.
  15. @anchor{aiff}
  16. @section aiff
  17. Audio Interchange File Format muxer.
  18. @subsection Options
  19. It accepts the following options:
  20. @table @option
  21. @item write_id3v2
  22. Enable ID3v2 tags writing when set to 1. Default is 0 (disabled).
  23. @item id3v2_version
  24. Select ID3v2 version to write. Currently only version 3 and 4 (aka.
  25. ID3v2.3 and ID3v2.4) are supported. The default is version 4.
  26. @end table
  27. @anchor{asf}
  28. @section asf
  29. Advanced Systems Format muxer.
  30. Note that Windows Media Audio (wma) and Windows Media Video (wmv) use this
  31. muxer too.
  32. @subsection Options
  33. It accepts the following options:
  34. @table @option
  35. @item packet_size
  36. Set the muxer packet size. By tuning this setting you may reduce data
  37. fragmentation or muxer overhead depending on your source. Default value is
  38. 3200, minimum is 100, maximum is 64k.
  39. @end table
  40. @anchor{chromaprint}
  41. @section chromaprint
  42. Chromaprint fingerprinter
  43. This muxer feeds audio data to the Chromaprint library, which generates
  44. a fingerprint for the provided audio data. It takes a single signed
  45. native-endian 16-bit raw audio stream.
  46. @subsection Options
  47. @table @option
  48. @item silence_threshold
  49. Threshold for detecting silence, ranges from 0 to 32767. -1 for default
  50. (required for use with the AcoustID service).
  51. @item algorithm
  52. Algorithm index to fingerprint with.
  53. @item fp_format
  54. Format to output the fingerprint as. Accepts the following options:
  55. @table @samp
  56. @item raw
  57. Binary raw fingerprint
  58. @item compressed
  59. Binary compressed fingerprint
  60. @item base64
  61. Base64 compressed fingerprint
  62. @end table
  63. @end table
  64. @anchor{crc}
  65. @section crc
  66. CRC (Cyclic Redundancy Check) testing format.
  67. This muxer computes and prints the Adler-32 CRC of all the input audio
  68. and video frames. By default audio frames are converted to signed
  69. 16-bit raw audio and video frames to raw video before computing the
  70. CRC.
  71. The output of the muxer consists of a single line of the form:
  72. CRC=0x@var{CRC}, where @var{CRC} is a hexadecimal number 0-padded to
  73. 8 digits containing the CRC for all the decoded input frames.
  74. See also the @ref{framecrc} muxer.
  75. @subsection Examples
  76. For example to compute the CRC of the input, and store it in the file
  77. @file{out.crc}:
  78. @example
  79. ffmpeg -i INPUT -f crc out.crc
  80. @end example
  81. You can print the CRC to stdout with the command:
  82. @example
  83. ffmpeg -i INPUT -f crc -
  84. @end example
  85. You can select the output format of each frame with @command{ffmpeg} by
  86. specifying the audio and video codec and format. For example to
  87. compute the CRC of the input audio converted to PCM unsigned 8-bit
  88. and the input video converted to MPEG-2 video, use the command:
  89. @example
  90. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f crc -
  91. @end example
  92. @anchor{framecrc}
  93. @section framecrc
  94. Per-packet CRC (Cyclic Redundancy Check) testing format.
  95. This muxer computes and prints the Adler-32 CRC for each audio
  96. and video packet. By default audio frames are converted to signed
  97. 16-bit raw audio and video frames to raw video before computing the
  98. CRC.
  99. The output of the muxer consists of a line for each audio and video
  100. packet of the form:
  101. @example
  102. @var{stream_index}, @var{packet_dts}, @var{packet_pts}, @var{packet_duration}, @var{packet_size}, 0x@var{CRC}
  103. @end example
  104. @var{CRC} is a hexadecimal number 0-padded to 8 digits containing the
  105. CRC of the packet.
  106. @subsection Examples
  107. For example to compute the CRC of the audio and video frames in
  108. @file{INPUT}, converted to raw audio and video packets, and store it
  109. in the file @file{out.crc}:
  110. @example
  111. ffmpeg -i INPUT -f framecrc out.crc
  112. @end example
  113. To print the information to stdout, use the command:
  114. @example
  115. ffmpeg -i INPUT -f framecrc -
  116. @end example
  117. With @command{ffmpeg}, you can select the output format to which the
  118. audio and video frames are encoded before computing the CRC for each
  119. packet by specifying the audio and video codec. For example, to
  120. compute the CRC of each decoded input audio frame converted to PCM
  121. unsigned 8-bit and of each decoded input video frame converted to
  122. MPEG-2 video, use the command:
  123. @example
  124. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f framecrc -
  125. @end example
  126. See also the @ref{crc} muxer.
  127. @anchor{framehash}
  128. @section framehash
  129. Per-packet hash testing format.
  130. This muxer computes and prints a cryptographic hash for each audio
  131. and video packet. This can be used for packet-by-packet equality
  132. checks without having to individually do a binary comparison on each.
  133. By default audio frames are converted to signed 16-bit raw audio and
  134. video frames to raw video before computing the hash, but the output
  135. of explicit conversions to other codecs can also be used. It uses the
  136. SHA-256 cryptographic hash function by default, but supports several
  137. other algorithms.
  138. The output of the muxer consists of a line for each audio and video
  139. packet of the form:
  140. @example
  141. @var{stream_index}, @var{packet_dts}, @var{packet_pts}, @var{packet_duration}, @var{packet_size}, @var{hash}
  142. @end example
  143. @var{hash} is a hexadecimal number representing the computed hash
  144. for the packet.
  145. @table @option
  146. @item hash @var{algorithm}
  147. Use the cryptographic hash function specified by the string @var{algorithm}.
  148. Supported values include @code{MD5}, @code{murmur3}, @code{RIPEMD128},
  149. @code{RIPEMD160}, @code{RIPEMD256}, @code{RIPEMD320}, @code{SHA160},
  150. @code{SHA224}, @code{SHA256} (default), @code{SHA512/224}, @code{SHA512/256},
  151. @code{SHA384}, @code{SHA512}, @code{CRC32} and @code{adler32}.
  152. @end table
  153. @subsection Examples
  154. To compute the SHA-256 hash of the audio and video frames in @file{INPUT},
  155. converted to raw audio and video packets, and store it in the file
  156. @file{out.sha256}:
  157. @example
  158. ffmpeg -i INPUT -f framehash out.sha256
  159. @end example
  160. To print the information to stdout, using the MD5 hash function, use
  161. the command:
  162. @example
  163. ffmpeg -i INPUT -f framehash -hash md5 -
  164. @end example
  165. See also the @ref{hash} muxer.
  166. @anchor{framemd5}
  167. @section framemd5
  168. Per-packet MD5 testing format.
  169. This is a variant of the @ref{framehash} muxer. Unlike that muxer,
  170. it defaults to using the MD5 hash function.
  171. @subsection Examples
  172. To compute the MD5 hash of the audio and video frames in @file{INPUT},
  173. converted to raw audio and video packets, and store it in the file
  174. @file{out.md5}:
  175. @example
  176. ffmpeg -i INPUT -f framemd5 out.md5
  177. @end example
  178. To print the information to stdout, use the command:
  179. @example
  180. ffmpeg -i INPUT -f framemd5 -
  181. @end example
  182. See also the @ref{framehash} and @ref{md5} muxers.
  183. @anchor{gif}
  184. @section gif
  185. Animated GIF muxer.
  186. It accepts the following options:
  187. @table @option
  188. @item loop
  189. Set the number of times to loop the output. Use @code{-1} for no loop, @code{0}
  190. for looping indefinitely (default).
  191. @item final_delay
  192. Force the delay (expressed in centiseconds) after the last frame. Each frame
  193. ends with a delay until the next frame. The default is @code{-1}, which is a
  194. special value to tell the muxer to re-use the previous delay. In case of a
  195. loop, you might want to customize this value to mark a pause for instance.
  196. @end table
  197. For example, to encode a gif looping 10 times, with a 5 seconds delay between
  198. the loops:
  199. @example
  200. ffmpeg -i INPUT -loop 10 -final_delay 500 out.gif
  201. @end example
  202. Note 1: if you wish to extract the frames in separate GIF files, you need to
  203. force the @ref{image2} muxer:
  204. @example
  205. ffmpeg -i INPUT -c:v gif -f image2 "out%d.gif"
  206. @end example
  207. Note 2: the GIF format has a very small time base: the delay between two frames
  208. can not be smaller than one centi second.
  209. @anchor{hash}
  210. @section hash
  211. Hash testing format.
  212. This muxer computes and prints a cryptographic hash of all the input
  213. audio and video frames. This can be used for equality checks without
  214. having to do a complete binary comparison.
  215. By default audio frames are converted to signed 16-bit raw audio and
  216. video frames to raw video before computing the hash, but the output
  217. of explicit conversions to other codecs can also be used. Timestamps
  218. are ignored. It uses the SHA-256 cryptographic hash function by default,
  219. but supports several other algorithms.
  220. The output of the muxer consists of a single line of the form:
  221. @var{algo}=@var{hash}, where @var{algo} is a short string representing
  222. the hash function used, and @var{hash} is a hexadecimal number
  223. representing the computed hash.
  224. @table @option
  225. @item hash @var{algorithm}
  226. Use the cryptographic hash function specified by the string @var{algorithm}.
  227. Supported values include @code{MD5}, @code{murmur3}, @code{RIPEMD128},
  228. @code{RIPEMD160}, @code{RIPEMD256}, @code{RIPEMD320}, @code{SHA160},
  229. @code{SHA224}, @code{SHA256} (default), @code{SHA512/224}, @code{SHA512/256},
  230. @code{SHA384}, @code{SHA512}, @code{CRC32} and @code{adler32}.
  231. @end table
  232. @subsection Examples
  233. To compute the SHA-256 hash of the input converted to raw audio and
  234. video, and store it in the file @file{out.sha256}:
  235. @example
  236. ffmpeg -i INPUT -f hash out.sha256
  237. @end example
  238. To print an MD5 hash to stdout use the command:
  239. @example
  240. ffmpeg -i INPUT -f hash -hash md5 -
  241. @end example
  242. See also the @ref{framehash} muxer.
  243. @anchor{hls}
  244. @section hls
  245. Apple HTTP Live Streaming muxer that segments MPEG-TS according to
  246. the HTTP Live Streaming (HLS) specification.
  247. It creates a playlist file, and one or more segment files. The output filename
  248. specifies the playlist filename.
  249. By default, the muxer creates a file for each segment produced. These files
  250. have the same name as the playlist, followed by a sequential number and a
  251. .ts extension.
  252. For example, to convert an input file with @command{ffmpeg}:
  253. @example
  254. ffmpeg -i in.nut out.m3u8
  255. @end example
  256. This example will produce the playlist, @file{out.m3u8}, and segment files:
  257. @file{out0.ts}, @file{out1.ts}, @file{out2.ts}, etc.
  258. See also the @ref{segment} muxer, which provides a more generic and
  259. flexible implementation of a segmenter, and can be used to perform HLS
  260. segmentation.
  261. @subsection Options
  262. This muxer supports the following options:
  263. @table @option
  264. @item hls_time @var{seconds}
  265. Set the target segment length in seconds. Default value is 2.
  266. Segment will be cut on the next key frame after this time has passed.
  267. @item hls_list_size @var{size}
  268. Set the maximum number of playlist entries. If set to 0 the list file
  269. will contain all the segments. Default value is 5.
  270. @item hls_ts_options @var{options_list}
  271. Set output format options using a :-separated list of key=value
  272. parameters. Values containing @code{:} special characters must be
  273. escaped.
  274. @item hls_wrap @var{wrap}
  275. Set the number after which the segment filename number (the number
  276. specified in each segment file) wraps. If set to 0 the number will be
  277. never wrapped. Default value is 0.
  278. This option is useful to avoid to fill the disk with many segment
  279. files, and limits the maximum number of segment files written to disk
  280. to @var{wrap}.
  281. @item start_number @var{number}
  282. Start the playlist sequence number from @var{number}. Default value is
  283. 0.
  284. @item hls_allow_cache @var{allowcache}
  285. Explicitly set whether the client MAY (1) or MUST NOT (0) cache media segments.
  286. @item hls_base_url @var{baseurl}
  287. Append @var{baseurl} to every entry in the playlist.
  288. Useful to generate playlists with absolute paths.
  289. Note that the playlist sequence number must be unique for each segment
  290. and it is not to be confused with the segment filename sequence number
  291. which can be cyclic, for example if the @option{wrap} option is
  292. specified.
  293. @item hls_segment_filename @var{filename}
  294. Set the segment filename. Unless @code{hls_flags single_file} is set,
  295. @var{filename} is used as a string format with the segment number:
  296. @example
  297. ffmpeg in.nut -hls_segment_filename 'file%03d.ts' out.m3u8
  298. @end example
  299. This example will produce the playlist, @file{out.m3u8}, and segment files:
  300. @file{file000.ts}, @file{file001.ts}, @file{file002.ts}, etc.
  301. @item use_localtime
  302. Use strftime on @var{filename} to expand the segment filename with localtime.
  303. The segment number (%d) is not available in this mode.
  304. @example
  305. ffmpeg in.nut -use_localtime 1 -hls_segment_filename 'file-%Y%m%d-%s.ts' out.m3u8
  306. @end example
  307. This example will produce the playlist, @file{out.m3u8}, and segment files:
  308. @file{file-20160215-1455569023.ts}, @file{file-20160215-1455569024.ts}, etc.
  309. @item use_localtime_mkdir
  310. Used together with -use_localtime, it will create up to one subdirectory which
  311. is expanded in @var{filename}.
  312. @example
  313. ffmpeg in.nut -use_localtime 1 -use_localtime_mkdir 1 -hls_segment_filename '%Y%m%d/file-%Y%m%d-%s.ts' out.m3u8
  314. @end example
  315. This example will create a directory 201560215 (if it does not exist), and then
  316. produce the playlist, @file{out.m3u8}, and segment files:
  317. @file{201560215/file-20160215-1455569023.ts}, @file{201560215/file-20160215-1455569024.ts}, etc.
  318. @item hls_key_info_file @var{key_info_file}
  319. Use the information in @var{key_info_file} for segment encryption. The first
  320. line of @var{key_info_file} specifies the key URI written to the playlist. The
  321. key URL is used to access the encryption key during playback. The second line
  322. specifies the path to the key file used to obtain the key during the encryption
  323. process. The key file is read as a single packed array of 16 octets in binary
  324. format. The optional third line specifies the initialization vector (IV) as a
  325. hexadecimal string to be used instead of the segment sequence number (default)
  326. for encryption. Changes to @var{key_info_file} will result in segment
  327. encryption with the new key/IV and an entry in the playlist for the new key
  328. URI/IV.
  329. Key info file format:
  330. @example
  331. @var{key URI}
  332. @var{key file path}
  333. @var{IV} (optional)
  334. @end example
  335. Example key URIs:
  336. @example
  337. http://server/file.key
  338. /path/to/file.key
  339. file.key
  340. @end example
  341. Example key file paths:
  342. @example
  343. file.key
  344. /path/to/file.key
  345. @end example
  346. Example IV:
  347. @example
  348. 0123456789ABCDEF0123456789ABCDEF
  349. @end example
  350. Key info file example:
  351. @example
  352. http://server/file.key
  353. /path/to/file.key
  354. 0123456789ABCDEF0123456789ABCDEF
  355. @end example
  356. Example shell script:
  357. @example
  358. #!/bin/sh
  359. BASE_URL=$@{1:-'.'@}
  360. openssl rand 16 > file.key
  361. echo $BASE_URL/file.key > file.keyinfo
  362. echo file.key >> file.keyinfo
  363. echo $(openssl rand -hex 16) >> file.keyinfo
  364. ffmpeg -f lavfi -re -i testsrc -c:v h264 -hls_flags delete_segments \
  365. -hls_key_info_file file.keyinfo out.m3u8
  366. @end example
  367. @item hls_flags single_file
  368. If this flag is set, the muxer will store all segments in a single MPEG-TS
  369. file, and will use byte ranges in the playlist. HLS playlists generated with
  370. this way will have the version number 4.
  371. For example:
  372. @example
  373. ffmpeg -i in.nut -hls_flags single_file out.m3u8
  374. @end example
  375. Will produce the playlist, @file{out.m3u8}, and a single segment file,
  376. @file{out.ts}.
  377. @item hls_flags delete_segments
  378. Segment files removed from the playlist are deleted after a period of time
  379. equal to the duration of the segment plus the duration of the playlist.
  380. @item hls_flags round_durations
  381. Round the duration info in the playlist file segment info to integer
  382. values, instead of using floating point.
  383. @item hls_flags discont_starts
  384. Add the @code{#EXT-X-DISCONTINUITY} tag to the playlist, before the
  385. first segment's information.
  386. @item hls_flags omit_endlist
  387. Do not append the @code{EXT-X-ENDLIST} tag at the end of the playlist.
  388. @item hls_flags split_by_time
  389. Allow segments to start on frames other than keyframes. This improves
  390. behavior on some players when the time between keyframes is inconsistent,
  391. but may make things worse on others, and can cause some oddities during
  392. seeking. This flag should be used with the @code{hls_time} option.
  393. @item hls_playlist_type event
  394. Emit @code{#EXT-X-PLAYLIST-TYPE:EVENT} in the m3u8 header. Forces
  395. @option{hls_list_size} to 0; the playlist can only be appended to.
  396. @item hls_playlist_type vod
  397. Emit @code{#EXT-X-PLAYLIST-TYPE:VOD} in the m3u8 header. Forces
  398. @option{hls_list_size} to 0; the playlist must not change.
  399. @item method
  400. Use the given HTTP method to create the hls files.
  401. @example
  402. ffmpeg -re -i in.ts -f hls -method PUT http://example.com/live/out.m3u8
  403. @end example
  404. This example will upload all the mpegts segment files to the HTTP
  405. server using the HTTP PUT method, and update the m3u8 files every
  406. @code{refresh} times using the same method.
  407. Note that the HTTP server must support the given method for uploading
  408. files.
  409. @end table
  410. @anchor{ico}
  411. @section ico
  412. ICO file muxer.
  413. Microsoft's icon file format (ICO) has some strict limitations that should be noted:
  414. @itemize
  415. @item
  416. Size cannot exceed 256 pixels in any dimension
  417. @item
  418. Only BMP and PNG images can be stored
  419. @item
  420. If a BMP image is used, it must be one of the following pixel formats:
  421. @example
  422. BMP Bit Depth FFmpeg Pixel Format
  423. 1bit pal8
  424. 4bit pal8
  425. 8bit pal8
  426. 16bit rgb555le
  427. 24bit bgr24
  428. 32bit bgra
  429. @end example
  430. @item
  431. If a BMP image is used, it must use the BITMAPINFOHEADER DIB header
  432. @item
  433. If a PNG image is used, it must use the rgba pixel format
  434. @end itemize
  435. @anchor{image2}
  436. @section image2
  437. Image file muxer.
  438. The image file muxer writes video frames to image files.
  439. The output filenames are specified by a pattern, which can be used to
  440. produce sequentially numbered series of files.
  441. The pattern may contain the string "%d" or "%0@var{N}d", this string
  442. specifies the position of the characters representing a numbering in
  443. the filenames. If the form "%0@var{N}d" is used, the string
  444. representing the number in each filename is 0-padded to @var{N}
  445. digits. The literal character '%' can be specified in the pattern with
  446. the string "%%".
  447. If the pattern contains "%d" or "%0@var{N}d", the first filename of
  448. the file list specified will contain the number 1, all the following
  449. numbers will be sequential.
  450. The pattern may contain a suffix which is used to automatically
  451. determine the format of the image files to write.
  452. For example the pattern "img-%03d.bmp" will specify a sequence of
  453. filenames of the form @file{img-001.bmp}, @file{img-002.bmp}, ...,
  454. @file{img-010.bmp}, etc.
  455. The pattern "img%%-%d.jpg" will specify a sequence of filenames of the
  456. form @file{img%-1.jpg}, @file{img%-2.jpg}, ..., @file{img%-10.jpg},
  457. etc.
  458. @subsection Examples
  459. The following example shows how to use @command{ffmpeg} for creating a
  460. sequence of files @file{img-001.jpeg}, @file{img-002.jpeg}, ...,
  461. taking one image every second from the input video:
  462. @example
  463. ffmpeg -i in.avi -vsync 1 -r 1 -f image2 'img-%03d.jpeg'
  464. @end example
  465. Note that with @command{ffmpeg}, if the format is not specified with the
  466. @code{-f} option and the output filename specifies an image file
  467. format, the image2 muxer is automatically selected, so the previous
  468. command can be written as:
  469. @example
  470. ffmpeg -i in.avi -vsync 1 -r 1 'img-%03d.jpeg'
  471. @end example
  472. Note also that the pattern must not necessarily contain "%d" or
  473. "%0@var{N}d", for example to create a single image file
  474. @file{img.jpeg} from the input video you can employ the command:
  475. @example
  476. ffmpeg -i in.avi -f image2 -frames:v 1 img.jpeg
  477. @end example
  478. The @option{strftime} option allows you to expand the filename with
  479. date and time information. Check the documentation of
  480. the @code{strftime()} function for the syntax.
  481. For example to generate image files from the @code{strftime()}
  482. "%Y-%m-%d_%H-%M-%S" pattern, the following @command{ffmpeg} command
  483. can be used:
  484. @example
  485. ffmpeg -f v4l2 -r 1 -i /dev/video0 -f image2 -strftime 1 "%Y-%m-%d_%H-%M-%S.jpg"
  486. @end example
  487. @subsection Options
  488. @table @option
  489. @item start_number
  490. Start the sequence from the specified number. Default value is 0.
  491. @item update
  492. If set to 1, the filename will always be interpreted as just a
  493. filename, not a pattern, and the corresponding file will be continuously
  494. overwritten with new images. Default value is 0.
  495. @item strftime
  496. If set to 1, expand the filename with date and time information from
  497. @code{strftime()}. Default value is 0.
  498. @end table
  499. The image muxer supports the .Y.U.V image file format. This format is
  500. special in that that each image frame consists of three files, for
  501. each of the YUV420P components. To read or write this image file format,
  502. specify the name of the '.Y' file. The muxer will automatically open the
  503. '.U' and '.V' files as required.
  504. @section matroska
  505. Matroska container muxer.
  506. This muxer implements the matroska and webm container specs.
  507. @subsection Metadata
  508. The recognized metadata settings in this muxer are:
  509. @table @option
  510. @item title
  511. Set title name provided to a single track.
  512. @item language
  513. Specify the language of the track in the Matroska languages form.
  514. The language can be either the 3 letters bibliographic ISO-639-2 (ISO
  515. 639-2/B) form (like "fre" for French), or a language code mixed with a
  516. country code for specialities in languages (like "fre-ca" for Canadian
  517. French).
  518. @item stereo_mode
  519. Set stereo 3D video layout of two views in a single video track.
  520. The following values are recognized:
  521. @table @samp
  522. @item mono
  523. video is not stereo
  524. @item left_right
  525. Both views are arranged side by side, Left-eye view is on the left
  526. @item bottom_top
  527. Both views are arranged in top-bottom orientation, Left-eye view is at bottom
  528. @item top_bottom
  529. Both views are arranged in top-bottom orientation, Left-eye view is on top
  530. @item checkerboard_rl
  531. Each view is arranged in a checkerboard interleaved pattern, Left-eye view being first
  532. @item checkerboard_lr
  533. Each view is arranged in a checkerboard interleaved pattern, Right-eye view being first
  534. @item row_interleaved_rl
  535. Each view is constituted by a row based interleaving, Right-eye view is first row
  536. @item row_interleaved_lr
  537. Each view is constituted by a row based interleaving, Left-eye view is first row
  538. @item col_interleaved_rl
  539. Both views are arranged in a column based interleaving manner, Right-eye view is first column
  540. @item col_interleaved_lr
  541. Both views are arranged in a column based interleaving manner, Left-eye view is first column
  542. @item anaglyph_cyan_red
  543. All frames are in anaglyph format viewable through red-cyan filters
  544. @item right_left
  545. Both views are arranged side by side, Right-eye view is on the left
  546. @item anaglyph_green_magenta
  547. All frames are in anaglyph format viewable through green-magenta filters
  548. @item block_lr
  549. Both eyes laced in one Block, Left-eye view is first
  550. @item block_rl
  551. Both eyes laced in one Block, Right-eye view is first
  552. @end table
  553. @end table
  554. For example a 3D WebM clip can be created using the following command line:
  555. @example
  556. ffmpeg -i sample_left_right_clip.mpg -an -c:v libvpx -metadata stereo_mode=left_right -y stereo_clip.webm
  557. @end example
  558. @subsection Options
  559. This muxer supports the following options:
  560. @table @option
  561. @item reserve_index_space
  562. By default, this muxer writes the index for seeking (called cues in Matroska
  563. terms) at the end of the file, because it cannot know in advance how much space
  564. to leave for the index at the beginning of the file. However for some use cases
  565. -- e.g. streaming where seeking is possible but slow -- it is useful to put the
  566. index at the beginning of the file.
  567. If this option is set to a non-zero value, the muxer will reserve a given amount
  568. of space in the file header and then try to write the cues there when the muxing
  569. finishes. If the available space does not suffice, muxing will fail. A safe size
  570. for most use cases should be about 50kB per hour of video.
  571. Note that cues are only written if the output is seekable and this option will
  572. have no effect if it is not.
  573. @end table
  574. @anchor{md5}
  575. @section md5
  576. MD5 testing format.
  577. This is a variant of the @ref{hash} muxer. Unlike that muxer, it
  578. defaults to using the MD5 hash function.
  579. @subsection Examples
  580. To compute the MD5 hash of the input converted to raw
  581. audio and video, and store it in the file @file{out.md5}:
  582. @example
  583. ffmpeg -i INPUT -f md5 out.md5
  584. @end example
  585. You can print the MD5 to stdout with the command:
  586. @example
  587. ffmpeg -i INPUT -f md5 -
  588. @end example
  589. See also the @ref{hash} and @ref{framemd5} muxers.
  590. @section mov, mp4, ismv
  591. MOV/MP4/ISMV (Smooth Streaming) muxer.
  592. The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4
  593. file has all the metadata about all packets stored in one location
  594. (written at the end of the file, it can be moved to the start for
  595. better playback by adding @var{faststart} to the @var{movflags}, or
  596. using the @command{qt-faststart} tool). A fragmented
  597. file consists of a number of fragments, where packets and metadata
  598. about these packets are stored together. Writing a fragmented
  599. file has the advantage that the file is decodable even if the
  600. writing is interrupted (while a normal MOV/MP4 is undecodable if
  601. it is not properly finished), and it requires less memory when writing
  602. very long files (since writing normal MOV/MP4 files stores info about
  603. every single packet in memory until the file is closed). The downside
  604. is that it is less compatible with other applications.
  605. @subsection Options
  606. Fragmentation is enabled by setting one of the AVOptions that define
  607. how to cut the file into fragments:
  608. @table @option
  609. @item -moov_size @var{bytes}
  610. Reserves space for the moov atom at the beginning of the file instead of placing the
  611. moov atom at the end. If the space reserved is insufficient, muxing will fail.
  612. @item -movflags frag_keyframe
  613. Start a new fragment at each video keyframe.
  614. @item -frag_duration @var{duration}
  615. Create fragments that are @var{duration} microseconds long.
  616. @item -frag_size @var{size}
  617. Create fragments that contain up to @var{size} bytes of payload data.
  618. @item -movflags frag_custom
  619. Allow the caller to manually choose when to cut fragments, by
  620. calling @code{av_write_frame(ctx, NULL)} to write a fragment with
  621. the packets written so far. (This is only useful with other
  622. applications integrating libavformat, not from @command{ffmpeg}.)
  623. @item -min_frag_duration @var{duration}
  624. Don't create fragments that are shorter than @var{duration} microseconds long.
  625. @end table
  626. If more than one condition is specified, fragments are cut when
  627. one of the specified conditions is fulfilled. The exception to this is
  628. @code{-min_frag_duration}, which has to be fulfilled for any of the other
  629. conditions to apply.
  630. Additionally, the way the output file is written can be adjusted
  631. through a few other options:
  632. @table @option
  633. @item -movflags empty_moov
  634. Write an initial moov atom directly at the start of the file, without
  635. describing any samples in it. Generally, an mdat/moov pair is written
  636. at the start of the file, as a normal MOV/MP4 file, containing only
  637. a short portion of the file. With this option set, there is no initial
  638. mdat atom, and the moov atom only describes the tracks but has
  639. a zero duration.
  640. This option is implicitly set when writing ismv (Smooth Streaming) files.
  641. @item -movflags separate_moof
  642. Write a separate moof (movie fragment) atom for each track. Normally,
  643. packets for all tracks are written in a moof atom (which is slightly
  644. more efficient), but with this option set, the muxer writes one moof/mdat
  645. pair for each track, making it easier to separate tracks.
  646. This option is implicitly set when writing ismv (Smooth Streaming) files.
  647. @item -movflags faststart
  648. Run a second pass moving the index (moov atom) to the beginning of the file.
  649. This operation can take a while, and will not work in various situations such
  650. as fragmented output, thus it is not enabled by default.
  651. @item -movflags rtphint
  652. Add RTP hinting tracks to the output file.
  653. @item -movflags disable_chpl
  654. Disable Nero chapter markers (chpl atom). Normally, both Nero chapters
  655. and a QuickTime chapter track are written to the file. With this option
  656. set, only the QuickTime chapter track will be written. Nero chapters can
  657. cause failures when the file is reprocessed with certain tagging programs, like
  658. mp3Tag 2.61a and iTunes 11.3, most likely other versions are affected as well.
  659. @item -movflags omit_tfhd_offset
  660. Do not write any absolute base_data_offset in tfhd atoms. This avoids
  661. tying fragments to absolute byte positions in the file/streams.
  662. @item -movflags default_base_moof
  663. Similarly to the omit_tfhd_offset, this flag avoids writing the
  664. absolute base_data_offset field in tfhd atoms, but does so by using
  665. the new default-base-is-moof flag instead. This flag is new from
  666. 14496-12:2012. This may make the fragments easier to parse in certain
  667. circumstances (avoiding basing track fragment location calculations
  668. on the implicit end of the previous track fragment).
  669. @end table
  670. @subsection Example
  671. Smooth Streaming content can be pushed in real time to a publishing
  672. point on IIS with this muxer. Example:
  673. @example
  674. ffmpeg -re @var{<normal input/transcoding options>} -movflags isml+frag_keyframe -f ismv http://server/publishingpoint.isml/Streams(Encoder1)
  675. @end example
  676. @subsection Audible AAX
  677. Audible AAX files are encrypted M4B files, and they can be decrypted by specifying a 4 byte activation secret.
  678. @example
  679. ffmpeg -activation_bytes 1CEB00DA -i test.aax -vn -c:a copy output.mp4
  680. @end example
  681. @section mp3
  682. The MP3 muxer writes a raw MP3 stream with the following optional features:
  683. @itemize @bullet
  684. @item
  685. An ID3v2 metadata header at the beginning (enabled by default). Versions 2.3 and
  686. 2.4 are supported, the @code{id3v2_version} private option controls which one is
  687. used (3 or 4). Setting @code{id3v2_version} to 0 disables the ID3v2 header
  688. completely.
  689. The muxer supports writing attached pictures (APIC frames) to the ID3v2 header.
  690. The pictures are supplied to the muxer in form of a video stream with a single
  691. packet. There can be any number of those streams, each will correspond to a
  692. single APIC frame. The stream metadata tags @var{title} and @var{comment} map
  693. to APIC @var{description} and @var{picture type} respectively. See
  694. @url{http://id3.org/id3v2.4.0-frames} for allowed picture types.
  695. Note that the APIC frames must be written at the beginning, so the muxer will
  696. buffer the audio frames until it gets all the pictures. It is therefore advised
  697. to provide the pictures as soon as possible to avoid excessive buffering.
  698. @item
  699. A Xing/LAME frame right after the ID3v2 header (if present). It is enabled by
  700. default, but will be written only if the output is seekable. The
  701. @code{write_xing} private option can be used to disable it. The frame contains
  702. various information that may be useful to the decoder, like the audio duration
  703. or encoder delay.
  704. @item
  705. A legacy ID3v1 tag at the end of the file (disabled by default). It may be
  706. enabled with the @code{write_id3v1} private option, but as its capabilities are
  707. very limited, its usage is not recommended.
  708. @end itemize
  709. Examples:
  710. Write an mp3 with an ID3v2.3 header and an ID3v1 footer:
  711. @example
  712. ffmpeg -i INPUT -id3v2_version 3 -write_id3v1 1 out.mp3
  713. @end example
  714. To attach a picture to an mp3 file select both the audio and the picture stream
  715. with @code{map}:
  716. @example
  717. ffmpeg -i input.mp3 -i cover.png -c copy -map 0 -map 1
  718. -metadata:s:v title="Album cover" -metadata:s:v comment="Cover (Front)" out.mp3
  719. @end example
  720. Write a "clean" MP3 without any extra features:
  721. @example
  722. ffmpeg -i input.wav -write_xing 0 -id3v2_version 0 out.mp3
  723. @end example
  724. @section mpegts
  725. MPEG transport stream muxer.
  726. This muxer implements ISO 13818-1 and part of ETSI EN 300 468.
  727. The recognized metadata settings in mpegts muxer are @code{service_provider}
  728. and @code{service_name}. If they are not set the default for
  729. @code{service_provider} is "FFmpeg" and the default for
  730. @code{service_name} is "Service01".
  731. @subsection Options
  732. The muxer options are:
  733. @table @option
  734. @item mpegts_original_network_id @var{number}
  735. Set the original_network_id (default 0x0001). This is unique identifier
  736. of a network in DVB. Its main use is in the unique identification of a
  737. service through the path Original_Network_ID, Transport_Stream_ID.
  738. @item mpegts_transport_stream_id @var{number}
  739. Set the transport_stream_id (default 0x0001). This identifies a
  740. transponder in DVB.
  741. @item mpegts_service_id @var{number}
  742. Set the service_id (default 0x0001) also known as program in DVB.
  743. @item mpegts_service_type @var{number}
  744. Set the program service_type (default @var{digital_tv}), see below
  745. a list of pre defined values.
  746. @item mpegts_pmt_start_pid @var{number}
  747. Set the first PID for PMT (default 0x1000, max 0x1f00).
  748. @item mpegts_start_pid @var{number}
  749. Set the first PID for data packets (default 0x0100, max 0x0f00).
  750. @item mpegts_m2ts_mode @var{number}
  751. Enable m2ts mode if set to 1. Default value is -1 which disables m2ts mode.
  752. @item muxrate @var{number}
  753. Set a constant muxrate (default VBR).
  754. @item pcr_period @var{numer}
  755. Override the default PCR retransmission time (default 20ms), ignored
  756. if variable muxrate is selected.
  757. @item pat_period @var{number}
  758. Maximal time in seconds between PAT/PMT tables.
  759. @item sdt_period @var{number}
  760. Maximal time in seconds between SDT tables.
  761. @item pes_payload_size @var{number}
  762. Set minimum PES packet payload in bytes.
  763. @item mpegts_flags @var{flags}
  764. Set flags (see below).
  765. @item mpegts_copyts @var{number}
  766. Preserve original timestamps, if value is set to 1. Default value is -1, which
  767. results in shifting timestamps so that they start from 0.
  768. @item tables_version @var{number}
  769. Set PAT, PMT and SDT version (default 0, valid values are from 0 to 31, inclusively).
  770. This option allows updating stream structure so that standard consumer may
  771. detect the change. To do so, reopen output AVFormatContext (in case of API
  772. usage) or restart ffmpeg instance, cyclically changing tables_version value:
  773. @example
  774. ffmpeg -i source1.ts -codec copy -f mpegts -tables_version 0 udp://1.1.1.1:1111
  775. ffmpeg -i source2.ts -codec copy -f mpegts -tables_version 1 udp://1.1.1.1:1111
  776. ...
  777. ffmpeg -i source3.ts -codec copy -f mpegts -tables_version 31 udp://1.1.1.1:1111
  778. ffmpeg -i source1.ts -codec copy -f mpegts -tables_version 0 udp://1.1.1.1:1111
  779. ffmpeg -i source2.ts -codec copy -f mpegts -tables_version 1 udp://1.1.1.1:1111
  780. ...
  781. @end example
  782. @end table
  783. Option @option{mpegts_service_type} accepts the following values:
  784. @table @option
  785. @item hex_value
  786. Any hexdecimal value between 0x01 to 0xff as defined in ETSI 300 468.
  787. @item digital_tv
  788. Digital TV service.
  789. @item digital_radio
  790. Digital Radio service.
  791. @item teletext
  792. Teletext service.
  793. @item advanced_codec_digital_radio
  794. Advanced Codec Digital Radio service.
  795. @item mpeg2_digital_hdtv
  796. MPEG2 Digital HDTV service.
  797. @item advanced_codec_digital_sdtv
  798. Advanced Codec Digital SDTV service.
  799. @item advanced_codec_digital_hdtv
  800. Advanced Codec Digital HDTV service.
  801. @end table
  802. Option @option{mpegts_flags} may take a set of such flags:
  803. @table @option
  804. @item resend_headers
  805. Reemit PAT/PMT before writing the next packet.
  806. @item latm
  807. Use LATM packetization for AAC.
  808. @item pat_pmt_at_frames
  809. Reemit PAT and PMT at each video frame.
  810. @item system_b
  811. Conform to System B (DVB) instead of System A (ATSC).
  812. @end table
  813. @subsection Example
  814. @example
  815. ffmpeg -i file.mpg -c copy \
  816. -mpegts_original_network_id 0x1122 \
  817. -mpegts_transport_stream_id 0x3344 \
  818. -mpegts_service_id 0x5566 \
  819. -mpegts_pmt_start_pid 0x1500 \
  820. -mpegts_start_pid 0x150 \
  821. -metadata service_provider="Some provider" \
  822. -metadata service_name="Some Channel" \
  823. -y out.ts
  824. @end example
  825. @section mxf, mxf_d10
  826. MXF muxer.
  827. @subsection Options
  828. The muxer options are:
  829. @table @option
  830. @item store_user_comments @var{bool}
  831. Set if user comments should be stored if available or never.
  832. IRT D-10 does not allow user comments. The default is thus to write them for
  833. mxf but not for mxf_d10
  834. @end table
  835. @section null
  836. Null muxer.
  837. This muxer does not generate any output file, it is mainly useful for
  838. testing or benchmarking purposes.
  839. For example to benchmark decoding with @command{ffmpeg} you can use the
  840. command:
  841. @example
  842. ffmpeg -benchmark -i INPUT -f null out.null
  843. @end example
  844. Note that the above command does not read or write the @file{out.null}
  845. file, but specifying the output file is required by the @command{ffmpeg}
  846. syntax.
  847. Alternatively you can write the command as:
  848. @example
  849. ffmpeg -benchmark -i INPUT -f null -
  850. @end example
  851. @section nut
  852. @table @option
  853. @item -syncpoints @var{flags}
  854. Change the syncpoint usage in nut:
  855. @table @option
  856. @item @var{default} use the normal low-overhead seeking aids.
  857. @item @var{none} do not use the syncpoints at all, reducing the overhead but making the stream non-seekable;
  858. Use of this option is not recommended, as the resulting files are very damage
  859. sensitive and seeking is not possible. Also in general the overhead from
  860. syncpoints is negligible. Note, -@code{write_index} 0 can be used to disable
  861. all growing data tables, allowing to mux endless streams with limited memory
  862. and without these disadvantages.
  863. @item @var{timestamped} extend the syncpoint with a wallclock field.
  864. @end table
  865. The @var{none} and @var{timestamped} flags are experimental.
  866. @item -write_index @var{bool}
  867. Write index at the end, the default is to write an index.
  868. @end table
  869. @example
  870. ffmpeg -i INPUT -f_strict experimental -syncpoints none - | processor
  871. @end example
  872. @section ogg
  873. Ogg container muxer.
  874. @table @option
  875. @item -page_duration @var{duration}
  876. Preferred page duration, in microseconds. The muxer will attempt to create
  877. pages that are approximately @var{duration} microseconds long. This allows the
  878. user to compromise between seek granularity and container overhead. The default
  879. is 1 second. A value of 0 will fill all segments, making pages as large as
  880. possible. A value of 1 will effectively use 1 packet-per-page in most
  881. situations, giving a small seek granularity at the cost of additional container
  882. overhead.
  883. @item -serial_offset @var{value}
  884. Serial value from which to set the streams serial number.
  885. Setting it to different and sufficiently large values ensures that the produced
  886. ogg files can be safely chained.
  887. @end table
  888. @anchor{segment}
  889. @section segment, stream_segment, ssegment
  890. Basic stream segmenter.
  891. This muxer outputs streams to a number of separate files of nearly
  892. fixed duration. Output filename pattern can be set in a fashion
  893. similar to @ref{image2}, or by using a @code{strftime} template if
  894. the @option{strftime} option is enabled.
  895. @code{stream_segment} is a variant of the muxer used to write to
  896. streaming output formats, i.e. which do not require global headers,
  897. and is recommended for outputting e.g. to MPEG transport stream segments.
  898. @code{ssegment} is a shorter alias for @code{stream_segment}.
  899. Every segment starts with a keyframe of the selected reference stream,
  900. which is set through the @option{reference_stream} option.
  901. Note that if you want accurate splitting for a video file, you need to
  902. make the input key frames correspond to the exact splitting times
  903. expected by the segmenter, or the segment muxer will start the new
  904. segment with the key frame found next after the specified start
  905. time.
  906. The segment muxer works best with a single constant frame rate video.
  907. Optionally it can generate a list of the created segments, by setting
  908. the option @var{segment_list}. The list type is specified by the
  909. @var{segment_list_type} option. The entry filenames in the segment
  910. list are set by default to the basename of the corresponding segment
  911. files.
  912. See also the @ref{hls} muxer, which provides a more specific
  913. implementation for HLS segmentation.
  914. @subsection Options
  915. The segment muxer supports the following options:
  916. @table @option
  917. @item increment_tc @var{1|0}
  918. if set to @code{1}, increment timecode between each segment
  919. If this is selected, the input need to have
  920. a timecode in the first video stream. Default value is
  921. @code{0}.
  922. @item reference_stream @var{specifier}
  923. Set the reference stream, as specified by the string @var{specifier}.
  924. If @var{specifier} is set to @code{auto}, the reference is chosen
  925. automatically. Otherwise it must be a stream specifier (see the ``Stream
  926. specifiers'' chapter in the ffmpeg manual) which specifies the
  927. reference stream. The default value is @code{auto}.
  928. @item segment_format @var{format}
  929. Override the inner container format, by default it is guessed by the filename
  930. extension.
  931. @item segment_format_options @var{options_list}
  932. Set output format options using a :-separated list of key=value
  933. parameters. Values containing the @code{:} special character must be
  934. escaped.
  935. @item segment_list @var{name}
  936. Generate also a listfile named @var{name}. If not specified no
  937. listfile is generated.
  938. @item segment_list_flags @var{flags}
  939. Set flags affecting the segment list generation.
  940. It currently supports the following flags:
  941. @table @samp
  942. @item cache
  943. Allow caching (only affects M3U8 list files).
  944. @item live
  945. Allow live-friendly file generation.
  946. @end table
  947. @item segment_list_size @var{size}
  948. Update the list file so that it contains at most @var{size}
  949. segments. If 0 the list file will contain all the segments. Default
  950. value is 0.
  951. @item segment_list_entry_prefix @var{prefix}
  952. Prepend @var{prefix} to each entry. Useful to generate absolute paths.
  953. By default no prefix is applied.
  954. @item segment_list_type @var{type}
  955. Select the listing format.
  956. The following values are recognized:
  957. @table @samp
  958. @item flat
  959. Generate a flat list for the created segments, one segment per line.
  960. @item csv, ext
  961. Generate a list for the created segments, one segment per line,
  962. each line matching the format (comma-separated values):
  963. @example
  964. @var{segment_filename},@var{segment_start_time},@var{segment_end_time}
  965. @end example
  966. @var{segment_filename} is the name of the output file generated by the
  967. muxer according to the provided pattern. CSV escaping (according to
  968. RFC4180) is applied if required.
  969. @var{segment_start_time} and @var{segment_end_time} specify
  970. the segment start and end time expressed in seconds.
  971. A list file with the suffix @code{".csv"} or @code{".ext"} will
  972. auto-select this format.
  973. @samp{ext} is deprecated in favor or @samp{csv}.
  974. @item ffconcat
  975. Generate an ffconcat file for the created segments. The resulting file
  976. can be read using the FFmpeg @ref{concat} demuxer.
  977. A list file with the suffix @code{".ffcat"} or @code{".ffconcat"} will
  978. auto-select this format.
  979. @item m3u8
  980. Generate an extended M3U8 file, version 3, compliant with
  981. @url{http://tools.ietf.org/id/draft-pantos-http-live-streaming}.
  982. A list file with the suffix @code{".m3u8"} will auto-select this format.
  983. @end table
  984. If not specified the type is guessed from the list file name suffix.
  985. @item segment_time @var{time}
  986. Set segment duration to @var{time}, the value must be a duration
  987. specification. Default value is "2". See also the
  988. @option{segment_times} option.
  989. Note that splitting may not be accurate, unless you force the
  990. reference stream key-frames at the given time. See the introductory
  991. notice and the examples below.
  992. @item segment_atclocktime @var{1|0}
  993. If set to "1" split at regular clock time intervals starting from 00:00
  994. o'clock. The @var{time} value specified in @option{segment_time} is
  995. used for setting the length of the splitting interval.
  996. For example with @option{segment_time} set to "900" this makes it possible
  997. to create files at 12:00 o'clock, 12:15, 12:30, etc.
  998. Default value is "0".
  999. @item segment_clocktime_offset @var{duration}
  1000. Delay the segment splitting times with the specified duration when using
  1001. @option{segment_atclocktime}.
  1002. For example with @option{segment_time} set to "900" and
  1003. @option{segment_clocktime_offset} set to "300" this makes it possible to
  1004. create files at 12:05, 12:20, 12:35, etc.
  1005. Default value is "0".
  1006. @item segment_clocktime_wrap_duration @var{duration}
  1007. Force the segmenter to only start a new segment if a packet reaches the muxer
  1008. within the specified duration after the segmenting clock time. This way you
  1009. can make the segmenter more resilient to backward local time jumps, such as
  1010. leap seconds or transition to standard time from daylight savings time.
  1011. Assuming that the delay between the packets of your source is less than 0.5
  1012. second you can detect a leap second by specifying 0.5 as the duration.
  1013. Default is the maximum possible duration which means starting a new segment
  1014. regardless of the elapsed time since the last clock time.
  1015. @item segment_time_delta @var{delta}
  1016. Specify the accuracy time when selecting the start time for a
  1017. segment, expressed as a duration specification. Default value is "0".
  1018. When delta is specified a key-frame will start a new segment if its
  1019. PTS satisfies the relation:
  1020. @example
  1021. PTS >= start_time - time_delta
  1022. @end example
  1023. This option is useful when splitting video content, which is always
  1024. split at GOP boundaries, in case a key frame is found just before the
  1025. specified split time.
  1026. In particular may be used in combination with the @file{ffmpeg} option
  1027. @var{force_key_frames}. The key frame times specified by
  1028. @var{force_key_frames} may not be set accurately because of rounding
  1029. issues, with the consequence that a key frame time may result set just
  1030. before the specified time. For constant frame rate videos a value of
  1031. 1/(2*@var{frame_rate}) should address the worst case mismatch between
  1032. the specified time and the time set by @var{force_key_frames}.
  1033. @item segment_times @var{times}
  1034. Specify a list of split points. @var{times} contains a list of comma
  1035. separated duration specifications, in increasing order. See also
  1036. the @option{segment_time} option.
  1037. @item segment_frames @var{frames}
  1038. Specify a list of split video frame numbers. @var{frames} contains a
  1039. list of comma separated integer numbers, in increasing order.
  1040. This option specifies to start a new segment whenever a reference
  1041. stream key frame is found and the sequential number (starting from 0)
  1042. of the frame is greater or equal to the next value in the list.
  1043. @item segment_wrap @var{limit}
  1044. Wrap around segment index once it reaches @var{limit}.
  1045. @item segment_start_number @var{number}
  1046. Set the sequence number of the first segment. Defaults to @code{0}.
  1047. @item strftime @var{1|0}
  1048. Use the @code{strftime} function to define the name of the new
  1049. segments to write. If this is selected, the output segment name must
  1050. contain a @code{strftime} function template. Default value is
  1051. @code{0}.
  1052. @item break_non_keyframes @var{1|0}
  1053. If enabled, allow segments to start on frames other than keyframes. This
  1054. improves behavior on some players when the time between keyframes is
  1055. inconsistent, but may make things worse on others, and can cause some oddities
  1056. during seeking. Defaults to @code{0}.
  1057. @item reset_timestamps @var{1|0}
  1058. Reset timestamps at the begin of each segment, so that each segment
  1059. will start with near-zero timestamps. It is meant to ease the playback
  1060. of the generated segments. May not work with some combinations of
  1061. muxers/codecs. It is set to @code{0} by default.
  1062. @item initial_offset @var{offset}
  1063. Specify timestamp offset to apply to the output packet timestamps. The
  1064. argument must be a time duration specification, and defaults to 0.
  1065. @item write_empty_segments @var{1|0}
  1066. If enabled, write an empty segment if there are no packets during the period a
  1067. segment would usually span. Otherwise, the segment will be filled with the next
  1068. packet written. Defaults to @code{0}.
  1069. @end table
  1070. @subsection Examples
  1071. @itemize
  1072. @item
  1073. Remux the content of file @file{in.mkv} to a list of segments
  1074. @file{out-000.nut}, @file{out-001.nut}, etc., and write the list of
  1075. generated segments to @file{out.list}:
  1076. @example
  1077. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.list out%03d.nut
  1078. @end example
  1079. @item
  1080. Segment input and set output format options for the output segments:
  1081. @example
  1082. ffmpeg -i in.mkv -f segment -segment_time 10 -segment_format_options movflags=+faststart out%03d.mp4
  1083. @end example
  1084. @item
  1085. Segment the input file according to the split points specified by the
  1086. @var{segment_times} option:
  1087. @example
  1088. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.csv -segment_times 1,2,3,5,8,13,21 out%03d.nut
  1089. @end example
  1090. @item
  1091. Use the @command{ffmpeg} @option{force_key_frames}
  1092. option to force key frames in the input at the specified location, together
  1093. with the segment option @option{segment_time_delta} to account for
  1094. possible roundings operated when setting key frame times.
  1095. @example
  1096. ffmpeg -i in.mkv -force_key_frames 1,2,3,5,8,13,21 -codec:v mpeg4 -codec:a pcm_s16le -map 0 \
  1097. -f segment -segment_list out.csv -segment_times 1,2,3,5,8,13,21 -segment_time_delta 0.05 out%03d.nut
  1098. @end example
  1099. In order to force key frames on the input file, transcoding is
  1100. required.
  1101. @item
  1102. Segment the input file by splitting the input file according to the
  1103. frame numbers sequence specified with the @option{segment_frames} option:
  1104. @example
  1105. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.csv -segment_frames 100,200,300,500,800 out%03d.nut
  1106. @end example
  1107. @item
  1108. Convert the @file{in.mkv} to TS segments using the @code{libx264}
  1109. and @code{libfaac} encoders:
  1110. @example
  1111. ffmpeg -i in.mkv -map 0 -codec:v libx264 -codec:a libfaac -f ssegment -segment_list out.list out%03d.ts
  1112. @end example
  1113. @item
  1114. Segment the input file, and create an M3U8 live playlist (can be used
  1115. as live HLS source):
  1116. @example
  1117. ffmpeg -re -i in.mkv -codec copy -map 0 -f segment -segment_list playlist.m3u8 \
  1118. -segment_list_flags +live -segment_time 10 out%03d.mkv
  1119. @end example
  1120. @end itemize
  1121. @section smoothstreaming
  1122. Smooth Streaming muxer generates a set of files (Manifest, chunks) suitable for serving with conventional web server.
  1123. @table @option
  1124. @item window_size
  1125. Specify the number of fragments kept in the manifest. Default 0 (keep all).
  1126. @item extra_window_size
  1127. Specify the number of fragments kept outside of the manifest before removing from disk. Default 5.
  1128. @item lookahead_count
  1129. Specify the number of lookahead fragments. Default 2.
  1130. @item min_frag_duration
  1131. Specify the minimum fragment duration (in microseconds). Default 5000000.
  1132. @item remove_at_exit
  1133. Specify whether to remove all fragments when finished. Default 0 (do not remove).
  1134. @end table
  1135. @section tee
  1136. The tee muxer can be used to write the same data to several files or any
  1137. other kind of muxer. It can be used, for example, to both stream a video to
  1138. the network and save it to disk at the same time.
  1139. It is different from specifying several outputs to the @command{ffmpeg}
  1140. command-line tool because the audio and video data will be encoded only once
  1141. with the tee muxer; encoding can be a very expensive process. It is not
  1142. useful when using the libavformat API directly because it is then possible
  1143. to feed the same packets to several muxers directly.
  1144. The slave outputs are specified in the file name given to the muxer,
  1145. separated by '|'. If any of the slave name contains the '|' separator,
  1146. leading or trailing spaces or any special character, it must be
  1147. escaped (see @ref{quoting_and_escaping,,the "Quoting and escaping"
  1148. section in the ffmpeg-utils(1) manual,ffmpeg-utils}).
  1149. Muxer options can be specified for each slave by prepending them as a list of
  1150. @var{key}=@var{value} pairs separated by ':', between square brackets. If
  1151. the options values contain a special character or the ':' separator, they
  1152. must be escaped; note that this is a second level escaping.
  1153. The following special options are also recognized:
  1154. @table @option
  1155. @item f
  1156. Specify the format name. Useful if it cannot be guessed from the
  1157. output name suffix.
  1158. @item bsfs[/@var{spec}]
  1159. Specify a list of bitstream filters to apply to the specified
  1160. output.
  1161. It is possible to specify to which streams a given bitstream filter
  1162. applies, by appending a stream specifier to the option separated by
  1163. @code{/}. @var{spec} must be a stream specifier (see @ref{Format
  1164. stream specifiers}). If the stream specifier is not specified, the
  1165. bitstream filters will be applied to all streams in the output.
  1166. Several bitstream filters can be specified, separated by ",".
  1167. @item select
  1168. Select the streams that should be mapped to the slave output,
  1169. specified by a stream specifier. If not specified, this defaults to
  1170. all the input streams. You may use multiple stream specifiers
  1171. separated by commas (@code{,}) e.g.: @code{a:0,v}
  1172. @item onfail
  1173. Specify behaviour on output failure. This can be set to either @code{abort} (which is
  1174. default) or @code{ignore}. @code{abort} will cause whole process to fail in case of failure
  1175. on this slave output. @code{ignore} will ignore failure on this output, so other outputs
  1176. will continue without being affected.
  1177. @end table
  1178. @subsection Examples
  1179. @itemize
  1180. @item
  1181. Encode something and both archive it in a WebM file and stream it
  1182. as MPEG-TS over UDP (the streams need to be explicitly mapped):
  1183. @example
  1184. ffmpeg -i ... -c:v libx264 -c:a mp2 -f tee -map 0:v -map 0:a
  1185. "archive-20121107.mkv|[f=mpegts]udp://10.0.1.255:1234/"
  1186. @end example
  1187. @item
  1188. As above, but continue streaming even if output to local file fails
  1189. (for example local drive fills up):
  1190. @example
  1191. ffmpeg -i ... -c:v libx264 -c:a mp2 -f tee -map 0:v -map 0:a
  1192. "[onfail=ignore]archive-20121107.mkv|[f=mpegts]udp://10.0.1.255:1234/"
  1193. @end example
  1194. @item
  1195. Use @command{ffmpeg} to encode the input, and send the output
  1196. to three different destinations. The @code{dump_extra} bitstream
  1197. filter is used to add extradata information to all the output video
  1198. keyframes packets, as requested by the MPEG-TS format. The select
  1199. option is applied to @file{out.aac} in order to make it contain only
  1200. audio packets.
  1201. @example
  1202. ffmpeg -i ... -map 0 -flags +global_header -c:v libx264 -c:a aac -strict experimental
  1203. -f tee "[bsfs/v=dump_extra]out.ts|[movflags=+faststart]out.mp4|[select=a]out.aac"
  1204. @end example
  1205. @item
  1206. As below, but select only stream @code{a:1} for the audio output. Note
  1207. that a second level escaping must be performed, as ":" is a special
  1208. character used to separate options.
  1209. @example
  1210. ffmpeg -i ... -map 0 -flags +global_header -c:v libx264 -c:a aac -strict experimental
  1211. -f tee "[bsfs/v=dump_extra]out.ts|[movflags=+faststart]out.mp4|[select=\'a:1\']out.aac"
  1212. @end example
  1213. @end itemize
  1214. Note: some codecs may need different options depending on the output format;
  1215. the auto-detection of this can not work with the tee muxer. The main example
  1216. is the @option{global_header} flag.
  1217. @section webm_dash_manifest
  1218. WebM DASH Manifest muxer.
  1219. This muxer implements the WebM DASH Manifest specification to generate the DASH
  1220. manifest XML. It also supports manifest generation for DASH live streams.
  1221. For more information see:
  1222. @itemize @bullet
  1223. @item
  1224. WebM DASH Specification: @url{https://sites.google.com/a/webmproject.org/wiki/adaptive-streaming/webm-dash-specification}
  1225. @item
  1226. ISO DASH Specification: @url{http://standards.iso.org/ittf/PubliclyAvailableStandards/c065274_ISO_IEC_23009-1_2014.zip}
  1227. @end itemize
  1228. @subsection Options
  1229. This muxer supports the following options:
  1230. @table @option
  1231. @item adaptation_sets
  1232. This option has the following syntax: "id=x,streams=a,b,c id=y,streams=d,e" where x and y are the
  1233. unique identifiers of the adaptation sets and a,b,c,d and e are the indices of the corresponding
  1234. audio and video streams. Any number of adaptation sets can be added using this option.
  1235. @item live
  1236. Set this to 1 to create a live stream DASH Manifest. Default: 0.
  1237. @item chunk_start_index
  1238. Start index of the first chunk. This will go in the @samp{startNumber} attribute
  1239. of the @samp{SegmentTemplate} element in the manifest. Default: 0.
  1240. @item chunk_duration_ms
  1241. Duration of each chunk in milliseconds. This will go in the @samp{duration}
  1242. attribute of the @samp{SegmentTemplate} element in the manifest. Default: 1000.
  1243. @item utc_timing_url
  1244. URL of the page that will return the UTC timestamp in ISO format. This will go
  1245. in the @samp{value} attribute of the @samp{UTCTiming} element in the manifest.
  1246. Default: None.
  1247. @item time_shift_buffer_depth
  1248. Smallest time (in seconds) shifting buffer for which any Representation is
  1249. guaranteed to be available. This will go in the @samp{timeShiftBufferDepth}
  1250. attribute of the @samp{MPD} element. Default: 60.
  1251. @item minimum_update_period
  1252. Minimum update period (in seconds) of the manifest. This will go in the
  1253. @samp{minimumUpdatePeriod} attribute of the @samp{MPD} element. Default: 0.
  1254. @end table
  1255. @subsection Example
  1256. @example
  1257. ffmpeg -f webm_dash_manifest -i video1.webm \
  1258. -f webm_dash_manifest -i video2.webm \
  1259. -f webm_dash_manifest -i audio1.webm \
  1260. -f webm_dash_manifest -i audio2.webm \
  1261. -map 0 -map 1 -map 2 -map 3 \
  1262. -c copy \
  1263. -f webm_dash_manifest \
  1264. -adaptation_sets "id=0,streams=0,1 id=1,streams=2,3" \
  1265. manifest.xml
  1266. @end example
  1267. @section webm_chunk
  1268. WebM Live Chunk Muxer.
  1269. This muxer writes out WebM headers and chunks as separate files which can be
  1270. consumed by clients that support WebM Live streams via DASH.
  1271. @subsection Options
  1272. This muxer supports the following options:
  1273. @table @option
  1274. @item chunk_start_index
  1275. Index of the first chunk (defaults to 0).
  1276. @item header
  1277. Filename of the header where the initialization data will be written.
  1278. @item audio_chunk_duration
  1279. Duration of each audio chunk in milliseconds (defaults to 5000).
  1280. @end table
  1281. @subsection Example
  1282. @example
  1283. ffmpeg -f v4l2 -i /dev/video0 \
  1284. -f alsa -i hw:0 \
  1285. -map 0:0 \
  1286. -c:v libvpx-vp9 \
  1287. -s 640x360 -keyint_min 30 -g 30 \
  1288. -f webm_chunk \
  1289. -header webm_live_video_360.hdr \
  1290. -chunk_start_index 1 \
  1291. webm_live_video_360_%d.chk \
  1292. -map 1:0 \
  1293. -c:a libvorbis \
  1294. -b:a 128k \
  1295. -f webm_chunk \
  1296. -header webm_live_audio_128.hdr \
  1297. -chunk_start_index 1 \
  1298. -audio_chunk_duration 1000 \
  1299. webm_live_audio_128_%d.chk
  1300. @end example
  1301. @c man end MUXERS