You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

1174 lines
41KB

  1. @chapter Muxers
  2. @c man begin MUXERS
  3. Muxers are configured elements in FFmpeg which allow writing
  4. multimedia streams to a particular type of file.
  5. When you configure your FFmpeg build, all the supported muxers
  6. are enabled by default. You can list all available muxers using the
  7. configure option @code{--list-muxers}.
  8. You can disable all the muxers with the configure option
  9. @code{--disable-muxers} and selectively enable / disable single muxers
  10. with the options @code{--enable-muxer=@var{MUXER}} /
  11. @code{--disable-muxer=@var{MUXER}}.
  12. The option @code{-formats} of the ff* tools will display the list of
  13. enabled muxers.
  14. A description of some of the currently available muxers follows.
  15. @anchor{aiff}
  16. @section aiff
  17. Audio Interchange File Format muxer.
  18. @subsection Options
  19. It accepts the following options:
  20. @table @option
  21. @item write_id3v2
  22. Enable ID3v2 tags writing when set to 1. Default is 0 (disabled).
  23. @item id3v2_version
  24. Select ID3v2 version to write. Currently only version 3 and 4 (aka.
  25. ID3v2.3 and ID3v2.4) are supported. The default is version 4.
  26. @end table
  27. @anchor{crc}
  28. @section crc
  29. CRC (Cyclic Redundancy Check) testing format.
  30. This muxer computes and prints the Adler-32 CRC of all the input audio
  31. and video frames. By default audio frames are converted to signed
  32. 16-bit raw audio and video frames to raw video before computing the
  33. CRC.
  34. The output of the muxer consists of a single line of the form:
  35. CRC=0x@var{CRC}, where @var{CRC} is a hexadecimal number 0-padded to
  36. 8 digits containing the CRC for all the decoded input frames.
  37. See also the @ref{framecrc} muxer.
  38. @subsection Examples
  39. For example to compute the CRC of the input, and store it in the file
  40. @file{out.crc}:
  41. @example
  42. ffmpeg -i INPUT -f crc out.crc
  43. @end example
  44. You can print the CRC to stdout with the command:
  45. @example
  46. ffmpeg -i INPUT -f crc -
  47. @end example
  48. You can select the output format of each frame with @command{ffmpeg} by
  49. specifying the audio and video codec and format. For example to
  50. compute the CRC of the input audio converted to PCM unsigned 8-bit
  51. and the input video converted to MPEG-2 video, use the command:
  52. @example
  53. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f crc -
  54. @end example
  55. @anchor{framecrc}
  56. @section framecrc
  57. Per-packet CRC (Cyclic Redundancy Check) testing format.
  58. This muxer computes and prints the Adler-32 CRC for each audio
  59. and video packet. By default audio frames are converted to signed
  60. 16-bit raw audio and video frames to raw video before computing the
  61. CRC.
  62. The output of the muxer consists of a line for each audio and video
  63. packet of the form:
  64. @example
  65. @var{stream_index}, @var{packet_dts}, @var{packet_pts}, @var{packet_duration}, @var{packet_size}, 0x@var{CRC}
  66. @end example
  67. @var{CRC} is a hexadecimal number 0-padded to 8 digits containing the
  68. CRC of the packet.
  69. @subsection Examples
  70. For example to compute the CRC of the audio and video frames in
  71. @file{INPUT}, converted to raw audio and video packets, and store it
  72. in the file @file{out.crc}:
  73. @example
  74. ffmpeg -i INPUT -f framecrc out.crc
  75. @end example
  76. To print the information to stdout, use the command:
  77. @example
  78. ffmpeg -i INPUT -f framecrc -
  79. @end example
  80. With @command{ffmpeg}, you can select the output format to which the
  81. audio and video frames are encoded before computing the CRC for each
  82. packet by specifying the audio and video codec. For example, to
  83. compute the CRC of each decoded input audio frame converted to PCM
  84. unsigned 8-bit and of each decoded input video frame converted to
  85. MPEG-2 video, use the command:
  86. @example
  87. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f framecrc -
  88. @end example
  89. See also the @ref{crc} muxer.
  90. @anchor{framemd5}
  91. @section framemd5
  92. Per-packet MD5 testing format.
  93. This muxer computes and prints the MD5 hash for each audio
  94. and video packet. By default audio frames are converted to signed
  95. 16-bit raw audio and video frames to raw video before computing the
  96. hash.
  97. The output of the muxer consists of a line for each audio and video
  98. packet of the form:
  99. @example
  100. @var{stream_index}, @var{packet_dts}, @var{packet_pts}, @var{packet_duration}, @var{packet_size}, @var{MD5}
  101. @end example
  102. @var{MD5} is a hexadecimal number representing the computed MD5 hash
  103. for the packet.
  104. @subsection Examples
  105. For example to compute the MD5 of the audio and video frames in
  106. @file{INPUT}, converted to raw audio and video packets, and store it
  107. in the file @file{out.md5}:
  108. @example
  109. ffmpeg -i INPUT -f framemd5 out.md5
  110. @end example
  111. To print the information to stdout, use the command:
  112. @example
  113. ffmpeg -i INPUT -f framemd5 -
  114. @end example
  115. See also the @ref{md5} muxer.
  116. @anchor{gif}
  117. @section gif
  118. Animated GIF muxer.
  119. It accepts the following options:
  120. @table @option
  121. @item loop
  122. Set the number of times to loop the output. Use @code{-1} for no loop, @code{0}
  123. for looping indefinitely (default).
  124. @item final_delay
  125. Force the delay (expressed in centiseconds) after the last frame. Each frame
  126. ends with a delay until the next frame. The default is @code{-1}, which is a
  127. special value to tell the muxer to re-use the previous delay. In case of a
  128. loop, you might want to customize this value to mark a pause for instance.
  129. @end table
  130. For example, to encode a gif looping 10 times, with a 5 seconds delay between
  131. the loops:
  132. @example
  133. ffmpeg -i INPUT -loop 10 -final_delay 500 out.gif
  134. @end example
  135. Note 1: if you wish to extract the frames in separate GIF files, you need to
  136. force the @ref{image2} muxer:
  137. @example
  138. ffmpeg -i INPUT -c:v gif -f image2 "out%d.gif"
  139. @end example
  140. Note 2: the GIF format has a very small time base: the delay between two frames
  141. can not be smaller than one centi second.
  142. @anchor{hls}
  143. @section hls
  144. Apple HTTP Live Streaming muxer that segments MPEG-TS according to
  145. the HTTP Live Streaming (HLS) specification.
  146. It creates a playlist file, and one or more segment files. The output filename
  147. specifies the playlist filename.
  148. By default, the muxer creates a file for each segment produced. These files
  149. have the same name as the playlist, followed by a sequential number and a
  150. .ts extension.
  151. For example, to convert an input file with @command{ffmpeg}:
  152. @example
  153. ffmpeg -i in.nut out.m3u8
  154. @end example
  155. This example will produce the playlist, @file{out.m3u8}, and segment files:
  156. @file{out0.ts}, @file{out1.ts}, @file{out2.ts}, etc.
  157. See also the @ref{segment} muxer, which provides a more generic and
  158. flexible implementation of a segmenter, and can be used to perform HLS
  159. segmentation.
  160. @subsection Options
  161. This muxer supports the following options:
  162. @table @option
  163. @item hls_time @var{seconds}
  164. Set the segment length in seconds. Default value is 2.
  165. @item hls_list_size @var{size}
  166. Set the maximum number of playlist entries. If set to 0 the list file
  167. will contain all the segments. Default value is 5.
  168. @item hls_ts_options @var{options_list}
  169. Set output format options using a :-separated list of key=value
  170. parameters. Values containing @code{:} special characters must be
  171. escaped.
  172. @item hls_wrap @var{wrap}
  173. Set the number after which the segment filename number (the number
  174. specified in each segment file) wraps. If set to 0 the number will be
  175. never wrapped. Default value is 0.
  176. This option is useful to avoid to fill the disk with many segment
  177. files, and limits the maximum number of segment files written to disk
  178. to @var{wrap}.
  179. @item start_number @var{number}
  180. Start the playlist sequence number from @var{number}. Default value is
  181. 0.
  182. @item hls_allow_cache @var{allowcache}
  183. Explicitly set whether the client MAY (1) or MUST NOT (0) cache media segments.
  184. @item hls_base_url @var{baseurl}
  185. Append @var{baseurl} to every entry in the playlist.
  186. Useful to generate playlists with absolute paths.
  187. Note that the playlist sequence number must be unique for each segment
  188. and it is not to be confused with the segment filename sequence number
  189. which can be cyclic, for example if the @option{wrap} option is
  190. specified.
  191. @item hls_flags single_file
  192. If this flag is set, the muxer will store all segments in a single MPEG-TS
  193. file, and will use byte ranges in the playlist. HLS playlists generated with
  194. this way will have the version number 4.
  195. For example:
  196. @example
  197. ffmpeg -i in.nut -hls_flags single_file out.m3u8
  198. @end example
  199. Will produce the playlist, @file{out.m3u8}, and a single segment file,
  200. @file{out.ts}.
  201. @end table
  202. @anchor{ico}
  203. @section ico
  204. ICO file muxer.
  205. Microsoft's icon file format (ICO) has some strict limitations that should be noted:
  206. @itemize
  207. @item
  208. Size cannot exceed 256 pixels in any dimension
  209. @item
  210. Only BMP and PNG images can be stored
  211. @item
  212. If a BMP image is used, it must be one of the following pixel formats:
  213. @example
  214. BMP Bit Depth FFmpeg Pixel Format
  215. 1bit pal8
  216. 4bit pal8
  217. 8bit pal8
  218. 16bit rgb555le
  219. 24bit bgr24
  220. 32bit bgra
  221. @end example
  222. @item
  223. If a BMP image is used, it must use the BITMAPINFOHEADER DIB header
  224. @item
  225. If a PNG image is used, it must use the rgba pixel format
  226. @end itemize
  227. @anchor{image2}
  228. @section image2
  229. Image file muxer.
  230. The image file muxer writes video frames to image files.
  231. The output filenames are specified by a pattern, which can be used to
  232. produce sequentially numbered series of files.
  233. The pattern may contain the string "%d" or "%0@var{N}d", this string
  234. specifies the position of the characters representing a numbering in
  235. the filenames. If the form "%0@var{N}d" is used, the string
  236. representing the number in each filename is 0-padded to @var{N}
  237. digits. The literal character '%' can be specified in the pattern with
  238. the string "%%".
  239. If the pattern contains "%d" or "%0@var{N}d", the first filename of
  240. the file list specified will contain the number 1, all the following
  241. numbers will be sequential.
  242. The pattern may contain a suffix which is used to automatically
  243. determine the format of the image files to write.
  244. For example the pattern "img-%03d.bmp" will specify a sequence of
  245. filenames of the form @file{img-001.bmp}, @file{img-002.bmp}, ...,
  246. @file{img-010.bmp}, etc.
  247. The pattern "img%%-%d.jpg" will specify a sequence of filenames of the
  248. form @file{img%-1.jpg}, @file{img%-2.jpg}, ..., @file{img%-10.jpg},
  249. etc.
  250. @subsection Examples
  251. The following example shows how to use @command{ffmpeg} for creating a
  252. sequence of files @file{img-001.jpeg}, @file{img-002.jpeg}, ...,
  253. taking one image every second from the input video:
  254. @example
  255. ffmpeg -i in.avi -vsync 1 -r 1 -f image2 'img-%03d.jpeg'
  256. @end example
  257. Note that with @command{ffmpeg}, if the format is not specified with the
  258. @code{-f} option and the output filename specifies an image file
  259. format, the image2 muxer is automatically selected, so the previous
  260. command can be written as:
  261. @example
  262. ffmpeg -i in.avi -vsync 1 -r 1 'img-%03d.jpeg'
  263. @end example
  264. Note also that the pattern must not necessarily contain "%d" or
  265. "%0@var{N}d", for example to create a single image file
  266. @file{img.jpeg} from the input video you can employ the command:
  267. @example
  268. ffmpeg -i in.avi -f image2 -frames:v 1 img.jpeg
  269. @end example
  270. The @option{strftime} option allows you to expand the filename with
  271. date and time information. Check the documentation of
  272. the @code{strftime()} function for the syntax.
  273. For example to generate image files from the @code{strftime()}
  274. "%Y-%m-%d_%H-%M-%S" pattern, the following @command{ffmpeg} command
  275. can be used:
  276. @example
  277. ffmpeg -f v4l2 -r 1 -i /dev/video0 -f image2 -strftime 1 "%Y-%m-%d_%H-%M-%S.jpg"
  278. @end example
  279. @subsection Options
  280. @table @option
  281. @item start_number
  282. Start the sequence from the specified number. Default value is 1. Must
  283. be a non-negative number.
  284. @item update
  285. If set to 1, the filename will always be interpreted as just a
  286. filename, not a pattern, and the corresponding file will be continuously
  287. overwritten with new images. Default value is 0.
  288. @item strftime
  289. If set to 1, expand the filename with date and time information from
  290. @code{strftime()}. Default value is 0.
  291. @end table
  292. The image muxer supports the .Y.U.V image file format. This format is
  293. special in that that each image frame consists of three files, for
  294. each of the YUV420P components. To read or write this image file format,
  295. specify the name of the '.Y' file. The muxer will automatically open the
  296. '.U' and '.V' files as required.
  297. @section matroska
  298. Matroska container muxer.
  299. This muxer implements the matroska and webm container specs.
  300. @subsection Metadata
  301. The recognized metadata settings in this muxer are:
  302. @table @option
  303. @item title
  304. Set title name provided to a single track.
  305. @item language
  306. Specify the language of the track in the Matroska languages form.
  307. The language can be either the 3 letters bibliographic ISO-639-2 (ISO
  308. 639-2/B) form (like "fre" for French), or a language code mixed with a
  309. country code for specialities in languages (like "fre-ca" for Canadian
  310. French).
  311. @item stereo_mode
  312. Set stereo 3D video layout of two views in a single video track.
  313. The following values are recognized:
  314. @table @samp
  315. @item mono
  316. video is not stereo
  317. @item left_right
  318. Both views are arranged side by side, Left-eye view is on the left
  319. @item bottom_top
  320. Both views are arranged in top-bottom orientation, Left-eye view is at bottom
  321. @item top_bottom
  322. Both views are arranged in top-bottom orientation, Left-eye view is on top
  323. @item checkerboard_rl
  324. Each view is arranged in a checkerboard interleaved pattern, Left-eye view being first
  325. @item checkerboard_lr
  326. Each view is arranged in a checkerboard interleaved pattern, Right-eye view being first
  327. @item row_interleaved_rl
  328. Each view is constituted by a row based interleaving, Right-eye view is first row
  329. @item row_interleaved_lr
  330. Each view is constituted by a row based interleaving, Left-eye view is first row
  331. @item col_interleaved_rl
  332. Both views are arranged in a column based interleaving manner, Right-eye view is first column
  333. @item col_interleaved_lr
  334. Both views are arranged in a column based interleaving manner, Left-eye view is first column
  335. @item anaglyph_cyan_red
  336. All frames are in anaglyph format viewable through red-cyan filters
  337. @item right_left
  338. Both views are arranged side by side, Right-eye view is on the left
  339. @item anaglyph_green_magenta
  340. All frames are in anaglyph format viewable through green-magenta filters
  341. @item block_lr
  342. Both eyes laced in one Block, Left-eye view is first
  343. @item block_rl
  344. Both eyes laced in one Block, Right-eye view is first
  345. @end table
  346. @end table
  347. For example a 3D WebM clip can be created using the following command line:
  348. @example
  349. ffmpeg -i sample_left_right_clip.mpg -an -c:v libvpx -metadata stereo_mode=left_right -y stereo_clip.webm
  350. @end example
  351. @subsection Options
  352. This muxer supports the following options:
  353. @table @option
  354. @item reserve_index_space
  355. By default, this muxer writes the index for seeking (called cues in Matroska
  356. terms) at the end of the file, because it cannot know in advance how much space
  357. to leave for the index at the beginning of the file. However for some use cases
  358. -- e.g. streaming where seeking is possible but slow -- it is useful to put the
  359. index at the beginning of the file.
  360. If this option is set to a non-zero value, the muxer will reserve a given amount
  361. of space in the file header and then try to write the cues there when the muxing
  362. finishes. If the available space does not suffice, muxing will fail. A safe size
  363. for most use cases should be about 50kB per hour of video.
  364. Note that cues are only written if the output is seekable and this option will
  365. have no effect if it is not.
  366. @end table
  367. @anchor{md5}
  368. @section md5
  369. MD5 testing format.
  370. This muxer computes and prints the MD5 hash of all the input audio
  371. and video frames. By default audio frames are converted to signed
  372. 16-bit raw audio and video frames to raw video before computing the
  373. hash.
  374. The output of the muxer consists of a single line of the form:
  375. MD5=@var{MD5}, where @var{MD5} is a hexadecimal number representing
  376. the computed MD5 hash.
  377. For example to compute the MD5 hash of the input converted to raw
  378. audio and video, and store it in the file @file{out.md5}:
  379. @example
  380. ffmpeg -i INPUT -f md5 out.md5
  381. @end example
  382. You can print the MD5 to stdout with the command:
  383. @example
  384. ffmpeg -i INPUT -f md5 -
  385. @end example
  386. See also the @ref{framemd5} muxer.
  387. @section mov, mp4, ismv
  388. MOV/MP4/ISMV (Smooth Streaming) muxer.
  389. The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4
  390. file has all the metadata about all packets stored in one location
  391. (written at the end of the file, it can be moved to the start for
  392. better playback by adding @var{faststart} to the @var{movflags}, or
  393. using the @command{qt-faststart} tool). A fragmented
  394. file consists of a number of fragments, where packets and metadata
  395. about these packets are stored together. Writing a fragmented
  396. file has the advantage that the file is decodable even if the
  397. writing is interrupted (while a normal MOV/MP4 is undecodable if
  398. it is not properly finished), and it requires less memory when writing
  399. very long files (since writing normal MOV/MP4 files stores info about
  400. every single packet in memory until the file is closed). The downside
  401. is that it is less compatible with other applications.
  402. @subsection Options
  403. Fragmentation is enabled by setting one of the AVOptions that define
  404. how to cut the file into fragments:
  405. @table @option
  406. @item -moov_size @var{bytes}
  407. Reserves space for the moov atom at the beginning of the file instead of placing the
  408. moov atom at the end. If the space reserved is insufficient, muxing will fail.
  409. @item -movflags frag_keyframe
  410. Start a new fragment at each video keyframe.
  411. @item -frag_duration @var{duration}
  412. Create fragments that are @var{duration} microseconds long.
  413. @item -frag_size @var{size}
  414. Create fragments that contain up to @var{size} bytes of payload data.
  415. @item -movflags frag_custom
  416. Allow the caller to manually choose when to cut fragments, by
  417. calling @code{av_write_frame(ctx, NULL)} to write a fragment with
  418. the packets written so far. (This is only useful with other
  419. applications integrating libavformat, not from @command{ffmpeg}.)
  420. @item -min_frag_duration @var{duration}
  421. Don't create fragments that are shorter than @var{duration} microseconds long.
  422. @end table
  423. If more than one condition is specified, fragments are cut when
  424. one of the specified conditions is fulfilled. The exception to this is
  425. @code{-min_frag_duration}, which has to be fulfilled for any of the other
  426. conditions to apply.
  427. Additionally, the way the output file is written can be adjusted
  428. through a few other options:
  429. @table @option
  430. @item -movflags empty_moov
  431. Write an initial moov atom directly at the start of the file, without
  432. describing any samples in it. Generally, an mdat/moov pair is written
  433. at the start of the file, as a normal MOV/MP4 file, containing only
  434. a short portion of the file. With this option set, there is no initial
  435. mdat atom, and the moov atom only describes the tracks but has
  436. a zero duration.
  437. Files written with this option set do not work in QuickTime.
  438. This option is implicitly set when writing ismv (Smooth Streaming) files.
  439. @item -movflags separate_moof
  440. Write a separate moof (movie fragment) atom for each track. Normally,
  441. packets for all tracks are written in a moof atom (which is slightly
  442. more efficient), but with this option set, the muxer writes one moof/mdat
  443. pair for each track, making it easier to separate tracks.
  444. This option is implicitly set when writing ismv (Smooth Streaming) files.
  445. @item -movflags faststart
  446. Run a second pass moving the index (moov atom) to the beginning of the file.
  447. This operation can take a while, and will not work in various situations such
  448. as fragmented output, thus it is not enabled by default.
  449. @item -movflags rtphint
  450. Add RTP hinting tracks to the output file.
  451. @item -movflags disable_chpl
  452. Disable Nero chapter markers (chpl atom). Normally, both Nero chapters
  453. and a QuickTime chapter track are written to the file. With this option
  454. set, only the QuickTime chapter track will be written. Nero chapters can
  455. cause failures when the file is reprocessed with certain tagging programs, like
  456. mp3Tag 2.61a and iTunes 11.3, most likely other versions are affected as well.
  457. @end table
  458. @subsection Example
  459. Smooth Streaming content can be pushed in real time to a publishing
  460. point on IIS with this muxer. Example:
  461. @example
  462. ffmpeg -re @var{<normal input/transcoding options>} -movflags isml+frag_keyframe -f ismv http://server/publishingpoint.isml/Streams(Encoder1)
  463. @end example
  464. @section mp3
  465. The MP3 muxer writes a raw MP3 stream with an ID3v2 header at the beginning and
  466. optionally an ID3v1 tag at the end. ID3v2.3 and ID3v2.4 are supported, the
  467. @code{id3v2_version} option controls which one is used. Setting
  468. @code{id3v2_version} to 0 will disable the ID3v2 header completely. The legacy
  469. ID3v1 tag is not written by default, but may be enabled with the
  470. @code{write_id3v1} option.
  471. The muxer may also write a Xing frame at the beginning, which contains the
  472. number of frames in the file. It is useful for computing duration of VBR files.
  473. The Xing frame is written if the output stream is seekable and if the
  474. @code{write_xing} option is set to 1 (the default).
  475. The muxer supports writing ID3v2 attached pictures (APIC frames). The pictures
  476. are supplied to the muxer in form of a video stream with a single packet. There
  477. can be any number of those streams, each will correspond to a single APIC frame.
  478. The stream metadata tags @var{title} and @var{comment} map to APIC
  479. @var{description} and @var{picture type} respectively. See
  480. @url{http://id3.org/id3v2.4.0-frames} for allowed picture types.
  481. Note that the APIC frames must be written at the beginning, so the muxer will
  482. buffer the audio frames until it gets all the pictures. It is therefore advised
  483. to provide the pictures as soon as possible to avoid excessive buffering.
  484. Examples:
  485. Write an mp3 with an ID3v2.3 header and an ID3v1 footer:
  486. @example
  487. ffmpeg -i INPUT -id3v2_version 3 -write_id3v1 1 out.mp3
  488. @end example
  489. To attach a picture to an mp3 file select both the audio and the picture stream
  490. with @code{map}:
  491. @example
  492. ffmpeg -i input.mp3 -i cover.png -c copy -map 0 -map 1
  493. -metadata:s:v title="Album cover" -metadata:s:v comment="Cover (Front)" out.mp3
  494. @end example
  495. Write a "clean" MP3 without any extra features:
  496. @example
  497. ffmpeg -i input.wav -write_xing 0 -id3v2_version 0 out.mp3
  498. @end example
  499. @section mpegts
  500. MPEG transport stream muxer.
  501. This muxer implements ISO 13818-1 and part of ETSI EN 300 468.
  502. The recognized metadata settings in mpegts muxer are @code{service_provider}
  503. and @code{service_name}. If they are not set the default for
  504. @code{service_provider} is "FFmpeg" and the default for
  505. @code{service_name} is "Service01".
  506. @subsection Options
  507. The muxer options are:
  508. @table @option
  509. @item -mpegts_original_network_id @var{number}
  510. Set the original_network_id (default 0x0001). This is unique identifier
  511. of a network in DVB. Its main use is in the unique identification of a
  512. service through the path Original_Network_ID, Transport_Stream_ID.
  513. @item -mpegts_transport_stream_id @var{number}
  514. Set the transport_stream_id (default 0x0001). This identifies a
  515. transponder in DVB.
  516. @item -mpegts_service_id @var{number}
  517. Set the service_id (default 0x0001) also known as program in DVB.
  518. @item -mpegts_pmt_start_pid @var{number}
  519. Set the first PID for PMT (default 0x1000, max 0x1f00).
  520. @item -mpegts_start_pid @var{number}
  521. Set the first PID for data packets (default 0x0100, max 0x0f00).
  522. @item -mpegts_m2ts_mode @var{number}
  523. Enable m2ts mode if set to 1. Default value is -1 which disables m2ts mode.
  524. @item -muxrate @var{number}
  525. Set a constant muxrate (default VBR).
  526. @item -pcr_period @var{numer}
  527. Override the default PCR retransmission time (default 20ms), ignored
  528. if variable muxrate is selected.
  529. @item -pes_payload_size @var{number}
  530. Set minimum PES packet payload in bytes.
  531. @item -mpegts_flags @var{flags}
  532. Set flags (see below).
  533. @item -mpegts_copyts @var{number}
  534. Preserve original timestamps, if value is set to 1. Default value is -1, which
  535. results in shifting timestamps so that they start from 0.
  536. @item -tables_version @var{number}
  537. Set PAT, PMT and SDT version (default 0, valid values are from 0 to 31, inclusively).
  538. This option allows updating stream structure so that standard consumer may
  539. detect the change. To do so, reopen output AVFormatContext (in case of API
  540. usage) or restart ffmpeg instance, cyclically changing tables_version value:
  541. @example
  542. ffmpeg -i source1.ts -codec copy -f mpegts -tables_version 0 udp://1.1.1.1:1111
  543. ffmpeg -i source2.ts -codec copy -f mpegts -tables_version 1 udp://1.1.1.1:1111
  544. ...
  545. ffmpeg -i source3.ts -codec copy -f mpegts -tables_version 31 udp://1.1.1.1:1111
  546. ffmpeg -i source1.ts -codec copy -f mpegts -tables_version 0 udp://1.1.1.1:1111
  547. ffmpeg -i source2.ts -codec copy -f mpegts -tables_version 1 udp://1.1.1.1:1111
  548. ...
  549. @end example
  550. @end table
  551. Option mpegts_flags may take a set of such flags:
  552. @table @option
  553. @item resend_headers
  554. Reemit PAT/PMT before writing the next packet.
  555. @item latm
  556. Use LATM packetization for AAC.
  557. @end table
  558. @subsection Example
  559. @example
  560. ffmpeg -i file.mpg -c copy \
  561. -mpegts_original_network_id 0x1122 \
  562. -mpegts_transport_stream_id 0x3344 \
  563. -mpegts_service_id 0x5566 \
  564. -mpegts_pmt_start_pid 0x1500 \
  565. -mpegts_start_pid 0x150 \
  566. -metadata service_provider="Some provider" \
  567. -metadata service_name="Some Channel" \
  568. -y out.ts
  569. @end example
  570. @section null
  571. Null muxer.
  572. This muxer does not generate any output file, it is mainly useful for
  573. testing or benchmarking purposes.
  574. For example to benchmark decoding with @command{ffmpeg} you can use the
  575. command:
  576. @example
  577. ffmpeg -benchmark -i INPUT -f null out.null
  578. @end example
  579. Note that the above command does not read or write the @file{out.null}
  580. file, but specifying the output file is required by the @command{ffmpeg}
  581. syntax.
  582. Alternatively you can write the command as:
  583. @example
  584. ffmpeg -benchmark -i INPUT -f null -
  585. @end example
  586. @section nut
  587. @table @option
  588. @item -syncpoints @var{flags}
  589. Change the syncpoint usage in nut:
  590. @table @option
  591. @item @var{default} use the normal low-overhead seeking aids.
  592. @item @var{none} do not use the syncpoints at all, reducing the overhead but making the stream non-seekable;
  593. Use of this option is not recommended, as the resulting files are very damage
  594. sensitive and seeking is not possible. Also in general the overhead from
  595. syncpoints is negligible. Note, -@code{write_index} 0 can be used to disable
  596. all growing data tables, allowing to mux endless streams with limited memory
  597. and wihout these disadvantages.
  598. @item @var{timestamped} extend the syncpoint with a wallclock field.
  599. @end table
  600. The @var{none} and @var{timestamped} flags are experimental.
  601. @item -write_index @var{bool}
  602. Write index at the end, the default is to write an index.
  603. @end table
  604. @example
  605. ffmpeg -i INPUT -f_strict experimental -syncpoints none - | processor
  606. @end example
  607. @section ogg
  608. Ogg container muxer.
  609. @table @option
  610. @item -page_duration @var{duration}
  611. Preferred page duration, in microseconds. The muxer will attempt to create
  612. pages that are approximately @var{duration} microseconds long. This allows the
  613. user to compromise between seek granularity and container overhead. The default
  614. is 1 second. A value of 0 will fill all segments, making pages as large as
  615. possible. A value of 1 will effectively use 1 packet-per-page in most
  616. situations, giving a small seek granularity at the cost of additional container
  617. overhead.
  618. @end table
  619. @anchor{segment}
  620. @section segment, stream_segment, ssegment
  621. Basic stream segmenter.
  622. This muxer outputs streams to a number of separate files of nearly
  623. fixed duration. Output filename pattern can be set in a fashion similar to
  624. @ref{image2}.
  625. @code{stream_segment} is a variant of the muxer used to write to
  626. streaming output formats, i.e. which do not require global headers,
  627. and is recommended for outputting e.g. to MPEG transport stream segments.
  628. @code{ssegment} is a shorter alias for @code{stream_segment}.
  629. Every segment starts with a keyframe of the selected reference stream,
  630. which is set through the @option{reference_stream} option.
  631. Note that if you want accurate splitting for a video file, you need to
  632. make the input key frames correspond to the exact splitting times
  633. expected by the segmenter, or the segment muxer will start the new
  634. segment with the key frame found next after the specified start
  635. time.
  636. The segment muxer works best with a single constant frame rate video.
  637. Optionally it can generate a list of the created segments, by setting
  638. the option @var{segment_list}. The list type is specified by the
  639. @var{segment_list_type} option. The entry filenames in the segment
  640. list are set by default to the basename of the corresponding segment
  641. files.
  642. See also the @ref{hls} muxer, which provides a more specific
  643. implementation for HLS segmentation.
  644. @subsection Options
  645. The segment muxer supports the following options:
  646. @table @option
  647. @item reference_stream @var{specifier}
  648. Set the reference stream, as specified by the string @var{specifier}.
  649. If @var{specifier} is set to @code{auto}, the reference is chosen
  650. automatically. Otherwise it must be a stream specifier (see the ``Stream
  651. specifiers'' chapter in the ffmpeg manual) which specifies the
  652. reference stream. The default value is @code{auto}.
  653. @item segment_format @var{format}
  654. Override the inner container format, by default it is guessed by the filename
  655. extension.
  656. @item segment_format_options @var{options_list}
  657. Set output format options using a :-separated list of key=value
  658. parameters. Values containing the @code{:} special character must be
  659. escaped.
  660. @item segment_list @var{name}
  661. Generate also a listfile named @var{name}. If not specified no
  662. listfile is generated.
  663. @item segment_list_flags @var{flags}
  664. Set flags affecting the segment list generation.
  665. It currently supports the following flags:
  666. @table @samp
  667. @item cache
  668. Allow caching (only affects M3U8 list files).
  669. @item live
  670. Allow live-friendly file generation.
  671. @end table
  672. @item segment_list_type @var{type}
  673. Select the listing format.
  674. @table @option
  675. @item @var{flat} use a simple flat list of entries.
  676. @item @var{hls} use a m3u8-like structure.
  677. @end table
  678. @item segment_list_size @var{size}
  679. Update the list file so that it contains at most @var{size}
  680. segments. If 0 the list file will contain all the segments. Default
  681. value is 0.
  682. @item segment_list_entry_prefix @var{prefix}
  683. Prepend @var{prefix} to each entry. Useful to generate absolute paths.
  684. By default no prefix is applied.
  685. The following values are recognized:
  686. @table @samp
  687. @item flat
  688. Generate a flat list for the created segments, one segment per line.
  689. @item csv, ext
  690. Generate a list for the created segments, one segment per line,
  691. each line matching the format (comma-separated values):
  692. @example
  693. @var{segment_filename},@var{segment_start_time},@var{segment_end_time}
  694. @end example
  695. @var{segment_filename} is the name of the output file generated by the
  696. muxer according to the provided pattern. CSV escaping (according to
  697. RFC4180) is applied if required.
  698. @var{segment_start_time} and @var{segment_end_time} specify
  699. the segment start and end time expressed in seconds.
  700. A list file with the suffix @code{".csv"} or @code{".ext"} will
  701. auto-select this format.
  702. @samp{ext} is deprecated in favor or @samp{csv}.
  703. @item ffconcat
  704. Generate an ffconcat file for the created segments. The resulting file
  705. can be read using the FFmpeg @ref{concat} demuxer.
  706. A list file with the suffix @code{".ffcat"} or @code{".ffconcat"} will
  707. auto-select this format.
  708. @item m3u8
  709. Generate an extended M3U8 file, version 3, compliant with
  710. @url{http://tools.ietf.org/id/draft-pantos-http-live-streaming}.
  711. A list file with the suffix @code{".m3u8"} will auto-select this format.
  712. @end table
  713. If not specified the type is guessed from the list file name suffix.
  714. @item segment_time @var{time}
  715. Set segment duration to @var{time}, the value must be a duration
  716. specification. Default value is "2". See also the
  717. @option{segment_times} option.
  718. Note that splitting may not be accurate, unless you force the
  719. reference stream key-frames at the given time. See the introductory
  720. notice and the examples below.
  721. @item segment_atclocktime @var{1|0}
  722. If set to "1" split at regular clock time intervals starting from 00:00
  723. o'clock. The @var{time} value specified in @option{segment_time} is
  724. used for setting the length of the splitting interval.
  725. For example with @option{segment_time} set to "900" this makes it possible
  726. to create files at 12:00 o'clock, 12:15, 12:30, etc.
  727. Default value is "0".
  728. @item segment_time_delta @var{delta}
  729. Specify the accuracy time when selecting the start time for a
  730. segment, expressed as a duration specification. Default value is "0".
  731. When delta is specified a key-frame will start a new segment if its
  732. PTS satisfies the relation:
  733. @example
  734. PTS >= start_time - time_delta
  735. @end example
  736. This option is useful when splitting video content, which is always
  737. split at GOP boundaries, in case a key frame is found just before the
  738. specified split time.
  739. In particular may be used in combination with the @file{ffmpeg} option
  740. @var{force_key_frames}. The key frame times specified by
  741. @var{force_key_frames} may not be set accurately because of rounding
  742. issues, with the consequence that a key frame time may result set just
  743. before the specified time. For constant frame rate videos a value of
  744. 1/(2*@var{frame_rate}) should address the worst case mismatch between
  745. the specified time and the time set by @var{force_key_frames}.
  746. @item segment_times @var{times}
  747. Specify a list of split points. @var{times} contains a list of comma
  748. separated duration specifications, in increasing order. See also
  749. the @option{segment_time} option.
  750. @item segment_frames @var{frames}
  751. Specify a list of split video frame numbers. @var{frames} contains a
  752. list of comma separated integer numbers, in increasing order.
  753. This option specifies to start a new segment whenever a reference
  754. stream key frame is found and the sequential number (starting from 0)
  755. of the frame is greater or equal to the next value in the list.
  756. @item segment_wrap @var{limit}
  757. Wrap around segment index once it reaches @var{limit}.
  758. @item segment_start_number @var{number}
  759. Set the sequence number of the first segment. Defaults to @code{0}.
  760. @item reset_timestamps @var{1|0}
  761. Reset timestamps at the begin of each segment, so that each segment
  762. will start with near-zero timestamps. It is meant to ease the playback
  763. of the generated segments. May not work with some combinations of
  764. muxers/codecs. It is set to @code{0} by default.
  765. @item initial_offset @var{offset}
  766. Specify timestamp offset to apply to the output packet timestamps. The
  767. argument must be a time duration specification, and defaults to 0.
  768. @end table
  769. @subsection Examples
  770. @itemize
  771. @item
  772. Remux the content of file @file{in.mkv} to a list of segments
  773. @file{out-000.nut}, @file{out-001.nut}, etc., and write the list of
  774. generated segments to @file{out.list}:
  775. @example
  776. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.list out%03d.nut
  777. @end example
  778. @item
  779. Segment input and set output format options for the output segments:
  780. @example
  781. ffmpeg -i in.mkv -f segment -segment_time 10 -segment_format_options movflags=+faststart out%03d.mp4
  782. @end example
  783. @item
  784. Segment the input file according to the split points specified by the
  785. @var{segment_times} option:
  786. @example
  787. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.csv -segment_times 1,2,3,5,8,13,21 out%03d.nut
  788. @end example
  789. @item
  790. Use the @command{ffmpeg} @option{force_key_frames}
  791. option to force key frames in the input at the specified location, together
  792. with the segment option @option{segment_time_delta} to account for
  793. possible roundings operated when setting key frame times.
  794. @example
  795. ffmpeg -i in.mkv -force_key_frames 1,2,3,5,8,13,21 -codec:v mpeg4 -codec:a pcm_s16le -map 0 \
  796. -f segment -segment_list out.csv -segment_times 1,2,3,5,8,13,21 -segment_time_delta 0.05 out%03d.nut
  797. @end example
  798. In order to force key frames on the input file, transcoding is
  799. required.
  800. @item
  801. Segment the input file by splitting the input file according to the
  802. frame numbers sequence specified with the @option{segment_frames} option:
  803. @example
  804. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.csv -segment_frames 100,200,300,500,800 out%03d.nut
  805. @end example
  806. @item
  807. Convert the @file{in.mkv} to TS segments using the @code{libx264}
  808. and @code{libfaac} encoders:
  809. @example
  810. ffmpeg -i in.mkv -map 0 -codec:v libx264 -codec:a libfaac -f ssegment -segment_list out.list out%03d.ts
  811. @end example
  812. @item
  813. Segment the input file, and create an M3U8 live playlist (can be used
  814. as live HLS source):
  815. @example
  816. ffmpeg -re -i in.mkv -codec copy -map 0 -f segment -segment_list playlist.m3u8 \
  817. -segment_list_flags +live -segment_time 10 out%03d.mkv
  818. @end example
  819. @end itemize
  820. @section smoothstreaming
  821. Smooth Streaming muxer generates a set of files (Manifest, chunks) suitable for serving with conventional web server.
  822. @table @option
  823. @item window_size
  824. Specify the number of fragments kept in the manifest. Default 0 (keep all).
  825. @item extra_window_size
  826. Specify the number of fragments kept outside of the manifest before removing from disk. Default 5.
  827. @item lookahead_count
  828. Specify the number of lookahead fragments. Default 2.
  829. @item min_frag_duration
  830. Specify the minimum fragment duration (in microseconds). Default 5000000.
  831. @item remove_at_exit
  832. Specify whether to remove all fragments when finished. Default 0 (do not remove).
  833. @end table
  834. @section tee
  835. The tee muxer can be used to write the same data to several files or any
  836. other kind of muxer. It can be used, for example, to both stream a video to
  837. the network and save it to disk at the same time.
  838. It is different from specifying several outputs to the @command{ffmpeg}
  839. command-line tool because the audio and video data will be encoded only once
  840. with the tee muxer; encoding can be a very expensive process. It is not
  841. useful when using the libavformat API directly because it is then possible
  842. to feed the same packets to several muxers directly.
  843. The slave outputs are specified in the file name given to the muxer,
  844. separated by '|'. If any of the slave name contains the '|' separator,
  845. leading or trailing spaces or any special character, it must be
  846. escaped (see @ref{quoting_and_escaping,,the "Quoting and escaping"
  847. section in the ffmpeg-utils(1) manual,ffmpeg-utils}).
  848. Muxer options can be specified for each slave by prepending them as a list of
  849. @var{key}=@var{value} pairs separated by ':', between square brackets. If
  850. the options values contain a special character or the ':' separator, they
  851. must be escaped; note that this is a second level escaping.
  852. The following special options are also recognized:
  853. @table @option
  854. @item f
  855. Specify the format name. Useful if it cannot be guessed from the
  856. output name suffix.
  857. @item bsfs[/@var{spec}]
  858. Specify a list of bitstream filters to apply to the specified
  859. output.
  860. It is possible to specify to which streams a given bitstream filter
  861. applies, by appending a stream specifier to the option separated by
  862. @code{/}. @var{spec} must be a stream specifier (see @ref{Format
  863. stream specifiers}). If the stream specifier is not specified, the
  864. bitstream filters will be applied to all streams in the output.
  865. Several bitstream filters can be specified, separated by ",".
  866. @item select
  867. Select the streams that should be mapped to the slave output,
  868. specified by a stream specifier. If not specified, this defaults to
  869. all the input streams.
  870. @end table
  871. @subsection Examples
  872. @itemize
  873. @item
  874. Encode something and both archive it in a WebM file and stream it
  875. as MPEG-TS over UDP (the streams need to be explicitly mapped):
  876. @example
  877. ffmpeg -i ... -c:v libx264 -c:a mp2 -f tee -map 0:v -map 0:a
  878. "archive-20121107.mkv|[f=mpegts]udp://10.0.1.255:1234/"
  879. @end example
  880. @item
  881. Use @command{ffmpeg} to encode the input, and send the output
  882. to three different destinations. The @code{dump_extra} bitstream
  883. filter is used to add extradata information to all the output video
  884. keyframes packets, as requested by the MPEG-TS format. The select
  885. option is applied to @file{out.aac} in order to make it contain only
  886. audio packets.
  887. @example
  888. ffmpeg -i ... -map 0 -flags +global_header -c:v libx264 -c:a aac -strict experimental
  889. -f tee "[bsfs/v=dump_extra]out.ts|[movflags=+faststart]out.mp4|[select=a]out.aac"
  890. @end example
  891. @item
  892. As below, but select only stream @code{a:1} for the audio output. Note
  893. that a second level escaping must be performed, as ":" is a special
  894. character used to separate options.
  895. @example
  896. ffmpeg -i ... -map 0 -flags +global_header -c:v libx264 -c:a aac -strict experimental
  897. -f tee "[bsfs/v=dump_extra]out.ts|[movflags=+faststart]out.mp4|[select=\'a:1\']out.aac"
  898. @end example
  899. @end itemize
  900. Note: some codecs may need different options depending on the output format;
  901. the auto-detection of this can not work with the tee muxer. The main example
  902. is the @option{global_header} flag.
  903. @section webm_dash_manifest
  904. WebM DASH Manifest muxer.
  905. This muxer implements the WebM DASH Manifest specification to generate the DASH manifest XML.
  906. @subsection Options
  907. This muxer supports the following options:
  908. @table @option
  909. @item adaptation_sets
  910. This option has the following syntax: "id=x,streams=a,b,c id=y,streams=d,e" where x and y are the
  911. unique identifiers of the adaptation sets and a,b,c,d and e are the indices of the corresponding
  912. audio and video streams. Any number of adaptation sets can be added using this option.
  913. @end table
  914. @subsection Example
  915. @example
  916. ffmpeg -f webm_dash_manifest -i video1.webm \
  917. -f webm_dash_manifest -i video2.webm \
  918. -f webm_dash_manifest -i audio1.webm \
  919. -f webm_dash_manifest -i audio2.webm \
  920. -map 0 -map 1 -map 2 -map 3 \
  921. -c copy \
  922. -f webm_dash_manifest \
  923. -adaptation_sets "id=0,streams=0,1 id=1,streams=2,3" \
  924. manifest.xml
  925. @end example
  926. @c man end MUXERS