You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

945 lines
33KB

  1. @chapter Muxers
  2. @c man begin MUXERS
  3. Muxers are configured elements in FFmpeg which allow writing
  4. multimedia streams to a particular type of file.
  5. When you configure your FFmpeg build, all the supported muxers
  6. are enabled by default. You can list all available muxers using the
  7. configure option @code{--list-muxers}.
  8. You can disable all the muxers with the configure option
  9. @code{--disable-muxers} and selectively enable / disable single muxers
  10. with the options @code{--enable-muxer=@var{MUXER}} /
  11. @code{--disable-muxer=@var{MUXER}}.
  12. The option @code{-formats} of the ff* tools will display the list of
  13. enabled muxers.
  14. A description of some of the currently available muxers follows.
  15. @anchor{aiff}
  16. @section aiff
  17. Audio Interchange File Format muxer.
  18. It accepts the following options:
  19. @table @option
  20. @item write_id3v2
  21. Enable ID3v2 tags writing when set to 1. Default is 0 (disabled).
  22. @item id3v2_version
  23. Select ID3v2 version to write. Currently only version 3 and 4 (aka.
  24. ID3v2.3 and ID3v2.4) are supported. The default is version 4.
  25. @end table
  26. @anchor{crc}
  27. @section crc
  28. CRC (Cyclic Redundancy Check) testing format.
  29. This muxer computes and prints the Adler-32 CRC of all the input audio
  30. and video frames. By default audio frames are converted to signed
  31. 16-bit raw audio and video frames to raw video before computing the
  32. CRC.
  33. The output of the muxer consists of a single line of the form:
  34. CRC=0x@var{CRC}, where @var{CRC} is a hexadecimal number 0-padded to
  35. 8 digits containing the CRC for all the decoded input frames.
  36. For example to compute the CRC of the input, and store it in the file
  37. @file{out.crc}:
  38. @example
  39. ffmpeg -i INPUT -f crc out.crc
  40. @end example
  41. You can print the CRC to stdout with the command:
  42. @example
  43. ffmpeg -i INPUT -f crc -
  44. @end example
  45. You can select the output format of each frame with @command{ffmpeg} by
  46. specifying the audio and video codec and format. For example to
  47. compute the CRC of the input audio converted to PCM unsigned 8-bit
  48. and the input video converted to MPEG-2 video, use the command:
  49. @example
  50. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f crc -
  51. @end example
  52. See also the @ref{framecrc} muxer.
  53. @anchor{framecrc}
  54. @section framecrc
  55. Per-packet CRC (Cyclic Redundancy Check) testing format.
  56. This muxer computes and prints the Adler-32 CRC for each audio
  57. and video packet. By default audio frames are converted to signed
  58. 16-bit raw audio and video frames to raw video before computing the
  59. CRC.
  60. The output of the muxer consists of a line for each audio and video
  61. packet of the form:
  62. @example
  63. @var{stream_index}, @var{packet_dts}, @var{packet_pts}, @var{packet_duration}, @var{packet_size}, 0x@var{CRC}
  64. @end example
  65. @var{CRC} is a hexadecimal number 0-padded to 8 digits containing the
  66. CRC of the packet.
  67. For example to compute the CRC of the audio and video frames in
  68. @file{INPUT}, converted to raw audio and video packets, and store it
  69. in the file @file{out.crc}:
  70. @example
  71. ffmpeg -i INPUT -f framecrc out.crc
  72. @end example
  73. To print the information to stdout, use the command:
  74. @example
  75. ffmpeg -i INPUT -f framecrc -
  76. @end example
  77. With @command{ffmpeg}, you can select the output format to which the
  78. audio and video frames are encoded before computing the CRC for each
  79. packet by specifying the audio and video codec. For example, to
  80. compute the CRC of each decoded input audio frame converted to PCM
  81. unsigned 8-bit and of each decoded input video frame converted to
  82. MPEG-2 video, use the command:
  83. @example
  84. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f framecrc -
  85. @end example
  86. See also the @ref{crc} muxer.
  87. @anchor{framemd5}
  88. @section framemd5
  89. Per-packet MD5 testing format.
  90. This muxer computes and prints the MD5 hash for each audio
  91. and video packet. By default audio frames are converted to signed
  92. 16-bit raw audio and video frames to raw video before computing the
  93. hash.
  94. The output of the muxer consists of a line for each audio and video
  95. packet of the form:
  96. @example
  97. @var{stream_index}, @var{packet_dts}, @var{packet_pts}, @var{packet_duration}, @var{packet_size}, @var{MD5}
  98. @end example
  99. @var{MD5} is a hexadecimal number representing the computed MD5 hash
  100. for the packet.
  101. For example to compute the MD5 of the audio and video frames in
  102. @file{INPUT}, converted to raw audio and video packets, and store it
  103. in the file @file{out.md5}:
  104. @example
  105. ffmpeg -i INPUT -f framemd5 out.md5
  106. @end example
  107. To print the information to stdout, use the command:
  108. @example
  109. ffmpeg -i INPUT -f framemd5 -
  110. @end example
  111. See also the @ref{md5} muxer.
  112. @anchor{gif}
  113. @section gif
  114. Animated GIF muxer.
  115. It accepts the following options:
  116. @table @option
  117. @item loop
  118. Set the number of times to loop the output. Use @code{-1} for no loop, @code{0}
  119. for looping indefinitely (default).
  120. @item final_delay
  121. Force the delay (expressed in centiseconds) after the last frame. Each frame
  122. ends with a delay until the next frame. The default is @code{-1}, which is a
  123. special value to tell the muxer to re-use the previous delay. In case of a
  124. loop, you might want to customize this value to mark a pause for instance.
  125. @end table
  126. For example, to encode a gif looping 10 times, with a 5 seconds delay between
  127. the loops:
  128. @example
  129. ffmpeg -i INPUT -loop 10 -final_delay 500 out.gif
  130. @end example
  131. Note 1: if you wish to extract the frames in separate GIF files, you need to
  132. force the @ref{image2} muxer:
  133. @example
  134. ffmpeg -i INPUT -c:v gif -f image2 "out%d.gif"
  135. @end example
  136. Note 2: the GIF format has a very small time base: the delay between two frames
  137. can not be smaller than one centi second.
  138. @anchor{hls}
  139. @section hls
  140. Apple HTTP Live Streaming muxer that segments MPEG-TS according to
  141. the HTTP Live Streaming specification.
  142. It creates a playlist file and numbered segment files. The output
  143. filename specifies the playlist filename; the segment filenames
  144. receive the same basename as the playlist, a sequential number and
  145. a .ts extension.
  146. @example
  147. ffmpeg -i in.nut out.m3u8
  148. @end example
  149. @table @option
  150. @item -hls_time @var{seconds}
  151. Set the segment length in seconds.
  152. @item -hls_list_size @var{size}
  153. Set the maximum number of playlist entries.
  154. @item -hls_wrap @var{wrap}
  155. Set the number after which index wraps.
  156. @item -start_number @var{number}
  157. Start the sequence from @var{number}.
  158. @end table
  159. @anchor{ico}
  160. @section ico
  161. ICO file muxer.
  162. Microsoft's icon file format (ICO) has some strict limitations that should be noted:
  163. @itemize
  164. @item
  165. Size cannot exceed 256 pixels in any dimension
  166. @item
  167. Only BMP and PNG images can be stored
  168. @item
  169. If a BMP image is used, it must be one of the following pixel formats:
  170. @example
  171. BMP Bit Depth FFmpeg Pixel Format
  172. 1bit pal8
  173. 4bit pal8
  174. 8bit pal8
  175. 16bit rgb555le
  176. 24bit bgr24
  177. 32bit bgra
  178. @end example
  179. @item
  180. If a BMP image is used, it must use the BITMAPINFOHEADER DIB header
  181. @item
  182. If a PNG image is used, it must use the rgba pixel format
  183. @end itemize
  184. @anchor{image2}
  185. @section image2
  186. Image file muxer.
  187. The image file muxer writes video frames to image files.
  188. The output filenames are specified by a pattern, which can be used to
  189. produce sequentially numbered series of files.
  190. The pattern may contain the string "%d" or "%0@var{N}d", this string
  191. specifies the position of the characters representing a numbering in
  192. the filenames. If the form "%0@var{N}d" is used, the string
  193. representing the number in each filename is 0-padded to @var{N}
  194. digits. The literal character '%' can be specified in the pattern with
  195. the string "%%".
  196. If the pattern contains "%d" or "%0@var{N}d", the first filename of
  197. the file list specified will contain the number 1, all the following
  198. numbers will be sequential.
  199. The pattern may contain a suffix which is used to automatically
  200. determine the format of the image files to write.
  201. For example the pattern "img-%03d.bmp" will specify a sequence of
  202. filenames of the form @file{img-001.bmp}, @file{img-002.bmp}, ...,
  203. @file{img-010.bmp}, etc.
  204. The pattern "img%%-%d.jpg" will specify a sequence of filenames of the
  205. form @file{img%-1.jpg}, @file{img%-2.jpg}, ..., @file{img%-10.jpg},
  206. etc.
  207. The following example shows how to use @command{ffmpeg} for creating a
  208. sequence of files @file{img-001.jpeg}, @file{img-002.jpeg}, ...,
  209. taking one image every second from the input video:
  210. @example
  211. ffmpeg -i in.avi -vsync 1 -r 1 -f image2 'img-%03d.jpeg'
  212. @end example
  213. Note that with @command{ffmpeg}, if the format is not specified with the
  214. @code{-f} option and the output filename specifies an image file
  215. format, the image2 muxer is automatically selected, so the previous
  216. command can be written as:
  217. @example
  218. ffmpeg -i in.avi -vsync 1 -r 1 'img-%03d.jpeg'
  219. @end example
  220. Note also that the pattern must not necessarily contain "%d" or
  221. "%0@var{N}d", for example to create a single image file
  222. @file{img.jpeg} from the input video you can employ the command:
  223. @example
  224. ffmpeg -i in.avi -f image2 -frames:v 1 img.jpeg
  225. @end example
  226. @table @option
  227. @item start_number @var{number}
  228. Start the sequence from @var{number}. Default value is 1. Must be a
  229. non-negative number.
  230. @item -update @var{number}
  231. If @var{number} is nonzero, the filename will always be interpreted as just a
  232. filename, not a pattern, and this file will be continuously overwritten with new
  233. images.
  234. @end table
  235. The image muxer supports the .Y.U.V image file format. This format is
  236. special in that that each image frame consists of three files, for
  237. each of the YUV420P components. To read or write this image file format,
  238. specify the name of the '.Y' file. The muxer will automatically open the
  239. '.U' and '.V' files as required.
  240. @section matroska
  241. Matroska container muxer.
  242. This muxer implements the matroska and webm container specs.
  243. The recognized metadata settings in this muxer are:
  244. @table @option
  245. @item title=@var{title name}
  246. Name provided to a single track
  247. @end table
  248. @table @option
  249. @item language=@var{language name}
  250. Specifies the language of the track in the Matroska languages form
  251. @end table
  252. @table @option
  253. @item stereo_mode=@var{mode}
  254. Stereo 3D video layout of two views in a single video track
  255. @table @option
  256. @item mono
  257. video is not stereo
  258. @item left_right
  259. Both views are arranged side by side, Left-eye view is on the left
  260. @item bottom_top
  261. Both views are arranged in top-bottom orientation, Left-eye view is at bottom
  262. @item top_bottom
  263. Both views are arranged in top-bottom orientation, Left-eye view is on top
  264. @item checkerboard_rl
  265. Each view is arranged in a checkerboard interleaved pattern, Left-eye view being first
  266. @item checkerboard_lr
  267. Each view is arranged in a checkerboard interleaved pattern, Right-eye view being first
  268. @item row_interleaved_rl
  269. Each view is constituted by a row based interleaving, Right-eye view is first row
  270. @item row_interleaved_lr
  271. Each view is constituted by a row based interleaving, Left-eye view is first row
  272. @item col_interleaved_rl
  273. Both views are arranged in a column based interleaving manner, Right-eye view is first column
  274. @item col_interleaved_lr
  275. Both views are arranged in a column based interleaving manner, Left-eye view is first column
  276. @item anaglyph_cyan_red
  277. All frames are in anaglyph format viewable through red-cyan filters
  278. @item right_left
  279. Both views are arranged side by side, Right-eye view is on the left
  280. @item anaglyph_green_magenta
  281. All frames are in anaglyph format viewable through green-magenta filters
  282. @item block_lr
  283. Both eyes laced in one Block, Left-eye view is first
  284. @item block_rl
  285. Both eyes laced in one Block, Right-eye view is first
  286. @end table
  287. @end table
  288. For example a 3D WebM clip can be created using the following command line:
  289. @example
  290. ffmpeg -i sample_left_right_clip.mpg -an -c:v libvpx -metadata stereo_mode=left_right -y stereo_clip.webm
  291. @end example
  292. This muxer supports the following options:
  293. @table @option
  294. @item reserve_index_space
  295. By default, this muxer writes the index for seeking (called cues in Matroska
  296. terms) at the end of the file, because it cannot know in advance how much space
  297. to leave for the index at the beginning of the file. However for some use cases
  298. -- e.g. streaming where seeking is possible but slow -- it is useful to put the
  299. index at the beginning of the file.
  300. If this option is set to a non-zero value, the muxer will reserve a given amount
  301. of space in the file header and then try to write the cues there when the muxing
  302. finishes. If the available space does not suffice, muxing will fail. A safe size
  303. for most use cases should be about 50kB per hour of video.
  304. Note that cues are only written if the output is seekable and this option will
  305. have no effect if it is not.
  306. @end table
  307. @anchor{md5}
  308. @section md5
  309. MD5 testing format.
  310. This muxer computes and prints the MD5 hash of all the input audio
  311. and video frames. By default audio frames are converted to signed
  312. 16-bit raw audio and video frames to raw video before computing the
  313. hash.
  314. The output of the muxer consists of a single line of the form:
  315. MD5=@var{MD5}, where @var{MD5} is a hexadecimal number representing
  316. the computed MD5 hash.
  317. For example to compute the MD5 hash of the input converted to raw
  318. audio and video, and store it in the file @file{out.md5}:
  319. @example
  320. ffmpeg -i INPUT -f md5 out.md5
  321. @end example
  322. You can print the MD5 to stdout with the command:
  323. @example
  324. ffmpeg -i INPUT -f md5 -
  325. @end example
  326. See also the @ref{framemd5} muxer.
  327. @section mov/mp4/ismv
  328. MOV/MP4/ISMV (Smooth Streaming) muxer.
  329. The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4
  330. file has all the metadata about all packets stored in one location
  331. (written at the end of the file, it can be moved to the start for
  332. better playback by adding @var{faststart} to the @var{movflags}, or
  333. using the @command{qt-faststart} tool). A fragmented
  334. file consists of a number of fragments, where packets and metadata
  335. about these packets are stored together. Writing a fragmented
  336. file has the advantage that the file is decodable even if the
  337. writing is interrupted (while a normal MOV/MP4 is undecodable if
  338. it is not properly finished), and it requires less memory when writing
  339. very long files (since writing normal MOV/MP4 files stores info about
  340. every single packet in memory until the file is closed). The downside
  341. is that it is less compatible with other applications.
  342. Fragmentation is enabled by setting one of the AVOptions that define
  343. how to cut the file into fragments:
  344. @table @option
  345. @item -moov_size @var{bytes}
  346. Reserves space for the moov atom at the beginning of the file instead of placing the
  347. moov atom at the end. If the space reserved is insufficient, muxing will fail.
  348. @item -movflags frag_keyframe
  349. Start a new fragment at each video keyframe.
  350. @item -frag_duration @var{duration}
  351. Create fragments that are @var{duration} microseconds long.
  352. @item -frag_size @var{size}
  353. Create fragments that contain up to @var{size} bytes of payload data.
  354. @item -movflags frag_custom
  355. Allow the caller to manually choose when to cut fragments, by
  356. calling @code{av_write_frame(ctx, NULL)} to write a fragment with
  357. the packets written so far. (This is only useful with other
  358. applications integrating libavformat, not from @command{ffmpeg}.)
  359. @item -min_frag_duration @var{duration}
  360. Don't create fragments that are shorter than @var{duration} microseconds long.
  361. @end table
  362. If more than one condition is specified, fragments are cut when
  363. one of the specified conditions is fulfilled. The exception to this is
  364. @code{-min_frag_duration}, which has to be fulfilled for any of the other
  365. conditions to apply.
  366. Additionally, the way the output file is written can be adjusted
  367. through a few other options:
  368. @table @option
  369. @item -movflags empty_moov
  370. Write an initial moov atom directly at the start of the file, without
  371. describing any samples in it. Generally, an mdat/moov pair is written
  372. at the start of the file, as a normal MOV/MP4 file, containing only
  373. a short portion of the file. With this option set, there is no initial
  374. mdat atom, and the moov atom only describes the tracks but has
  375. a zero duration.
  376. Files written with this option set do not work in QuickTime.
  377. This option is implicitly set when writing ismv (Smooth Streaming) files.
  378. @item -movflags separate_moof
  379. Write a separate moof (movie fragment) atom for each track. Normally,
  380. packets for all tracks are written in a moof atom (which is slightly
  381. more efficient), but with this option set, the muxer writes one moof/mdat
  382. pair for each track, making it easier to separate tracks.
  383. This option is implicitly set when writing ismv (Smooth Streaming) files.
  384. @item -movflags faststart
  385. Run a second pass moving the index (moov atom) to the beginning of the file.
  386. This operation can take a while, and will not work in various situations such
  387. as fragmented output, thus it is not enabled by default.
  388. @item -movflags rtphint
  389. Add RTP hinting tracks to the output file.
  390. @end table
  391. Smooth Streaming content can be pushed in real time to a publishing
  392. point on IIS with this muxer. Example:
  393. @example
  394. ffmpeg -re @var{<normal input/transcoding options>} -movflags isml+frag_keyframe -f ismv http://server/publishingpoint.isml/Streams(Encoder1)
  395. @end example
  396. @section mp3
  397. The MP3 muxer writes a raw MP3 stream with an ID3v2 header at the beginning and
  398. optionally an ID3v1 tag at the end. ID3v2.3 and ID3v2.4 are supported, the
  399. @code{id3v2_version} option controls which one is used. The legacy ID3v1 tag is
  400. not written by default, but may be enabled with the @code{write_id3v1} option.
  401. For seekable output the muxer also writes a Xing frame at the beginning, which
  402. contains the number of frames in the file. It is useful for computing duration
  403. of VBR files.
  404. The muxer supports writing ID3v2 attached pictures (APIC frames). The pictures
  405. are supplied to the muxer in form of a video stream with a single packet. There
  406. can be any number of those streams, each will correspond to a single APIC frame.
  407. The stream metadata tags @var{title} and @var{comment} map to APIC
  408. @var{description} and @var{picture type} respectively. See
  409. @url{http://id3.org/id3v2.4.0-frames} for allowed picture types.
  410. Note that the APIC frames must be written at the beginning, so the muxer will
  411. buffer the audio frames until it gets all the pictures. It is therefore advised
  412. to provide the pictures as soon as possible to avoid excessive buffering.
  413. Examples:
  414. Write an mp3 with an ID3v2.3 header and an ID3v1 footer:
  415. @example
  416. ffmpeg -i INPUT -id3v2_version 3 -write_id3v1 1 out.mp3
  417. @end example
  418. To attach a picture to an mp3 file select both the audio and the picture stream
  419. with @code{map}:
  420. @example
  421. ffmpeg -i input.mp3 -i cover.png -c copy -map 0 -map 1
  422. -metadata:s:v title="Album cover" -metadata:s:v comment="Cover (Front)" out.mp3
  423. @end example
  424. @section mpegts
  425. MPEG transport stream muxer.
  426. This muxer implements ISO 13818-1 and part of ETSI EN 300 468.
  427. The muxer options are:
  428. @table @option
  429. @item -mpegts_original_network_id @var{number}
  430. Set the original_network_id (default 0x0001). This is unique identifier
  431. of a network in DVB. Its main use is in the unique identification of a
  432. service through the path Original_Network_ID, Transport_Stream_ID.
  433. @item -mpegts_transport_stream_id @var{number}
  434. Set the transport_stream_id (default 0x0001). This identifies a
  435. transponder in DVB.
  436. @item -mpegts_service_id @var{number}
  437. Set the service_id (default 0x0001) also known as program in DVB.
  438. @item -mpegts_pmt_start_pid @var{number}
  439. Set the first PID for PMT (default 0x1000, max 0x1f00).
  440. @item -mpegts_start_pid @var{number}
  441. Set the first PID for data packets (default 0x0100, max 0x0f00).
  442. @item -mpegts_m2ts_mode @var{number}
  443. Enable m2ts mode if set to 1. Default value is -1 which disables m2ts mode.
  444. @item -muxrate @var{number}
  445. Set muxrate.
  446. @item -pes_payload_size @var{number}
  447. Set minimum PES packet payload in bytes.
  448. @item -mpegts_flags @var{flags}
  449. Set flags (see below).
  450. @item -mpegts_copyts @var{number}
  451. Preserve original timestamps, if value is set to 1. Default value is -1, which
  452. results in shifting timestamps so that they start from 0.
  453. @item -tables_version @var{number}
  454. Set PAT, PMT and SDT version (default 0, valid values are from 0 to 31, inclusively).
  455. This option allows updating stream structure so that standard consumer may
  456. detect the change. To do so, reopen output AVFormatContext (in case of API
  457. usage) or restart ffmpeg instance, cyclically changing tables_version value:
  458. @example
  459. ffmpeg -i source1.ts -codec copy -f mpegts -tables_version 0 udp://1.1.1.1:1111
  460. ffmpeg -i source2.ts -codec copy -f mpegts -tables_version 1 udp://1.1.1.1:1111
  461. ...
  462. ffmpeg -i source3.ts -codec copy -f mpegts -tables_version 31 udp://1.1.1.1:1111
  463. ffmpeg -i source1.ts -codec copy -f mpegts -tables_version 0 udp://1.1.1.1:1111
  464. ffmpeg -i source2.ts -codec copy -f mpegts -tables_version 1 udp://1.1.1.1:1111
  465. ...
  466. @end example
  467. @end table
  468. Option mpegts_flags may take a set of such flags:
  469. @table @option
  470. @item resend_headers
  471. Reemit PAT/PMT before writing the next packet.
  472. @item latm
  473. Use LATM packetization for AAC.
  474. @end table
  475. The recognized metadata settings in mpegts muxer are @code{service_provider}
  476. and @code{service_name}. If they are not set the default for
  477. @code{service_provider} is "FFmpeg" and the default for
  478. @code{service_name} is "Service01".
  479. @example
  480. ffmpeg -i file.mpg -c copy \
  481. -mpegts_original_network_id 0x1122 \
  482. -mpegts_transport_stream_id 0x3344 \
  483. -mpegts_service_id 0x5566 \
  484. -mpegts_pmt_start_pid 0x1500 \
  485. -mpegts_start_pid 0x150 \
  486. -metadata service_provider="Some provider" \
  487. -metadata service_name="Some Channel" \
  488. -y out.ts
  489. @end example
  490. @section null
  491. Null muxer.
  492. This muxer does not generate any output file, it is mainly useful for
  493. testing or benchmarking purposes.
  494. For example to benchmark decoding with @command{ffmpeg} you can use the
  495. command:
  496. @example
  497. ffmpeg -benchmark -i INPUT -f null out.null
  498. @end example
  499. Note that the above command does not read or write the @file{out.null}
  500. file, but specifying the output file is required by the @command{ffmpeg}
  501. syntax.
  502. Alternatively you can write the command as:
  503. @example
  504. ffmpeg -benchmark -i INPUT -f null -
  505. @end example
  506. @section ogg
  507. Ogg container muxer.
  508. @table @option
  509. @item -page_duration @var{duration}
  510. Preferred page duration, in microseconds. The muxer will attempt to create
  511. pages that are approximately @var{duration} microseconds long. This allows the
  512. user to compromise between seek granularity and container overhead. The default
  513. is 1 second. A value of 0 will fill all segments, making pages as large as
  514. possible. A value of 1 will effectively use 1 packet-per-page in most
  515. situations, giving a small seek granularity at the cost of additional container
  516. overhead.
  517. @end table
  518. @section segment, stream_segment, ssegment
  519. Basic stream segmenter.
  520. The segmenter muxer outputs streams to a number of separate files of nearly
  521. fixed duration. Output filename pattern can be set in a fashion similar to
  522. @ref{image2}.
  523. @code{stream_segment} is a variant of the muxer used to write to
  524. streaming output formats, i.e. which do not require global headers,
  525. and is recommended for outputting e.g. to MPEG transport stream segments.
  526. @code{ssegment} is a shorter alias for @code{stream_segment}.
  527. Every segment starts with a keyframe of the selected reference stream,
  528. which is set through the @option{reference_stream} option.
  529. Note that if you want accurate splitting for a video file, you need to
  530. make the input key frames correspond to the exact splitting times
  531. expected by the segmenter, or the segment muxer will start the new
  532. segment with the key frame found next after the specified start
  533. time.
  534. The segment muxer works best with a single constant frame rate video.
  535. Optionally it can generate a list of the created segments, by setting
  536. the option @var{segment_list}. The list type is specified by the
  537. @var{segment_list_type} option.
  538. The segment muxer supports the following options:
  539. @table @option
  540. @item reference_stream @var{specifier}
  541. Set the reference stream, as specified by the string @var{specifier}.
  542. If @var{specifier} is set to @code{auto}, the reference is choosen
  543. automatically. Otherwise it must be a stream specifier (see the ``Stream
  544. specifiers'' chapter in the ffmpeg manual) which specifies the
  545. reference stream. The default value is @code{auto}.
  546. @item segment_format @var{format}
  547. Override the inner container format, by default it is guessed by the filename
  548. extension.
  549. @item segment_list @var{name}
  550. Generate also a listfile named @var{name}. If not specified no
  551. listfile is generated.
  552. @item segment_list_flags @var{flags}
  553. Set flags affecting the segment list generation.
  554. It currently supports the following flags:
  555. @table @samp
  556. @item cache
  557. Allow caching (only affects M3U8 list files).
  558. @item live
  559. Allow live-friendly file generation.
  560. @end table
  561. Default value is @code{samp}.
  562. @item segment_list_size @var{size}
  563. Update the list file so that it contains at most the last @var{size}
  564. segments. If 0 the list file will contain all the segments. Default
  565. value is 0.
  566. @item segment_list_type @var{type}
  567. Specify the format for the segment list file.
  568. The following values are recognized:
  569. @table @samp
  570. @item flat
  571. Generate a flat list for the created segments, one segment per line.
  572. @item csv, ext
  573. Generate a list for the created segments, one segment per line,
  574. each line matching the format (comma-separated values):
  575. @example
  576. @var{segment_filename},@var{segment_start_time},@var{segment_end_time}
  577. @end example
  578. @var{segment_filename} is the name of the output file generated by the
  579. muxer according to the provided pattern. CSV escaping (according to
  580. RFC4180) is applied if required.
  581. @var{segment_start_time} and @var{segment_end_time} specify
  582. the segment start and end time expressed in seconds.
  583. A list file with the suffix @code{".csv"} or @code{".ext"} will
  584. auto-select this format.
  585. @samp{ext} is deprecated in favor or @samp{csv}.
  586. @item ffconcat
  587. Generate an ffconcat file for the created segments. The resulting file
  588. can be read using the FFmpeg @ref{concat} demuxer.
  589. A list file with the suffix @code{".ffcat"} or @code{".ffconcat"} will
  590. auto-select this format.
  591. @item m3u8
  592. Generate an extended M3U8 file, version 3, compliant with
  593. @url{http://tools.ietf.org/id/draft-pantos-http-live-streaming}.
  594. A list file with the suffix @code{".m3u8"} will auto-select this format.
  595. @end table
  596. If not specified the type is guessed from the list file name suffix.
  597. @item segment_time @var{time}
  598. Set segment duration to @var{time}, the value must be a duration
  599. specification. Default value is "2". See also the
  600. @option{segment_times} option.
  601. Note that splitting may not be accurate, unless you force the
  602. reference stream key-frames at the given time. See the introductory
  603. notice and the examples below.
  604. @item segment_time_delta @var{delta}
  605. Specify the accuracy time when selecting the start time for a
  606. segment, expressed as a duration specification. Default value is "0".
  607. When delta is specified a key-frame will start a new segment if its
  608. PTS satisfies the relation:
  609. @example
  610. PTS >= start_time - time_delta
  611. @end example
  612. This option is useful when splitting video content, which is always
  613. split at GOP boundaries, in case a key frame is found just before the
  614. specified split time.
  615. In particular may be used in combination with the @file{ffmpeg} option
  616. @var{force_key_frames}. The key frame times specified by
  617. @var{force_key_frames} may not be set accurately because of rounding
  618. issues, with the consequence that a key frame time may result set just
  619. before the specified time. For constant frame rate videos a value of
  620. 1/2*@var{frame_rate} should address the worst case mismatch between
  621. the specified time and the time set by @var{force_key_frames}.
  622. @item segment_times @var{times}
  623. Specify a list of split points. @var{times} contains a list of comma
  624. separated duration specifications, in increasing order. See also
  625. the @option{segment_time} option.
  626. @item segment_frames @var{frames}
  627. Specify a list of split video frame numbers. @var{frames} contains a
  628. list of comma separated integer numbers, in increasing order.
  629. This option specifies to start a new segment whenever a reference
  630. stream key frame is found and the sequential number (starting from 0)
  631. of the frame is greater or equal to the next value in the list.
  632. @item segment_wrap @var{limit}
  633. Wrap around segment index once it reaches @var{limit}.
  634. @item segment_start_number @var{number}
  635. Set the sequence number of the first segment. Defaults to @code{0}.
  636. @item reset_timestamps @var{1|0}
  637. Reset timestamps at the begin of each segment, so that each segment
  638. will start with near-zero timestamps. It is meant to ease the playback
  639. of the generated segments. May not work with some combinations of
  640. muxers/codecs. It is set to @code{0} by default.
  641. @item initial_offset @var{offset}
  642. Specify timestamp offset to apply to the output packet timestamps. The
  643. argument must be a time duration specification, and defaults to 0.
  644. @end table
  645. @subsection Examples
  646. @itemize
  647. @item
  648. To remux the content of file @file{in.mkv} to a list of segments
  649. @file{out-000.nut}, @file{out-001.nut}, etc., and write the list of
  650. generated segments to @file{out.list}:
  651. @example
  652. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.list out%03d.nut
  653. @end example
  654. @item
  655. As the example above, but segment the input file according to the split
  656. points specified by the @var{segment_times} option:
  657. @example
  658. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.csv -segment_times 1,2,3,5,8,13,21 out%03d.nut
  659. @end example
  660. @item
  661. As the example above, but use the @command{ffmpeg} @option{force_key_frames}
  662. option to force key frames in the input at the specified location, together
  663. with the segment option @option{segment_time_delta} to account for
  664. possible roundings operated when setting key frame times.
  665. @example
  666. ffmpeg -i in.mkv -force_key_frames 1,2,3,5,8,13,21 -codec:v mpeg4 -codec:a pcm_s16le -map 0 \
  667. -f segment -segment_list out.csv -segment_times 1,2,3,5,8,13,21 -segment_time_delta 0.05 out%03d.nut
  668. @end example
  669. In order to force key frames on the input file, transcoding is
  670. required.
  671. @item
  672. Segment the input file by splitting the input file according to the
  673. frame numbers sequence specified with the @option{segment_frames} option:
  674. @example
  675. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.csv -segment_frames 100,200,300,500,800 out%03d.nut
  676. @end example
  677. @item
  678. To convert the @file{in.mkv} to TS segments using the @code{libx264}
  679. and @code{libfaac} encoders:
  680. @example
  681. ffmpeg -i in.mkv -map 0 -codec:v libx264 -codec:a libfaac -f ssegment -segment_list out.list out%03d.ts
  682. @end example
  683. @item
  684. Segment the input file, and create an M3U8 live playlist (can be used
  685. as live HLS source):
  686. @example
  687. ffmpeg -re -i in.mkv -codec copy -map 0 -f segment -segment_list playlist.m3u8 \
  688. -segment_list_flags +live -segment_time 10 out%03d.mkv
  689. @end example
  690. @end itemize
  691. @section tee
  692. The tee muxer can be used to write the same data to several files or any
  693. other kind of muxer. It can be used, for example, to both stream a video to
  694. the network and save it to disk at the same time.
  695. It is different from specifying several outputs to the @command{ffmpeg}
  696. command-line tool because the audio and video data will be encoded only once
  697. with the tee muxer; encoding can be a very expensive process. It is not
  698. useful when using the libavformat API directly because it is then possible
  699. to feed the same packets to several muxers directly.
  700. The slave outputs are specified in the file name given to the muxer,
  701. separated by '|'. If any of the slave name contains the '|' separator,
  702. leading or trailing spaces or any special character, it must be
  703. escaped (see the ``Quoting and escaping'' section in the ffmpeg-utils
  704. manual).
  705. Muxer options can be specified for each slave by prepending them as a list of
  706. @var{key}=@var{value} pairs separated by ':', between square brackets. If
  707. the options values contain a special character or the ':' separator, they
  708. must be escaped; note that this is a second level escaping.
  709. The following special options are also recognized:
  710. @table @option
  711. @item f
  712. Specify the format name. Useful if it cannot be guessed from the
  713. output name suffix.
  714. @item bsfs[/@var{spec}]
  715. Specify a list of bitstream filters to apply to the specified
  716. output. It is possible to specify to which streams a given bitstream
  717. filter applies, by appending a stream specifier to the option
  718. separated by @code{/}. If the stream specifier is not specified, the
  719. bistream filters will be applied to all streams in the output.
  720. Several bitstream filters can be specified, separated by ",".
  721. @item select
  722. Select the streams that should be mapped to the slave output,
  723. specified by a stream specifier. If not specified, this defaults to
  724. all the input streams.
  725. @end table
  726. Some examples follow.
  727. @itemize
  728. @item
  729. Encode something and both archive it in a WebM file and stream it
  730. as MPEG-TS over UDP (the streams need to be explicitly mapped):
  731. @example
  732. ffmpeg -i ... -c:v libx264 -c:a mp2 -f tee -map 0:v -map 0:a
  733. "archive-20121107.mkv|[f=mpegts]udp://10.0.1.255:1234/"
  734. @end example
  735. @item
  736. Use @command{ffmpeg} to encode the input, and send the output
  737. to three different destinations. The @code{dump_extra} bitstream
  738. filter is used to add extradata information to all the output video
  739. keyframes packets, as requested by the MPEG-TS format. The select
  740. option is applied to @file{out.aac} in order to make it contain only
  741. audio packets.
  742. @example
  743. ffmpeg -i ... -map 0 -flags +global_header -c:v libx264 -c:a aac -strict experimental
  744. -f tee "[bsfs/v=dump_extra]out.ts|[movflags=+faststart]out.mp4|[select=a]out.aac"
  745. @end example
  746. @end itemize
  747. Note: some codecs may need different options depending on the output format;
  748. the auto-detection of this can not work with the tee muxer. The main example
  749. is the @option{global_header} flag.
  750. @c man end MUXERS