You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

818 lines
29KB

  1. @chapter Muxers
  2. @c man begin MUXERS
  3. Muxers are configured elements in FFmpeg which allow writing
  4. multimedia streams to a particular type of file.
  5. When you configure your FFmpeg build, all the supported muxers
  6. are enabled by default. You can list all available muxers using the
  7. configure option @code{--list-muxers}.
  8. You can disable all the muxers with the configure option
  9. @code{--disable-muxers} and selectively enable / disable single muxers
  10. with the options @code{--enable-muxer=@var{MUXER}} /
  11. @code{--disable-muxer=@var{MUXER}}.
  12. The option @code{-formats} of the ff* tools will display the list of
  13. enabled muxers.
  14. A description of some of the currently available muxers follows.
  15. @anchor{crc}
  16. @section crc
  17. CRC (Cyclic Redundancy Check) testing format.
  18. This muxer computes and prints the Adler-32 CRC of all the input audio
  19. and video frames. By default audio frames are converted to signed
  20. 16-bit raw audio and video frames to raw video before computing the
  21. CRC.
  22. The output of the muxer consists of a single line of the form:
  23. CRC=0x@var{CRC}, where @var{CRC} is a hexadecimal number 0-padded to
  24. 8 digits containing the CRC for all the decoded input frames.
  25. For example to compute the CRC of the input, and store it in the file
  26. @file{out.crc}:
  27. @example
  28. ffmpeg -i INPUT -f crc out.crc
  29. @end example
  30. You can print the CRC to stdout with the command:
  31. @example
  32. ffmpeg -i INPUT -f crc -
  33. @end example
  34. You can select the output format of each frame with @command{ffmpeg} by
  35. specifying the audio and video codec and format. For example to
  36. compute the CRC of the input audio converted to PCM unsigned 8-bit
  37. and the input video converted to MPEG-2 video, use the command:
  38. @example
  39. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f crc -
  40. @end example
  41. See also the @ref{framecrc} muxer.
  42. @anchor{framecrc}
  43. @section framecrc
  44. Per-packet CRC (Cyclic Redundancy Check) testing format.
  45. This muxer computes and prints the Adler-32 CRC for each audio
  46. and video packet. By default audio frames are converted to signed
  47. 16-bit raw audio and video frames to raw video before computing the
  48. CRC.
  49. The output of the muxer consists of a line for each audio and video
  50. packet of the form:
  51. @example
  52. @var{stream_index}, @var{packet_dts}, @var{packet_pts}, @var{packet_duration}, @var{packet_size}, 0x@var{CRC}
  53. @end example
  54. @var{CRC} is a hexadecimal number 0-padded to 8 digits containing the
  55. CRC of the packet.
  56. For example to compute the CRC of the audio and video frames in
  57. @file{INPUT}, converted to raw audio and video packets, and store it
  58. in the file @file{out.crc}:
  59. @example
  60. ffmpeg -i INPUT -f framecrc out.crc
  61. @end example
  62. To print the information to stdout, use the command:
  63. @example
  64. ffmpeg -i INPUT -f framecrc -
  65. @end example
  66. With @command{ffmpeg}, you can select the output format to which the
  67. audio and video frames are encoded before computing the CRC for each
  68. packet by specifying the audio and video codec. For example, to
  69. compute the CRC of each decoded input audio frame converted to PCM
  70. unsigned 8-bit and of each decoded input video frame converted to
  71. MPEG-2 video, use the command:
  72. @example
  73. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f framecrc -
  74. @end example
  75. See also the @ref{crc} muxer.
  76. @anchor{framemd5}
  77. @section framemd5
  78. Per-packet MD5 testing format.
  79. This muxer computes and prints the MD5 hash for each audio
  80. and video packet. By default audio frames are converted to signed
  81. 16-bit raw audio and video frames to raw video before computing the
  82. hash.
  83. The output of the muxer consists of a line for each audio and video
  84. packet of the form:
  85. @example
  86. @var{stream_index}, @var{packet_dts}, @var{packet_pts}, @var{packet_duration}, @var{packet_size}, @var{MD5}
  87. @end example
  88. @var{MD5} is a hexadecimal number representing the computed MD5 hash
  89. for the packet.
  90. For example to compute the MD5 of the audio and video frames in
  91. @file{INPUT}, converted to raw audio and video packets, and store it
  92. in the file @file{out.md5}:
  93. @example
  94. ffmpeg -i INPUT -f framemd5 out.md5
  95. @end example
  96. To print the information to stdout, use the command:
  97. @example
  98. ffmpeg -i INPUT -f framemd5 -
  99. @end example
  100. See also the @ref{md5} muxer.
  101. @anchor{hls}
  102. @section hls
  103. Apple HTTP Live Streaming muxer that segments MPEG-TS according to
  104. the HTTP Live Streaming specification.
  105. It creates a playlist file and numbered segment files. The output
  106. filename specifies the playlist filename; the segment filenames
  107. receive the same basename as the playlist, a sequential number and
  108. a .ts extension.
  109. @example
  110. ffmpeg -i in.nut out.m3u8
  111. @end example
  112. @table @option
  113. @item -hls_time @var{seconds}
  114. Set the segment length in seconds.
  115. @item -hls_list_size @var{size}
  116. Set the maximum number of playlist entries.
  117. @item -hls_wrap @var{wrap}
  118. Set the number after which index wraps.
  119. @item -start_number @var{number}
  120. Start the sequence from @var{number}.
  121. @end table
  122. @anchor{ico}
  123. @section ico
  124. ICO file muxer.
  125. Microsoft's icon file format (ICO) has some strict limitations that should be noted:
  126. @itemize
  127. @item
  128. Size cannot exceed 256 pixels in any dimension
  129. @item
  130. Only BMP and PNG images can be stored
  131. @item
  132. If a BMP image is used, it must be one of the following pixel formats:
  133. @example
  134. BMP Bit Depth FFmpeg Pixel Format
  135. 1bit pal8
  136. 4bit pal8
  137. 8bit pal8
  138. 16bit rgb555le
  139. 24bit bgr24
  140. 32bit bgra
  141. @end example
  142. @item
  143. If a BMP image is used, it must use the BITMAPINFOHEADER DIB header
  144. @item
  145. If a PNG image is used, it must use the rgba pixel format
  146. @end itemize
  147. @anchor{image2}
  148. @section image2
  149. Image file muxer.
  150. The image file muxer writes video frames to image files.
  151. The output filenames are specified by a pattern, which can be used to
  152. produce sequentially numbered series of files.
  153. The pattern may contain the string "%d" or "%0@var{N}d", this string
  154. specifies the position of the characters representing a numbering in
  155. the filenames. If the form "%0@var{N}d" is used, the string
  156. representing the number in each filename is 0-padded to @var{N}
  157. digits. The literal character '%' can be specified in the pattern with
  158. the string "%%".
  159. If the pattern contains "%d" or "%0@var{N}d", the first filename of
  160. the file list specified will contain the number 1, all the following
  161. numbers will be sequential.
  162. The pattern may contain a suffix which is used to automatically
  163. determine the format of the image files to write.
  164. For example the pattern "img-%03d.bmp" will specify a sequence of
  165. filenames of the form @file{img-001.bmp}, @file{img-002.bmp}, ...,
  166. @file{img-010.bmp}, etc.
  167. The pattern "img%%-%d.jpg" will specify a sequence of filenames of the
  168. form @file{img%-1.jpg}, @file{img%-2.jpg}, ..., @file{img%-10.jpg},
  169. etc.
  170. The following example shows how to use @command{ffmpeg} for creating a
  171. sequence of files @file{img-001.jpeg}, @file{img-002.jpeg}, ...,
  172. taking one image every second from the input video:
  173. @example
  174. ffmpeg -i in.avi -vsync 1 -r 1 -f image2 'img-%03d.jpeg'
  175. @end example
  176. Note that with @command{ffmpeg}, if the format is not specified with the
  177. @code{-f} option and the output filename specifies an image file
  178. format, the image2 muxer is automatically selected, so the previous
  179. command can be written as:
  180. @example
  181. ffmpeg -i in.avi -vsync 1 -r 1 'img-%03d.jpeg'
  182. @end example
  183. Note also that the pattern must not necessarily contain "%d" or
  184. "%0@var{N}d", for example to create a single image file
  185. @file{img.jpeg} from the input video you can employ the command:
  186. @example
  187. ffmpeg -i in.avi -f image2 -frames:v 1 img.jpeg
  188. @end example
  189. @table @option
  190. @item start_number @var{number}
  191. Start the sequence from @var{number}. Default value is 1. Must be a
  192. positive number.
  193. @item -update @var{number}
  194. If @var{number} is nonzero, the filename will always be interpreted as just a
  195. filename, not a pattern, and this file will be continuously overwritten with new
  196. images.
  197. @end table
  198. The image muxer supports the .Y.U.V image file format. This format is
  199. special in that that each image frame consists of three files, for
  200. each of the YUV420P components. To read or write this image file format,
  201. specify the name of the '.Y' file. The muxer will automatically open the
  202. '.U' and '.V' files as required.
  203. @section matroska
  204. Matroska container muxer.
  205. This muxer implements the matroska and webm container specs.
  206. The recognized metadata settings in this muxer are:
  207. @table @option
  208. @item title=@var{title name}
  209. Name provided to a single track
  210. @end table
  211. @table @option
  212. @item language=@var{language name}
  213. Specifies the language of the track in the Matroska languages form
  214. @end table
  215. @table @option
  216. @item stereo_mode=@var{mode}
  217. Stereo 3D video layout of two views in a single video track
  218. @table @option
  219. @item mono
  220. video is not stereo
  221. @item left_right
  222. Both views are arranged side by side, Left-eye view is on the left
  223. @item bottom_top
  224. Both views are arranged in top-bottom orientation, Left-eye view is at bottom
  225. @item top_bottom
  226. Both views are arranged in top-bottom orientation, Left-eye view is on top
  227. @item checkerboard_rl
  228. Each view is arranged in a checkerboard interleaved pattern, Left-eye view being first
  229. @item checkerboard_lr
  230. Each view is arranged in a checkerboard interleaved pattern, Right-eye view being first
  231. @item row_interleaved_rl
  232. Each view is constituted by a row based interleaving, Right-eye view is first row
  233. @item row_interleaved_lr
  234. Each view is constituted by a row based interleaving, Left-eye view is first row
  235. @item col_interleaved_rl
  236. Both views are arranged in a column based interleaving manner, Right-eye view is first column
  237. @item col_interleaved_lr
  238. Both views are arranged in a column based interleaving manner, Left-eye view is first column
  239. @item anaglyph_cyan_red
  240. All frames are in anaglyph format viewable through red-cyan filters
  241. @item right_left
  242. Both views are arranged side by side, Right-eye view is on the left
  243. @item anaglyph_green_magenta
  244. All frames are in anaglyph format viewable through green-magenta filters
  245. @item block_lr
  246. Both eyes laced in one Block, Left-eye view is first
  247. @item block_rl
  248. Both eyes laced in one Block, Right-eye view is first
  249. @end table
  250. @end table
  251. For example a 3D WebM clip can be created using the following command line:
  252. @example
  253. ffmpeg -i sample_left_right_clip.mpg -an -c:v libvpx -metadata stereo_mode=left_right -y stereo_clip.webm
  254. @end example
  255. This muxer supports the following options:
  256. @table @option
  257. @item reserve_index_space
  258. By default, this muxer writes the index for seeking (called cues in Matroska
  259. terms) at the end of the file, because it cannot know in advance how much space
  260. to leave for the index at the beginning of the file. However for some use cases
  261. -- e.g. streaming where seeking is possible but slow -- it is useful to put the
  262. index at the beginning of the file.
  263. If this option is set to a non-zero value, the muxer will reserve a given amount
  264. of space in the file header and then try to write the cues there when the muxing
  265. finishes. If the available space does not suffice, muxing will fail. A safe size
  266. for most use cases should be about 50kB per hour of video.
  267. Note that cues are only written if the output is seekable and this option will
  268. have no effect if it is not.
  269. @end table
  270. @anchor{md5}
  271. @section md5
  272. MD5 testing format.
  273. This muxer computes and prints the MD5 hash of all the input audio
  274. and video frames. By default audio frames are converted to signed
  275. 16-bit raw audio and video frames to raw video before computing the
  276. hash.
  277. The output of the muxer consists of a single line of the form:
  278. MD5=@var{MD5}, where @var{MD5} is a hexadecimal number representing
  279. the computed MD5 hash.
  280. For example to compute the MD5 hash of the input converted to raw
  281. audio and video, and store it in the file @file{out.md5}:
  282. @example
  283. ffmpeg -i INPUT -f md5 out.md5
  284. @end example
  285. You can print the MD5 to stdout with the command:
  286. @example
  287. ffmpeg -i INPUT -f md5 -
  288. @end example
  289. See also the @ref{framemd5} muxer.
  290. @section MOV/MP4/ISMV
  291. The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4
  292. file has all the metadata about all packets stored in one location
  293. (written at the end of the file, it can be moved to the start for
  294. better playback by adding @var{faststart} to the @var{movflags}, or
  295. using the @command{qt-faststart} tool). A fragmented
  296. file consists of a number of fragments, where packets and metadata
  297. about these packets are stored together. Writing a fragmented
  298. file has the advantage that the file is decodable even if the
  299. writing is interrupted (while a normal MOV/MP4 is undecodable if
  300. it is not properly finished), and it requires less memory when writing
  301. very long files (since writing normal MOV/MP4 files stores info about
  302. every single packet in memory until the file is closed). The downside
  303. is that it is less compatible with other applications.
  304. Fragmentation is enabled by setting one of the AVOptions that define
  305. how to cut the file into fragments:
  306. @table @option
  307. @item -moov_size @var{bytes}
  308. Reserves space for the moov atom at the beginning of the file instead of placing the
  309. moov atom at the end. If the space reserved is insufficient, muxing will fail.
  310. @item -movflags frag_keyframe
  311. Start a new fragment at each video keyframe.
  312. @item -frag_duration @var{duration}
  313. Create fragments that are @var{duration} microseconds long.
  314. @item -frag_size @var{size}
  315. Create fragments that contain up to @var{size} bytes of payload data.
  316. @item -movflags frag_custom
  317. Allow the caller to manually choose when to cut fragments, by
  318. calling @code{av_write_frame(ctx, NULL)} to write a fragment with
  319. the packets written so far. (This is only useful with other
  320. applications integrating libavformat, not from @command{ffmpeg}.)
  321. @item -min_frag_duration @var{duration}
  322. Don't create fragments that are shorter than @var{duration} microseconds long.
  323. @end table
  324. If more than one condition is specified, fragments are cut when
  325. one of the specified conditions is fulfilled. The exception to this is
  326. @code{-min_frag_duration}, which has to be fulfilled for any of the other
  327. conditions to apply.
  328. Additionally, the way the output file is written can be adjusted
  329. through a few other options:
  330. @table @option
  331. @item -movflags empty_moov
  332. Write an initial moov atom directly at the start of the file, without
  333. describing any samples in it. Generally, an mdat/moov pair is written
  334. at the start of the file, as a normal MOV/MP4 file, containing only
  335. a short portion of the file. With this option set, there is no initial
  336. mdat atom, and the moov atom only describes the tracks but has
  337. a zero duration.
  338. Files written with this option set do not work in QuickTime.
  339. This option is implicitly set when writing ismv (Smooth Streaming) files.
  340. @item -movflags separate_moof
  341. Write a separate moof (movie fragment) atom for each track. Normally,
  342. packets for all tracks are written in a moof atom (which is slightly
  343. more efficient), but with this option set, the muxer writes one moof/mdat
  344. pair for each track, making it easier to separate tracks.
  345. This option is implicitly set when writing ismv (Smooth Streaming) files.
  346. @item -movflags faststart
  347. Run a second pass moving the moov atom on top of the file. This
  348. operation can take a while, and will not work in various situations such
  349. as fragmented output, thus it is not enabled by default.
  350. @item -movflags rtphint
  351. Add RTP hinting tracks to the output file.
  352. @end table
  353. Smooth Streaming content can be pushed in real time to a publishing
  354. point on IIS with this muxer. Example:
  355. @example
  356. ffmpeg -re @var{<normal input/transcoding options>} -movflags isml+frag_keyframe -f ismv http://server/publishingpoint.isml/Streams(Encoder1)
  357. @end example
  358. @section mp3
  359. The MP3 muxer writes a raw MP3 stream with an ID3v2 header at the beginning and
  360. optionally an ID3v1 tag at the end. ID3v2.3 and ID3v2.4 are supported, the
  361. @code{id3v2_version} option controls which one is used. The legacy ID3v1 tag is
  362. not written by default, but may be enabled with the @code{write_id3v1} option.
  363. For seekable output the muxer also writes a Xing frame at the beginning, which
  364. contains the number of frames in the file. It is useful for computing duration
  365. of VBR files.
  366. The muxer supports writing ID3v2 attached pictures (APIC frames). The pictures
  367. are supplied to the muxer in form of a video stream with a single packet. There
  368. can be any number of those streams, each will correspond to a single APIC frame.
  369. The stream metadata tags @var{title} and @var{comment} map to APIC
  370. @var{description} and @var{picture type} respectively. See
  371. @url{http://id3.org/id3v2.4.0-frames} for allowed picture types.
  372. Note that the APIC frames must be written at the beginning, so the muxer will
  373. buffer the audio frames until it gets all the pictures. It is therefore advised
  374. to provide the pictures as soon as possible to avoid excessive buffering.
  375. Examples:
  376. Write an mp3 with an ID3v2.3 header and an ID3v1 footer:
  377. @example
  378. ffmpeg -i INPUT -id3v2_version 3 -write_id3v1 1 out.mp3
  379. @end example
  380. To attach a picture to an mp3 file select both the audio and the picture stream
  381. with @code{map}:
  382. @example
  383. ffmpeg -i input.mp3 -i cover.png -c copy -map 0 -map 1
  384. -metadata:s:v title="Album cover" -metadata:s:v comment="Cover (Front)" out.mp3
  385. @end example
  386. @section mpegts
  387. MPEG transport stream muxer.
  388. This muxer implements ISO 13818-1 and part of ETSI EN 300 468.
  389. The muxer options are:
  390. @table @option
  391. @item -mpegts_original_network_id @var{number}
  392. Set the original_network_id (default 0x0001). This is unique identifier
  393. of a network in DVB. Its main use is in the unique identification of a
  394. service through the path Original_Network_ID, Transport_Stream_ID.
  395. @item -mpegts_transport_stream_id @var{number}
  396. Set the transport_stream_id (default 0x0001). This identifies a
  397. transponder in DVB.
  398. @item -mpegts_service_id @var{number}
  399. Set the service_id (default 0x0001) also known as program in DVB.
  400. @item -mpegts_pmt_start_pid @var{number}
  401. Set the first PID for PMT (default 0x1000, max 0x1f00).
  402. @item -mpegts_start_pid @var{number}
  403. Set the first PID for data packets (default 0x0100, max 0x0f00).
  404. @end table
  405. The recognized metadata settings in mpegts muxer are @code{service_provider}
  406. and @code{service_name}. If they are not set the default for
  407. @code{service_provider} is "FFmpeg" and the default for
  408. @code{service_name} is "Service01".
  409. @example
  410. ffmpeg -i file.mpg -c copy \
  411. -mpegts_original_network_id 0x1122 \
  412. -mpegts_transport_stream_id 0x3344 \
  413. -mpegts_service_id 0x5566 \
  414. -mpegts_pmt_start_pid 0x1500 \
  415. -mpegts_start_pid 0x150 \
  416. -metadata service_provider="Some provider" \
  417. -metadata service_name="Some Channel" \
  418. -y out.ts
  419. @end example
  420. @section null
  421. Null muxer.
  422. This muxer does not generate any output file, it is mainly useful for
  423. testing or benchmarking purposes.
  424. For example to benchmark decoding with @command{ffmpeg} you can use the
  425. command:
  426. @example
  427. ffmpeg -benchmark -i INPUT -f null out.null
  428. @end example
  429. Note that the above command does not read or write the @file{out.null}
  430. file, but specifying the output file is required by the @command{ffmpeg}
  431. syntax.
  432. Alternatively you can write the command as:
  433. @example
  434. ffmpeg -benchmark -i INPUT -f null -
  435. @end example
  436. @section ogg
  437. Ogg container muxer.
  438. @table @option
  439. @item -page_duration @var{duration}
  440. Preferred page duration, in microseconds. The muxer will attempt to create
  441. pages that are approximately @var{duration} microseconds long. This allows the
  442. user to compromise between seek granularity and container overhead. The default
  443. is 1 second. A value of 0 will fill all segments, making pages as large as
  444. possible. A value of 1 will effectively use 1 packet-per-page in most
  445. situations, giving a small seek granularity at the cost of additional container
  446. overhead.
  447. @end table
  448. @section segment, stream_segment, ssegment
  449. Basic stream segmenter.
  450. The segmenter muxer outputs streams to a number of separate files of nearly
  451. fixed duration. Output filename pattern can be set in a fashion similar to
  452. @ref{image2}.
  453. @code{stream_segment} is a variant of the muxer used to write to
  454. streaming output formats, i.e. which do not require global headers,
  455. and is recommended for outputting e.g. to MPEG transport stream segments.
  456. @code{ssegment} is a shorter alias for @code{stream_segment}.
  457. Every segment starts with a keyframe of the selected reference stream,
  458. which is set through the @option{reference_stream} option.
  459. Note that if you want accurate splitting for a video file, you need to
  460. make the input key frames correspond to the exact splitting times
  461. expected by the segmenter, or the segment muxer will start the new
  462. segment with the key frame found next after the specified start
  463. time.
  464. The segment muxer works best with a single constant frame rate video.
  465. Optionally it can generate a list of the created segments, by setting
  466. the option @var{segment_list}. The list type is specified by the
  467. @var{segment_list_type} option.
  468. The segment muxer supports the following options:
  469. @table @option
  470. @item reference_stream @var{specifier}
  471. Set the reference stream, as specified by the string @var{specifier}.
  472. If @var{specifier} is set to @code{auto}, the reference is choosen
  473. automatically. Otherwise it must be a stream specifier (see the ``Stream
  474. specifiers'' chapter in the ffmpeg manual) which specifies the
  475. reference stream. The default value is @code{auto}.
  476. @item segment_format @var{format}
  477. Override the inner container format, by default it is guessed by the filename
  478. extension.
  479. @item segment_list @var{name}
  480. Generate also a listfile named @var{name}. If not specified no
  481. listfile is generated.
  482. @item segment_list_flags @var{flags}
  483. Set flags affecting the segment list generation.
  484. It currently supports the following flags:
  485. @table @samp
  486. @item cache
  487. Allow caching (only affects M3U8 list files).
  488. @item live
  489. Allow live-friendly file generation.
  490. @end table
  491. Default value is @code{samp}.
  492. @item segment_list_size @var{size}
  493. Update the list file so that it contains at most the last @var{size}
  494. segments. If 0 the list file will contain all the segments. Default
  495. value is 0.
  496. @item segment_list_type @var{type}
  497. Specify the format for the segment list file.
  498. The following values are recognized:
  499. @table @samp
  500. @item flat
  501. Generate a flat list for the created segments, one segment per line.
  502. @item csv, ext
  503. Generate a list for the created segments, one segment per line,
  504. each line matching the format (comma-separated values):
  505. @example
  506. @var{segment_filename},@var{segment_start_time},@var{segment_end_time}
  507. @end example
  508. @var{segment_filename} is the name of the output file generated by the
  509. muxer according to the provided pattern. CSV escaping (according to
  510. RFC4180) is applied if required.
  511. @var{segment_start_time} and @var{segment_end_time} specify
  512. the segment start and end time expressed in seconds.
  513. A list file with the suffix @code{".csv"} or @code{".ext"} will
  514. auto-select this format.
  515. @samp{ext} is deprecated in favor or @samp{csv}.
  516. @item ffconcat
  517. Generate an ffconcat file for the created segments. The resulting file
  518. can be read using the FFmpeg @ref{concat} demuxer.
  519. A list file with the suffix @code{".ffcat"} or @code{".ffconcat"} will
  520. auto-select this format.
  521. @item m3u8
  522. Generate an extended M3U8 file, version 3, compliant with
  523. @url{http://tools.ietf.org/id/draft-pantos-http-live-streaming}.
  524. A list file with the suffix @code{".m3u8"} will auto-select this format.
  525. @end table
  526. If not specified the type is guessed from the list file name suffix.
  527. @item segment_time @var{time}
  528. Set segment duration to @var{time}, the value must be a duration
  529. specification. Default value is "2". See also the
  530. @option{segment_times} option.
  531. Note that splitting may not be accurate, unless you force the
  532. reference stream key-frames at the given time. See the introductory
  533. notice and the examples below.
  534. @item segment_time_delta @var{delta}
  535. Specify the accuracy time when selecting the start time for a
  536. segment, expressed as a duration specification. Default value is "0".
  537. When delta is specified a key-frame will start a new segment if its
  538. PTS satisfies the relation:
  539. @example
  540. PTS >= start_time - time_delta
  541. @end example
  542. This option is useful when splitting video content, which is always
  543. split at GOP boundaries, in case a key frame is found just before the
  544. specified split time.
  545. In particular may be used in combination with the @file{ffmpeg} option
  546. @var{force_key_frames}. The key frame times specified by
  547. @var{force_key_frames} may not be set accurately because of rounding
  548. issues, with the consequence that a key frame time may result set just
  549. before the specified time. For constant frame rate videos a value of
  550. 1/2*@var{frame_rate} should address the worst case mismatch between
  551. the specified time and the time set by @var{force_key_frames}.
  552. @item segment_times @var{times}
  553. Specify a list of split points. @var{times} contains a list of comma
  554. separated duration specifications, in increasing order. See also
  555. the @option{segment_time} option.
  556. @item segment_frames @var{frames}
  557. Specify a list of split video frame numbers. @var{frames} contains a
  558. list of comma separated integer numbers, in increasing order.
  559. This option specifies to start a new segment whenever a reference
  560. stream key frame is found and the sequential number (starting from 0)
  561. of the frame is greater or equal to the next value in the list.
  562. @item segment_wrap @var{limit}
  563. Wrap around segment index once it reaches @var{limit}.
  564. @item segment_start_number @var{number}
  565. Set the sequence number of the first segment. Defaults to @code{0}.
  566. @item reset_timestamps @var{1|0}
  567. Reset timestamps at the begin of each segment, so that each segment
  568. will start with near-zero timestamps. It is meant to ease the playback
  569. of the generated segments. May not work with some combinations of
  570. muxers/codecs. It is set to @code{0} by default.
  571. @end table
  572. @subsection Examples
  573. @itemize
  574. @item
  575. To remux the content of file @file{in.mkv} to a list of segments
  576. @file{out-000.nut}, @file{out-001.nut}, etc., and write the list of
  577. generated segments to @file{out.list}:
  578. @example
  579. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.list out%03d.nut
  580. @end example
  581. @item
  582. As the example above, but segment the input file according to the split
  583. points specified by the @var{segment_times} option:
  584. @example
  585. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.csv -segment_times 1,2,3,5,8,13,21 out%03d.nut
  586. @end example
  587. @item
  588. As the example above, but use the @command{ffmpeg} @option{force_key_frames}
  589. option to force key frames in the input at the specified location, together
  590. with the segment option @option{segment_time_delta} to account for
  591. possible roundings operated when setting key frame times.
  592. @example
  593. ffmpeg -i in.mkv -force_key_frames 1,2,3,5,8,13,21 -codec:v mpeg4 -codec:a pcm_s16le -map 0 \
  594. -f segment -segment_list out.csv -segment_times 1,2,3,5,8,13,21 -segment_time_delta 0.05 out%03d.nut
  595. @end example
  596. In order to force key frames on the input file, transcoding is
  597. required.
  598. @item
  599. Segment the input file by splitting the input file according to the
  600. frame numbers sequence specified with the @option{segment_frames} option:
  601. @example
  602. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.csv -segment_frames 100,200,300,500,800 out%03d.nut
  603. @end example
  604. @item
  605. To convert the @file{in.mkv} to TS segments using the @code{libx264}
  606. and @code{libfaac} encoders:
  607. @example
  608. ffmpeg -i in.mkv -map 0 -codec:v libx264 -codec:a libfaac -f ssegment -segment_list out.list out%03d.ts
  609. @end example
  610. @item
  611. Segment the input file, and create an M3U8 live playlist (can be used
  612. as live HLS source):
  613. @example
  614. ffmpeg -re -i in.mkv -codec copy -map 0 -f segment -segment_list playlist.m3u8 \
  615. -segment_list_flags +live -segment_time 10 out%03d.mkv
  616. @end example
  617. @end itemize
  618. @section tee
  619. The tee muxer can be used to write the same data to several files or any
  620. other kind of muxer. It can be used, for example, to both stream a video to
  621. the network and save it to disk at the same time.
  622. It is different from specifying several outputs to the @command{ffmpeg}
  623. command-line tool because the audio and video data will be encoded only once
  624. with the tee muxer; encoding can be a very expensive process. It is not
  625. useful when using the libavformat API directly because it is then possible
  626. to feed the same packets to several muxers directly.
  627. The slave outputs are specified in the file name given to the muxer,
  628. separated by '|'. If any of the slave name contains the '|' separator,
  629. leading or trailing spaces or any special character, it must be
  630. escaped (see the ``Quoting and escaping'' section in the ffmpeg-utils
  631. manual).
  632. Options can be specified for each slave by prepending them as a list of
  633. @var{key}=@var{value} pairs separated by ':', between square brackets. If
  634. the options values contain a special character or the ':' separator, they
  635. must be escaped; note that this is a second level escaping.
  636. Example: encode something and both archive it in a WebM file and stream it
  637. as MPEG-TS over UDP (the streams need to be explicitly mapped):
  638. @example
  639. ffmpeg -i ... -c:v libx264 -c:a mp2 -f tee -map 0:v -map 0:a
  640. "archive-20121107.mkv|[f=mpegts]udp://10.0.1.255:1234/"
  641. @end example
  642. Note: some codecs may need different options depending on the output format;
  643. the auto-detection of this can not work with the tee muxer. The main example
  644. is the @option{global_header} flag.
  645. @c man end MUXERS