You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

416 lines
15KB

  1. @chapter Muxers
  2. @c man begin MUXERS
  3. Muxers are configured elements in FFmpeg which allow writing
  4. multimedia streams to a particular type of file.
  5. When you configure your FFmpeg build, all the supported muxers
  6. are enabled by default. You can list all available muxers using the
  7. configure option @code{--list-muxers}.
  8. You can disable all the muxers with the configure option
  9. @code{--disable-muxers} and selectively enable / disable single muxers
  10. with the options @code{--enable-muxer=@var{MUXER}} /
  11. @code{--disable-muxer=@var{MUXER}}.
  12. The option @code{-formats} of the ff* tools will display the list of
  13. enabled muxers.
  14. A description of some of the currently available muxers follows.
  15. @anchor{crc}
  16. @section crc
  17. CRC (Cyclic Redundancy Check) testing format.
  18. This muxer computes and prints the Adler-32 CRC of all the input audio
  19. and video frames. By default audio frames are converted to signed
  20. 16-bit raw audio and video frames to raw video before computing the
  21. CRC.
  22. The output of the muxer consists of a single line of the form:
  23. CRC=0x@var{CRC}, where @var{CRC} is a hexadecimal number 0-padded to
  24. 8 digits containing the CRC for all the decoded input frames.
  25. For example to compute the CRC of the input, and store it in the file
  26. @file{out.crc}:
  27. @example
  28. ffmpeg -i INPUT -f crc out.crc
  29. @end example
  30. You can print the CRC to stdout with the command:
  31. @example
  32. ffmpeg -i INPUT -f crc -
  33. @end example
  34. You can select the output format of each frame with @command{ffmpeg} by
  35. specifying the audio and video codec and format. For example to
  36. compute the CRC of the input audio converted to PCM unsigned 8-bit
  37. and the input video converted to MPEG-2 video, use the command:
  38. @example
  39. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f crc -
  40. @end example
  41. See also the @ref{framecrc} muxer.
  42. @anchor{framecrc}
  43. @section framecrc
  44. Per-frame CRC (Cyclic Redundancy Check) testing format.
  45. This muxer computes and prints the Adler-32 CRC for each decoded audio
  46. and video frame. By default audio frames are converted to signed
  47. 16-bit raw audio and video frames to raw video before computing the
  48. CRC.
  49. The output of the muxer consists of a line for each audio and video
  50. frame of the form: @var{stream_index}, @var{frame_dts},
  51. @var{frame_size}, 0x@var{CRC}, where @var{CRC} is a hexadecimal
  52. number 0-padded to 8 digits containing the CRC of the decoded frame.
  53. For example to compute the CRC of each decoded frame in the input, and
  54. store it in the file @file{out.crc}:
  55. @example
  56. ffmpeg -i INPUT -f framecrc out.crc
  57. @end example
  58. You can print the CRC of each decoded frame to stdout with the command:
  59. @example
  60. ffmpeg -i INPUT -f framecrc -
  61. @end example
  62. You can select the output format of each frame with @command{ffmpeg} by
  63. specifying the audio and video codec and format. For example, to
  64. compute the CRC of each decoded input audio frame converted to PCM
  65. unsigned 8-bit and of each decoded input video frame converted to
  66. MPEG-2 video, use the command:
  67. @example
  68. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f framecrc -
  69. @end example
  70. See also the @ref{crc} muxer.
  71. @anchor{image2}
  72. @section image2
  73. Image file muxer.
  74. The image file muxer writes video frames to image files.
  75. The output filenames are specified by a pattern, which can be used to
  76. produce sequentially numbered series of files.
  77. The pattern may contain the string "%d" or "%0@var{N}d", this string
  78. specifies the position of the characters representing a numbering in
  79. the filenames. If the form "%0@var{N}d" is used, the string
  80. representing the number in each filename is 0-padded to @var{N}
  81. digits. The literal character '%' can be specified in the pattern with
  82. the string "%%".
  83. If the pattern contains "%d" or "%0@var{N}d", the first filename of
  84. the file list specified will contain the number 1, all the following
  85. numbers will be sequential.
  86. The pattern may contain a suffix which is used to automatically
  87. determine the format of the image files to write.
  88. For example the pattern "img-%03d.bmp" will specify a sequence of
  89. filenames of the form @file{img-001.bmp}, @file{img-002.bmp}, ...,
  90. @file{img-010.bmp}, etc.
  91. The pattern "img%%-%d.jpg" will specify a sequence of filenames of the
  92. form @file{img%-1.jpg}, @file{img%-2.jpg}, ..., @file{img%-10.jpg},
  93. etc.
  94. The following example shows how to use @command{ffmpeg} for creating a
  95. sequence of files @file{img-001.jpeg}, @file{img-002.jpeg}, ...,
  96. taking one image every second from the input video:
  97. @example
  98. ffmpeg -i in.avi -vsync 1 -r 1 -f image2 'img-%03d.jpeg'
  99. @end example
  100. Note that with @command{ffmpeg}, if the format is not specified with the
  101. @code{-f} option and the output filename specifies an image file
  102. format, the image2 muxer is automatically selected, so the previous
  103. command can be written as:
  104. @example
  105. ffmpeg -i in.avi -vsync 1 -r 1 'img-%03d.jpeg'
  106. @end example
  107. Note also that the pattern must not necessarily contain "%d" or
  108. "%0@var{N}d", for example to create a single image file
  109. @file{img.jpeg} from the input video you can employ the command:
  110. @example
  111. ffmpeg -i in.avi -f image2 -frames:v 1 img.jpeg
  112. @end example
  113. The image muxer supports the .Y.U.V image file format. This format is
  114. special in that that each image frame consists of three files, for
  115. each of the YUV420P components. To read or write this image file format,
  116. specify the name of the '.Y' file. The muxer will automatically open the
  117. '.U' and '.V' files as required.
  118. @section MOV/MP4/ISMV
  119. The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4
  120. file has all the metadata about all packets stored in one location
  121. (written at the end of the file, it can be moved to the start for
  122. better playback using the @command{qt-faststart} tool). A fragmented
  123. file consists of a number of fragments, where packets and metadata
  124. about these packets are stored together. Writing a fragmented
  125. file has the advantage that the file is decodable even if the
  126. writing is interrupted (while a normal MOV/MP4 is undecodable if
  127. it is not properly finished), and it requires less memory when writing
  128. very long files (since writing normal MOV/MP4 files stores info about
  129. every single packet in memory until the file is closed). The downside
  130. is that it is less compatible with other applications.
  131. Fragmentation is enabled by setting one of the AVOptions that define
  132. how to cut the file into fragments:
  133. @table @option
  134. @item -moov_size @var{bytes}
  135. Reserves space for the moov atom at the beginning of the file instead of placing the
  136. moov atom at the end. If the space reserved is insufficient, muxing will fail.
  137. @item -movflags frag_keyframe
  138. Start a new fragment at each video keyframe.
  139. @item -frag_duration @var{duration}
  140. Create fragments that are @var{duration} microseconds long.
  141. @item -frag_size @var{size}
  142. Create fragments that contain up to @var{size} bytes of payload data.
  143. @item -movflags frag_custom
  144. Allow the caller to manually choose when to cut fragments, by
  145. calling @code{av_write_frame(ctx, NULL)} to write a fragment with
  146. the packets written so far. (This is only useful with other
  147. applications integrating libavformat, not from @command{ffmpeg}.)
  148. @item -min_frag_duration @var{duration}
  149. Don't create fragments that are shorter than @var{duration} microseconds long.
  150. @end table
  151. If more than one condition is specified, fragments are cut when
  152. one of the specified conditions is fulfilled. The exception to this is
  153. @code{-min_frag_duration}, which has to be fulfilled for any of the other
  154. conditions to apply.
  155. Additionally, the way the output file is written can be adjusted
  156. through a few other options:
  157. @table @option
  158. @item -movflags empty_moov
  159. Write an initial moov atom directly at the start of the file, without
  160. describing any samples in it. Generally, an mdat/moov pair is written
  161. at the start of the file, as a normal MOV/MP4 file, containing only
  162. a short portion of the file. With this option set, there is no initial
  163. mdat atom, and the moov atom only describes the tracks but has
  164. a zero duration.
  165. Files written with this option set do not work in QuickTime.
  166. This option is implicitly set when writing ismv (Smooth Streaming) files.
  167. @item -movflags separate_moof
  168. Write a separate moof (movie fragment) atom for each track. Normally,
  169. packets for all tracks are written in a moof atom (which is slightly
  170. more efficient), but with this option set, the muxer writes one moof/mdat
  171. pair for each track, making it easier to separate tracks.
  172. This option is implicitly set when writing ismv (Smooth Streaming) files.
  173. @end table
  174. Smooth Streaming content can be pushed in real time to a publishing
  175. point on IIS with this muxer. Example:
  176. @example
  177. ffmpeg -re @var{<normal input/transcoding options>} -movflags isml+frag_keyframe -f ismv http://server/publishingpoint.isml/Streams(Encoder1)
  178. @end example
  179. @section mpegts
  180. MPEG transport stream muxer.
  181. This muxer implements ISO 13818-1 and part of ETSI EN 300 468.
  182. The muxer options are:
  183. @table @option
  184. @item -mpegts_original_network_id @var{number}
  185. Set the original_network_id (default 0x0001). This is unique identifier
  186. of a network in DVB. Its main use is in the unique identification of a
  187. service through the path Original_Network_ID, Transport_Stream_ID.
  188. @item -mpegts_transport_stream_id @var{number}
  189. Set the transport_stream_id (default 0x0001). This identifies a
  190. transponder in DVB.
  191. @item -mpegts_service_id @var{number}
  192. Set the service_id (default 0x0001) also known as program in DVB.
  193. @item -mpegts_pmt_start_pid @var{number}
  194. Set the first PID for PMT (default 0x1000, max 0x1f00).
  195. @item -mpegts_start_pid @var{number}
  196. Set the first PID for data packets (default 0x0100, max 0x0f00).
  197. @end table
  198. The recognized metadata settings in mpegts muxer are @code{service_provider}
  199. and @code{service_name}. If they are not set the default for
  200. @code{service_provider} is "FFmpeg" and the default for
  201. @code{service_name} is "Service01".
  202. @example
  203. ffmpeg -i file.mpg -c copy \
  204. -mpegts_original_network_id 0x1122 \
  205. -mpegts_transport_stream_id 0x3344 \
  206. -mpegts_service_id 0x5566 \
  207. -mpegts_pmt_start_pid 0x1500 \
  208. -mpegts_start_pid 0x150 \
  209. -metadata service_provider="Some provider" \
  210. -metadata service_name="Some Channel" \
  211. -y out.ts
  212. @end example
  213. @section null
  214. Null muxer.
  215. This muxer does not generate any output file, it is mainly useful for
  216. testing or benchmarking purposes.
  217. For example to benchmark decoding with @command{ffmpeg} you can use the
  218. command:
  219. @example
  220. ffmpeg -benchmark -i INPUT -f null out.null
  221. @end example
  222. Note that the above command does not read or write the @file{out.null}
  223. file, but specifying the output file is required by the @command{ffmpeg}
  224. syntax.
  225. Alternatively you can write the command as:
  226. @example
  227. ffmpeg -benchmark -i INPUT -f null -
  228. @end example
  229. @section matroska
  230. Matroska container muxer.
  231. This muxer implements the matroska and webm container specs.
  232. The recognized metadata settings in this muxer are:
  233. @table @option
  234. @item title=@var{title name}
  235. Name provided to a single track
  236. @end table
  237. @table @option
  238. @item language=@var{language name}
  239. Specifies the language of the track in the Matroska languages form
  240. @end table
  241. @table @option
  242. @item stereo_mode=@var{mode}
  243. Stereo 3D video layout of two views in a single video track
  244. @table @option
  245. @item mono
  246. video is not stereo
  247. @item left_right
  248. Both views are arranged side by side, Left-eye view is on the left
  249. @item bottom_top
  250. Both views are arranged in top-bottom orientation, Left-eye view is at bottom
  251. @item top_bottom
  252. Both views are arranged in top-bottom orientation, Left-eye view is on top
  253. @item checkerboard_rl
  254. Each view is arranged in a checkerboard interleaved pattern, Left-eye view being first
  255. @item checkerboard_lr
  256. Each view is arranged in a checkerboard interleaved pattern, Right-eye view being first
  257. @item row_interleaved_rl
  258. Each view is constituted by a row based interleaving, Right-eye view is first row
  259. @item row_interleaved_lr
  260. Each view is constituted by a row based interleaving, Left-eye view is first row
  261. @item col_interleaved_rl
  262. Both views are arranged in a column based interleaving manner, Right-eye view is first column
  263. @item col_interleaved_lr
  264. Both views are arranged in a column based interleaving manner, Left-eye view is first column
  265. @item anaglyph_cyan_red
  266. All frames are in anaglyph format viewable through red-cyan filters
  267. @item right_left
  268. Both views are arranged side by side, Right-eye view is on the left
  269. @item anaglyph_green_magenta
  270. All frames are in anaglyph format viewable through green-magenta filters
  271. @item block_lr
  272. Both eyes laced in one Block, Left-eye view is first
  273. @item block_rl
  274. Both eyes laced in one Block, Right-eye view is first
  275. @end table
  276. @end table
  277. For example a 3D WebM clip can be created using the following command line:
  278. @example
  279. ffmpeg -i sample_left_right_clip.mpg -an -c:v libvpx -metadata stereo_mode=left_right -y stereo_clip.webm
  280. @end example
  281. @section segment
  282. Basic stream segmenter.
  283. The segmenter muxer outputs streams to a number of separate files of nearly
  284. fixed duration. Output filename pattern can be set in a fashion similar to
  285. @ref{image2}.
  286. Every segment starts with a video keyframe, if a video stream is present.
  287. The segment muxer works best with a single constant frame rate video.
  288. Optionally it can generate a flat list of the created segments, one segment
  289. per line.
  290. @table @option
  291. @item segment_format @var{format}
  292. Override the inner container format, by default it is guessed by the filename
  293. extension.
  294. @item segment_time @var{t}
  295. Set segment duration to @var{t} seconds.
  296. @item segment_list @var{name}
  297. Generate also a listfile named @var{name}.
  298. @item segment_list_size @var{size}
  299. Overwrite the listfile once it reaches @var{size} entries.
  300. @item segment_wrap @var{limit}
  301. Wrap around segment index once it reaches @var{limit}.
  302. @end table
  303. @example
  304. ffmpeg -i in.mkv -c copy -map 0 -f segment -list out.list out%03d.nut
  305. @end example
  306. @section mp3
  307. The MP3 muxer writes a raw MP3 stream with an ID3v2 header at the beginning and
  308. optionally an ID3v1 tag at the end. ID3v2.3 and ID3v2.4 are supported, the
  309. @code{id3v2_version} option controls which one is used. The legacy ID3v1 tag is
  310. not written by default, but may be enabled with the @code{write_id3v1} option.
  311. For seekable output the muxer also writes a Xing frame at the beginning, which
  312. contains the number of frames in the file. It is useful for computing duration
  313. of VBR files.
  314. The muxer supports writing ID3v2 attached pictures (APIC frames). The pictures
  315. are supplied to the muxer in form of a video stream with a single packet. There
  316. can be any number of those streams, each will correspond to a single APIC frame.
  317. The stream metadata tags @var{title} and @var{comment} map to APIC
  318. @var{description} and @var{picture type} respectively. See
  319. @url{http://id3.org/id3v2.4.0-frames} for allowed picture types.
  320. Note that the APIC frames must be written at the beginning, so the muxer will
  321. buffer the audio frames until it gets all the pictures. It is therefore advised
  322. to provide the pictures as soon as possible to avoid excessive buffering.
  323. Examples:
  324. Write an mp3 with an ID3v2.3 header and an ID3v1 footer:
  325. @example
  326. ffmpeg -i INPUT -id3v2_version 3 -write_id3v1 1 out.mp3
  327. @end example
  328. Attach a picture to an mp3:
  329. @example
  330. ffmpeg -i input.mp3 -i cover.png -c copy -metadata:s:v title="Album cover"
  331. -metadata:s:v comment="Cover (Front)" out.mp3
  332. @end example
  333. @c man end MUXERS