You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

1538 lines
44KB

  1. @chapter Input Devices
  2. @c man begin INPUT DEVICES
  3. Input devices are configured elements in FFmpeg which enable accessing
  4. the data coming from a multimedia device attached to your system.
  5. When you configure your FFmpeg build, all the supported input devices
  6. are enabled by default. You can list all available ones using the
  7. configure option "--list-indevs".
  8. You can disable all the input devices using the configure option
  9. "--disable-indevs", and selectively enable an input device using the
  10. option "--enable-indev=@var{INDEV}", or you can disable a particular
  11. input device using the option "--disable-indev=@var{INDEV}".
  12. The option "-devices" of the ff* tools will display the list of
  13. supported input devices.
  14. A description of the currently available input devices follows.
  15. @section alsa
  16. ALSA (Advanced Linux Sound Architecture) input device.
  17. To enable this input device during configuration you need libasound
  18. installed on your system.
  19. This device allows capturing from an ALSA device. The name of the
  20. device to capture has to be an ALSA card identifier.
  21. An ALSA identifier has the syntax:
  22. @example
  23. hw:@var{CARD}[,@var{DEV}[,@var{SUBDEV}]]
  24. @end example
  25. where the @var{DEV} and @var{SUBDEV} components are optional.
  26. The three arguments (in order: @var{CARD},@var{DEV},@var{SUBDEV})
  27. specify card number or identifier, device number and subdevice number
  28. (-1 means any).
  29. To see the list of cards currently recognized by your system check the
  30. files @file{/proc/asound/cards} and @file{/proc/asound/devices}.
  31. For example to capture with @command{ffmpeg} from an ALSA device with
  32. card id 0, you may run the command:
  33. @example
  34. ffmpeg -f alsa -i hw:0 alsaout.wav
  35. @end example
  36. For more information see:
  37. @url{http://www.alsa-project.org/alsa-doc/alsa-lib/pcm.html}
  38. @subsection Options
  39. @table @option
  40. @item sample_rate
  41. Set the sample rate in Hz. Default is 48000.
  42. @item channels
  43. Set the number of channels. Default is 2.
  44. @end table
  45. @section android_camera
  46. Android camera input device.
  47. This input devices uses the Android Camera2 NDK API which is
  48. available on devices with API level 24+. The availability of
  49. android_camera is autodetected during configuration.
  50. This device allows capturing from all cameras on an Android device,
  51. which are integrated into the Camera2 NDK API.
  52. The available cameras are enumerated internally and can be selected
  53. with the @var{camera_index} parameter. The input file string is
  54. discarded.
  55. Generally the back facing camera has index 0 while the front facing
  56. camera has index 1.
  57. @subsection Options
  58. @table @option
  59. @item video_size
  60. Set the video size given as a string such as 640x480 or hd720.
  61. Falls back to the first available configuration reported by
  62. Android if requested video size is not available or by default.
  63. @item framerate
  64. Set the video framerate.
  65. Falls back to the first available configuration reported by
  66. Android if requested framerate is not available or by default (-1).
  67. @item camera_index
  68. Set the index of the camera to use. Default is 0.
  69. @item input_queue_size
  70. Set the maximum number of frames to buffer. Default is 5.
  71. @end table
  72. @section avfoundation
  73. AVFoundation input device.
  74. AVFoundation is the currently recommended framework by Apple for streamgrabbing on OSX >= 10.7 as well as on iOS.
  75. The input filename has to be given in the following syntax:
  76. @example
  77. -i "[[VIDEO]:[AUDIO]]"
  78. @end example
  79. The first entry selects the video input while the latter selects the audio input.
  80. The stream has to be specified by the device name or the device index as shown by the device list.
  81. Alternatively, the video and/or audio input device can be chosen by index using the
  82. @option{
  83. -video_device_index <INDEX>
  84. }
  85. and/or
  86. @option{
  87. -audio_device_index <INDEX>
  88. }
  89. , overriding any
  90. device name or index given in the input filename.
  91. All available devices can be enumerated by using @option{-list_devices true}, listing
  92. all device names and corresponding indices.
  93. There are two device name aliases:
  94. @table @code
  95. @item default
  96. Select the AVFoundation default device of the corresponding type.
  97. @item none
  98. Do not record the corresponding media type.
  99. This is equivalent to specifying an empty device name or index.
  100. @end table
  101. @subsection Options
  102. AVFoundation supports the following options:
  103. @table @option
  104. @item -list_devices <TRUE|FALSE>
  105. If set to true, a list of all available input devices is given showing all
  106. device names and indices.
  107. @item -video_device_index <INDEX>
  108. Specify the video device by its index. Overrides anything given in the input filename.
  109. @item -audio_device_index <INDEX>
  110. Specify the audio device by its index. Overrides anything given in the input filename.
  111. @item -pixel_format <FORMAT>
  112. Request the video device to use a specific pixel format.
  113. If the specified format is not supported, a list of available formats is given
  114. and the first one in this list is used instead. Available pixel formats are:
  115. @code{monob, rgb555be, rgb555le, rgb565be, rgb565le, rgb24, bgr24, 0rgb, bgr0, 0bgr, rgb0,
  116. bgr48be, uyvy422, yuva444p, yuva444p16le, yuv444p, yuv422p16, yuv422p10, yuv444p10,
  117. yuv420p, nv12, yuyv422, gray}
  118. @item -framerate
  119. Set the grabbing frame rate. Default is @code{ntsc}, corresponding to a
  120. frame rate of @code{30000/1001}.
  121. @item -video_size
  122. Set the video frame size.
  123. @item -capture_cursor
  124. Capture the mouse pointer. Default is 0.
  125. @item -capture_mouse_clicks
  126. Capture the screen mouse clicks. Default is 0.
  127. @end table
  128. @subsection Examples
  129. @itemize
  130. @item
  131. Print the list of AVFoundation supported devices and exit:
  132. @example
  133. $ ffmpeg -f avfoundation -list_devices true -i ""
  134. @end example
  135. @item
  136. Record video from video device 0 and audio from audio device 0 into out.avi:
  137. @example
  138. $ ffmpeg -f avfoundation -i "0:0" out.avi
  139. @end example
  140. @item
  141. Record video from video device 2 and audio from audio device 1 into out.avi:
  142. @example
  143. $ ffmpeg -f avfoundation -video_device_index 2 -i ":1" out.avi
  144. @end example
  145. @item
  146. Record video from the system default video device using the pixel format bgr0 and do not record any audio into out.avi:
  147. @example
  148. $ ffmpeg -f avfoundation -pixel_format bgr0 -i "default:none" out.avi
  149. @end example
  150. @end itemize
  151. @section bktr
  152. BSD video input device.
  153. @subsection Options
  154. @table @option
  155. @item framerate
  156. Set the frame rate.
  157. @item video_size
  158. Set the video frame size. Default is @code{vga}.
  159. @item standard
  160. Available values are:
  161. @table @samp
  162. @item pal
  163. @item ntsc
  164. @item secam
  165. @item paln
  166. @item palm
  167. @item ntscj
  168. @end table
  169. @end table
  170. @section decklink
  171. The decklink input device provides capture capabilities for Blackmagic
  172. DeckLink devices.
  173. To enable this input device, you need the Blackmagic DeckLink SDK and you
  174. need to configure with the appropriate @code{--extra-cflags}
  175. and @code{--extra-ldflags}.
  176. On Windows, you need to run the IDL files through @command{widl}.
  177. DeckLink is very picky about the formats it supports. Pixel format of the
  178. input can be set with @option{raw_format}.
  179. Framerate and video size must be determined for your device with
  180. @command{-list_formats 1}. Audio sample rate is always 48 kHz and the number
  181. of channels can be 2, 8 or 16. Note that all audio channels are bundled in one single
  182. audio track.
  183. @subsection Options
  184. @table @option
  185. @item list_devices
  186. If set to @option{true}, print a list of devices and exit.
  187. Defaults to @option{false}.
  188. @item list_formats
  189. If set to @option{true}, print a list of supported formats and exit.
  190. Defaults to @option{false}.
  191. @item format_code <FourCC>
  192. This sets the input video format to the format given by the FourCC. To see
  193. the supported values of your device(s) use @option{list_formats}.
  194. Note that there is a FourCC @option{'pal '} that can also be used
  195. as @option{pal} (3 letters).
  196. Default behavior is autodetection of the input video format, if the hardware
  197. supports it.
  198. @item bm_v210
  199. This is a deprecated option, you can use @option{raw_format} instead.
  200. If set to @samp{1}, video is captured in 10 bit v210 instead
  201. of uyvy422. Not all Blackmagic devices support this option.
  202. @item raw_format
  203. Set the pixel format of the captured video.
  204. Available values are:
  205. @table @samp
  206. @item uyvy422
  207. @item yuv422p10
  208. @item argb
  209. @item bgra
  210. @item rgb10
  211. @end table
  212. @item teletext_lines
  213. If set to nonzero, an additional teletext stream will be captured from the
  214. vertical ancillary data. Both SD PAL (576i) and HD (1080i or 1080p)
  215. sources are supported. In case of HD sources, OP47 packets are decoded.
  216. This option is a bitmask of the SD PAL VBI lines captured, specifically lines 6
  217. to 22, and lines 318 to 335. Line 6 is the LSB in the mask. Selected lines
  218. which do not contain teletext information will be ignored. You can use the
  219. special @option{all} constant to select all possible lines, or
  220. @option{standard} to skip lines 6, 318 and 319, which are not compatible with
  221. all receivers.
  222. For SD sources, ffmpeg needs to be compiled with @code{--enable-libzvbi}. For
  223. HD sources, on older (pre-4K) DeckLink card models you have to capture in 10
  224. bit mode.
  225. @item channels
  226. Defines number of audio channels to capture. Must be @samp{2}, @samp{8} or @samp{16}.
  227. Defaults to @samp{2}.
  228. @item duplex_mode
  229. Sets the decklink device duplex mode. Must be @samp{unset}, @samp{half} or @samp{full}.
  230. Defaults to @samp{unset}.
  231. @item timecode_format
  232. Timecode type to include in the frame and video stream metadata. Must be
  233. @samp{none}, @samp{rp188vitc}, @samp{rp188vitc2}, @samp{rp188ltc},
  234. @samp{rp188any}, @samp{vitc}, @samp{vitc2}, or @samp{serial}. Defaults to
  235. @samp{none} (not included).
  236. @item video_input
  237. Sets the video input source. Must be @samp{unset}, @samp{sdi}, @samp{hdmi},
  238. @samp{optical_sdi}, @samp{component}, @samp{composite} or @samp{s_video}.
  239. Defaults to @samp{unset}.
  240. @item audio_input
  241. Sets the audio input source. Must be @samp{unset}, @samp{embedded},
  242. @samp{aes_ebu}, @samp{analog}, @samp{analog_xlr}, @samp{analog_rca} or
  243. @samp{microphone}. Defaults to @samp{unset}.
  244. @item video_pts
  245. Sets the video packet timestamp source. Must be @samp{video}, @samp{audio},
  246. @samp{reference}, @samp{wallclock} or @samp{abs_wallclock}.
  247. Defaults to @samp{video}.
  248. @item audio_pts
  249. Sets the audio packet timestamp source. Must be @samp{video}, @samp{audio},
  250. @samp{reference}, @samp{wallclock} or @samp{abs_wallclock}.
  251. Defaults to @samp{audio}.
  252. @item draw_bars
  253. If set to @samp{true}, color bars are drawn in the event of a signal loss.
  254. Defaults to @samp{true}.
  255. @item queue_size
  256. Sets maximum input buffer size in bytes. If the buffering reaches this value,
  257. incoming frames will be dropped.
  258. Defaults to @samp{1073741824}.
  259. @item audio_depth
  260. Sets the audio sample bit depth. Must be @samp{16} or @samp{32}.
  261. Defaults to @samp{16}.
  262. @item decklink_copyts
  263. If set to @option{true}, timestamps are forwarded as they are without removing
  264. the initial offset.
  265. Defaults to @option{false}.
  266. @end table
  267. @subsection Examples
  268. @itemize
  269. @item
  270. List input devices:
  271. @example
  272. ffmpeg -f decklink -list_devices 1 -i dummy
  273. @end example
  274. @item
  275. List supported formats:
  276. @example
  277. ffmpeg -f decklink -list_formats 1 -i 'Intensity Pro'
  278. @end example
  279. @item
  280. Capture video clip at 1080i50:
  281. @example
  282. ffmpeg -format_code Hi50 -f decklink -i 'Intensity Pro' -c:a copy -c:v copy output.avi
  283. @end example
  284. @item
  285. Capture video clip at 1080i50 10 bit:
  286. @example
  287. ffmpeg -bm_v210 1 -format_code Hi50 -f decklink -i 'UltraStudio Mini Recorder' -c:a copy -c:v copy output.avi
  288. @end example
  289. @item
  290. Capture video clip at 1080i50 with 16 audio channels:
  291. @example
  292. ffmpeg -channels 16 -format_code Hi50 -f decklink -i 'UltraStudio Mini Recorder' -c:a copy -c:v copy output.avi
  293. @end example
  294. @end itemize
  295. @section dshow
  296. Windows DirectShow input device.
  297. DirectShow support is enabled when FFmpeg is built with the mingw-w64 project.
  298. Currently only audio and video devices are supported.
  299. Multiple devices may be opened as separate inputs, but they may also be
  300. opened on the same input, which should improve synchronism between them.
  301. The input name should be in the format:
  302. @example
  303. @var{TYPE}=@var{NAME}[:@var{TYPE}=@var{NAME}]
  304. @end example
  305. where @var{TYPE} can be either @var{audio} or @var{video},
  306. and @var{NAME} is the device's name or alternative name..
  307. @subsection Options
  308. If no options are specified, the device's defaults are used.
  309. If the device does not support the requested options, it will
  310. fail to open.
  311. @table @option
  312. @item video_size
  313. Set the video size in the captured video.
  314. @item framerate
  315. Set the frame rate in the captured video.
  316. @item sample_rate
  317. Set the sample rate (in Hz) of the captured audio.
  318. @item sample_size
  319. Set the sample size (in bits) of the captured audio.
  320. @item channels
  321. Set the number of channels in the captured audio.
  322. @item list_devices
  323. If set to @option{true}, print a list of devices and exit.
  324. @item list_options
  325. If set to @option{true}, print a list of selected device's options
  326. and exit.
  327. @item video_device_number
  328. Set video device number for devices with the same name (starts at 0,
  329. defaults to 0).
  330. @item audio_device_number
  331. Set audio device number for devices with the same name (starts at 0,
  332. defaults to 0).
  333. @item pixel_format
  334. Select pixel format to be used by DirectShow. This may only be set when
  335. the video codec is not set or set to rawvideo.
  336. @item audio_buffer_size
  337. Set audio device buffer size in milliseconds (which can directly
  338. impact latency, depending on the device).
  339. Defaults to using the audio device's
  340. default buffer size (typically some multiple of 500ms).
  341. Setting this value too low can degrade performance.
  342. See also
  343. @url{http://msdn.microsoft.com/en-us/library/windows/desktop/dd377582(v=vs.85).aspx}
  344. @item video_pin_name
  345. Select video capture pin to use by name or alternative name.
  346. @item audio_pin_name
  347. Select audio capture pin to use by name or alternative name.
  348. @item crossbar_video_input_pin_number
  349. Select video input pin number for crossbar device. This will be
  350. routed to the crossbar device's Video Decoder output pin.
  351. Note that changing this value can affect future invocations
  352. (sets a new default) until system reboot occurs.
  353. @item crossbar_audio_input_pin_number
  354. Select audio input pin number for crossbar device. This will be
  355. routed to the crossbar device's Audio Decoder output pin.
  356. Note that changing this value can affect future invocations
  357. (sets a new default) until system reboot occurs.
  358. @item show_video_device_dialog
  359. If set to @option{true}, before capture starts, popup a display dialog
  360. to the end user, allowing them to change video filter properties
  361. and configurations manually.
  362. Note that for crossbar devices, adjusting values in this dialog
  363. may be needed at times to toggle between PAL (25 fps) and NTSC (29.97)
  364. input frame rates, sizes, interlacing, etc. Changing these values can
  365. enable different scan rates/frame rates and avoiding green bars at
  366. the bottom, flickering scan lines, etc.
  367. Note that with some devices, changing these properties can also affect future
  368. invocations (sets new defaults) until system reboot occurs.
  369. @item show_audio_device_dialog
  370. If set to @option{true}, before capture starts, popup a display dialog
  371. to the end user, allowing them to change audio filter properties
  372. and configurations manually.
  373. @item show_video_crossbar_connection_dialog
  374. If set to @option{true}, before capture starts, popup a display
  375. dialog to the end user, allowing them to manually
  376. modify crossbar pin routings, when it opens a video device.
  377. @item show_audio_crossbar_connection_dialog
  378. If set to @option{true}, before capture starts, popup a display
  379. dialog to the end user, allowing them to manually
  380. modify crossbar pin routings, when it opens an audio device.
  381. @item show_analog_tv_tuner_dialog
  382. If set to @option{true}, before capture starts, popup a display
  383. dialog to the end user, allowing them to manually
  384. modify TV channels and frequencies.
  385. @item show_analog_tv_tuner_audio_dialog
  386. If set to @option{true}, before capture starts, popup a display
  387. dialog to the end user, allowing them to manually
  388. modify TV audio (like mono vs. stereo, Language A,B or C).
  389. @item audio_device_load
  390. Load an audio capture filter device from file instead of searching
  391. it by name. It may load additional parameters too, if the filter
  392. supports the serialization of its properties to.
  393. To use this an audio capture source has to be specified, but it can
  394. be anything even fake one.
  395. @item audio_device_save
  396. Save the currently used audio capture filter device and its
  397. parameters (if the filter supports it) to a file.
  398. If a file with the same name exists it will be overwritten.
  399. @item video_device_load
  400. Load a video capture filter device from file instead of searching
  401. it by name. It may load additional parameters too, if the filter
  402. supports the serialization of its properties to.
  403. To use this a video capture source has to be specified, but it can
  404. be anything even fake one.
  405. @item video_device_save
  406. Save the currently used video capture filter device and its
  407. parameters (if the filter supports it) to a file.
  408. If a file with the same name exists it will be overwritten.
  409. @end table
  410. @subsection Examples
  411. @itemize
  412. @item
  413. Print the list of DirectShow supported devices and exit:
  414. @example
  415. $ ffmpeg -list_devices true -f dshow -i dummy
  416. @end example
  417. @item
  418. Open video device @var{Camera}:
  419. @example
  420. $ ffmpeg -f dshow -i video="Camera"
  421. @end example
  422. @item
  423. Open second video device with name @var{Camera}:
  424. @example
  425. $ ffmpeg -f dshow -video_device_number 1 -i video="Camera"
  426. @end example
  427. @item
  428. Open video device @var{Camera} and audio device @var{Microphone}:
  429. @example
  430. $ ffmpeg -f dshow -i video="Camera":audio="Microphone"
  431. @end example
  432. @item
  433. Print the list of supported options in selected device and exit:
  434. @example
  435. $ ffmpeg -list_options true -f dshow -i video="Camera"
  436. @end example
  437. @item
  438. Specify pin names to capture by name or alternative name, specify alternative device name:
  439. @example
  440. $ ffmpeg -f dshow -audio_pin_name "Audio Out" -video_pin_name 2 -i video=video="@@device_pnp_\\?\pci#ven_1a0a&dev_6200&subsys_62021461&rev_01#4&e2c7dd6&0&00e1#@{65e8773d-8f56-11d0-a3b9-00a0c9223196@}\@{ca465100-deb0-4d59-818f-8c477184adf6@}":audio="Microphone"
  441. @end example
  442. @item
  443. Configure a crossbar device, specifying crossbar pins, allow user to adjust video capture properties at startup:
  444. @example
  445. $ ffmpeg -f dshow -show_video_device_dialog true -crossbar_video_input_pin_number 0
  446. -crossbar_audio_input_pin_number 3 -i video="AVerMedia BDA Analog Capture":audio="AVerMedia BDA Analog Capture"
  447. @end example
  448. @end itemize
  449. @section fbdev
  450. Linux framebuffer input device.
  451. The Linux framebuffer is a graphic hardware-independent abstraction
  452. layer to show graphics on a computer monitor, typically on the
  453. console. It is accessed through a file device node, usually
  454. @file{/dev/fb0}.
  455. For more detailed information read the file
  456. Documentation/fb/framebuffer.txt included in the Linux source tree.
  457. See also @url{http://linux-fbdev.sourceforge.net/}, and fbset(1).
  458. To record from the framebuffer device @file{/dev/fb0} with
  459. @command{ffmpeg}:
  460. @example
  461. ffmpeg -f fbdev -framerate 10 -i /dev/fb0 out.avi
  462. @end example
  463. You can take a single screenshot image with the command:
  464. @example
  465. ffmpeg -f fbdev -framerate 1 -i /dev/fb0 -frames:v 1 screenshot.jpeg
  466. @end example
  467. @subsection Options
  468. @table @option
  469. @item framerate
  470. Set the frame rate. Default is 25.
  471. @end table
  472. @section gdigrab
  473. Win32 GDI-based screen capture device.
  474. This device allows you to capture a region of the display on Windows.
  475. There are two options for the input filename:
  476. @example
  477. desktop
  478. @end example
  479. or
  480. @example
  481. title=@var{window_title}
  482. @end example
  483. The first option will capture the entire desktop, or a fixed region of the
  484. desktop. The second option will instead capture the contents of a single
  485. window, regardless of its position on the screen.
  486. For example, to grab the entire desktop using @command{ffmpeg}:
  487. @example
  488. ffmpeg -f gdigrab -framerate 6 -i desktop out.mpg
  489. @end example
  490. Grab a 640x480 region at position @code{10,20}:
  491. @example
  492. ffmpeg -f gdigrab -framerate 6 -offset_x 10 -offset_y 20 -video_size vga -i desktop out.mpg
  493. @end example
  494. Grab the contents of the window named "Calculator"
  495. @example
  496. ffmpeg -f gdigrab -framerate 6 -i title=Calculator out.mpg
  497. @end example
  498. @subsection Options
  499. @table @option
  500. @item draw_mouse
  501. Specify whether to draw the mouse pointer. Use the value @code{0} to
  502. not draw the pointer. Default value is @code{1}.
  503. @item framerate
  504. Set the grabbing frame rate. Default value is @code{ntsc},
  505. corresponding to a frame rate of @code{30000/1001}.
  506. @item show_region
  507. Show grabbed region on screen.
  508. If @var{show_region} is specified with @code{1}, then the grabbing
  509. region will be indicated on screen. With this option, it is easy to
  510. know what is being grabbed if only a portion of the screen is grabbed.
  511. Note that @var{show_region} is incompatible with grabbing the contents
  512. of a single window.
  513. For example:
  514. @example
  515. ffmpeg -f gdigrab -show_region 1 -framerate 6 -video_size cif -offset_x 10 -offset_y 20 -i desktop out.mpg
  516. @end example
  517. @item video_size
  518. Set the video frame size. The default is to capture the full screen if @file{desktop} is selected, or the full window size if @file{title=@var{window_title}} is selected.
  519. @item offset_x
  520. When capturing a region with @var{video_size}, set the distance from the left edge of the screen or desktop.
  521. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned to the left of your primary monitor, you will need to use a negative @var{offset_x} value to move the region to that monitor.
  522. @item offset_y
  523. When capturing a region with @var{video_size}, set the distance from the top edge of the screen or desktop.
  524. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned above your primary monitor, you will need to use a negative @var{offset_y} value to move the region to that monitor.
  525. @end table
  526. @section iec61883
  527. FireWire DV/HDV input device using libiec61883.
  528. To enable this input device, you need libiec61883, libraw1394 and
  529. libavc1394 installed on your system. Use the configure option
  530. @code{--enable-libiec61883} to compile with the device enabled.
  531. The iec61883 capture device supports capturing from a video device
  532. connected via IEEE1394 (FireWire), using libiec61883 and the new Linux
  533. FireWire stack (juju). This is the default DV/HDV input method in Linux
  534. Kernel 2.6.37 and later, since the old FireWire stack was removed.
  535. Specify the FireWire port to be used as input file, or "auto"
  536. to choose the first port connected.
  537. @subsection Options
  538. @table @option
  539. @item dvtype
  540. Override autodetection of DV/HDV. This should only be used if auto
  541. detection does not work, or if usage of a different device type
  542. should be prohibited. Treating a DV device as HDV (or vice versa) will
  543. not work and result in undefined behavior.
  544. The values @option{auto}, @option{dv} and @option{hdv} are supported.
  545. @item dvbuffer
  546. Set maximum size of buffer for incoming data, in frames. For DV, this
  547. is an exact value. For HDV, it is not frame exact, since HDV does
  548. not have a fixed frame size.
  549. @item dvguid
  550. Select the capture device by specifying its GUID. Capturing will only
  551. be performed from the specified device and fails if no device with the
  552. given GUID is found. This is useful to select the input if multiple
  553. devices are connected at the same time.
  554. Look at /sys/bus/firewire/devices to find out the GUIDs.
  555. @end table
  556. @subsection Examples
  557. @itemize
  558. @item
  559. Grab and show the input of a FireWire DV/HDV device.
  560. @example
  561. ffplay -f iec61883 -i auto
  562. @end example
  563. @item
  564. Grab and record the input of a FireWire DV/HDV device,
  565. using a packet buffer of 100000 packets if the source is HDV.
  566. @example
  567. ffmpeg -f iec61883 -i auto -hdvbuffer 100000 out.mpg
  568. @end example
  569. @end itemize
  570. @section jack
  571. JACK input device.
  572. To enable this input device during configuration you need libjack
  573. installed on your system.
  574. A JACK input device creates one or more JACK writable clients, one for
  575. each audio channel, with name @var{client_name}:input_@var{N}, where
  576. @var{client_name} is the name provided by the application, and @var{N}
  577. is a number which identifies the channel.
  578. Each writable client will send the acquired data to the FFmpeg input
  579. device.
  580. Once you have created one or more JACK readable clients, you need to
  581. connect them to one or more JACK writable clients.
  582. To connect or disconnect JACK clients you can use the @command{jack_connect}
  583. and @command{jack_disconnect} programs, or do it through a graphical interface,
  584. for example with @command{qjackctl}.
  585. To list the JACK clients and their properties you can invoke the command
  586. @command{jack_lsp}.
  587. Follows an example which shows how to capture a JACK readable client
  588. with @command{ffmpeg}.
  589. @example
  590. # Create a JACK writable client with name "ffmpeg".
  591. $ ffmpeg -f jack -i ffmpeg -y out.wav
  592. # Start the sample jack_metro readable client.
  593. $ jack_metro -b 120 -d 0.2 -f 4000
  594. # List the current JACK clients.
  595. $ jack_lsp -c
  596. system:capture_1
  597. system:capture_2
  598. system:playback_1
  599. system:playback_2
  600. ffmpeg:input_1
  601. metro:120_bpm
  602. # Connect metro to the ffmpeg writable client.
  603. $ jack_connect metro:120_bpm ffmpeg:input_1
  604. @end example
  605. For more information read:
  606. @url{http://jackaudio.org/}
  607. @subsection Options
  608. @table @option
  609. @item channels
  610. Set the number of channels. Default is 2.
  611. @end table
  612. @section kmsgrab
  613. KMS video input device.
  614. Captures the KMS scanout framebuffer associated with a specified CRTC or plane as a
  615. DRM object that can be passed to other hardware functions.
  616. Requires either DRM master or CAP_SYS_ADMIN to run.
  617. If you don't understand what all of that means, you probably don't want this. Look at
  618. @option{x11grab} instead.
  619. @subsection Options
  620. @table @option
  621. @item device
  622. DRM device to capture on. Defaults to @option{/dev/dri/card0}.
  623. @item format
  624. Pixel format of the framebuffer. Defaults to @option{bgr0}.
  625. @item format_modifier
  626. Format modifier to signal on output frames. This is necessary to import correctly into
  627. some APIs, but can't be autodetected. See the libdrm documentation for possible values.
  628. @item crtc_id
  629. KMS CRTC ID to define the capture source. The first active plane on the given CRTC
  630. will be used.
  631. @item plane_id
  632. KMS plane ID to define the capture source. Defaults to the first active plane found if
  633. neither @option{crtc_id} nor @option{plane_id} are specified.
  634. @item framerate
  635. Framerate to capture at. This is not synchronised to any page flipping or framebuffer
  636. changes - it just defines the interval at which the framebuffer is sampled. Sampling
  637. faster than the framebuffer update rate will generate independent frames with the same
  638. content. Defaults to @code{30}.
  639. @end table
  640. @subsection Examples
  641. @itemize
  642. @item
  643. Capture from the first active plane, download the result to normal frames and encode.
  644. This will only work if the framebuffer is both linear and mappable - if not, the result
  645. may be scrambled or fail to download.
  646. @example
  647. ffmpeg -f kmsgrab -i - -vf 'hwdownload,format=bgr0' output.mp4
  648. @end example
  649. @item
  650. Capture from CRTC ID 42 at 60fps, map the result to VAAPI, convert to NV12 and encode as H.264.
  651. @example
  652. ffmpeg -crtc_id 42 -framerate 60 -f kmsgrab -i - -vf 'hwmap=derive_device=vaapi,scale_vaapi=w=1920:h=1080:format=nv12' -c:v h264_vaapi output.mp4
  653. @end example
  654. @end itemize
  655. @section lavfi
  656. Libavfilter input virtual device.
  657. This input device reads data from the open output pads of a libavfilter
  658. filtergraph.
  659. For each filtergraph open output, the input device will create a
  660. corresponding stream which is mapped to the generated output. Currently
  661. only video data is supported. The filtergraph is specified through the
  662. option @option{graph}.
  663. @subsection Options
  664. @table @option
  665. @item graph
  666. Specify the filtergraph to use as input. Each video open output must be
  667. labelled by a unique string of the form "out@var{N}", where @var{N} is a
  668. number starting from 0 corresponding to the mapped input stream
  669. generated by the device.
  670. The first unlabelled output is automatically assigned to the "out0"
  671. label, but all the others need to be specified explicitly.
  672. The suffix "+subcc" can be appended to the output label to create an extra
  673. stream with the closed captions packets attached to that output
  674. (experimental; only for EIA-608 / CEA-708 for now).
  675. The subcc streams are created after all the normal streams, in the order of
  676. the corresponding stream.
  677. For example, if there is "out19+subcc", "out7+subcc" and up to "out42", the
  678. stream #43 is subcc for stream #7 and stream #44 is subcc for stream #19.
  679. If not specified defaults to the filename specified for the input
  680. device.
  681. @item graph_file
  682. Set the filename of the filtergraph to be read and sent to the other
  683. filters. Syntax of the filtergraph is the same as the one specified by
  684. the option @var{graph}.
  685. @item dumpgraph
  686. Dump graph to stderr.
  687. @end table
  688. @subsection Examples
  689. @itemize
  690. @item
  691. Create a color video stream and play it back with @command{ffplay}:
  692. @example
  693. ffplay -f lavfi -graph "color=c=pink [out0]" dummy
  694. @end example
  695. @item
  696. As the previous example, but use filename for specifying the graph
  697. description, and omit the "out0" label:
  698. @example
  699. ffplay -f lavfi color=c=pink
  700. @end example
  701. @item
  702. Create three different video test filtered sources and play them:
  703. @example
  704. ffplay -f lavfi -graph "testsrc [out0]; testsrc,hflip [out1]; testsrc,negate [out2]" test3
  705. @end example
  706. @item
  707. Read an audio stream from a file using the amovie source and play it
  708. back with @command{ffplay}:
  709. @example
  710. ffplay -f lavfi "amovie=test.wav"
  711. @end example
  712. @item
  713. Read an audio stream and a video stream and play it back with
  714. @command{ffplay}:
  715. @example
  716. ffplay -f lavfi "movie=test.avi[out0];amovie=test.wav[out1]"
  717. @end example
  718. @item
  719. Dump decoded frames to images and closed captions to a file (experimental):
  720. @example
  721. ffmpeg -f lavfi -i "movie=test.ts[out0+subcc]" -map v frame%08d.png -map s -c copy -f rawvideo subcc.bin
  722. @end example
  723. @end itemize
  724. @section libcdio
  725. Audio-CD input device based on libcdio.
  726. To enable this input device during configuration you need libcdio
  727. installed on your system. It requires the configure option
  728. @code{--enable-libcdio}.
  729. This device allows playing and grabbing from an Audio-CD.
  730. For example to copy with @command{ffmpeg} the entire Audio-CD in @file{/dev/sr0},
  731. you may run the command:
  732. @example
  733. ffmpeg -f libcdio -i /dev/sr0 cd.wav
  734. @end example
  735. @subsection Options
  736. @table @option
  737. @item speed
  738. Set drive reading speed. Default value is 0.
  739. The speed is specified CD-ROM speed units. The speed is set through
  740. the libcdio @code{cdio_cddap_speed_set} function. On many CD-ROM
  741. drives, specifying a value too large will result in using the fastest
  742. speed.
  743. @item paranoia_mode
  744. Set paranoia recovery mode flags. It accepts one of the following values:
  745. @table @samp
  746. @item disable
  747. @item verify
  748. @item overlap
  749. @item neverskip
  750. @item full
  751. @end table
  752. Default value is @samp{disable}.
  753. For more information about the available recovery modes, consult the
  754. paranoia project documentation.
  755. @end table
  756. @section libdc1394
  757. IIDC1394 input device, based on libdc1394 and libraw1394.
  758. Requires the configure option @code{--enable-libdc1394}.
  759. @section libndi_newtek
  760. The libndi_newtek input device provides capture capabilities for using NDI (Network
  761. Device Interface, standard created by NewTek).
  762. Input filename is a NDI source name that could be found by sending -find_sources 1
  763. to command line - it has no specific syntax but human-readable formatted.
  764. To enable this input device, you need the NDI SDK and you
  765. need to configure with the appropriate @code{--extra-cflags}
  766. and @code{--extra-ldflags}.
  767. @subsection Options
  768. @table @option
  769. @item find_sources
  770. If set to @option{true}, print a list of found/available NDI sources and exit.
  771. Defaults to @option{false}.
  772. @item wait_sources
  773. Override time to wait until the number of online sources have changed.
  774. Defaults to @option{0.5}.
  775. @item allow_video_fields
  776. When this flag is @option{false}, all video that you receive will be progressive.
  777. Defaults to @option{true}.
  778. @end table
  779. @subsection Examples
  780. @itemize
  781. @item
  782. List input devices:
  783. @example
  784. ffmpeg -f libndi_newtek -find_sources 1 -i dummy
  785. @end example
  786. @item
  787. Restream to NDI:
  788. @example
  789. ffmpeg -f libndi_newtek -i "DEV-5.INTERNAL.M1STEREO.TV (NDI_SOURCE_NAME_1)" -f libndi_newtek -y NDI_SOURCE_NAME_2
  790. @end example
  791. @end itemize
  792. @section openal
  793. The OpenAL input device provides audio capture on all systems with a
  794. working OpenAL 1.1 implementation.
  795. To enable this input device during configuration, you need OpenAL
  796. headers and libraries installed on your system, and need to configure
  797. FFmpeg with @code{--enable-openal}.
  798. OpenAL headers and libraries should be provided as part of your OpenAL
  799. implementation, or as an additional download (an SDK). Depending on your
  800. installation you may need to specify additional flags via the
  801. @code{--extra-cflags} and @code{--extra-ldflags} for allowing the build
  802. system to locate the OpenAL headers and libraries.
  803. An incomplete list of OpenAL implementations follows:
  804. @table @strong
  805. @item Creative
  806. The official Windows implementation, providing hardware acceleration
  807. with supported devices and software fallback.
  808. See @url{http://openal.org/}.
  809. @item OpenAL Soft
  810. Portable, open source (LGPL) software implementation. Includes
  811. backends for the most common sound APIs on the Windows, Linux,
  812. Solaris, and BSD operating systems.
  813. See @url{http://kcat.strangesoft.net/openal.html}.
  814. @item Apple
  815. OpenAL is part of Core Audio, the official Mac OS X Audio interface.
  816. See @url{http://developer.apple.com/technologies/mac/audio-and-video.html}
  817. @end table
  818. This device allows one to capture from an audio input device handled
  819. through OpenAL.
  820. You need to specify the name of the device to capture in the provided
  821. filename. If the empty string is provided, the device will
  822. automatically select the default device. You can get the list of the
  823. supported devices by using the option @var{list_devices}.
  824. @subsection Options
  825. @table @option
  826. @item channels
  827. Set the number of channels in the captured audio. Only the values
  828. @option{1} (monaural) and @option{2} (stereo) are currently supported.
  829. Defaults to @option{2}.
  830. @item sample_size
  831. Set the sample size (in bits) of the captured audio. Only the values
  832. @option{8} and @option{16} are currently supported. Defaults to
  833. @option{16}.
  834. @item sample_rate
  835. Set the sample rate (in Hz) of the captured audio.
  836. Defaults to @option{44.1k}.
  837. @item list_devices
  838. If set to @option{true}, print a list of devices and exit.
  839. Defaults to @option{false}.
  840. @end table
  841. @subsection Examples
  842. Print the list of OpenAL supported devices and exit:
  843. @example
  844. $ ffmpeg -list_devices true -f openal -i dummy out.ogg
  845. @end example
  846. Capture from the OpenAL device @file{DR-BT101 via PulseAudio}:
  847. @example
  848. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out.ogg
  849. @end example
  850. Capture from the default device (note the empty string '' as filename):
  851. @example
  852. $ ffmpeg -f openal -i '' out.ogg
  853. @end example
  854. Capture from two devices simultaneously, writing to two different files,
  855. within the same @command{ffmpeg} command:
  856. @example
  857. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out1.ogg -f openal -i 'ALSA Default' out2.ogg
  858. @end example
  859. Note: not all OpenAL implementations support multiple simultaneous capture -
  860. try the latest OpenAL Soft if the above does not work.
  861. @section oss
  862. Open Sound System input device.
  863. The filename to provide to the input device is the device node
  864. representing the OSS input device, and is usually set to
  865. @file{/dev/dsp}.
  866. For example to grab from @file{/dev/dsp} using @command{ffmpeg} use the
  867. command:
  868. @example
  869. ffmpeg -f oss -i /dev/dsp /tmp/oss.wav
  870. @end example
  871. For more information about OSS see:
  872. @url{http://manuals.opensound.com/usersguide/dsp.html}
  873. @subsection Options
  874. @table @option
  875. @item sample_rate
  876. Set the sample rate in Hz. Default is 48000.
  877. @item channels
  878. Set the number of channels. Default is 2.
  879. @end table
  880. @section pulse
  881. PulseAudio input device.
  882. To enable this output device you need to configure FFmpeg with @code{--enable-libpulse}.
  883. The filename to provide to the input device is a source device or the
  884. string "default"
  885. To list the PulseAudio source devices and their properties you can invoke
  886. the command @command{pactl list sources}.
  887. More information about PulseAudio can be found on @url{http://www.pulseaudio.org}.
  888. @subsection Options
  889. @table @option
  890. @item server
  891. Connect to a specific PulseAudio server, specified by an IP address.
  892. Default server is used when not provided.
  893. @item name
  894. Specify the application name PulseAudio will use when showing active clients,
  895. by default it is the @code{LIBAVFORMAT_IDENT} string.
  896. @item stream_name
  897. Specify the stream name PulseAudio will use when showing active streams,
  898. by default it is "record".
  899. @item sample_rate
  900. Specify the samplerate in Hz, by default 48kHz is used.
  901. @item channels
  902. Specify the channels in use, by default 2 (stereo) is set.
  903. @item frame_size
  904. Specify the number of bytes per frame, by default it is set to 1024.
  905. @item fragment_size
  906. Specify the minimal buffering fragment in PulseAudio, it will affect the
  907. audio latency. By default it is unset.
  908. @item wallclock
  909. Set the initial PTS using the current time. Default is 1.
  910. @end table
  911. @subsection Examples
  912. Record a stream from default device:
  913. @example
  914. ffmpeg -f pulse -i default /tmp/pulse.wav
  915. @end example
  916. @section sndio
  917. sndio input device.
  918. To enable this input device during configuration you need libsndio
  919. installed on your system.
  920. The filename to provide to the input device is the device node
  921. representing the sndio input device, and is usually set to
  922. @file{/dev/audio0}.
  923. For example to grab from @file{/dev/audio0} using @command{ffmpeg} use the
  924. command:
  925. @example
  926. ffmpeg -f sndio -i /dev/audio0 /tmp/oss.wav
  927. @end example
  928. @subsection Options
  929. @table @option
  930. @item sample_rate
  931. Set the sample rate in Hz. Default is 48000.
  932. @item channels
  933. Set the number of channels. Default is 2.
  934. @end table
  935. @section video4linux2, v4l2
  936. Video4Linux2 input video device.
  937. "v4l2" can be used as alias for "video4linux2".
  938. If FFmpeg is built with v4l-utils support (by using the
  939. @code{--enable-libv4l2} configure option), it is possible to use it with the
  940. @code{-use_libv4l2} input device option.
  941. The name of the device to grab is a file device node, usually Linux
  942. systems tend to automatically create such nodes when the device
  943. (e.g. an USB webcam) is plugged into the system, and has a name of the
  944. kind @file{/dev/video@var{N}}, where @var{N} is a number associated to
  945. the device.
  946. Video4Linux2 devices usually support a limited set of
  947. @var{width}x@var{height} sizes and frame rates. You can check which are
  948. supported using @command{-list_formats all} for Video4Linux2 devices.
  949. Some devices, like TV cards, support one or more standards. It is possible
  950. to list all the supported standards using @command{-list_standards all}.
  951. The time base for the timestamps is 1 microsecond. Depending on the kernel
  952. version and configuration, the timestamps may be derived from the real time
  953. clock (origin at the Unix Epoch) or the monotonic clock (origin usually at
  954. boot time, unaffected by NTP or manual changes to the clock). The
  955. @option{-timestamps abs} or @option{-ts abs} option can be used to force
  956. conversion into the real time clock.
  957. Some usage examples of the video4linux2 device with @command{ffmpeg}
  958. and @command{ffplay}:
  959. @itemize
  960. @item
  961. List supported formats for a video4linux2 device:
  962. @example
  963. ffplay -f video4linux2 -list_formats all /dev/video0
  964. @end example
  965. @item
  966. Grab and show the input of a video4linux2 device:
  967. @example
  968. ffplay -f video4linux2 -framerate 30 -video_size hd720 /dev/video0
  969. @end example
  970. @item
  971. Grab and record the input of a video4linux2 device, leave the
  972. frame rate and size as previously set:
  973. @example
  974. ffmpeg -f video4linux2 -input_format mjpeg -i /dev/video0 out.mpeg
  975. @end example
  976. @end itemize
  977. For more information about Video4Linux, check @url{http://linuxtv.org/}.
  978. @subsection Options
  979. @table @option
  980. @item standard
  981. Set the standard. Must be the name of a supported standard. To get a
  982. list of the supported standards, use the @option{list_standards}
  983. option.
  984. @item channel
  985. Set the input channel number. Default to -1, which means using the
  986. previously selected channel.
  987. @item video_size
  988. Set the video frame size. The argument must be a string in the form
  989. @var{WIDTH}x@var{HEIGHT} or a valid size abbreviation.
  990. @item pixel_format
  991. Select the pixel format (only valid for raw video input).
  992. @item input_format
  993. Set the preferred pixel format (for raw video) or a codec name.
  994. This option allows one to select the input format, when several are
  995. available.
  996. @item framerate
  997. Set the preferred video frame rate.
  998. @item list_formats
  999. List available formats (supported pixel formats, codecs, and frame
  1000. sizes) and exit.
  1001. Available values are:
  1002. @table @samp
  1003. @item all
  1004. Show all available (compressed and non-compressed) formats.
  1005. @item raw
  1006. Show only raw video (non-compressed) formats.
  1007. @item compressed
  1008. Show only compressed formats.
  1009. @end table
  1010. @item list_standards
  1011. List supported standards and exit.
  1012. Available values are:
  1013. @table @samp
  1014. @item all
  1015. Show all supported standards.
  1016. @end table
  1017. @item timestamps, ts
  1018. Set type of timestamps for grabbed frames.
  1019. Available values are:
  1020. @table @samp
  1021. @item default
  1022. Use timestamps from the kernel.
  1023. @item abs
  1024. Use absolute timestamps (wall clock).
  1025. @item mono2abs
  1026. Force conversion from monotonic to absolute timestamps.
  1027. @end table
  1028. Default value is @code{default}.
  1029. @item use_libv4l2
  1030. Use libv4l2 (v4l-utils) conversion functions. Default is 0.
  1031. @end table
  1032. @section vfwcap
  1033. VfW (Video for Windows) capture input device.
  1034. The filename passed as input is the capture driver number, ranging from
  1035. 0 to 9. You may use "list" as filename to print a list of drivers. Any
  1036. other filename will be interpreted as device number 0.
  1037. @subsection Options
  1038. @table @option
  1039. @item video_size
  1040. Set the video frame size.
  1041. @item framerate
  1042. Set the grabbing frame rate. Default value is @code{ntsc},
  1043. corresponding to a frame rate of @code{30000/1001}.
  1044. @end table
  1045. @section x11grab
  1046. X11 video input device.
  1047. To enable this input device during configuration you need libxcb
  1048. installed on your system. It will be automatically detected during
  1049. configuration.
  1050. This device allows one to capture a region of an X11 display.
  1051. The filename passed as input has the syntax:
  1052. @example
  1053. [@var{hostname}]:@var{display_number}.@var{screen_number}[+@var{x_offset},@var{y_offset}]
  1054. @end example
  1055. @var{hostname}:@var{display_number}.@var{screen_number} specifies the
  1056. X11 display name of the screen to grab from. @var{hostname} can be
  1057. omitted, and defaults to "localhost". The environment variable
  1058. @env{DISPLAY} contains the default display name.
  1059. @var{x_offset} and @var{y_offset} specify the offsets of the grabbed
  1060. area with respect to the top-left border of the X11 screen. They
  1061. default to 0.
  1062. Check the X11 documentation (e.g. @command{man X}) for more detailed
  1063. information.
  1064. Use the @command{xdpyinfo} program for getting basic information about
  1065. the properties of your X11 display (e.g. grep for "name" or
  1066. "dimensions").
  1067. For example to grab from @file{:0.0} using @command{ffmpeg}:
  1068. @example
  1069. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0 out.mpg
  1070. @end example
  1071. Grab at position @code{10,20}:
  1072. @example
  1073. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  1074. @end example
  1075. @subsection Options
  1076. @table @option
  1077. @item draw_mouse
  1078. Specify whether to draw the mouse pointer. A value of @code{0} specifies
  1079. not to draw the pointer. Default value is @code{1}.
  1080. @item follow_mouse
  1081. Make the grabbed area follow the mouse. The argument can be
  1082. @code{centered} or a number of pixels @var{PIXELS}.
  1083. When it is specified with "centered", the grabbing region follows the mouse
  1084. pointer and keeps the pointer at the center of region; otherwise, the region
  1085. follows only when the mouse pointer reaches within @var{PIXELS} (greater than
  1086. zero) to the edge of region.
  1087. For example:
  1088. @example
  1089. ffmpeg -f x11grab -follow_mouse centered -framerate 25 -video_size cif -i :0.0 out.mpg
  1090. @end example
  1091. To follow only when the mouse pointer reaches within 100 pixels to edge:
  1092. @example
  1093. ffmpeg -f x11grab -follow_mouse 100 -framerate 25 -video_size cif -i :0.0 out.mpg
  1094. @end example
  1095. @item framerate
  1096. Set the grabbing frame rate. Default value is @code{ntsc},
  1097. corresponding to a frame rate of @code{30000/1001}.
  1098. @item show_region
  1099. Show grabbed region on screen.
  1100. If @var{show_region} is specified with @code{1}, then the grabbing
  1101. region will be indicated on screen. With this option, it is easy to
  1102. know what is being grabbed if only a portion of the screen is grabbed.
  1103. @item region_border
  1104. Set the region border thickness if @option{-show_region 1} is used.
  1105. Range is 1 to 128 and default is 3 (XCB-based x11grab only).
  1106. For example:
  1107. @example
  1108. ffmpeg -f x11grab -show_region 1 -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  1109. @end example
  1110. With @var{follow_mouse}:
  1111. @example
  1112. ffmpeg -f x11grab -follow_mouse centered -show_region 1 -framerate 25 -video_size cif -i :0.0 out.mpg
  1113. @end example
  1114. @item video_size
  1115. Set the video frame size. Default value is @code{vga}.
  1116. @item grab_x
  1117. @item grab_y
  1118. Set the grabbing region coordinates. They are expressed as offset from
  1119. the top left corner of the X11 window and correspond to the
  1120. @var{x_offset} and @var{y_offset} parameters in the device name. The
  1121. default value for both options is 0.
  1122. @end table
  1123. @c man end INPUT DEVICES