You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

1204 lines
35KB

  1. @chapter Input Devices
  2. @c man begin INPUT DEVICES
  3. Input devices are configured elements in FFmpeg which enable accessing
  4. the data coming from a multimedia device attached to your system.
  5. When you configure your FFmpeg build, all the supported input devices
  6. are enabled by default. You can list all available ones using the
  7. configure option "--list-indevs".
  8. You can disable all the input devices using the configure option
  9. "--disable-indevs", and selectively enable an input device using the
  10. option "--enable-indev=@var{INDEV}", or you can disable a particular
  11. input device using the option "--disable-indev=@var{INDEV}".
  12. The option "-devices" of the ff* tools will display the list of
  13. supported input devices.
  14. A description of the currently available input devices follows.
  15. @section alsa
  16. ALSA (Advanced Linux Sound Architecture) input device.
  17. To enable this input device during configuration you need libasound
  18. installed on your system.
  19. This device allows capturing from an ALSA device. The name of the
  20. device to capture has to be an ALSA card identifier.
  21. An ALSA identifier has the syntax:
  22. @example
  23. hw:@var{CARD}[,@var{DEV}[,@var{SUBDEV}]]
  24. @end example
  25. where the @var{DEV} and @var{SUBDEV} components are optional.
  26. The three arguments (in order: @var{CARD},@var{DEV},@var{SUBDEV})
  27. specify card number or identifier, device number and subdevice number
  28. (-1 means any).
  29. To see the list of cards currently recognized by your system check the
  30. files @file{/proc/asound/cards} and @file{/proc/asound/devices}.
  31. For example to capture with @command{ffmpeg} from an ALSA device with
  32. card id 0, you may run the command:
  33. @example
  34. ffmpeg -f alsa -i hw:0 alsaout.wav
  35. @end example
  36. For more information see:
  37. @url{http://www.alsa-project.org/alsa-doc/alsa-lib/pcm.html}
  38. @section avfoundation
  39. AVFoundation input device.
  40. AVFoundation is the currently recommended framework by Apple for streamgrabbing on OSX >= 10.7 as well as on iOS.
  41. The older QTKit framework has been marked deprecated since OSX version 10.7.
  42. The input filename has to be given in the following syntax:
  43. @example
  44. -i "[[VIDEO]:[AUDIO]]"
  45. @end example
  46. The first entry selects the video input while the latter selects the audio input.
  47. The stream has to be specified by the device name or the device index as shown by the device list.
  48. Alternatively, the video and/or audio input device can be chosen by index using the
  49. @option{
  50. -video_device_index <INDEX>
  51. }
  52. and/or
  53. @option{
  54. -audio_device_index <INDEX>
  55. }
  56. , overriding any
  57. device name or index given in the input filename.
  58. All available devices can be enumerated by using @option{-list_devices true}, listing
  59. all device names and corresponding indices.
  60. There are two device name aliases:
  61. @table @code
  62. @item default
  63. Select the AVFoundation default device of the corresponding type.
  64. @item none
  65. Do not record the corresponding media type.
  66. This is equivalent to specifying an empty device name or index.
  67. @end table
  68. @subsection Options
  69. AVFoundation supports the following options:
  70. @table @option
  71. @item -list_devices <TRUE|FALSE>
  72. If set to true, a list of all available input devices is given showing all
  73. device names and indices.
  74. @item -video_device_index <INDEX>
  75. Specify the video device by its index. Overrides anything given in the input filename.
  76. @item -audio_device_index <INDEX>
  77. Specify the audio device by its index. Overrides anything given in the input filename.
  78. @item -pixel_format <FORMAT>
  79. Request the video device to use a specific pixel format.
  80. If the specified format is not supported, a list of available formats is given
  81. und the first one in this list is used instead. Available pixel formats are:
  82. @code{monob, rgb555be, rgb555le, rgb565be, rgb565le, rgb24, bgr24, 0rgb, bgr0, 0bgr, rgb0,
  83. bgr48be, uyvy422, yuva444p, yuva444p16le, yuv444p, yuv422p16, yuv422p10, yuv444p10,
  84. yuv420p, nv12, yuyv422, gray}
  85. @end table
  86. @subsection Examples
  87. @itemize
  88. @item
  89. Print the list of AVFoundation supported devices and exit:
  90. @example
  91. $ ffmpeg -f avfoundation -list_devices true -i ""
  92. @end example
  93. @item
  94. Record video from video device 0 and audio from audio device 0 into out.avi:
  95. @example
  96. $ ffmpeg -f avfoundation -i "0:0" out.avi
  97. @end example
  98. @item
  99. Record video from video device 2 and audio from audio device 1 into out.avi:
  100. @example
  101. $ ffmpeg -f avfoundation -video_device_index 2 -i ":1" out.avi
  102. @end example
  103. @item
  104. Record video from the system default video device using the pixel format bgr0 and do not record any audio into out.avi:
  105. @example
  106. $ ffmpeg -f avfoundation -pixel_format bgr0 -i "default:none" out.avi
  107. @end example
  108. @end itemize
  109. @section bktr
  110. BSD video input device.
  111. @section decklink
  112. The decklink input device provides capture capabilities for Blackmagic
  113. DeckLink devices.
  114. To enable this input device, you need the Blackmagic DeckLink SDK and you
  115. need to configure with the appropriate @code{--extra-cflags}
  116. and @code{--extra-ldflags}.
  117. On Windows, you need to run the IDL files through @command{widl}.
  118. DeckLink is very picky about the formats it supports. Pixel format is
  119. uyvy422 or v210, framerate and video size must be determined for your device with
  120. @command{-list_formats 1}. Audio sample rate is always 48 kHz and the number
  121. of channels can be 2, 8 or 16.
  122. @subsection Options
  123. @table @option
  124. @item list_devices
  125. If set to @option{true}, print a list of devices and exit.
  126. Defaults to @option{false}.
  127. @item list_formats
  128. If set to @option{true}, print a list of supported formats and exit.
  129. Defaults to @option{false}.
  130. @item bm_v210
  131. If set to @samp{1}, video is captured in 10 bit v210 instead
  132. of uyvy422. Not all Blackmagic devices support this option.
  133. @item bm_channels <CHANNELS>
  134. Number of audio channels, can be 2, 8 or 16
  135. @item bm_audiodepth <BITDEPTH>
  136. Audio bit depth, can be 16 or 32.
  137. @end table
  138. @subsection Examples
  139. @itemize
  140. @item
  141. List input devices:
  142. @example
  143. ffmpeg -f decklink -list_devices 1 -i dummy
  144. @end example
  145. @item
  146. List supported formats:
  147. @example
  148. ffmpeg -f decklink -list_formats 1 -i 'Intensity Pro'
  149. @end example
  150. @item
  151. Capture video clip at 1080i50 (format 11):
  152. @example
  153. ffmpeg -f decklink -i 'Intensity Pro@@11' -acodec copy -vcodec copy output.avi
  154. @end example
  155. @item
  156. Capture video clip at 1080i50 10 bit:
  157. @example
  158. ffmpeg -bm_v210 1 -f decklink -i 'UltraStudio Mini Recorder@@11' -acodec copy -vcodec copy output.avi
  159. @end example
  160. @item
  161. Capture video clip at 720p50 with 32bit audio:
  162. @example
  163. ffmpeg -bm_audiodepth 32 -f decklink -i 'UltraStudio Mini Recorder@@14' -acodec copy -vcodec copy output.avi
  164. @end example
  165. @item
  166. Capture video clip at 576i50 with 8 audio channels:
  167. @example
  168. ffmpeg -bm_channels 8 -f decklink -i 'UltraStudio Mini Recorder@@3' -acodec copy -vcodec copy output.avi
  169. @end example
  170. @end itemize
  171. @section dshow
  172. Windows DirectShow input device.
  173. DirectShow support is enabled when FFmpeg is built with the mingw-w64 project.
  174. Currently only audio and video devices are supported.
  175. Multiple devices may be opened as separate inputs, but they may also be
  176. opened on the same input, which should improve synchronism between them.
  177. The input name should be in the format:
  178. @example
  179. @var{TYPE}=@var{NAME}[:@var{TYPE}=@var{NAME}]
  180. @end example
  181. where @var{TYPE} can be either @var{audio} or @var{video},
  182. and @var{NAME} is the device's name or alternative name..
  183. @subsection Options
  184. If no options are specified, the device's defaults are used.
  185. If the device does not support the requested options, it will
  186. fail to open.
  187. @table @option
  188. @item video_size
  189. Set the video size in the captured video.
  190. @item framerate
  191. Set the frame rate in the captured video.
  192. @item sample_rate
  193. Set the sample rate (in Hz) of the captured audio.
  194. @item sample_size
  195. Set the sample size (in bits) of the captured audio.
  196. @item channels
  197. Set the number of channels in the captured audio.
  198. @item list_devices
  199. If set to @option{true}, print a list of devices and exit.
  200. @item list_options
  201. If set to @option{true}, print a list of selected device's options
  202. and exit.
  203. @item video_device_number
  204. Set video device number for devices with same name (starts at 0,
  205. defaults to 0).
  206. @item audio_device_number
  207. Set audio device number for devices with same name (starts at 0,
  208. defaults to 0).
  209. @item pixel_format
  210. Select pixel format to be used by DirectShow. This may only be set when
  211. the video codec is not set or set to rawvideo.
  212. @item audio_buffer_size
  213. Set audio device buffer size in milliseconds (which can directly
  214. impact latency, depending on the device).
  215. Defaults to using the audio device's
  216. default buffer size (typically some multiple of 500ms).
  217. Setting this value too low can degrade performance.
  218. See also
  219. @url{http://msdn.microsoft.com/en-us/library/windows/desktop/dd377582(v=vs.85).aspx}
  220. @item video_pin_name
  221. Select video capture pin to use by name or alternative name.
  222. @item audio_pin_name
  223. Select audio capture pin to use by name or alternative name.
  224. @item crossbar_video_input_pin_number
  225. Select video input pin number for crossbar device. This will be
  226. routed to the crossbar device's Video Decoder output pin.
  227. Note that changing this value can affect future invocations
  228. (sets a new default) until system reboot occurs.
  229. @item crossbar_audio_input_pin_number
  230. Select audio input pin number for crossbar device. This will be
  231. routed to the crossbar device's Audio Decoder output pin.
  232. Note that changing this value can affect future invocations
  233. (sets a new default) until system reboot occurs.
  234. @item show_video_device_dialog
  235. If set to @option{true}, before capture starts, popup a display dialog
  236. to the end user, allowing them to change video filter properties
  237. and configurations manually.
  238. Note that for crossbar devices, adjusting values in this dialog
  239. may be needed at times to toggle between PAL (25 fps) and NTSC (29.97)
  240. input frame rates, sizes, interlacing, etc. Changing these values can
  241. enable different scan rates/frame rates and avoiding green bars at
  242. the bottom, flickering scan lines, etc.
  243. Note that with some devices, changing these properties can also affect future
  244. invocations (sets new defaults) until system reboot occurs.
  245. @item show_audio_device_dialog
  246. If set to @option{true}, before capture starts, popup a display dialog
  247. to the end user, allowing them to change audio filter properties
  248. and configurations manually.
  249. @item show_video_crossbar_connection_dialog
  250. If set to @option{true}, before capture starts, popup a display
  251. dialog to the end user, allowing them to manually
  252. modify crossbar pin routings, when it opens a video device.
  253. @item show_audio_crossbar_connection_dialog
  254. If set to @option{true}, before capture starts, popup a display
  255. dialog to the end user, allowing them to manually
  256. modify crossbar pin routings, when it opens an audio device.
  257. @item show_analog_tv_tuner_dialog
  258. If set to @option{true}, before capture starts, popup a display
  259. dialog to the end user, allowing them to manually
  260. modify TV channels and frequencies.
  261. @item show_analog_tv_tuner_audio_dialog
  262. If set to @option{true}, before capture starts, popup a display
  263. dialog to the end user, allowing them to manually
  264. modify TV audio (like mono vs. stereo, Language A,B or C).
  265. @end table
  266. @subsection Examples
  267. @itemize
  268. @item
  269. Print the list of DirectShow supported devices and exit:
  270. @example
  271. $ ffmpeg -list_devices true -f dshow -i dummy
  272. @end example
  273. @item
  274. Open video device @var{Camera}:
  275. @example
  276. $ ffmpeg -f dshow -i video="Camera"
  277. @end example
  278. @item
  279. Open second video device with name @var{Camera}:
  280. @example
  281. $ ffmpeg -f dshow -video_device_number 1 -i video="Camera"
  282. @end example
  283. @item
  284. Open video device @var{Camera} and audio device @var{Microphone}:
  285. @example
  286. $ ffmpeg -f dshow -i video="Camera":audio="Microphone"
  287. @end example
  288. @item
  289. Print the list of supported options in selected device and exit:
  290. @example
  291. $ ffmpeg -list_options true -f dshow -i video="Camera"
  292. @end example
  293. @item
  294. Specify pin names to capture by name or alternative name, specify alternative device name:
  295. @example
  296. $ ffmpeg -f dshow -audio_pin_name "Audio Out" -video_pin_name 2 -i video=video="@@device_pnp_\\?\pci#ven_1a0a&dev_6200&subsys_62021461&rev_01#4&e2c7dd6&0&00e1#@{65e8773d-8f56-11d0-a3b9-00a0c9223196@}\@{ca465100-deb0-4d59-818f-8c477184adf6@}":audio="Microphone"
  297. @end example
  298. @item
  299. Configure a crossbar device, specifying crossbar pins, allow user to adjust video capture properties at startup:
  300. @example
  301. $ ffmpeg -f dshow -show_video_device_dialog true -crossbar_video_input_pin_number 0
  302. -crossbar_audio_input_pin_number 3 -i video="AVerMedia BDA Analog Capture":audio="AVerMedia BDA Analog Capture"
  303. @end example
  304. @end itemize
  305. @section dv1394
  306. Linux DV 1394 input device.
  307. @section fbdev
  308. Linux framebuffer input device.
  309. The Linux framebuffer is a graphic hardware-independent abstraction
  310. layer to show graphics on a computer monitor, typically on the
  311. console. It is accessed through a file device node, usually
  312. @file{/dev/fb0}.
  313. For more detailed information read the file
  314. Documentation/fb/framebuffer.txt included in the Linux source tree.
  315. To record from the framebuffer device @file{/dev/fb0} with
  316. @command{ffmpeg}:
  317. @example
  318. ffmpeg -f fbdev -r 10 -i /dev/fb0 out.avi
  319. @end example
  320. You can take a single screenshot image with the command:
  321. @example
  322. ffmpeg -f fbdev -frames:v 1 -r 1 -i /dev/fb0 screenshot.jpeg
  323. @end example
  324. See also @url{http://linux-fbdev.sourceforge.net/}, and fbset(1).
  325. @section gdigrab
  326. Win32 GDI-based screen capture device.
  327. This device allows you to capture a region of the display on Windows.
  328. There are two options for the input filename:
  329. @example
  330. desktop
  331. @end example
  332. or
  333. @example
  334. title=@var{window_title}
  335. @end example
  336. The first option will capture the entire desktop, or a fixed region of the
  337. desktop. The second option will instead capture the contents of a single
  338. window, regardless of its position on the screen.
  339. For example, to grab the entire desktop using @command{ffmpeg}:
  340. @example
  341. ffmpeg -f gdigrab -framerate 6 -i desktop out.mpg
  342. @end example
  343. Grab a 640x480 region at position @code{10,20}:
  344. @example
  345. ffmpeg -f gdigrab -framerate 6 -offset_x 10 -offset_y 20 -video_size vga -i desktop out.mpg
  346. @end example
  347. Grab the contents of the window named "Calculator"
  348. @example
  349. ffmpeg -f gdigrab -framerate 6 -i title=Calculator out.mpg
  350. @end example
  351. @subsection Options
  352. @table @option
  353. @item draw_mouse
  354. Specify whether to draw the mouse pointer. Use the value @code{0} to
  355. not draw the pointer. Default value is @code{1}.
  356. @item framerate
  357. Set the grabbing frame rate. Default value is @code{ntsc},
  358. corresponding to a frame rate of @code{30000/1001}.
  359. @item show_region
  360. Show grabbed region on screen.
  361. If @var{show_region} is specified with @code{1}, then the grabbing
  362. region will be indicated on screen. With this option, it is easy to
  363. know what is being grabbed if only a portion of the screen is grabbed.
  364. Note that @var{show_region} is incompatible with grabbing the contents
  365. of a single window.
  366. For example:
  367. @example
  368. ffmpeg -f gdigrab -show_region 1 -framerate 6 -video_size cif -offset_x 10 -offset_y 20 -i desktop out.mpg
  369. @end example
  370. @item video_size
  371. Set the video frame size. The default is to capture the full screen if @file{desktop} is selected, or the full window size if @file{title=@var{window_title}} is selected.
  372. @item offset_x
  373. When capturing a region with @var{video_size}, set the distance from the left edge of the screen or desktop.
  374. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned to the left of your primary monitor, you will need to use a negative @var{offset_x} value to move the region to that monitor.
  375. @item offset_y
  376. When capturing a region with @var{video_size}, set the distance from the top edge of the screen or desktop.
  377. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned above your primary monitor, you will need to use a negative @var{offset_y} value to move the region to that monitor.
  378. @end table
  379. @section iec61883
  380. FireWire DV/HDV input device using libiec61883.
  381. To enable this input device, you need libiec61883, libraw1394 and
  382. libavc1394 installed on your system. Use the configure option
  383. @code{--enable-libiec61883} to compile with the device enabled.
  384. The iec61883 capture device supports capturing from a video device
  385. connected via IEEE1394 (FireWire), using libiec61883 and the new Linux
  386. FireWire stack (juju). This is the default DV/HDV input method in Linux
  387. Kernel 2.6.37 and later, since the old FireWire stack was removed.
  388. Specify the FireWire port to be used as input file, or "auto"
  389. to choose the first port connected.
  390. @subsection Options
  391. @table @option
  392. @item dvtype
  393. Override autodetection of DV/HDV. This should only be used if auto
  394. detection does not work, or if usage of a different device type
  395. should be prohibited. Treating a DV device as HDV (or vice versa) will
  396. not work and result in undefined behavior.
  397. The values @option{auto}, @option{dv} and @option{hdv} are supported.
  398. @item dvbuffer
  399. Set maximum size of buffer for incoming data, in frames. For DV, this
  400. is an exact value. For HDV, it is not frame exact, since HDV does
  401. not have a fixed frame size.
  402. @item dvguid
  403. Select the capture device by specifying it's GUID. Capturing will only
  404. be performed from the specified device and fails if no device with the
  405. given GUID is found. This is useful to select the input if multiple
  406. devices are connected at the same time.
  407. Look at /sys/bus/firewire/devices to find out the GUIDs.
  408. @end table
  409. @subsection Examples
  410. @itemize
  411. @item
  412. Grab and show the input of a FireWire DV/HDV device.
  413. @example
  414. ffplay -f iec61883 -i auto
  415. @end example
  416. @item
  417. Grab and record the input of a FireWire DV/HDV device,
  418. using a packet buffer of 100000 packets if the source is HDV.
  419. @example
  420. ffmpeg -f iec61883 -i auto -hdvbuffer 100000 out.mpg
  421. @end example
  422. @end itemize
  423. @section jack
  424. JACK input device.
  425. To enable this input device during configuration you need libjack
  426. installed on your system.
  427. A JACK input device creates one or more JACK writable clients, one for
  428. each audio channel, with name @var{client_name}:input_@var{N}, where
  429. @var{client_name} is the name provided by the application, and @var{N}
  430. is a number which identifies the channel.
  431. Each writable client will send the acquired data to the FFmpeg input
  432. device.
  433. Once you have created one or more JACK readable clients, you need to
  434. connect them to one or more JACK writable clients.
  435. To connect or disconnect JACK clients you can use the @command{jack_connect}
  436. and @command{jack_disconnect} programs, or do it through a graphical interface,
  437. for example with @command{qjackctl}.
  438. To list the JACK clients and their properties you can invoke the command
  439. @command{jack_lsp}.
  440. Follows an example which shows how to capture a JACK readable client
  441. with @command{ffmpeg}.
  442. @example
  443. # Create a JACK writable client with name "ffmpeg".
  444. $ ffmpeg -f jack -i ffmpeg -y out.wav
  445. # Start the sample jack_metro readable client.
  446. $ jack_metro -b 120 -d 0.2 -f 4000
  447. # List the current JACK clients.
  448. $ jack_lsp -c
  449. system:capture_1
  450. system:capture_2
  451. system:playback_1
  452. system:playback_2
  453. ffmpeg:input_1
  454. metro:120_bpm
  455. # Connect metro to the ffmpeg writable client.
  456. $ jack_connect metro:120_bpm ffmpeg:input_1
  457. @end example
  458. For more information read:
  459. @url{http://jackaudio.org/}
  460. @section lavfi
  461. Libavfilter input virtual device.
  462. This input device reads data from the open output pads of a libavfilter
  463. filtergraph.
  464. For each filtergraph open output, the input device will create a
  465. corresponding stream which is mapped to the generated output. Currently
  466. only video data is supported. The filtergraph is specified through the
  467. option @option{graph}.
  468. @subsection Options
  469. @table @option
  470. @item graph
  471. Specify the filtergraph to use as input. Each video open output must be
  472. labelled by a unique string of the form "out@var{N}", where @var{N} is a
  473. number starting from 0 corresponding to the mapped input stream
  474. generated by the device.
  475. The first unlabelled output is automatically assigned to the "out0"
  476. label, but all the others need to be specified explicitly.
  477. The suffix "+subcc" can be appended to the output label to create an extra
  478. stream with the closed captions packets attached to that output
  479. (experimental; only for EIA-608 / CEA-708 for now).
  480. The subcc streams are created after all the normal streams, in the order of
  481. the corresponding stream.
  482. For example, if there is "out19+subcc", "out7+subcc" and up to "out42", the
  483. stream #43 is subcc for stream #7 and stream #44 is subcc for stream #19.
  484. If not specified defaults to the filename specified for the input
  485. device.
  486. @item graph_file
  487. Set the filename of the filtergraph to be read and sent to the other
  488. filters. Syntax of the filtergraph is the same as the one specified by
  489. the option @var{graph}.
  490. @end table
  491. @subsection Examples
  492. @itemize
  493. @item
  494. Create a color video stream and play it back with @command{ffplay}:
  495. @example
  496. ffplay -f lavfi -graph "color=c=pink [out0]" dummy
  497. @end example
  498. @item
  499. As the previous example, but use filename for specifying the graph
  500. description, and omit the "out0" label:
  501. @example
  502. ffplay -f lavfi color=c=pink
  503. @end example
  504. @item
  505. Create three different video test filtered sources and play them:
  506. @example
  507. ffplay -f lavfi -graph "testsrc [out0]; testsrc,hflip [out1]; testsrc,negate [out2]" test3
  508. @end example
  509. @item
  510. Read an audio stream from a file using the amovie source and play it
  511. back with @command{ffplay}:
  512. @example
  513. ffplay -f lavfi "amovie=test.wav"
  514. @end example
  515. @item
  516. Read an audio stream and a video stream and play it back with
  517. @command{ffplay}:
  518. @example
  519. ffplay -f lavfi "movie=test.avi[out0];amovie=test.wav[out1]"
  520. @end example
  521. @item
  522. Dump decoded frames to images and closed captions to a file (experimental):
  523. @example
  524. ffmpeg -f lavfi -i "movie=test.ts[out0+subcc]" -map v frame%08d.png -map s -c copy -f rawvideo subcc.bin
  525. @end example
  526. @end itemize
  527. @section libcdio
  528. Audio-CD input device based on libcdio.
  529. To enable this input device during configuration you need libcdio
  530. installed on your system. It requires the configure option
  531. @code{--enable-libcdio}.
  532. This device allows playing and grabbing from an Audio-CD.
  533. For example to copy with @command{ffmpeg} the entire Audio-CD in @file{/dev/sr0},
  534. you may run the command:
  535. @example
  536. ffmpeg -f libcdio -i /dev/sr0 cd.wav
  537. @end example
  538. @subsection Options
  539. @table @option
  540. @item speed
  541. Set drive reading speed. Default value is 0.
  542. The speed is specified CD-ROM speed units. The speed is set through
  543. the libcdio @code{cdio_cddap_speed_set} function. On many CD-ROM
  544. drives, specifying a value too large will result in using the fastest
  545. speed.
  546. @item paranoia_mode
  547. Set paranoia recovery mode flags. It accepts one of the following values:
  548. @table @samp
  549. @item disable
  550. @item verify
  551. @item overlap
  552. @item neverskip
  553. @item full
  554. @end table
  555. Default value is @samp{disable}.
  556. For more information about the available recovery modes, consult the
  557. paranoia project documentation.
  558. @end table
  559. @section libdc1394
  560. IIDC1394 input device, based on libdc1394 and libraw1394.
  561. Requires the configure option @code{--enable-libdc1394}.
  562. @section openal
  563. The OpenAL input device provides audio capture on all systems with a
  564. working OpenAL 1.1 implementation.
  565. To enable this input device during configuration, you need OpenAL
  566. headers and libraries installed on your system, and need to configure
  567. FFmpeg with @code{--enable-openal}.
  568. OpenAL headers and libraries should be provided as part of your OpenAL
  569. implementation, or as an additional download (an SDK). Depending on your
  570. installation you may need to specify additional flags via the
  571. @code{--extra-cflags} and @code{--extra-ldflags} for allowing the build
  572. system to locate the OpenAL headers and libraries.
  573. An incomplete list of OpenAL implementations follows:
  574. @table @strong
  575. @item Creative
  576. The official Windows implementation, providing hardware acceleration
  577. with supported devices and software fallback.
  578. See @url{http://openal.org/}.
  579. @item OpenAL Soft
  580. Portable, open source (LGPL) software implementation. Includes
  581. backends for the most common sound APIs on the Windows, Linux,
  582. Solaris, and BSD operating systems.
  583. See @url{http://kcat.strangesoft.net/openal.html}.
  584. @item Apple
  585. OpenAL is part of Core Audio, the official Mac OS X Audio interface.
  586. See @url{http://developer.apple.com/technologies/mac/audio-and-video.html}
  587. @end table
  588. This device allows one to capture from an audio input device handled
  589. through OpenAL.
  590. You need to specify the name of the device to capture in the provided
  591. filename. If the empty string is provided, the device will
  592. automatically select the default device. You can get the list of the
  593. supported devices by using the option @var{list_devices}.
  594. @subsection Options
  595. @table @option
  596. @item channels
  597. Set the number of channels in the captured audio. Only the values
  598. @option{1} (monaural) and @option{2} (stereo) are currently supported.
  599. Defaults to @option{2}.
  600. @item sample_size
  601. Set the sample size (in bits) of the captured audio. Only the values
  602. @option{8} and @option{16} are currently supported. Defaults to
  603. @option{16}.
  604. @item sample_rate
  605. Set the sample rate (in Hz) of the captured audio.
  606. Defaults to @option{44.1k}.
  607. @item list_devices
  608. If set to @option{true}, print a list of devices and exit.
  609. Defaults to @option{false}.
  610. @end table
  611. @subsection Examples
  612. Print the list of OpenAL supported devices and exit:
  613. @example
  614. $ ffmpeg -list_devices true -f openal -i dummy out.ogg
  615. @end example
  616. Capture from the OpenAL device @file{DR-BT101 via PulseAudio}:
  617. @example
  618. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out.ogg
  619. @end example
  620. Capture from the default device (note the empty string '' as filename):
  621. @example
  622. $ ffmpeg -f openal -i '' out.ogg
  623. @end example
  624. Capture from two devices simultaneously, writing to two different files,
  625. within the same @command{ffmpeg} command:
  626. @example
  627. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out1.ogg -f openal -i 'ALSA Default' out2.ogg
  628. @end example
  629. Note: not all OpenAL implementations support multiple simultaneous capture -
  630. try the latest OpenAL Soft if the above does not work.
  631. @section oss
  632. Open Sound System input device.
  633. The filename to provide to the input device is the device node
  634. representing the OSS input device, and is usually set to
  635. @file{/dev/dsp}.
  636. For example to grab from @file{/dev/dsp} using @command{ffmpeg} use the
  637. command:
  638. @example
  639. ffmpeg -f oss -i /dev/dsp /tmp/oss.wav
  640. @end example
  641. For more information about OSS see:
  642. @url{http://manuals.opensound.com/usersguide/dsp.html}
  643. @section pulse
  644. PulseAudio input device.
  645. To enable this output device you need to configure FFmpeg with @code{--enable-libpulse}.
  646. The filename to provide to the input device is a source device or the
  647. string "default"
  648. To list the PulseAudio source devices and their properties you can invoke
  649. the command @command{pactl list sources}.
  650. More information about PulseAudio can be found on @url{http://www.pulseaudio.org}.
  651. @subsection Options
  652. @table @option
  653. @item server
  654. Connect to a specific PulseAudio server, specified by an IP address.
  655. Default server is used when not provided.
  656. @item name
  657. Specify the application name PulseAudio will use when showing active clients,
  658. by default it is the @code{LIBAVFORMAT_IDENT} string.
  659. @item stream_name
  660. Specify the stream name PulseAudio will use when showing active streams,
  661. by default it is "record".
  662. @item sample_rate
  663. Specify the samplerate in Hz, by default 48kHz is used.
  664. @item channels
  665. Specify the channels in use, by default 2 (stereo) is set.
  666. @item frame_size
  667. Specify the number of bytes per frame, by default it is set to 1024.
  668. @item fragment_size
  669. Specify the minimal buffering fragment in PulseAudio, it will affect the
  670. audio latency. By default it is unset.
  671. @end table
  672. @subsection Examples
  673. Record a stream from default device:
  674. @example
  675. ffmpeg -f pulse -i default /tmp/pulse.wav
  676. @end example
  677. @section qtkit
  678. QTKit input device.
  679. The filename passed as input is parsed to contain either a device name or index.
  680. The device index can also be given by using -video_device_index.
  681. A given device index will override any given device name.
  682. If the desired device consists of numbers only, use -video_device_index to identify it.
  683. The default device will be chosen if an empty string or the device name "default" is given.
  684. The available devices can be enumerated by using -list_devices.
  685. @example
  686. ffmpeg -f qtkit -i "0" out.mpg
  687. @end example
  688. @example
  689. ffmpeg -f qtkit -video_device_index 0 -i "" out.mpg
  690. @end example
  691. @example
  692. ffmpeg -f qtkit -i "default" out.mpg
  693. @end example
  694. @example
  695. ffmpeg -f qtkit -list_devices true -i ""
  696. @end example
  697. @section sndio
  698. sndio input device.
  699. To enable this input device during configuration you need libsndio
  700. installed on your system.
  701. The filename to provide to the input device is the device node
  702. representing the sndio input device, and is usually set to
  703. @file{/dev/audio0}.
  704. For example to grab from @file{/dev/audio0} using @command{ffmpeg} use the
  705. command:
  706. @example
  707. ffmpeg -f sndio -i /dev/audio0 /tmp/oss.wav
  708. @end example
  709. @section video4linux2, v4l2
  710. Video4Linux2 input video device.
  711. "v4l2" can be used as alias for "video4linux2".
  712. If FFmpeg is built with v4l-utils support (by using the
  713. @code{--enable-libv4l2} configure option), it is possible to use it with the
  714. @code{-use_libv4l2} input device option.
  715. The name of the device to grab is a file device node, usually Linux
  716. systems tend to automatically create such nodes when the device
  717. (e.g. an USB webcam) is plugged into the system, and has a name of the
  718. kind @file{/dev/video@var{N}}, where @var{N} is a number associated to
  719. the device.
  720. Video4Linux2 devices usually support a limited set of
  721. @var{width}x@var{height} sizes and frame rates. You can check which are
  722. supported using @command{-list_formats all} for Video4Linux2 devices.
  723. Some devices, like TV cards, support one or more standards. It is possible
  724. to list all the supported standards using @command{-list_standards all}.
  725. The time base for the timestamps is 1 microsecond. Depending on the kernel
  726. version and configuration, the timestamps may be derived from the real time
  727. clock (origin at the Unix Epoch) or the monotonic clock (origin usually at
  728. boot time, unaffected by NTP or manual changes to the clock). The
  729. @option{-timestamps abs} or @option{-ts abs} option can be used to force
  730. conversion into the real time clock.
  731. Some usage examples of the video4linux2 device with @command{ffmpeg}
  732. and @command{ffplay}:
  733. @itemize
  734. @item
  735. List supported formats for a video4linux2 device:
  736. @example
  737. ffplay -f video4linux2 -list_formats all /dev/video0
  738. @end example
  739. @item
  740. Grab and show the input of a video4linux2 device:
  741. @example
  742. ffplay -f video4linux2 -framerate 30 -video_size hd720 /dev/video0
  743. @end example
  744. @item
  745. Grab and record the input of a video4linux2 device, leave the
  746. frame rate and size as previously set:
  747. @example
  748. ffmpeg -f video4linux2 -input_format mjpeg -i /dev/video0 out.mpeg
  749. @end example
  750. @end itemize
  751. For more information about Video4Linux, check @url{http://linuxtv.org/}.
  752. @subsection Options
  753. @table @option
  754. @item standard
  755. Set the standard. Must be the name of a supported standard. To get a
  756. list of the supported standards, use the @option{list_standards}
  757. option.
  758. @item channel
  759. Set the input channel number. Default to -1, which means using the
  760. previously selected channel.
  761. @item video_size
  762. Set the video frame size. The argument must be a string in the form
  763. @var{WIDTH}x@var{HEIGHT} or a valid size abbreviation.
  764. @item pixel_format
  765. Select the pixel format (only valid for raw video input).
  766. @item input_format
  767. Set the preferred pixel format (for raw video) or a codec name.
  768. This option allows one to select the input format, when several are
  769. available.
  770. @item framerate
  771. Set the preferred video frame rate.
  772. @item list_formats
  773. List available formats (supported pixel formats, codecs, and frame
  774. sizes) and exit.
  775. Available values are:
  776. @table @samp
  777. @item all
  778. Show all available (compressed and non-compressed) formats.
  779. @item raw
  780. Show only raw video (non-compressed) formats.
  781. @item compressed
  782. Show only compressed formats.
  783. @end table
  784. @item list_standards
  785. List supported standards and exit.
  786. Available values are:
  787. @table @samp
  788. @item all
  789. Show all supported standards.
  790. @end table
  791. @item timestamps, ts
  792. Set type of timestamps for grabbed frames.
  793. Available values are:
  794. @table @samp
  795. @item default
  796. Use timestamps from the kernel.
  797. @item abs
  798. Use absolute timestamps (wall clock).
  799. @item mono2abs
  800. Force conversion from monotonic to absolute timestamps.
  801. @end table
  802. Default value is @code{default}.
  803. @end table
  804. @section vfwcap
  805. VfW (Video for Windows) capture input device.
  806. The filename passed as input is the capture driver number, ranging from
  807. 0 to 9. You may use "list" as filename to print a list of drivers. Any
  808. other filename will be interpreted as device number 0.
  809. @section x11grab
  810. X11 video input device.
  811. To enable this input device during configuration you need libxcb
  812. installed on your system. It will be automatically detected during
  813. configuration.
  814. Alternatively, the configure option @option{--enable-x11grab} exists
  815. for legacy Xlib users.
  816. This device allows one to capture a region of an X11 display.
  817. The filename passed as input has the syntax:
  818. @example
  819. [@var{hostname}]:@var{display_number}.@var{screen_number}[+@var{x_offset},@var{y_offset}]
  820. @end example
  821. @var{hostname}:@var{display_number}.@var{screen_number} specifies the
  822. X11 display name of the screen to grab from. @var{hostname} can be
  823. omitted, and defaults to "localhost". The environment variable
  824. @env{DISPLAY} contains the default display name.
  825. @var{x_offset} and @var{y_offset} specify the offsets of the grabbed
  826. area with respect to the top-left border of the X11 screen. They
  827. default to 0.
  828. Check the X11 documentation (e.g. @command{man X}) for more detailed
  829. information.
  830. Use the @command{xdpyinfo} program for getting basic information about
  831. the properties of your X11 display (e.g. grep for "name" or
  832. "dimensions").
  833. For example to grab from @file{:0.0} using @command{ffmpeg}:
  834. @example
  835. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0 out.mpg
  836. @end example
  837. Grab at position @code{10,20}:
  838. @example
  839. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  840. @end example
  841. @subsection Options
  842. @table @option
  843. @item draw_mouse
  844. Specify whether to draw the mouse pointer. A value of @code{0} specify
  845. not to draw the pointer. Default value is @code{1}.
  846. @item follow_mouse
  847. Make the grabbed area follow the mouse. The argument can be
  848. @code{centered} or a number of pixels @var{PIXELS}.
  849. When it is specified with "centered", the grabbing region follows the mouse
  850. pointer and keeps the pointer at the center of region; otherwise, the region
  851. follows only when the mouse pointer reaches within @var{PIXELS} (greater than
  852. zero) to the edge of region.
  853. For example:
  854. @example
  855. ffmpeg -f x11grab -follow_mouse centered -framerate 25 -video_size cif -i :0.0 out.mpg
  856. @end example
  857. To follow only when the mouse pointer reaches within 100 pixels to edge:
  858. @example
  859. ffmpeg -f x11grab -follow_mouse 100 -framerate 25 -video_size cif -i :0.0 out.mpg
  860. @end example
  861. @item framerate
  862. Set the grabbing frame rate. Default value is @code{ntsc},
  863. corresponding to a frame rate of @code{30000/1001}.
  864. @item show_region
  865. Show grabbed region on screen.
  866. If @var{show_region} is specified with @code{1}, then the grabbing
  867. region will be indicated on screen. With this option, it is easy to
  868. know what is being grabbed if only a portion of the screen is grabbed.
  869. @item region_border
  870. Set the region border thickness if @option{-show_region 1} is used.
  871. Range is 1 to 128 and default is 3 (XCB-based x11grab only).
  872. For example:
  873. @example
  874. ffmpeg -f x11grab -show_region 1 -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  875. @end example
  876. With @var{follow_mouse}:
  877. @example
  878. ffmpeg -f x11grab -follow_mouse centered -show_region 1 -framerate 25 -video_size cif -i :0.0 out.mpg
  879. @end example
  880. @item video_size
  881. Set the video frame size. Default value is @code{vga}.
  882. @item use_shm
  883. Use the MIT-SHM extension for shared memory. Default value is @code{1}.
  884. It may be necessary to disable it for remote displays (legacy x11grab
  885. only).
  886. @end table
  887. @subsection @var{grab_x} @var{grab_y} AVOption
  888. The syntax is:
  889. @example
  890. -grab_x @var{x_offset} -grab_y @var{y_offset}
  891. @end example
  892. Set the grabbing region coordinates. They are expressed as offset from the top left
  893. corner of the X11 window. The default value is 0.
  894. @c man end INPUT DEVICES