You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

1040 lines
30KB

  1. @chapter Input Devices
  2. @c man begin INPUT DEVICES
  3. Input devices are configured elements in FFmpeg which allow to access
  4. the data coming from a multimedia device attached to your system.
  5. When you configure your FFmpeg build, all the supported input devices
  6. are enabled by default. You can list all available ones using the
  7. configure option "--list-indevs".
  8. You can disable all the input devices using the configure option
  9. "--disable-indevs", and selectively enable an input device using the
  10. option "--enable-indev=@var{INDEV}", or you can disable a particular
  11. input device using the option "--disable-indev=@var{INDEV}".
  12. The option "-devices" of the ff* tools will display the list of
  13. supported input devices.
  14. A description of the currently available input devices follows.
  15. @section alsa
  16. ALSA (Advanced Linux Sound Architecture) input device.
  17. To enable this input device during configuration you need libasound
  18. installed on your system.
  19. This device allows capturing from an ALSA device. The name of the
  20. device to capture has to be an ALSA card identifier.
  21. An ALSA identifier has the syntax:
  22. @example
  23. hw:@var{CARD}[,@var{DEV}[,@var{SUBDEV}]]
  24. @end example
  25. where the @var{DEV} and @var{SUBDEV} components are optional.
  26. The three arguments (in order: @var{CARD},@var{DEV},@var{SUBDEV})
  27. specify card number or identifier, device number and subdevice number
  28. (-1 means any).
  29. To see the list of cards currently recognized by your system check the
  30. files @file{/proc/asound/cards} and @file{/proc/asound/devices}.
  31. For example to capture with @command{ffmpeg} from an ALSA device with
  32. card id 0, you may run the command:
  33. @example
  34. ffmpeg -f alsa -i hw:0 alsaout.wav
  35. @end example
  36. For more information see:
  37. @url{http://www.alsa-project.org/alsa-doc/alsa-lib/pcm.html}
  38. @section avfoundation
  39. AVFoundation input device.
  40. AVFoundation is the currently recommended framework by Apple for streamgrabbing on OSX >= 10.7 as well as on iOS.
  41. The older QTKit framework has been marked deprecated since OSX version 10.7.
  42. The input filename has to be given in the following syntax:
  43. @example
  44. -i "[[VIDEO]:[AUDIO]]"
  45. @end example
  46. The first entry selects the video input while the latter selects the audio input.
  47. The stream has to be specified by the device name or the device index as shown by the device list.
  48. Alternatively, the video and/or audio input device can be chosen by index using the
  49. @option{
  50. -video_device_index <INDEX>
  51. }
  52. and/or
  53. @option{
  54. -audio_device_index <INDEX>
  55. }
  56. , overriding any
  57. device name or index given in the input filename.
  58. All available devices can be enumerated by using @option{-list_devices true}, listing
  59. all device names and corresponding indices.
  60. There are two device name aliases:
  61. @table @code
  62. @item default
  63. Select the AVFoundation default device of the corresponding type.
  64. @item none
  65. Do not record the corresponding media type.
  66. This is equivalent to specifiying an empty device name or index.
  67. @end table
  68. @subsection Options
  69. AVFoundation supports the following options:
  70. @table @option
  71. @item -list_devices <TRUE|FALSE>
  72. If set to true, a list of all available input devices is given showing all
  73. device names and indices.
  74. @item -video_device_index <INDEX>
  75. Specify the video device by its index. Overrides anything given in the input filename.
  76. @item -audio_device_index <INDEX>
  77. Specify the audio device by its index. Overrides anything given in the input filename.
  78. @item -pixel_format <FORMAT>
  79. Request the video device to use a specific pixel format.
  80. If the specified format is not supported, a list of available formats is given
  81. und the first one in this list is used instead. Available pixel formats are:
  82. @code{monob, rgb555be, rgb555le, rgb565be, rgb565le, rgb24, bgr24, 0rgb, bgr0, 0bgr, rgb0,
  83. bgr48be, uyvy422, yuva444p, yuva444p16le, yuv444p, yuv422p16, yuv422p10, yuv444p10,
  84. yuv420p, nv12, yuyv422, gray}
  85. @end table
  86. @subsection Examples
  87. @itemize
  88. @item
  89. Print the list of AVFoundation supported devices and exit:
  90. @example
  91. $ ffmpeg -f avfoundation -list_devices true -i ""
  92. @end example
  93. @item
  94. Record video from video device 0 and audio from audio device 0 into out.avi:
  95. @example
  96. $ ffmpeg -f avfoundation -i "0:0" out.avi
  97. @end example
  98. @item
  99. Record video from video device 2 and audio from audio device 1 into out.avi:
  100. @example
  101. $ ffmpeg -f avfoundation -video_device_index 2 -i ":1" out.avi
  102. @end example
  103. @item
  104. Record video from the system default video device using the pixel format bgr0 and do not record any audio into out.avi:
  105. @example
  106. $ ffmpeg -f avfoundation -pixel_format bgr0 -i "default:none" out.avi
  107. @end example
  108. @end itemize
  109. @section bktr
  110. BSD video input device.
  111. @section dshow
  112. Windows DirectShow input device.
  113. DirectShow support is enabled when FFmpeg is built with the mingw-w64 project.
  114. Currently only audio and video devices are supported.
  115. Multiple devices may be opened as separate inputs, but they may also be
  116. opened on the same input, which should improve synchronism between them.
  117. The input name should be in the format:
  118. @example
  119. @var{TYPE}=@var{NAME}[:@var{TYPE}=@var{NAME}]
  120. @end example
  121. where @var{TYPE} can be either @var{audio} or @var{video},
  122. and @var{NAME} is the device's name.
  123. @subsection Options
  124. If no options are specified, the device's defaults are used.
  125. If the device does not support the requested options, it will
  126. fail to open.
  127. @table @option
  128. @item video_size
  129. Set the video size in the captured video.
  130. @item framerate
  131. Set the frame rate in the captured video.
  132. @item sample_rate
  133. Set the sample rate (in Hz) of the captured audio.
  134. @item sample_size
  135. Set the sample size (in bits) of the captured audio.
  136. @item channels
  137. Set the number of channels in the captured audio.
  138. @item list_devices
  139. If set to @option{true}, print a list of devices and exit.
  140. @item list_options
  141. If set to @option{true}, print a list of selected device's options
  142. and exit.
  143. @item video_device_number
  144. Set video device number for devices with same name (starts at 0,
  145. defaults to 0).
  146. @item audio_device_number
  147. Set audio device number for devices with same name (starts at 0,
  148. defaults to 0).
  149. @item pixel_format
  150. Select pixel format to be used by DirectShow. This may only be set when
  151. the video codec is not set or set to rawvideo.
  152. @item audio_buffer_size
  153. Set audio device buffer size in milliseconds (which can directly
  154. impact latency, depending on the device).
  155. Defaults to using the audio device's
  156. default buffer size (typically some multiple of 500ms).
  157. Setting this value too low can degrade performance.
  158. See also
  159. @url{http://msdn.microsoft.com/en-us/library/windows/desktop/dd377582(v=vs.85).aspx}
  160. @end table
  161. @subsection Examples
  162. @itemize
  163. @item
  164. Print the list of DirectShow supported devices and exit:
  165. @example
  166. $ ffmpeg -list_devices true -f dshow -i dummy
  167. @end example
  168. @item
  169. Open video device @var{Camera}:
  170. @example
  171. $ ffmpeg -f dshow -i video="Camera"
  172. @end example
  173. @item
  174. Open second video device with name @var{Camera}:
  175. @example
  176. $ ffmpeg -f dshow -video_device_number 1 -i video="Camera"
  177. @end example
  178. @item
  179. Open video device @var{Camera} and audio device @var{Microphone}:
  180. @example
  181. $ ffmpeg -f dshow -i video="Camera":audio="Microphone"
  182. @end example
  183. @item
  184. Print the list of supported options in selected device and exit:
  185. @example
  186. $ ffmpeg -list_options true -f dshow -i video="Camera"
  187. @end example
  188. @end itemize
  189. @section dv1394
  190. Linux DV 1394 input device.
  191. @section fbdev
  192. Linux framebuffer input device.
  193. The Linux framebuffer is a graphic hardware-independent abstraction
  194. layer to show graphics on a computer monitor, typically on the
  195. console. It is accessed through a file device node, usually
  196. @file{/dev/fb0}.
  197. For more detailed information read the file
  198. Documentation/fb/framebuffer.txt included in the Linux source tree.
  199. To record from the framebuffer device @file{/dev/fb0} with
  200. @command{ffmpeg}:
  201. @example
  202. ffmpeg -f fbdev -r 10 -i /dev/fb0 out.avi
  203. @end example
  204. You can take a single screenshot image with the command:
  205. @example
  206. ffmpeg -f fbdev -frames:v 1 -r 1 -i /dev/fb0 screenshot.jpeg
  207. @end example
  208. See also @url{http://linux-fbdev.sourceforge.net/}, and fbset(1).
  209. @section gdigrab
  210. Win32 GDI-based screen capture device.
  211. This device allows you to capture a region of the display on Windows.
  212. There are two options for the input filename:
  213. @example
  214. desktop
  215. @end example
  216. or
  217. @example
  218. title=@var{window_title}
  219. @end example
  220. The first option will capture the entire desktop, or a fixed region of the
  221. desktop. The second option will instead capture the contents of a single
  222. window, regardless of its position on the screen.
  223. For example, to grab the entire desktop using @command{ffmpeg}:
  224. @example
  225. ffmpeg -f gdigrab -framerate 6 -i desktop out.mpg
  226. @end example
  227. Grab a 640x480 region at position @code{10,20}:
  228. @example
  229. ffmpeg -f gdigrab -framerate 6 -offset_x 10 -offset_y 20 -video_size vga -i desktop out.mpg
  230. @end example
  231. Grab the contents of the window named "Calculator"
  232. @example
  233. ffmpeg -f gdigrab -framerate 6 -i title=Calculator out.mpg
  234. @end example
  235. @subsection Options
  236. @table @option
  237. @item draw_mouse
  238. Specify whether to draw the mouse pointer. Use the value @code{0} to
  239. not draw the pointer. Default value is @code{1}.
  240. @item framerate
  241. Set the grabbing frame rate. Default value is @code{ntsc},
  242. corresponding to a frame rate of @code{30000/1001}.
  243. @item show_region
  244. Show grabbed region on screen.
  245. If @var{show_region} is specified with @code{1}, then the grabbing
  246. region will be indicated on screen. With this option, it is easy to
  247. know what is being grabbed if only a portion of the screen is grabbed.
  248. Note that @var{show_region} is incompatible with grabbing the contents
  249. of a single window.
  250. For example:
  251. @example
  252. ffmpeg -f gdigrab -show_region 1 -framerate 6 -video_size cif -offset_x 10 -offset_y 20 -i desktop out.mpg
  253. @end example
  254. @item video_size
  255. Set the video frame size. The default is to capture the full screen if @file{desktop} is selected, or the full window size if @file{title=@var{window_title}} is selected.
  256. @item offset_x
  257. When capturing a region with @var{video_size}, set the distance from the left edge of the screen or desktop.
  258. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned to the left of your primary monitor, you will need to use a negative @var{offset_x} value to move the region to that monitor.
  259. @item offset_y
  260. When capturing a region with @var{video_size}, set the distance from the top edge of the screen or desktop.
  261. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned above your primary monitor, you will need to use a negative @var{offset_y} value to move the region to that monitor.
  262. @end table
  263. @section iec61883
  264. FireWire DV/HDV input device using libiec61883.
  265. To enable this input device, you need libiec61883, libraw1394 and
  266. libavc1394 installed on your system. Use the configure option
  267. @code{--enable-libiec61883} to compile with the device enabled.
  268. The iec61883 capture device supports capturing from a video device
  269. connected via IEEE1394 (FireWire), using libiec61883 and the new Linux
  270. FireWire stack (juju). This is the default DV/HDV input method in Linux
  271. Kernel 2.6.37 and later, since the old FireWire stack was removed.
  272. Specify the FireWire port to be used as input file, or "auto"
  273. to choose the first port connected.
  274. @subsection Options
  275. @table @option
  276. @item dvtype
  277. Override autodetection of DV/HDV. This should only be used if auto
  278. detection does not work, or if usage of a different device type
  279. should be prohibited. Treating a DV device as HDV (or vice versa) will
  280. not work and result in undefined behavior.
  281. The values @option{auto}, @option{dv} and @option{hdv} are supported.
  282. @item dvbuffer
  283. Set maxiumum size of buffer for incoming data, in frames. For DV, this
  284. is an exact value. For HDV, it is not frame exact, since HDV does
  285. not have a fixed frame size.
  286. @item dvguid
  287. Select the capture device by specifying it's GUID. Capturing will only
  288. be performed from the specified device and fails if no device with the
  289. given GUID is found. This is useful to select the input if multiple
  290. devices are connected at the same time.
  291. Look at /sys/bus/firewire/devices to find out the GUIDs.
  292. @end table
  293. @subsection Examples
  294. @itemize
  295. @item
  296. Grab and show the input of a FireWire DV/HDV device.
  297. @example
  298. ffplay -f iec61883 -i auto
  299. @end example
  300. @item
  301. Grab and record the input of a FireWire DV/HDV device,
  302. using a packet buffer of 100000 packets if the source is HDV.
  303. @example
  304. ffmpeg -f iec61883 -i auto -hdvbuffer 100000 out.mpg
  305. @end example
  306. @end itemize
  307. @section jack
  308. JACK input device.
  309. To enable this input device during configuration you need libjack
  310. installed on your system.
  311. A JACK input device creates one or more JACK writable clients, one for
  312. each audio channel, with name @var{client_name}:input_@var{N}, where
  313. @var{client_name} is the name provided by the application, and @var{N}
  314. is a number which identifies the channel.
  315. Each writable client will send the acquired data to the FFmpeg input
  316. device.
  317. Once you have created one or more JACK readable clients, you need to
  318. connect them to one or more JACK writable clients.
  319. To connect or disconnect JACK clients you can use the @command{jack_connect}
  320. and @command{jack_disconnect} programs, or do it through a graphical interface,
  321. for example with @command{qjackctl}.
  322. To list the JACK clients and their properties you can invoke the command
  323. @command{jack_lsp}.
  324. Follows an example which shows how to capture a JACK readable client
  325. with @command{ffmpeg}.
  326. @example
  327. # Create a JACK writable client with name "ffmpeg".
  328. $ ffmpeg -f jack -i ffmpeg -y out.wav
  329. # Start the sample jack_metro readable client.
  330. $ jack_metro -b 120 -d 0.2 -f 4000
  331. # List the current JACK clients.
  332. $ jack_lsp -c
  333. system:capture_1
  334. system:capture_2
  335. system:playback_1
  336. system:playback_2
  337. ffmpeg:input_1
  338. metro:120_bpm
  339. # Connect metro to the ffmpeg writable client.
  340. $ jack_connect metro:120_bpm ffmpeg:input_1
  341. @end example
  342. For more information read:
  343. @url{http://jackaudio.org/}
  344. @section lavfi
  345. Libavfilter input virtual device.
  346. This input device reads data from the open output pads of a libavfilter
  347. filtergraph.
  348. For each filtergraph open output, the input device will create a
  349. corresponding stream which is mapped to the generated output. Currently
  350. only video data is supported. The filtergraph is specified through the
  351. option @option{graph}.
  352. @subsection Options
  353. @table @option
  354. @item graph
  355. Specify the filtergraph to use as input. Each video open output must be
  356. labelled by a unique string of the form "out@var{N}", where @var{N} is a
  357. number starting from 0 corresponding to the mapped input stream
  358. generated by the device.
  359. The first unlabelled output is automatically assigned to the "out0"
  360. label, but all the others need to be specified explicitly.
  361. If not specified defaults to the filename specified for the input
  362. device.
  363. @item graph_file
  364. Set the filename of the filtergraph to be read and sent to the other
  365. filters. Syntax of the filtergraph is the same as the one specified by
  366. the option @var{graph}.
  367. @end table
  368. @subsection Examples
  369. @itemize
  370. @item
  371. Create a color video stream and play it back with @command{ffplay}:
  372. @example
  373. ffplay -f lavfi -graph "color=c=pink [out0]" dummy
  374. @end example
  375. @item
  376. As the previous example, but use filename for specifying the graph
  377. description, and omit the "out0" label:
  378. @example
  379. ffplay -f lavfi color=c=pink
  380. @end example
  381. @item
  382. Create three different video test filtered sources and play them:
  383. @example
  384. ffplay -f lavfi -graph "testsrc [out0]; testsrc,hflip [out1]; testsrc,negate [out2]" test3
  385. @end example
  386. @item
  387. Read an audio stream from a file using the amovie source and play it
  388. back with @command{ffplay}:
  389. @example
  390. ffplay -f lavfi "amovie=test.wav"
  391. @end example
  392. @item
  393. Read an audio stream and a video stream and play it back with
  394. @command{ffplay}:
  395. @example
  396. ffplay -f lavfi "movie=test.avi[out0];amovie=test.wav[out1]"
  397. @end example
  398. @end itemize
  399. @section libcdio
  400. Audio-CD input device based on cdio.
  401. To enable this input device during configuration you need libcdio
  402. installed on your system. Requires the configure option
  403. @code{--enable-libcdio}.
  404. This device allows playing and grabbing from an Audio-CD.
  405. For example to copy with @command{ffmpeg} the entire Audio-CD in /dev/sr0,
  406. you may run the command:
  407. @example
  408. ffmpeg -f libcdio -i /dev/sr0 cd.wav
  409. @end example
  410. @section libdc1394
  411. IIDC1394 input device, based on libdc1394 and libraw1394.
  412. Requires the configure option @code{--enable-libdc1394}.
  413. @section openal
  414. The OpenAL input device provides audio capture on all systems with a
  415. working OpenAL 1.1 implementation.
  416. To enable this input device during configuration, you need OpenAL
  417. headers and libraries installed on your system, and need to configure
  418. FFmpeg with @code{--enable-openal}.
  419. OpenAL headers and libraries should be provided as part of your OpenAL
  420. implementation, or as an additional download (an SDK). Depending on your
  421. installation you may need to specify additional flags via the
  422. @code{--extra-cflags} and @code{--extra-ldflags} for allowing the build
  423. system to locate the OpenAL headers and libraries.
  424. An incomplete list of OpenAL implementations follows:
  425. @table @strong
  426. @item Creative
  427. The official Windows implementation, providing hardware acceleration
  428. with supported devices and software fallback.
  429. See @url{http://openal.org/}.
  430. @item OpenAL Soft
  431. Portable, open source (LGPL) software implementation. Includes
  432. backends for the most common sound APIs on the Windows, Linux,
  433. Solaris, and BSD operating systems.
  434. See @url{http://kcat.strangesoft.net/openal.html}.
  435. @item Apple
  436. OpenAL is part of Core Audio, the official Mac OS X Audio interface.
  437. See @url{http://developer.apple.com/technologies/mac/audio-and-video.html}
  438. @end table
  439. This device allows one to capture from an audio input device handled
  440. through OpenAL.
  441. You need to specify the name of the device to capture in the provided
  442. filename. If the empty string is provided, the device will
  443. automatically select the default device. You can get the list of the
  444. supported devices by using the option @var{list_devices}.
  445. @subsection Options
  446. @table @option
  447. @item channels
  448. Set the number of channels in the captured audio. Only the values
  449. @option{1} (monaural) and @option{2} (stereo) are currently supported.
  450. Defaults to @option{2}.
  451. @item sample_size
  452. Set the sample size (in bits) of the captured audio. Only the values
  453. @option{8} and @option{16} are currently supported. Defaults to
  454. @option{16}.
  455. @item sample_rate
  456. Set the sample rate (in Hz) of the captured audio.
  457. Defaults to @option{44.1k}.
  458. @item list_devices
  459. If set to @option{true}, print a list of devices and exit.
  460. Defaults to @option{false}.
  461. @end table
  462. @subsection Examples
  463. Print the list of OpenAL supported devices and exit:
  464. @example
  465. $ ffmpeg -list_devices true -f openal -i dummy out.ogg
  466. @end example
  467. Capture from the OpenAL device @file{DR-BT101 via PulseAudio}:
  468. @example
  469. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out.ogg
  470. @end example
  471. Capture from the default device (note the empty string '' as filename):
  472. @example
  473. $ ffmpeg -f openal -i '' out.ogg
  474. @end example
  475. Capture from two devices simultaneously, writing to two different files,
  476. within the same @command{ffmpeg} command:
  477. @example
  478. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out1.ogg -f openal -i 'ALSA Default' out2.ogg
  479. @end example
  480. Note: not all OpenAL implementations support multiple simultaneous capture -
  481. try the latest OpenAL Soft if the above does not work.
  482. @section oss
  483. Open Sound System input device.
  484. The filename to provide to the input device is the device node
  485. representing the OSS input device, and is usually set to
  486. @file{/dev/dsp}.
  487. For example to grab from @file{/dev/dsp} using @command{ffmpeg} use the
  488. command:
  489. @example
  490. ffmpeg -f oss -i /dev/dsp /tmp/oss.wav
  491. @end example
  492. For more information about OSS see:
  493. @url{http://manuals.opensound.com/usersguide/dsp.html}
  494. @section pulse
  495. PulseAudio input device.
  496. To enable this output device you need to configure FFmpeg with @code{--enable-libpulse}.
  497. The filename to provide to the input device is a source device or the
  498. string "default"
  499. To list the PulseAudio source devices and their properties you can invoke
  500. the command @command{pactl list sources}.
  501. More information about PulseAudio can be found on @url{http://www.pulseaudio.org}.
  502. @subsection Options
  503. @table @option
  504. @item server
  505. Connect to a specific PulseAudio server, specified by an IP address.
  506. Default server is used when not provided.
  507. @item name
  508. Specify the application name PulseAudio will use when showing active clients,
  509. by default it is the @code{LIBAVFORMAT_IDENT} string.
  510. @item stream_name
  511. Specify the stream name PulseAudio will use when showing active streams,
  512. by default it is "record".
  513. @item sample_rate
  514. Specify the samplerate in Hz, by default 48kHz is used.
  515. @item channels
  516. Specify the channels in use, by default 2 (stereo) is set.
  517. @item frame_size
  518. Specify the number of bytes per frame, by default it is set to 1024.
  519. @item fragment_size
  520. Specify the minimal buffering fragment in PulseAudio, it will affect the
  521. audio latency. By default it is unset.
  522. @end table
  523. @subsection Examples
  524. Record a stream from default device:
  525. @example
  526. ffmpeg -f pulse -i default /tmp/pulse.wav
  527. @end example
  528. @section qtkit
  529. QTKit input device.
  530. The filename passed as input is parsed to contain either a device name or index.
  531. The device index can also be given by using -video_device_index.
  532. A given device index will override any given device name.
  533. If the desired device consists of numbers only, use -video_device_index to identify it.
  534. The default device will be chosen if an empty string or the device name "default" is given.
  535. The available devices can be enumerated by using -list_devices.
  536. @example
  537. ffmpeg -f qtkit -i "0" out.mpg
  538. @end example
  539. @example
  540. ffmpeg -f qtkit -video_device_index 0 -i "" out.mpg
  541. @end example
  542. @example
  543. ffmpeg -f qtkit -i "default" out.mpg
  544. @end example
  545. @example
  546. ffmpeg -f qtkit -list_devices true -i ""
  547. @end example
  548. @section sndio
  549. sndio input device.
  550. To enable this input device during configuration you need libsndio
  551. installed on your system.
  552. The filename to provide to the input device is the device node
  553. representing the sndio input device, and is usually set to
  554. @file{/dev/audio0}.
  555. For example to grab from @file{/dev/audio0} using @command{ffmpeg} use the
  556. command:
  557. @example
  558. ffmpeg -f sndio -i /dev/audio0 /tmp/oss.wav
  559. @end example
  560. @section video4linux2, v4l2
  561. Video4Linux2 input video device.
  562. "v4l2" can be used as alias for "video4linux2".
  563. If FFmpeg is built with v4l-utils support (by using the
  564. @code{--enable-libv4l2} configure option), it is possible to use it with the
  565. @code{-use_libv4l2} input device option.
  566. The name of the device to grab is a file device node, usually Linux
  567. systems tend to automatically create such nodes when the device
  568. (e.g. an USB webcam) is plugged into the system, and has a name of the
  569. kind @file{/dev/video@var{N}}, where @var{N} is a number associated to
  570. the device.
  571. Video4Linux2 devices usually support a limited set of
  572. @var{width}x@var{height} sizes and frame rates. You can check which are
  573. supported using @command{-list_formats all} for Video4Linux2 devices.
  574. Some devices, like TV cards, support one or more standards. It is possible
  575. to list all the supported standards using @command{-list_standards all}.
  576. The time base for the timestamps is 1 microsecond. Depending on the kernel
  577. version and configuration, the timestamps may be derived from the real time
  578. clock (origin at the Unix Epoch) or the monotonic clock (origin usually at
  579. boot time, unaffected by NTP or manual changes to the clock). The
  580. @option{-timestamps abs} or @option{-ts abs} option can be used to force
  581. conversion into the real time clock.
  582. Some usage examples of the video4linux2 device with @command{ffmpeg}
  583. and @command{ffplay}:
  584. @itemize
  585. @item
  586. Grab and show the input of a video4linux2 device:
  587. @example
  588. ffplay -f video4linux2 -framerate 30 -video_size hd720 /dev/video0
  589. @end example
  590. @item
  591. Grab and record the input of a video4linux2 device, leave the
  592. frame rate and size as previously set:
  593. @example
  594. ffmpeg -f video4linux2 -input_format mjpeg -i /dev/video0 out.mpeg
  595. @end example
  596. @end itemize
  597. For more information about Video4Linux, check @url{http://linuxtv.org/}.
  598. @subsection Options
  599. @table @option
  600. @item standard
  601. Set the standard. Must be the name of a supported standard. To get a
  602. list of the supported standards, use the @option{list_standards}
  603. option.
  604. @item channel
  605. Set the input channel number. Default to -1, which means using the
  606. previously selected channel.
  607. @item video_size
  608. Set the video frame size. The argument must be a string in the form
  609. @var{WIDTH}x@var{HEIGHT} or a valid size abbreviation.
  610. @item pixel_format
  611. Select the pixel format (only valid for raw video input).
  612. @item input_format
  613. Set the preferred pixel format (for raw video) or a codec name.
  614. This option allows one to select the input format, when several are
  615. available.
  616. @item framerate
  617. Set the preferred video frame rate.
  618. @item list_formats
  619. List available formats (supported pixel formats, codecs, and frame
  620. sizes) and exit.
  621. Available values are:
  622. @table @samp
  623. @item all
  624. Show all available (compressed and non-compressed) formats.
  625. @item raw
  626. Show only raw video (non-compressed) formats.
  627. @item compressed
  628. Show only compressed formats.
  629. @end table
  630. @item list_standards
  631. List supported standards and exit.
  632. Available values are:
  633. @table @samp
  634. @item all
  635. Show all supported standards.
  636. @end table
  637. @item timestamps, ts
  638. Set type of timestamps for grabbed frames.
  639. Available values are:
  640. @table @samp
  641. @item default
  642. Use timestamps from the kernel.
  643. @item abs
  644. Use absolute timestamps (wall clock).
  645. @item mono2abs
  646. Force conversion from monotonic to absolute timestamps.
  647. @end table
  648. Default value is @code{default}.
  649. @end table
  650. @section vfwcap
  651. VfW (Video for Windows) capture input device.
  652. The filename passed as input is the capture driver number, ranging from
  653. 0 to 9. You may use "list" as filename to print a list of drivers. Any
  654. other filename will be interpreted as device number 0.
  655. @section x11grab
  656. X11 video input device.
  657. Depends on X11, Xext, and Xfixes. Requires the configure option
  658. @code{--enable-x11grab}.
  659. This device allows one to capture a region of an X11 display.
  660. The filename passed as input has the syntax:
  661. @example
  662. [@var{hostname}]:@var{display_number}.@var{screen_number}[+@var{x_offset},@var{y_offset}]
  663. @end example
  664. @var{hostname}:@var{display_number}.@var{screen_number} specifies the
  665. X11 display name of the screen to grab from. @var{hostname} can be
  666. omitted, and defaults to "localhost". The environment variable
  667. @env{DISPLAY} contains the default display name.
  668. @var{x_offset} and @var{y_offset} specify the offsets of the grabbed
  669. area with respect to the top-left border of the X11 screen. They
  670. default to 0.
  671. Check the X11 documentation (e.g. man X) for more detailed information.
  672. Use the @command{dpyinfo} program for getting basic information about the
  673. properties of your X11 display (e.g. grep for "name" or "dimensions").
  674. For example to grab from @file{:0.0} using @command{ffmpeg}:
  675. @example
  676. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0 out.mpg
  677. @end example
  678. Grab at position @code{10,20}:
  679. @example
  680. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  681. @end example
  682. @subsection Options
  683. @table @option
  684. @item draw_mouse
  685. Specify whether to draw the mouse pointer. A value of @code{0} specify
  686. not to draw the pointer. Default value is @code{1}.
  687. @item follow_mouse
  688. Make the grabbed area follow the mouse. The argument can be
  689. @code{centered} or a number of pixels @var{PIXELS}.
  690. When it is specified with "centered", the grabbing region follows the mouse
  691. pointer and keeps the pointer at the center of region; otherwise, the region
  692. follows only when the mouse pointer reaches within @var{PIXELS} (greater than
  693. zero) to the edge of region.
  694. For example:
  695. @example
  696. ffmpeg -f x11grab -follow_mouse centered -framerate 25 -video_size cif -i :0.0 out.mpg
  697. @end example
  698. To follow only when the mouse pointer reaches within 100 pixels to edge:
  699. @example
  700. ffmpeg -f x11grab -follow_mouse 100 -framerate 25 -video_size cif -i :0.0 out.mpg
  701. @end example
  702. @item framerate
  703. Set the grabbing frame rate. Default value is @code{ntsc},
  704. corresponding to a frame rate of @code{30000/1001}.
  705. @item show_region
  706. Show grabbed region on screen.
  707. If @var{show_region} is specified with @code{1}, then the grabbing
  708. region will be indicated on screen. With this option, it is easy to
  709. know what is being grabbed if only a portion of the screen is grabbed.
  710. For example:
  711. @example
  712. ffmpeg -f x11grab -show_region 1 -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  713. @end example
  714. With @var{follow_mouse}:
  715. @example
  716. ffmpeg -f x11grab -follow_mouse centered -show_region 1 -framerate 25 -video_size cif -i :0.0 out.mpg
  717. @end example
  718. @item video_size
  719. Set the video frame size. Default value is @code{vga}.
  720. @item use_shm
  721. Use the MIT-SHM extension for shared memory. Default value is @code{1}.
  722. It may be necessary to disable it for remote displays.
  723. @end table
  724. @section decklink
  725. The decklink input device provides capture capabilities for Blackmagic
  726. DeckLink devices.
  727. To enable this input device, you need the Blackmagic DeckLink SDK and you
  728. need to configure with the appropriate @code{--extra-cflags}
  729. and @code{--extra-ldflags}.
  730. On Windows, you need to run the IDL files through @command{widl}.
  731. DeckLink is very picky about the formats it supports. Pixel format is always
  732. uyvy422, framerate and video size must be determined for your device with
  733. @command{-list_formats 1}. Audio sample rate is always 48 kHz and the number
  734. of channels currently is limited to 2 (stereo).
  735. @subsection Options
  736. @table @option
  737. @item list_devices
  738. If set to @option{true}, print a list of devices and exit.
  739. Defaults to @option{false}.
  740. @item list_formats
  741. If set to @option{true}, print a list of supported formats and exit.
  742. Defaults to @option{false}.
  743. @end table
  744. @subsection Examples
  745. @itemize
  746. @item
  747. List input devices:
  748. @example
  749. ffmpeg -f decklink -list_devices 1 -i dummy
  750. @end example
  751. @item
  752. List supported formats:
  753. @example
  754. ffmpeg -f decklink -list_formats 1 -i 'Intensity Pro'
  755. @end example
  756. @item
  757. Capture video clip at 1080i50 (format 11):
  758. @example
  759. ffmpeg -f decklink -i 'Intensity Pro@@11' -acodec copy -vcodec copy output.avi
  760. @end example
  761. @end itemize
  762. @c man end INPUT DEVICES