You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

896 lines
26KB

  1. @chapter Input Devices
  2. @c man begin INPUT DEVICES
  3. Input devices are configured elements in FFmpeg which allow to access
  4. the data coming from a multimedia device attached to your system.
  5. When you configure your FFmpeg build, all the supported input devices
  6. are enabled by default. You can list all available ones using the
  7. configure option "--list-indevs".
  8. You can disable all the input devices using the configure option
  9. "--disable-indevs", and selectively enable an input device using the
  10. option "--enable-indev=@var{INDEV}", or you can disable a particular
  11. input device using the option "--disable-indev=@var{INDEV}".
  12. The option "-devices" of the ff* tools will display the list of
  13. supported input devices.
  14. A description of the currently available input devices follows.
  15. @section alsa
  16. ALSA (Advanced Linux Sound Architecture) input device.
  17. To enable this input device during configuration you need libasound
  18. installed on your system.
  19. This device allows capturing from an ALSA device. The name of the
  20. device to capture has to be an ALSA card identifier.
  21. An ALSA identifier has the syntax:
  22. @example
  23. hw:@var{CARD}[,@var{DEV}[,@var{SUBDEV}]]
  24. @end example
  25. where the @var{DEV} and @var{SUBDEV} components are optional.
  26. The three arguments (in order: @var{CARD},@var{DEV},@var{SUBDEV})
  27. specify card number or identifier, device number and subdevice number
  28. (-1 means any).
  29. To see the list of cards currently recognized by your system check the
  30. files @file{/proc/asound/cards} and @file{/proc/asound/devices}.
  31. For example to capture with @command{ffmpeg} from an ALSA device with
  32. card id 0, you may run the command:
  33. @example
  34. ffmpeg -f alsa -i hw:0 alsaout.wav
  35. @end example
  36. For more information see:
  37. @url{http://www.alsa-project.org/alsa-doc/alsa-lib/pcm.html}
  38. @section avfoundation
  39. AVFoundation input device.
  40. AVFoundation is the currently recommended framework by Apple for streamgrabbing on OSX >= 10.7 as well as on iOS.
  41. The older QTKit framework has been marked deprecated since OSX version 10.7.
  42. The filename passed as input is parsed to contain either a device name or index.
  43. The device index can also be given by using -video_device_index.
  44. A given device index will override any given device name.
  45. If the desired device consists of numbers only, use -video_device_index to identify it.
  46. The default device will be chosen if an empty string or the device name "default" is given.
  47. The available devices can be enumerated by using -list_devices.
  48. @example
  49. ffmpeg -f avfoundation -i "0" out.mpg
  50. @end example
  51. @example
  52. ffmpeg -f avfoundation -video_device_index 0 -i "" out.mpg
  53. @end example
  54. @example
  55. ffmpeg -f avfoundation -i "default" out.mpg
  56. @end example
  57. @example
  58. ffmpeg -f avfoundation -list_devices true -i ""
  59. @end example
  60. @section bktr
  61. BSD video input device.
  62. @section dshow
  63. Windows DirectShow input device.
  64. DirectShow support is enabled when FFmpeg is built with the mingw-w64 project.
  65. Currently only audio and video devices are supported.
  66. Multiple devices may be opened as separate inputs, but they may also be
  67. opened on the same input, which should improve synchronism between them.
  68. The input name should be in the format:
  69. @example
  70. @var{TYPE}=@var{NAME}[:@var{TYPE}=@var{NAME}]
  71. @end example
  72. where @var{TYPE} can be either @var{audio} or @var{video},
  73. and @var{NAME} is the device's name.
  74. @subsection Options
  75. If no options are specified, the device's defaults are used.
  76. If the device does not support the requested options, it will
  77. fail to open.
  78. @table @option
  79. @item video_size
  80. Set the video size in the captured video.
  81. @item framerate
  82. Set the frame rate in the captured video.
  83. @item sample_rate
  84. Set the sample rate (in Hz) of the captured audio.
  85. @item sample_size
  86. Set the sample size (in bits) of the captured audio.
  87. @item channels
  88. Set the number of channels in the captured audio.
  89. @item list_devices
  90. If set to @option{true}, print a list of devices and exit.
  91. @item list_options
  92. If set to @option{true}, print a list of selected device's options
  93. and exit.
  94. @item video_device_number
  95. Set video device number for devices with same name (starts at 0,
  96. defaults to 0).
  97. @item audio_device_number
  98. Set audio device number for devices with same name (starts at 0,
  99. defaults to 0).
  100. @item pixel_format
  101. Select pixel format to be used by DirectShow. This may only be set when
  102. the video codec is not set or set to rawvideo.
  103. @item audio_buffer_size
  104. Set audio device buffer size in milliseconds (which can directly
  105. impact latency, depending on the device).
  106. Defaults to using the audio device's
  107. default buffer size (typically some multiple of 500ms).
  108. Setting this value too low can degrade performance.
  109. See also
  110. @url{http://msdn.microsoft.com/en-us/library/windows/desktop/dd377582(v=vs.85).aspx}
  111. @end table
  112. @subsection Examples
  113. @itemize
  114. @item
  115. Print the list of DirectShow supported devices and exit:
  116. @example
  117. $ ffmpeg -list_devices true -f dshow -i dummy
  118. @end example
  119. @item
  120. Open video device @var{Camera}:
  121. @example
  122. $ ffmpeg -f dshow -i video="Camera"
  123. @end example
  124. @item
  125. Open second video device with name @var{Camera}:
  126. @example
  127. $ ffmpeg -f dshow -video_device_number 1 -i video="Camera"
  128. @end example
  129. @item
  130. Open video device @var{Camera} and audio device @var{Microphone}:
  131. @example
  132. $ ffmpeg -f dshow -i video="Camera":audio="Microphone"
  133. @end example
  134. @item
  135. Print the list of supported options in selected device and exit:
  136. @example
  137. $ ffmpeg -list_options true -f dshow -i video="Camera"
  138. @end example
  139. @end itemize
  140. @section dv1394
  141. Linux DV 1394 input device.
  142. @section fbdev
  143. Linux framebuffer input device.
  144. The Linux framebuffer is a graphic hardware-independent abstraction
  145. layer to show graphics on a computer monitor, typically on the
  146. console. It is accessed through a file device node, usually
  147. @file{/dev/fb0}.
  148. For more detailed information read the file
  149. Documentation/fb/framebuffer.txt included in the Linux source tree.
  150. To record from the framebuffer device @file{/dev/fb0} with
  151. @command{ffmpeg}:
  152. @example
  153. ffmpeg -f fbdev -r 10 -i /dev/fb0 out.avi
  154. @end example
  155. You can take a single screenshot image with the command:
  156. @example
  157. ffmpeg -f fbdev -frames:v 1 -r 1 -i /dev/fb0 screenshot.jpeg
  158. @end example
  159. See also @url{http://linux-fbdev.sourceforge.net/}, and fbset(1).
  160. @section gdigrab
  161. Win32 GDI-based screen capture device.
  162. This device allows you to capture a region of the display on Windows.
  163. There are two options for the input filename:
  164. @example
  165. desktop
  166. @end example
  167. or
  168. @example
  169. title=@var{window_title}
  170. @end example
  171. The first option will capture the entire desktop, or a fixed region of the
  172. desktop. The second option will instead capture the contents of a single
  173. window, regardless of its position on the screen.
  174. For example, to grab the entire desktop using @command{ffmpeg}:
  175. @example
  176. ffmpeg -f gdigrab -framerate 6 -i desktop out.mpg
  177. @end example
  178. Grab a 640x480 region at position @code{10,20}:
  179. @example
  180. ffmpeg -f gdigrab -framerate 6 -offset_x 10 -offset_y 20 -video_size vga -i desktop out.mpg
  181. @end example
  182. Grab the contents of the window named "Calculator"
  183. @example
  184. ffmpeg -f gdigrab -framerate 6 -i title=Calculator out.mpg
  185. @end example
  186. @subsection Options
  187. @table @option
  188. @item draw_mouse
  189. Specify whether to draw the mouse pointer. Use the value @code{0} to
  190. not draw the pointer. Default value is @code{1}.
  191. @item framerate
  192. Set the grabbing frame rate. Default value is @code{ntsc},
  193. corresponding to a frame rate of @code{30000/1001}.
  194. @item show_region
  195. Show grabbed region on screen.
  196. If @var{show_region} is specified with @code{1}, then the grabbing
  197. region will be indicated on screen. With this option, it is easy to
  198. know what is being grabbed if only a portion of the screen is grabbed.
  199. Note that @var{show_region} is incompatible with grabbing the contents
  200. of a single window.
  201. For example:
  202. @example
  203. ffmpeg -f gdigrab -show_region 1 -framerate 6 -video_size cif -offset_x 10 -offset_y 20 -i desktop out.mpg
  204. @end example
  205. @item video_size
  206. Set the video frame size. The default is to capture the full screen if @file{desktop} is selected, or the full window size if @file{title=@var{window_title}} is selected.
  207. @item offset_x
  208. When capturing a region with @var{video_size}, set the distance from the left edge of the screen or desktop.
  209. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned to the left of your primary monitor, you will need to use a negative @var{offset_x} value to move the region to that monitor.
  210. @item offset_y
  211. When capturing a region with @var{video_size}, set the distance from the top edge of the screen or desktop.
  212. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned above your primary monitor, you will need to use a negative @var{offset_y} value to move the region to that monitor.
  213. @end table
  214. @section iec61883
  215. FireWire DV/HDV input device using libiec61883.
  216. To enable this input device, you need libiec61883, libraw1394 and
  217. libavc1394 installed on your system. Use the configure option
  218. @code{--enable-libiec61883} to compile with the device enabled.
  219. The iec61883 capture device supports capturing from a video device
  220. connected via IEEE1394 (FireWire), using libiec61883 and the new Linux
  221. FireWire stack (juju). This is the default DV/HDV input method in Linux
  222. Kernel 2.6.37 and later, since the old FireWire stack was removed.
  223. Specify the FireWire port to be used as input file, or "auto"
  224. to choose the first port connected.
  225. @subsection Options
  226. @table @option
  227. @item dvtype
  228. Override autodetection of DV/HDV. This should only be used if auto
  229. detection does not work, or if usage of a different device type
  230. should be prohibited. Treating a DV device as HDV (or vice versa) will
  231. not work and result in undefined behavior.
  232. The values @option{auto}, @option{dv} and @option{hdv} are supported.
  233. @item dvbuffer
  234. Set maxiumum size of buffer for incoming data, in frames. For DV, this
  235. is an exact value. For HDV, it is not frame exact, since HDV does
  236. not have a fixed frame size.
  237. @item dvguid
  238. Select the capture device by specifying it's GUID. Capturing will only
  239. be performed from the specified device and fails if no device with the
  240. given GUID is found. This is useful to select the input if multiple
  241. devices are connected at the same time.
  242. Look at /sys/bus/firewire/devices to find out the GUIDs.
  243. @end table
  244. @subsection Examples
  245. @itemize
  246. @item
  247. Grab and show the input of a FireWire DV/HDV device.
  248. @example
  249. ffplay -f iec61883 -i auto
  250. @end example
  251. @item
  252. Grab and record the input of a FireWire DV/HDV device,
  253. using a packet buffer of 100000 packets if the source is HDV.
  254. @example
  255. ffmpeg -f iec61883 -i auto -hdvbuffer 100000 out.mpg
  256. @end example
  257. @end itemize
  258. @section jack
  259. JACK input device.
  260. To enable this input device during configuration you need libjack
  261. installed on your system.
  262. A JACK input device creates one or more JACK writable clients, one for
  263. each audio channel, with name @var{client_name}:input_@var{N}, where
  264. @var{client_name} is the name provided by the application, and @var{N}
  265. is a number which identifies the channel.
  266. Each writable client will send the acquired data to the FFmpeg input
  267. device.
  268. Once you have created one or more JACK readable clients, you need to
  269. connect them to one or more JACK writable clients.
  270. To connect or disconnect JACK clients you can use the @command{jack_connect}
  271. and @command{jack_disconnect} programs, or do it through a graphical interface,
  272. for example with @command{qjackctl}.
  273. To list the JACK clients and their properties you can invoke the command
  274. @command{jack_lsp}.
  275. Follows an example which shows how to capture a JACK readable client
  276. with @command{ffmpeg}.
  277. @example
  278. # Create a JACK writable client with name "ffmpeg".
  279. $ ffmpeg -f jack -i ffmpeg -y out.wav
  280. # Start the sample jack_metro readable client.
  281. $ jack_metro -b 120 -d 0.2 -f 4000
  282. # List the current JACK clients.
  283. $ jack_lsp -c
  284. system:capture_1
  285. system:capture_2
  286. system:playback_1
  287. system:playback_2
  288. ffmpeg:input_1
  289. metro:120_bpm
  290. # Connect metro to the ffmpeg writable client.
  291. $ jack_connect metro:120_bpm ffmpeg:input_1
  292. @end example
  293. For more information read:
  294. @url{http://jackaudio.org/}
  295. @section lavfi
  296. Libavfilter input virtual device.
  297. This input device reads data from the open output pads of a libavfilter
  298. filtergraph.
  299. For each filtergraph open output, the input device will create a
  300. corresponding stream which is mapped to the generated output. Currently
  301. only video data is supported. The filtergraph is specified through the
  302. option @option{graph}.
  303. @subsection Options
  304. @table @option
  305. @item graph
  306. Specify the filtergraph to use as input. Each video open output must be
  307. labelled by a unique string of the form "out@var{N}", where @var{N} is a
  308. number starting from 0 corresponding to the mapped input stream
  309. generated by the device.
  310. The first unlabelled output is automatically assigned to the "out0"
  311. label, but all the others need to be specified explicitly.
  312. If not specified defaults to the filename specified for the input
  313. device.
  314. @item graph_file
  315. Set the filename of the filtergraph to be read and sent to the other
  316. filters. Syntax of the filtergraph is the same as the one specified by
  317. the option @var{graph}.
  318. @end table
  319. @subsection Examples
  320. @itemize
  321. @item
  322. Create a color video stream and play it back with @command{ffplay}:
  323. @example
  324. ffplay -f lavfi -graph "color=c=pink [out0]" dummy
  325. @end example
  326. @item
  327. As the previous example, but use filename for specifying the graph
  328. description, and omit the "out0" label:
  329. @example
  330. ffplay -f lavfi color=c=pink
  331. @end example
  332. @item
  333. Create three different video test filtered sources and play them:
  334. @example
  335. ffplay -f lavfi -graph "testsrc [out0]; testsrc,hflip [out1]; testsrc,negate [out2]" test3
  336. @end example
  337. @item
  338. Read an audio stream from a file using the amovie source and play it
  339. back with @command{ffplay}:
  340. @example
  341. ffplay -f lavfi "amovie=test.wav"
  342. @end example
  343. @item
  344. Read an audio stream and a video stream and play it back with
  345. @command{ffplay}:
  346. @example
  347. ffplay -f lavfi "movie=test.avi[out0];amovie=test.wav[out1]"
  348. @end example
  349. @end itemize
  350. @section libdc1394
  351. IIDC1394 input device, based on libdc1394 and libraw1394.
  352. @section openal
  353. The OpenAL input device provides audio capture on all systems with a
  354. working OpenAL 1.1 implementation.
  355. To enable this input device during configuration, you need OpenAL
  356. headers and libraries installed on your system, and need to configure
  357. FFmpeg with @code{--enable-openal}.
  358. OpenAL headers and libraries should be provided as part of your OpenAL
  359. implementation, or as an additional download (an SDK). Depending on your
  360. installation you may need to specify additional flags via the
  361. @code{--extra-cflags} and @code{--extra-ldflags} for allowing the build
  362. system to locate the OpenAL headers and libraries.
  363. An incomplete list of OpenAL implementations follows:
  364. @table @strong
  365. @item Creative
  366. The official Windows implementation, providing hardware acceleration
  367. with supported devices and software fallback.
  368. See @url{http://openal.org/}.
  369. @item OpenAL Soft
  370. Portable, open source (LGPL) software implementation. Includes
  371. backends for the most common sound APIs on the Windows, Linux,
  372. Solaris, and BSD operating systems.
  373. See @url{http://kcat.strangesoft.net/openal.html}.
  374. @item Apple
  375. OpenAL is part of Core Audio, the official Mac OS X Audio interface.
  376. See @url{http://developer.apple.com/technologies/mac/audio-and-video.html}
  377. @end table
  378. This device allows one to capture from an audio input device handled
  379. through OpenAL.
  380. You need to specify the name of the device to capture in the provided
  381. filename. If the empty string is provided, the device will
  382. automatically select the default device. You can get the list of the
  383. supported devices by using the option @var{list_devices}.
  384. @subsection Options
  385. @table @option
  386. @item channels
  387. Set the number of channels in the captured audio. Only the values
  388. @option{1} (monaural) and @option{2} (stereo) are currently supported.
  389. Defaults to @option{2}.
  390. @item sample_size
  391. Set the sample size (in bits) of the captured audio. Only the values
  392. @option{8} and @option{16} are currently supported. Defaults to
  393. @option{16}.
  394. @item sample_rate
  395. Set the sample rate (in Hz) of the captured audio.
  396. Defaults to @option{44.1k}.
  397. @item list_devices
  398. If set to @option{true}, print a list of devices and exit.
  399. Defaults to @option{false}.
  400. @end table
  401. @subsection Examples
  402. Print the list of OpenAL supported devices and exit:
  403. @example
  404. $ ffmpeg -list_devices true -f openal -i dummy out.ogg
  405. @end example
  406. Capture from the OpenAL device @file{DR-BT101 via PulseAudio}:
  407. @example
  408. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out.ogg
  409. @end example
  410. Capture from the default device (note the empty string '' as filename):
  411. @example
  412. $ ffmpeg -f openal -i '' out.ogg
  413. @end example
  414. Capture from two devices simultaneously, writing to two different files,
  415. within the same @command{ffmpeg} command:
  416. @example
  417. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out1.ogg -f openal -i 'ALSA Default' out2.ogg
  418. @end example
  419. Note: not all OpenAL implementations support multiple simultaneous capture -
  420. try the latest OpenAL Soft if the above does not work.
  421. @section oss
  422. Open Sound System input device.
  423. The filename to provide to the input device is the device node
  424. representing the OSS input device, and is usually set to
  425. @file{/dev/dsp}.
  426. For example to grab from @file{/dev/dsp} using @command{ffmpeg} use the
  427. command:
  428. @example
  429. ffmpeg -f oss -i /dev/dsp /tmp/oss.wav
  430. @end example
  431. For more information about OSS see:
  432. @url{http://manuals.opensound.com/usersguide/dsp.html}
  433. @section pulse
  434. PulseAudio input device.
  435. To enable this output device you need to configure FFmpeg with @code{--enable-libpulse}.
  436. The filename to provide to the input device is a source device or the
  437. string "default"
  438. To list the PulseAudio source devices and their properties you can invoke
  439. the command @command{pactl list sources}.
  440. More information about PulseAudio can be found on @url{http://www.pulseaudio.org}.
  441. @subsection Options
  442. @table @option
  443. @item server
  444. Connect to a specific PulseAudio server, specified by an IP address.
  445. Default server is used when not provided.
  446. @item name
  447. Specify the application name PulseAudio will use when showing active clients,
  448. by default it is the @code{LIBAVFORMAT_IDENT} string.
  449. @item stream_name
  450. Specify the stream name PulseAudio will use when showing active streams,
  451. by default it is "record".
  452. @item sample_rate
  453. Specify the samplerate in Hz, by default 48kHz is used.
  454. @item channels
  455. Specify the channels in use, by default 2 (stereo) is set.
  456. @item frame_size
  457. Specify the number of bytes per frame, by default it is set to 1024.
  458. @item fragment_size
  459. Specify the minimal buffering fragment in PulseAudio, it will affect the
  460. audio latency. By default it is unset.
  461. @end table
  462. @subsection Examples
  463. Record a stream from default device:
  464. @example
  465. ffmpeg -f pulse -i default /tmp/pulse.wav
  466. @end example
  467. @section qtkit
  468. QTKit input device.
  469. The filename passed as input is parsed to contain either a device name or index.
  470. The device index can also be given by using -video_device_index.
  471. A given device index will override any given device name.
  472. If the desired device consists of numbers only, use -video_device_index to identify it.
  473. The default device will be chosen if an empty string or the device name "default" is given.
  474. The available devices can be enumerated by using -list_devices.
  475. @example
  476. ffmpeg -f qtkit -i "0" out.mpg
  477. @end example
  478. @example
  479. ffmpeg -f qtkit -video_device_index 0 -i "" out.mpg
  480. @end example
  481. @example
  482. ffmpeg -f qtkit -i "default" out.mpg
  483. @end example
  484. @example
  485. ffmpeg -f qtkit -list_devices true -i ""
  486. @end example
  487. @section sndio
  488. sndio input device.
  489. To enable this input device during configuration you need libsndio
  490. installed on your system.
  491. The filename to provide to the input device is the device node
  492. representing the sndio input device, and is usually set to
  493. @file{/dev/audio0}.
  494. For example to grab from @file{/dev/audio0} using @command{ffmpeg} use the
  495. command:
  496. @example
  497. ffmpeg -f sndio -i /dev/audio0 /tmp/oss.wav
  498. @end example
  499. @section video4linux2, v4l2
  500. Video4Linux2 input video device.
  501. "v4l2" can be used as alias for "video4linux2".
  502. If FFmpeg is built with v4l-utils support (by using the
  503. @code{--enable-libv4l2} configure option), it is possible to use it with the
  504. @code{-use_libv4l2} input device option.
  505. The name of the device to grab is a file device node, usually Linux
  506. systems tend to automatically create such nodes when the device
  507. (e.g. an USB webcam) is plugged into the system, and has a name of the
  508. kind @file{/dev/video@var{N}}, where @var{N} is a number associated to
  509. the device.
  510. Video4Linux2 devices usually support a limited set of
  511. @var{width}x@var{height} sizes and frame rates. You can check which are
  512. supported using @command{-list_formats all} for Video4Linux2 devices.
  513. Some devices, like TV cards, support one or more standards. It is possible
  514. to list all the supported standards using @command{-list_standards all}.
  515. The time base for the timestamps is 1 microsecond. Depending on the kernel
  516. version and configuration, the timestamps may be derived from the real time
  517. clock (origin at the Unix Epoch) or the monotonic clock (origin usually at
  518. boot time, unaffected by NTP or manual changes to the clock). The
  519. @option{-timestamps abs} or @option{-ts abs} option can be used to force
  520. conversion into the real time clock.
  521. Some usage examples of the video4linux2 device with @command{ffmpeg}
  522. and @command{ffplay}:
  523. @itemize
  524. @item
  525. Grab and show the input of a video4linux2 device:
  526. @example
  527. ffplay -f video4linux2 -framerate 30 -video_size hd720 /dev/video0
  528. @end example
  529. @item
  530. Grab and record the input of a video4linux2 device, leave the
  531. frame rate and size as previously set:
  532. @example
  533. ffmpeg -f video4linux2 -input_format mjpeg -i /dev/video0 out.mpeg
  534. @end example
  535. @end itemize
  536. For more information about Video4Linux, check @url{http://linuxtv.org/}.
  537. @subsection Options
  538. @table @option
  539. @item standard
  540. Set the standard. Must be the name of a supported standard. To get a
  541. list of the supported standards, use the @option{list_standards}
  542. option.
  543. @item channel
  544. Set the input channel number. Default to -1, which means using the
  545. previously selected channel.
  546. @item video_size
  547. Set the video frame size. The argument must be a string in the form
  548. @var{WIDTH}x@var{HEIGHT} or a valid size abbreviation.
  549. @item pixel_format
  550. Select the pixel format (only valid for raw video input).
  551. @item input_format
  552. Set the preferred pixel format (for raw video) or a codec name.
  553. This option allows one to select the input format, when several are
  554. available.
  555. @item framerate
  556. Set the preferred video frame rate.
  557. @item list_formats
  558. List available formats (supported pixel formats, codecs, and frame
  559. sizes) and exit.
  560. Available values are:
  561. @table @samp
  562. @item all
  563. Show all available (compressed and non-compressed) formats.
  564. @item raw
  565. Show only raw video (non-compressed) formats.
  566. @item compressed
  567. Show only compressed formats.
  568. @end table
  569. @item list_standards
  570. List supported standards and exit.
  571. Available values are:
  572. @table @samp
  573. @item all
  574. Show all supported standards.
  575. @end table
  576. @item timestamps, ts
  577. Set type of timestamps for grabbed frames.
  578. Available values are:
  579. @table @samp
  580. @item default
  581. Use timestamps from the kernel.
  582. @item abs
  583. Use absolute timestamps (wall clock).
  584. @item mono2abs
  585. Force conversion from monotonic to absolute timestamps.
  586. @end table
  587. Default value is @code{default}.
  588. @end table
  589. @section vfwcap
  590. VfW (Video for Windows) capture input device.
  591. The filename passed as input is the capture driver number, ranging from
  592. 0 to 9. You may use "list" as filename to print a list of drivers. Any
  593. other filename will be interpreted as device number 0.
  594. @section x11grab
  595. X11 video input device.
  596. This device allows one to capture a region of an X11 display.
  597. The filename passed as input has the syntax:
  598. @example
  599. [@var{hostname}]:@var{display_number}.@var{screen_number}[+@var{x_offset},@var{y_offset}]
  600. @end example
  601. @var{hostname}:@var{display_number}.@var{screen_number} specifies the
  602. X11 display name of the screen to grab from. @var{hostname} can be
  603. omitted, and defaults to "localhost". The environment variable
  604. @env{DISPLAY} contains the default display name.
  605. @var{x_offset} and @var{y_offset} specify the offsets of the grabbed
  606. area with respect to the top-left border of the X11 screen. They
  607. default to 0.
  608. Check the X11 documentation (e.g. man X) for more detailed information.
  609. Use the @command{dpyinfo} program for getting basic information about the
  610. properties of your X11 display (e.g. grep for "name" or "dimensions").
  611. For example to grab from @file{:0.0} using @command{ffmpeg}:
  612. @example
  613. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0 out.mpg
  614. @end example
  615. Grab at position @code{10,20}:
  616. @example
  617. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  618. @end example
  619. @subsection Options
  620. @table @option
  621. @item draw_mouse
  622. Specify whether to draw the mouse pointer. A value of @code{0} specify
  623. not to draw the pointer. Default value is @code{1}.
  624. @item follow_mouse
  625. Make the grabbed area follow the mouse. The argument can be
  626. @code{centered} or a number of pixels @var{PIXELS}.
  627. When it is specified with "centered", the grabbing region follows the mouse
  628. pointer and keeps the pointer at the center of region; otherwise, the region
  629. follows only when the mouse pointer reaches within @var{PIXELS} (greater than
  630. zero) to the edge of region.
  631. For example:
  632. @example
  633. ffmpeg -f x11grab -follow_mouse centered -framerate 25 -video_size cif -i :0.0 out.mpg
  634. @end example
  635. To follow only when the mouse pointer reaches within 100 pixels to edge:
  636. @example
  637. ffmpeg -f x11grab -follow_mouse 100 -framerate 25 -video_size cif -i :0.0 out.mpg
  638. @end example
  639. @item framerate
  640. Set the grabbing frame rate. Default value is @code{ntsc},
  641. corresponding to a frame rate of @code{30000/1001}.
  642. @item show_region
  643. Show grabbed region on screen.
  644. If @var{show_region} is specified with @code{1}, then the grabbing
  645. region will be indicated on screen. With this option, it is easy to
  646. know what is being grabbed if only a portion of the screen is grabbed.
  647. For example:
  648. @example
  649. ffmpeg -f x11grab -show_region 1 -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  650. @end example
  651. With @var{follow_mouse}:
  652. @example
  653. ffmpeg -f x11grab -follow_mouse centered -show_region 1 -framerate 25 -video_size cif -i :0.0 out.mpg
  654. @end example
  655. @item video_size
  656. Set the video frame size. Default value is @code{vga}.
  657. @end table
  658. @c man end INPUT DEVICES