You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

1546 lines
46KB

  1. @chapter Input Devices
  2. @c man begin INPUT DEVICES
  3. Input devices are configured elements in FFmpeg which enable accessing
  4. the data coming from a multimedia device attached to your system.
  5. When you configure your FFmpeg build, all the supported input devices
  6. are enabled by default. You can list all available ones using the
  7. configure option "--list-indevs".
  8. You can disable all the input devices using the configure option
  9. "--disable-indevs", and selectively enable an input device using the
  10. option "--enable-indev=@var{INDEV}", or you can disable a particular
  11. input device using the option "--disable-indev=@var{INDEV}".
  12. The option "-devices" of the ff* tools will display the list of
  13. supported input devices.
  14. A description of the currently available input devices follows.
  15. @section alsa
  16. ALSA (Advanced Linux Sound Architecture) input device.
  17. To enable this input device during configuration you need libasound
  18. installed on your system.
  19. This device allows capturing from an ALSA device. The name of the
  20. device to capture has to be an ALSA card identifier.
  21. An ALSA identifier has the syntax:
  22. @example
  23. hw:@var{CARD}[,@var{DEV}[,@var{SUBDEV}]]
  24. @end example
  25. where the @var{DEV} and @var{SUBDEV} components are optional.
  26. The three arguments (in order: @var{CARD},@var{DEV},@var{SUBDEV})
  27. specify card number or identifier, device number and subdevice number
  28. (-1 means any).
  29. To see the list of cards currently recognized by your system check the
  30. files @file{/proc/asound/cards} and @file{/proc/asound/devices}.
  31. For example to capture with @command{ffmpeg} from an ALSA device with
  32. card id 0, you may run the command:
  33. @example
  34. ffmpeg -f alsa -i hw:0 alsaout.wav
  35. @end example
  36. For more information see:
  37. @url{http://www.alsa-project.org/alsa-doc/alsa-lib/pcm.html}
  38. @subsection Options
  39. @table @option
  40. @item sample_rate
  41. Set the sample rate in Hz. Default is 48000.
  42. @item channels
  43. Set the number of channels. Default is 2.
  44. @end table
  45. @section android_camera
  46. Android camera input device.
  47. This input devices uses the Android Camera2 NDK API which is
  48. available on devices with API level 24+. The availability of
  49. android_camera is autodetected during configuration.
  50. This device allows capturing from all cameras on an Android device,
  51. which are integrated into the Camera2 NDK API.
  52. The available cameras are enumerated internally and can be selected
  53. with the @var{camera_index} parameter. The input file string is
  54. discarded.
  55. Generally the back facing camera has index 0 while the front facing
  56. camera has index 1.
  57. @subsection Options
  58. @table @option
  59. @item video_size
  60. Set the video size given as a string such as 640x480 or hd720.
  61. Falls back to the first available configuration reported by
  62. Android if requested video size is not available or by default.
  63. @item framerate
  64. Set the video framerate.
  65. Falls back to the first available configuration reported by
  66. Android if requested framerate is not available or by default (-1).
  67. @item camera_index
  68. Set the index of the camera to use. Default is 0.
  69. @item input_queue_size
  70. Set the maximum number of frames to buffer. Default is 5.
  71. @end table
  72. @section avfoundation
  73. AVFoundation input device.
  74. AVFoundation is the currently recommended framework by Apple for streamgrabbing on OSX >= 10.7 as well as on iOS.
  75. The input filename has to be given in the following syntax:
  76. @example
  77. -i "[[VIDEO]:[AUDIO]]"
  78. @end example
  79. The first entry selects the video input while the latter selects the audio input.
  80. The stream has to be specified by the device name or the device index as shown by the device list.
  81. Alternatively, the video and/or audio input device can be chosen by index using the
  82. @option{
  83. -video_device_index <INDEX>
  84. }
  85. and/or
  86. @option{
  87. -audio_device_index <INDEX>
  88. }
  89. , overriding any
  90. device name or index given in the input filename.
  91. All available devices can be enumerated by using @option{-list_devices true}, listing
  92. all device names and corresponding indices.
  93. There are two device name aliases:
  94. @table @code
  95. @item default
  96. Select the AVFoundation default device of the corresponding type.
  97. @item none
  98. Do not record the corresponding media type.
  99. This is equivalent to specifying an empty device name or index.
  100. @end table
  101. @subsection Options
  102. AVFoundation supports the following options:
  103. @table @option
  104. @item -list_devices <TRUE|FALSE>
  105. If set to true, a list of all available input devices is given showing all
  106. device names and indices.
  107. @item -video_device_index <INDEX>
  108. Specify the video device by its index. Overrides anything given in the input filename.
  109. @item -audio_device_index <INDEX>
  110. Specify the audio device by its index. Overrides anything given in the input filename.
  111. @item -pixel_format <FORMAT>
  112. Request the video device to use a specific pixel format.
  113. If the specified format is not supported, a list of available formats is given
  114. and the first one in this list is used instead. Available pixel formats are:
  115. @code{monob, rgb555be, rgb555le, rgb565be, rgb565le, rgb24, bgr24, 0rgb, bgr0, 0bgr, rgb0,
  116. bgr48be, uyvy422, yuva444p, yuva444p16le, yuv444p, yuv422p16, yuv422p10, yuv444p10,
  117. yuv420p, nv12, yuyv422, gray}
  118. @item -framerate
  119. Set the grabbing frame rate. Default is @code{ntsc}, corresponding to a
  120. frame rate of @code{30000/1001}.
  121. @item -video_size
  122. Set the video frame size.
  123. @item -capture_cursor
  124. Capture the mouse pointer. Default is 0.
  125. @item -capture_mouse_clicks
  126. Capture the screen mouse clicks. Default is 0.
  127. @item -capture_raw_data
  128. Capture the raw device data. Default is 0.
  129. Using this option may result in receiving the underlying data delivered to the AVFoundation framework. E.g. for muxed devices that sends raw DV data to the framework (like tape-based camcorders), setting this option to false results in extracted video frames captured in the designated pixel format only. Setting this option to true results in receiving the raw DV stream untouched.
  130. @end table
  131. @subsection Examples
  132. @itemize
  133. @item
  134. Print the list of AVFoundation supported devices and exit:
  135. @example
  136. $ ffmpeg -f avfoundation -list_devices true -i ""
  137. @end example
  138. @item
  139. Record video from video device 0 and audio from audio device 0 into out.avi:
  140. @example
  141. $ ffmpeg -f avfoundation -i "0:0" out.avi
  142. @end example
  143. @item
  144. Record video from video device 2 and audio from audio device 1 into out.avi:
  145. @example
  146. $ ffmpeg -f avfoundation -video_device_index 2 -i ":1" out.avi
  147. @end example
  148. @item
  149. Record video from the system default video device using the pixel format bgr0 and do not record any audio into out.avi:
  150. @example
  151. $ ffmpeg -f avfoundation -pixel_format bgr0 -i "default:none" out.avi
  152. @end example
  153. @item
  154. Record raw DV data from a suitable input device and write the output into out.dv:
  155. @example
  156. $ ffmpeg -f avfoundation -capture_raw_data true -i "zr100:none" out.dv
  157. @end example
  158. @end itemize
  159. @section bktr
  160. BSD video input device.
  161. @subsection Options
  162. @table @option
  163. @item framerate
  164. Set the frame rate.
  165. @item video_size
  166. Set the video frame size. Default is @code{vga}.
  167. @item standard
  168. Available values are:
  169. @table @samp
  170. @item pal
  171. @item ntsc
  172. @item secam
  173. @item paln
  174. @item palm
  175. @item ntscj
  176. @end table
  177. @end table
  178. @section decklink
  179. The decklink input device provides capture capabilities for Blackmagic
  180. DeckLink devices.
  181. To enable this input device, you need the Blackmagic DeckLink SDK and you
  182. need to configure with the appropriate @code{--extra-cflags}
  183. and @code{--extra-ldflags}.
  184. On Windows, you need to run the IDL files through @command{widl}.
  185. DeckLink is very picky about the formats it supports. Pixel format of the
  186. input can be set with @option{raw_format}.
  187. Framerate and video size must be determined for your device with
  188. @command{-list_formats 1}. Audio sample rate is always 48 kHz and the number
  189. of channels can be 2, 8 or 16. Note that all audio channels are bundled in one single
  190. audio track.
  191. @subsection Options
  192. @table @option
  193. @item list_devices
  194. If set to @option{true}, print a list of devices and exit.
  195. Defaults to @option{false}. Alternatively you can use the @code{-sources}
  196. option of ffmpeg to list the available input devices.
  197. @item list_formats
  198. If set to @option{true}, print a list of supported formats and exit.
  199. Defaults to @option{false}.
  200. @item format_code <FourCC>
  201. This sets the input video format to the format given by the FourCC. To see
  202. the supported values of your device(s) use @option{list_formats}.
  203. Note that there is a FourCC @option{'pal '} that can also be used
  204. as @option{pal} (3 letters).
  205. Default behavior is autodetection of the input video format, if the hardware
  206. supports it.
  207. @item bm_v210
  208. This is a deprecated option, you can use @option{raw_format} instead.
  209. If set to @samp{1}, video is captured in 10 bit v210 instead
  210. of uyvy422. Not all Blackmagic devices support this option.
  211. @item raw_format
  212. Set the pixel format of the captured video.
  213. Available values are:
  214. @table @samp
  215. @item uyvy422
  216. @item yuv422p10
  217. @item argb
  218. @item bgra
  219. @item rgb10
  220. @end table
  221. @item teletext_lines
  222. If set to nonzero, an additional teletext stream will be captured from the
  223. vertical ancillary data. Both SD PAL (576i) and HD (1080i or 1080p)
  224. sources are supported. In case of HD sources, OP47 packets are decoded.
  225. This option is a bitmask of the SD PAL VBI lines captured, specifically lines 6
  226. to 22, and lines 318 to 335. Line 6 is the LSB in the mask. Selected lines
  227. which do not contain teletext information will be ignored. You can use the
  228. special @option{all} constant to select all possible lines, or
  229. @option{standard} to skip lines 6, 318 and 319, which are not compatible with
  230. all receivers.
  231. For SD sources, ffmpeg needs to be compiled with @code{--enable-libzvbi}. For
  232. HD sources, on older (pre-4K) DeckLink card models you have to capture in 10
  233. bit mode.
  234. @item channels
  235. Defines number of audio channels to capture. Must be @samp{2}, @samp{8} or @samp{16}.
  236. Defaults to @samp{2}.
  237. @item duplex_mode
  238. Sets the decklink device duplex mode. Must be @samp{unset}, @samp{half} or @samp{full}.
  239. Defaults to @samp{unset}.
  240. @item timecode_format
  241. Timecode type to include in the frame and video stream metadata. Must be
  242. @samp{none}, @samp{rp188vitc}, @samp{rp188vitc2}, @samp{rp188ltc},
  243. @samp{rp188any}, @samp{vitc}, @samp{vitc2}, or @samp{serial}. Defaults to
  244. @samp{none} (not included).
  245. @item video_input
  246. Sets the video input source. Must be @samp{unset}, @samp{sdi}, @samp{hdmi},
  247. @samp{optical_sdi}, @samp{component}, @samp{composite} or @samp{s_video}.
  248. Defaults to @samp{unset}.
  249. @item audio_input
  250. Sets the audio input source. Must be @samp{unset}, @samp{embedded},
  251. @samp{aes_ebu}, @samp{analog}, @samp{analog_xlr}, @samp{analog_rca} or
  252. @samp{microphone}. Defaults to @samp{unset}.
  253. @item video_pts
  254. Sets the video packet timestamp source. Must be @samp{video}, @samp{audio},
  255. @samp{reference}, @samp{wallclock} or @samp{abs_wallclock}.
  256. Defaults to @samp{video}.
  257. @item audio_pts
  258. Sets the audio packet timestamp source. Must be @samp{video}, @samp{audio},
  259. @samp{reference}, @samp{wallclock} or @samp{abs_wallclock}.
  260. Defaults to @samp{audio}.
  261. @item draw_bars
  262. If set to @samp{true}, color bars are drawn in the event of a signal loss.
  263. Defaults to @samp{true}.
  264. @item queue_size
  265. Sets maximum input buffer size in bytes. If the buffering reaches this value,
  266. incoming frames will be dropped.
  267. Defaults to @samp{1073741824}.
  268. @item audio_depth
  269. Sets the audio sample bit depth. Must be @samp{16} or @samp{32}.
  270. Defaults to @samp{16}.
  271. @item decklink_copyts
  272. If set to @option{true}, timestamps are forwarded as they are without removing
  273. the initial offset.
  274. Defaults to @option{false}.
  275. @item timestamp_align
  276. Capture start time alignment in seconds. If set to nonzero, input frames are
  277. dropped till the system timestamp aligns with configured value.
  278. Alignment difference of up to one frame duration is tolerated.
  279. This is useful for maintaining input synchronization across N different
  280. hardware devices deployed for 'N-way' redundancy. The system time of different
  281. hardware devices should be synchronized with protocols such as NTP or PTP,
  282. before using this option.
  283. Note that this method is not foolproof. In some border cases input
  284. synchronization may not happen due to thread scheduling jitters in the OS.
  285. Either sync could go wrong by 1 frame or in a rarer case
  286. @option{timestamp_align} seconds.
  287. Defaults to @samp{0}.
  288. @item wait_for_tc (@emph{bool})
  289. Drop frames till a frame with timecode is received. Sometimes serial timecode
  290. isn't received with the first input frame. If that happens, the stored stream
  291. timecode will be inaccurate. If this option is set to @option{true}, input frames
  292. are dropped till a frame with timecode is received.
  293. Option @var{timecode_format} must be specified.
  294. Defaults to @option{false}.
  295. @end table
  296. @subsection Examples
  297. @itemize
  298. @item
  299. List input devices:
  300. @example
  301. ffmpeg -f decklink -list_devices 1 -i dummy
  302. @end example
  303. @item
  304. List supported formats:
  305. @example
  306. ffmpeg -f decklink -list_formats 1 -i 'Intensity Pro'
  307. @end example
  308. @item
  309. Capture video clip at 1080i50:
  310. @example
  311. ffmpeg -format_code Hi50 -f decklink -i 'Intensity Pro' -c:a copy -c:v copy output.avi
  312. @end example
  313. @item
  314. Capture video clip at 1080i50 10 bit:
  315. @example
  316. ffmpeg -bm_v210 1 -format_code Hi50 -f decklink -i 'UltraStudio Mini Recorder' -c:a copy -c:v copy output.avi
  317. @end example
  318. @item
  319. Capture video clip at 1080i50 with 16 audio channels:
  320. @example
  321. ffmpeg -channels 16 -format_code Hi50 -f decklink -i 'UltraStudio Mini Recorder' -c:a copy -c:v copy output.avi
  322. @end example
  323. @end itemize
  324. @section dshow
  325. Windows DirectShow input device.
  326. DirectShow support is enabled when FFmpeg is built with the mingw-w64 project.
  327. Currently only audio and video devices are supported.
  328. Multiple devices may be opened as separate inputs, but they may also be
  329. opened on the same input, which should improve synchronism between them.
  330. The input name should be in the format:
  331. @example
  332. @var{TYPE}=@var{NAME}[:@var{TYPE}=@var{NAME}]
  333. @end example
  334. where @var{TYPE} can be either @var{audio} or @var{video},
  335. and @var{NAME} is the device's name or alternative name..
  336. @subsection Options
  337. If no options are specified, the device's defaults are used.
  338. If the device does not support the requested options, it will
  339. fail to open.
  340. @table @option
  341. @item video_size
  342. Set the video size in the captured video.
  343. @item framerate
  344. Set the frame rate in the captured video.
  345. @item sample_rate
  346. Set the sample rate (in Hz) of the captured audio.
  347. @item sample_size
  348. Set the sample size (in bits) of the captured audio.
  349. @item channels
  350. Set the number of channels in the captured audio.
  351. @item list_devices
  352. If set to @option{true}, print a list of devices and exit.
  353. @item list_options
  354. If set to @option{true}, print a list of selected device's options
  355. and exit.
  356. @item video_device_number
  357. Set video device number for devices with the same name (starts at 0,
  358. defaults to 0).
  359. @item audio_device_number
  360. Set audio device number for devices with the same name (starts at 0,
  361. defaults to 0).
  362. @item pixel_format
  363. Select pixel format to be used by DirectShow. This may only be set when
  364. the video codec is not set or set to rawvideo.
  365. @item audio_buffer_size
  366. Set audio device buffer size in milliseconds (which can directly
  367. impact latency, depending on the device).
  368. Defaults to using the audio device's
  369. default buffer size (typically some multiple of 500ms).
  370. Setting this value too low can degrade performance.
  371. See also
  372. @url{http://msdn.microsoft.com/en-us/library/windows/desktop/dd377582(v=vs.85).aspx}
  373. @item video_pin_name
  374. Select video capture pin to use by name or alternative name.
  375. @item audio_pin_name
  376. Select audio capture pin to use by name or alternative name.
  377. @item crossbar_video_input_pin_number
  378. Select video input pin number for crossbar device. This will be
  379. routed to the crossbar device's Video Decoder output pin.
  380. Note that changing this value can affect future invocations
  381. (sets a new default) until system reboot occurs.
  382. @item crossbar_audio_input_pin_number
  383. Select audio input pin number for crossbar device. This will be
  384. routed to the crossbar device's Audio Decoder output pin.
  385. Note that changing this value can affect future invocations
  386. (sets a new default) until system reboot occurs.
  387. @item show_video_device_dialog
  388. If set to @option{true}, before capture starts, popup a display dialog
  389. to the end user, allowing them to change video filter properties
  390. and configurations manually.
  391. Note that for crossbar devices, adjusting values in this dialog
  392. may be needed at times to toggle between PAL (25 fps) and NTSC (29.97)
  393. input frame rates, sizes, interlacing, etc. Changing these values can
  394. enable different scan rates/frame rates and avoiding green bars at
  395. the bottom, flickering scan lines, etc.
  396. Note that with some devices, changing these properties can also affect future
  397. invocations (sets new defaults) until system reboot occurs.
  398. @item show_audio_device_dialog
  399. If set to @option{true}, before capture starts, popup a display dialog
  400. to the end user, allowing them to change audio filter properties
  401. and configurations manually.
  402. @item show_video_crossbar_connection_dialog
  403. If set to @option{true}, before capture starts, popup a display
  404. dialog to the end user, allowing them to manually
  405. modify crossbar pin routings, when it opens a video device.
  406. @item show_audio_crossbar_connection_dialog
  407. If set to @option{true}, before capture starts, popup a display
  408. dialog to the end user, allowing them to manually
  409. modify crossbar pin routings, when it opens an audio device.
  410. @item show_analog_tv_tuner_dialog
  411. If set to @option{true}, before capture starts, popup a display
  412. dialog to the end user, allowing them to manually
  413. modify TV channels and frequencies.
  414. @item show_analog_tv_tuner_audio_dialog
  415. If set to @option{true}, before capture starts, popup a display
  416. dialog to the end user, allowing them to manually
  417. modify TV audio (like mono vs. stereo, Language A,B or C).
  418. @item audio_device_load
  419. Load an audio capture filter device from file instead of searching
  420. it by name. It may load additional parameters too, if the filter
  421. supports the serialization of its properties to.
  422. To use this an audio capture source has to be specified, but it can
  423. be anything even fake one.
  424. @item audio_device_save
  425. Save the currently used audio capture filter device and its
  426. parameters (if the filter supports it) to a file.
  427. If a file with the same name exists it will be overwritten.
  428. @item video_device_load
  429. Load a video capture filter device from file instead of searching
  430. it by name. It may load additional parameters too, if the filter
  431. supports the serialization of its properties to.
  432. To use this a video capture source has to be specified, but it can
  433. be anything even fake one.
  434. @item video_device_save
  435. Save the currently used video capture filter device and its
  436. parameters (if the filter supports it) to a file.
  437. If a file with the same name exists it will be overwritten.
  438. @end table
  439. @subsection Examples
  440. @itemize
  441. @item
  442. Print the list of DirectShow supported devices and exit:
  443. @example
  444. $ ffmpeg -list_devices true -f dshow -i dummy
  445. @end example
  446. @item
  447. Open video device @var{Camera}:
  448. @example
  449. $ ffmpeg -f dshow -i video="Camera"
  450. @end example
  451. @item
  452. Open second video device with name @var{Camera}:
  453. @example
  454. $ ffmpeg -f dshow -video_device_number 1 -i video="Camera"
  455. @end example
  456. @item
  457. Open video device @var{Camera} and audio device @var{Microphone}:
  458. @example
  459. $ ffmpeg -f dshow -i video="Camera":audio="Microphone"
  460. @end example
  461. @item
  462. Print the list of supported options in selected device and exit:
  463. @example
  464. $ ffmpeg -list_options true -f dshow -i video="Camera"
  465. @end example
  466. @item
  467. Specify pin names to capture by name or alternative name, specify alternative device name:
  468. @example
  469. $ ffmpeg -f dshow -audio_pin_name "Audio Out" -video_pin_name 2 -i video=video="@@device_pnp_\\?\pci#ven_1a0a&dev_6200&subsys_62021461&rev_01#4&e2c7dd6&0&00e1#@{65e8773d-8f56-11d0-a3b9-00a0c9223196@}\@{ca465100-deb0-4d59-818f-8c477184adf6@}":audio="Microphone"
  470. @end example
  471. @item
  472. Configure a crossbar device, specifying crossbar pins, allow user to adjust video capture properties at startup:
  473. @example
  474. $ ffmpeg -f dshow -show_video_device_dialog true -crossbar_video_input_pin_number 0
  475. -crossbar_audio_input_pin_number 3 -i video="AVerMedia BDA Analog Capture":audio="AVerMedia BDA Analog Capture"
  476. @end example
  477. @end itemize
  478. @section fbdev
  479. Linux framebuffer input device.
  480. The Linux framebuffer is a graphic hardware-independent abstraction
  481. layer to show graphics on a computer monitor, typically on the
  482. console. It is accessed through a file device node, usually
  483. @file{/dev/fb0}.
  484. For more detailed information read the file
  485. Documentation/fb/framebuffer.txt included in the Linux source tree.
  486. See also @url{http://linux-fbdev.sourceforge.net/}, and fbset(1).
  487. To record from the framebuffer device @file{/dev/fb0} with
  488. @command{ffmpeg}:
  489. @example
  490. ffmpeg -f fbdev -framerate 10 -i /dev/fb0 out.avi
  491. @end example
  492. You can take a single screenshot image with the command:
  493. @example
  494. ffmpeg -f fbdev -framerate 1 -i /dev/fb0 -frames:v 1 screenshot.jpeg
  495. @end example
  496. @subsection Options
  497. @table @option
  498. @item framerate
  499. Set the frame rate. Default is 25.
  500. @end table
  501. @section gdigrab
  502. Win32 GDI-based screen capture device.
  503. This device allows you to capture a region of the display on Windows.
  504. There are two options for the input filename:
  505. @example
  506. desktop
  507. @end example
  508. or
  509. @example
  510. title=@var{window_title}
  511. @end example
  512. The first option will capture the entire desktop, or a fixed region of the
  513. desktop. The second option will instead capture the contents of a single
  514. window, regardless of its position on the screen.
  515. For example, to grab the entire desktop using @command{ffmpeg}:
  516. @example
  517. ffmpeg -f gdigrab -framerate 6 -i desktop out.mpg
  518. @end example
  519. Grab a 640x480 region at position @code{10,20}:
  520. @example
  521. ffmpeg -f gdigrab -framerate 6 -offset_x 10 -offset_y 20 -video_size vga -i desktop out.mpg
  522. @end example
  523. Grab the contents of the window named "Calculator"
  524. @example
  525. ffmpeg -f gdigrab -framerate 6 -i title=Calculator out.mpg
  526. @end example
  527. @subsection Options
  528. @table @option
  529. @item draw_mouse
  530. Specify whether to draw the mouse pointer. Use the value @code{0} to
  531. not draw the pointer. Default value is @code{1}.
  532. @item framerate
  533. Set the grabbing frame rate. Default value is @code{ntsc},
  534. corresponding to a frame rate of @code{30000/1001}.
  535. @item show_region
  536. Show grabbed region on screen.
  537. If @var{show_region} is specified with @code{1}, then the grabbing
  538. region will be indicated on screen. With this option, it is easy to
  539. know what is being grabbed if only a portion of the screen is grabbed.
  540. Note that @var{show_region} is incompatible with grabbing the contents
  541. of a single window.
  542. For example:
  543. @example
  544. ffmpeg -f gdigrab -show_region 1 -framerate 6 -video_size cif -offset_x 10 -offset_y 20 -i desktop out.mpg
  545. @end example
  546. @item video_size
  547. Set the video frame size. The default is to capture the full screen if @file{desktop} is selected, or the full window size if @file{title=@var{window_title}} is selected.
  548. @item offset_x
  549. When capturing a region with @var{video_size}, set the distance from the left edge of the screen or desktop.
  550. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned to the left of your primary monitor, you will need to use a negative @var{offset_x} value to move the region to that monitor.
  551. @item offset_y
  552. When capturing a region with @var{video_size}, set the distance from the top edge of the screen or desktop.
  553. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned above your primary monitor, you will need to use a negative @var{offset_y} value to move the region to that monitor.
  554. @end table
  555. @section iec61883
  556. FireWire DV/HDV input device using libiec61883.
  557. To enable this input device, you need libiec61883, libraw1394 and
  558. libavc1394 installed on your system. Use the configure option
  559. @code{--enable-libiec61883} to compile with the device enabled.
  560. The iec61883 capture device supports capturing from a video device
  561. connected via IEEE1394 (FireWire), using libiec61883 and the new Linux
  562. FireWire stack (juju). This is the default DV/HDV input method in Linux
  563. Kernel 2.6.37 and later, since the old FireWire stack was removed.
  564. Specify the FireWire port to be used as input file, or "auto"
  565. to choose the first port connected.
  566. @subsection Options
  567. @table @option
  568. @item dvtype
  569. Override autodetection of DV/HDV. This should only be used if auto
  570. detection does not work, or if usage of a different device type
  571. should be prohibited. Treating a DV device as HDV (or vice versa) will
  572. not work and result in undefined behavior.
  573. The values @option{auto}, @option{dv} and @option{hdv} are supported.
  574. @item dvbuffer
  575. Set maximum size of buffer for incoming data, in frames. For DV, this
  576. is an exact value. For HDV, it is not frame exact, since HDV does
  577. not have a fixed frame size.
  578. @item dvguid
  579. Select the capture device by specifying its GUID. Capturing will only
  580. be performed from the specified device and fails if no device with the
  581. given GUID is found. This is useful to select the input if multiple
  582. devices are connected at the same time.
  583. Look at /sys/bus/firewire/devices to find out the GUIDs.
  584. @end table
  585. @subsection Examples
  586. @itemize
  587. @item
  588. Grab and show the input of a FireWire DV/HDV device.
  589. @example
  590. ffplay -f iec61883 -i auto
  591. @end example
  592. @item
  593. Grab and record the input of a FireWire DV/HDV device,
  594. using a packet buffer of 100000 packets if the source is HDV.
  595. @example
  596. ffmpeg -f iec61883 -i auto -dvbuffer 100000 out.mpg
  597. @end example
  598. @end itemize
  599. @section jack
  600. JACK input device.
  601. To enable this input device during configuration you need libjack
  602. installed on your system.
  603. A JACK input device creates one or more JACK writable clients, one for
  604. each audio channel, with name @var{client_name}:input_@var{N}, where
  605. @var{client_name} is the name provided by the application, and @var{N}
  606. is a number which identifies the channel.
  607. Each writable client will send the acquired data to the FFmpeg input
  608. device.
  609. Once you have created one or more JACK readable clients, you need to
  610. connect them to one or more JACK writable clients.
  611. To connect or disconnect JACK clients you can use the @command{jack_connect}
  612. and @command{jack_disconnect} programs, or do it through a graphical interface,
  613. for example with @command{qjackctl}.
  614. To list the JACK clients and their properties you can invoke the command
  615. @command{jack_lsp}.
  616. Follows an example which shows how to capture a JACK readable client
  617. with @command{ffmpeg}.
  618. @example
  619. # Create a JACK writable client with name "ffmpeg".
  620. $ ffmpeg -f jack -i ffmpeg -y out.wav
  621. # Start the sample jack_metro readable client.
  622. $ jack_metro -b 120 -d 0.2 -f 4000
  623. # List the current JACK clients.
  624. $ jack_lsp -c
  625. system:capture_1
  626. system:capture_2
  627. system:playback_1
  628. system:playback_2
  629. ffmpeg:input_1
  630. metro:120_bpm
  631. # Connect metro to the ffmpeg writable client.
  632. $ jack_connect metro:120_bpm ffmpeg:input_1
  633. @end example
  634. For more information read:
  635. @url{http://jackaudio.org/}
  636. @subsection Options
  637. @table @option
  638. @item channels
  639. Set the number of channels. Default is 2.
  640. @end table
  641. @section kmsgrab
  642. KMS video input device.
  643. Captures the KMS scanout framebuffer associated with a specified CRTC or plane as a
  644. DRM object that can be passed to other hardware functions.
  645. Requires either DRM master or CAP_SYS_ADMIN to run.
  646. If you don't understand what all of that means, you probably don't want this. Look at
  647. @option{x11grab} instead.
  648. @subsection Options
  649. @table @option
  650. @item device
  651. DRM device to capture on. Defaults to @option{/dev/dri/card0}.
  652. @item format
  653. Pixel format of the framebuffer. Defaults to @option{bgr0}.
  654. @item format_modifier
  655. Format modifier to signal on output frames. This is necessary to import correctly into
  656. some APIs, but can't be autodetected. See the libdrm documentation for possible values.
  657. @item crtc_id
  658. KMS CRTC ID to define the capture source. The first active plane on the given CRTC
  659. will be used.
  660. @item plane_id
  661. KMS plane ID to define the capture source. Defaults to the first active plane found if
  662. neither @option{crtc_id} nor @option{plane_id} are specified.
  663. @item framerate
  664. Framerate to capture at. This is not synchronised to any page flipping or framebuffer
  665. changes - it just defines the interval at which the framebuffer is sampled. Sampling
  666. faster than the framebuffer update rate will generate independent frames with the same
  667. content. Defaults to @code{30}.
  668. @end table
  669. @subsection Examples
  670. @itemize
  671. @item
  672. Capture from the first active plane, download the result to normal frames and encode.
  673. This will only work if the framebuffer is both linear and mappable - if not, the result
  674. may be scrambled or fail to download.
  675. @example
  676. ffmpeg -f kmsgrab -i - -vf 'hwdownload,format=bgr0' output.mp4
  677. @end example
  678. @item
  679. Capture from CRTC ID 42 at 60fps, map the result to VAAPI, convert to NV12 and encode as H.264.
  680. @example
  681. ffmpeg -crtc_id 42 -framerate 60 -f kmsgrab -i - -vf 'hwmap=derive_device=vaapi,scale_vaapi=w=1920:h=1080:format=nv12' -c:v h264_vaapi output.mp4
  682. @end example
  683. @item
  684. To capture only part of a plane the output can be cropped - this can be used to capture
  685. a single window, as long as it has a known absolute position and size. For example, to
  686. capture and encode the middle quarter of a 1920x1080 plane:
  687. @example
  688. ffmpeg -f kmsgrab -i - -vf 'hwmap=derive_device=vaapi,crop=960:540:480:270,scale_vaapi=960:540:nv12' -c:v h264_vaapi output.mp4
  689. @end example
  690. @end itemize
  691. @section lavfi
  692. Libavfilter input virtual device.
  693. This input device reads data from the open output pads of a libavfilter
  694. filtergraph.
  695. For each filtergraph open output, the input device will create a
  696. corresponding stream which is mapped to the generated output. Currently
  697. only video data is supported. The filtergraph is specified through the
  698. option @option{graph}.
  699. @subsection Options
  700. @table @option
  701. @item graph
  702. Specify the filtergraph to use as input. Each video open output must be
  703. labelled by a unique string of the form "out@var{N}", where @var{N} is a
  704. number starting from 0 corresponding to the mapped input stream
  705. generated by the device.
  706. The first unlabelled output is automatically assigned to the "out0"
  707. label, but all the others need to be specified explicitly.
  708. The suffix "+subcc" can be appended to the output label to create an extra
  709. stream with the closed captions packets attached to that output
  710. (experimental; only for EIA-608 / CEA-708 for now).
  711. The subcc streams are created after all the normal streams, in the order of
  712. the corresponding stream.
  713. For example, if there is "out19+subcc", "out7+subcc" and up to "out42", the
  714. stream #43 is subcc for stream #7 and stream #44 is subcc for stream #19.
  715. If not specified defaults to the filename specified for the input
  716. device.
  717. @item graph_file
  718. Set the filename of the filtergraph to be read and sent to the other
  719. filters. Syntax of the filtergraph is the same as the one specified by
  720. the option @var{graph}.
  721. @item dumpgraph
  722. Dump graph to stderr.
  723. @end table
  724. @subsection Examples
  725. @itemize
  726. @item
  727. Create a color video stream and play it back with @command{ffplay}:
  728. @example
  729. ffplay -f lavfi -graph "color=c=pink [out0]" dummy
  730. @end example
  731. @item
  732. As the previous example, but use filename for specifying the graph
  733. description, and omit the "out0" label:
  734. @example
  735. ffplay -f lavfi color=c=pink
  736. @end example
  737. @item
  738. Create three different video test filtered sources and play them:
  739. @example
  740. ffplay -f lavfi -graph "testsrc [out0]; testsrc,hflip [out1]; testsrc,negate [out2]" test3
  741. @end example
  742. @item
  743. Read an audio stream from a file using the amovie source and play it
  744. back with @command{ffplay}:
  745. @example
  746. ffplay -f lavfi "amovie=test.wav"
  747. @end example
  748. @item
  749. Read an audio stream and a video stream and play it back with
  750. @command{ffplay}:
  751. @example
  752. ffplay -f lavfi "movie=test.avi[out0];amovie=test.wav[out1]"
  753. @end example
  754. @item
  755. Dump decoded frames to images and closed captions to a file (experimental):
  756. @example
  757. ffmpeg -f lavfi -i "movie=test.ts[out0+subcc]" -map v frame%08d.png -map s -c copy -f rawvideo subcc.bin
  758. @end example
  759. @end itemize
  760. @section libcdio
  761. Audio-CD input device based on libcdio.
  762. To enable this input device during configuration you need libcdio
  763. installed on your system. It requires the configure option
  764. @code{--enable-libcdio}.
  765. This device allows playing and grabbing from an Audio-CD.
  766. For example to copy with @command{ffmpeg} the entire Audio-CD in @file{/dev/sr0},
  767. you may run the command:
  768. @example
  769. ffmpeg -f libcdio -i /dev/sr0 cd.wav
  770. @end example
  771. @subsection Options
  772. @table @option
  773. @item speed
  774. Set drive reading speed. Default value is 0.
  775. The speed is specified CD-ROM speed units. The speed is set through
  776. the libcdio @code{cdio_cddap_speed_set} function. On many CD-ROM
  777. drives, specifying a value too large will result in using the fastest
  778. speed.
  779. @item paranoia_mode
  780. Set paranoia recovery mode flags. It accepts one of the following values:
  781. @table @samp
  782. @item disable
  783. @item verify
  784. @item overlap
  785. @item neverskip
  786. @item full
  787. @end table
  788. Default value is @samp{disable}.
  789. For more information about the available recovery modes, consult the
  790. paranoia project documentation.
  791. @end table
  792. @section libdc1394
  793. IIDC1394 input device, based on libdc1394 and libraw1394.
  794. Requires the configure option @code{--enable-libdc1394}.
  795. @subsection Options
  796. @table @option
  797. @item framerate
  798. Set the frame rate. Default is @code{ntsc}, corresponding to a frame
  799. rate of @code{30000/1001}.
  800. @item pixel_format
  801. Select the pixel format. Default is @code{uyvy422}.
  802. @item video_size
  803. Set the video size given as a string such as @code{640x480} or @code{hd720}.
  804. Default is @code{qvga}.
  805. @end table
  806. @section openal
  807. The OpenAL input device provides audio capture on all systems with a
  808. working OpenAL 1.1 implementation.
  809. To enable this input device during configuration, you need OpenAL
  810. headers and libraries installed on your system, and need to configure
  811. FFmpeg with @code{--enable-openal}.
  812. OpenAL headers and libraries should be provided as part of your OpenAL
  813. implementation, or as an additional download (an SDK). Depending on your
  814. installation you may need to specify additional flags via the
  815. @code{--extra-cflags} and @code{--extra-ldflags} for allowing the build
  816. system to locate the OpenAL headers and libraries.
  817. An incomplete list of OpenAL implementations follows:
  818. @table @strong
  819. @item Creative
  820. The official Windows implementation, providing hardware acceleration
  821. with supported devices and software fallback.
  822. See @url{http://openal.org/}.
  823. @item OpenAL Soft
  824. Portable, open source (LGPL) software implementation. Includes
  825. backends for the most common sound APIs on the Windows, Linux,
  826. Solaris, and BSD operating systems.
  827. See @url{http://kcat.strangesoft.net/openal.html}.
  828. @item Apple
  829. OpenAL is part of Core Audio, the official Mac OS X Audio interface.
  830. See @url{http://developer.apple.com/technologies/mac/audio-and-video.html}
  831. @end table
  832. This device allows one to capture from an audio input device handled
  833. through OpenAL.
  834. You need to specify the name of the device to capture in the provided
  835. filename. If the empty string is provided, the device will
  836. automatically select the default device. You can get the list of the
  837. supported devices by using the option @var{list_devices}.
  838. @subsection Options
  839. @table @option
  840. @item channels
  841. Set the number of channels in the captured audio. Only the values
  842. @option{1} (monaural) and @option{2} (stereo) are currently supported.
  843. Defaults to @option{2}.
  844. @item sample_size
  845. Set the sample size (in bits) of the captured audio. Only the values
  846. @option{8} and @option{16} are currently supported. Defaults to
  847. @option{16}.
  848. @item sample_rate
  849. Set the sample rate (in Hz) of the captured audio.
  850. Defaults to @option{44.1k}.
  851. @item list_devices
  852. If set to @option{true}, print a list of devices and exit.
  853. Defaults to @option{false}.
  854. @end table
  855. @subsection Examples
  856. Print the list of OpenAL supported devices and exit:
  857. @example
  858. $ ffmpeg -list_devices true -f openal -i dummy out.ogg
  859. @end example
  860. Capture from the OpenAL device @file{DR-BT101 via PulseAudio}:
  861. @example
  862. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out.ogg
  863. @end example
  864. Capture from the default device (note the empty string '' as filename):
  865. @example
  866. $ ffmpeg -f openal -i '' out.ogg
  867. @end example
  868. Capture from two devices simultaneously, writing to two different files,
  869. within the same @command{ffmpeg} command:
  870. @example
  871. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out1.ogg -f openal -i 'ALSA Default' out2.ogg
  872. @end example
  873. Note: not all OpenAL implementations support multiple simultaneous capture -
  874. try the latest OpenAL Soft if the above does not work.
  875. @section oss
  876. Open Sound System input device.
  877. The filename to provide to the input device is the device node
  878. representing the OSS input device, and is usually set to
  879. @file{/dev/dsp}.
  880. For example to grab from @file{/dev/dsp} using @command{ffmpeg} use the
  881. command:
  882. @example
  883. ffmpeg -f oss -i /dev/dsp /tmp/oss.wav
  884. @end example
  885. For more information about OSS see:
  886. @url{http://manuals.opensound.com/usersguide/dsp.html}
  887. @subsection Options
  888. @table @option
  889. @item sample_rate
  890. Set the sample rate in Hz. Default is 48000.
  891. @item channels
  892. Set the number of channels. Default is 2.
  893. @end table
  894. @section pulse
  895. PulseAudio input device.
  896. To enable this output device you need to configure FFmpeg with @code{--enable-libpulse}.
  897. The filename to provide to the input device is a source device or the
  898. string "default"
  899. To list the PulseAudio source devices and their properties you can invoke
  900. the command @command{pactl list sources}.
  901. More information about PulseAudio can be found on @url{http://www.pulseaudio.org}.
  902. @subsection Options
  903. @table @option
  904. @item server
  905. Connect to a specific PulseAudio server, specified by an IP address.
  906. Default server is used when not provided.
  907. @item name
  908. Specify the application name PulseAudio will use when showing active clients,
  909. by default it is the @code{LIBAVFORMAT_IDENT} string.
  910. @item stream_name
  911. Specify the stream name PulseAudio will use when showing active streams,
  912. by default it is "record".
  913. @item sample_rate
  914. Specify the samplerate in Hz, by default 48kHz is used.
  915. @item channels
  916. Specify the channels in use, by default 2 (stereo) is set.
  917. @item frame_size
  918. Specify the number of bytes per frame, by default it is set to 1024.
  919. @item fragment_size
  920. Specify the minimal buffering fragment in PulseAudio, it will affect the
  921. audio latency. By default it is unset.
  922. @item wallclock
  923. Set the initial PTS using the current time. Default is 1.
  924. @end table
  925. @subsection Examples
  926. Record a stream from default device:
  927. @example
  928. ffmpeg -f pulse -i default /tmp/pulse.wav
  929. @end example
  930. @section sndio
  931. sndio input device.
  932. To enable this input device during configuration you need libsndio
  933. installed on your system.
  934. The filename to provide to the input device is the device node
  935. representing the sndio input device, and is usually set to
  936. @file{/dev/audio0}.
  937. For example to grab from @file{/dev/audio0} using @command{ffmpeg} use the
  938. command:
  939. @example
  940. ffmpeg -f sndio -i /dev/audio0 /tmp/oss.wav
  941. @end example
  942. @subsection Options
  943. @table @option
  944. @item sample_rate
  945. Set the sample rate in Hz. Default is 48000.
  946. @item channels
  947. Set the number of channels. Default is 2.
  948. @end table
  949. @section video4linux2, v4l2
  950. Video4Linux2 input video device.
  951. "v4l2" can be used as alias for "video4linux2".
  952. If FFmpeg is built with v4l-utils support (by using the
  953. @code{--enable-libv4l2} configure option), it is possible to use it with the
  954. @code{-use_libv4l2} input device option.
  955. The name of the device to grab is a file device node, usually Linux
  956. systems tend to automatically create such nodes when the device
  957. (e.g. an USB webcam) is plugged into the system, and has a name of the
  958. kind @file{/dev/video@var{N}}, where @var{N} is a number associated to
  959. the device.
  960. Video4Linux2 devices usually support a limited set of
  961. @var{width}x@var{height} sizes and frame rates. You can check which are
  962. supported using @command{-list_formats all} for Video4Linux2 devices.
  963. Some devices, like TV cards, support one or more standards. It is possible
  964. to list all the supported standards using @command{-list_standards all}.
  965. The time base for the timestamps is 1 microsecond. Depending on the kernel
  966. version and configuration, the timestamps may be derived from the real time
  967. clock (origin at the Unix Epoch) or the monotonic clock (origin usually at
  968. boot time, unaffected by NTP or manual changes to the clock). The
  969. @option{-timestamps abs} or @option{-ts abs} option can be used to force
  970. conversion into the real time clock.
  971. Some usage examples of the video4linux2 device with @command{ffmpeg}
  972. and @command{ffplay}:
  973. @itemize
  974. @item
  975. List supported formats for a video4linux2 device:
  976. @example
  977. ffplay -f video4linux2 -list_formats all /dev/video0
  978. @end example
  979. @item
  980. Grab and show the input of a video4linux2 device:
  981. @example
  982. ffplay -f video4linux2 -framerate 30 -video_size hd720 /dev/video0
  983. @end example
  984. @item
  985. Grab and record the input of a video4linux2 device, leave the
  986. frame rate and size as previously set:
  987. @example
  988. ffmpeg -f video4linux2 -input_format mjpeg -i /dev/video0 out.mpeg
  989. @end example
  990. @end itemize
  991. For more information about Video4Linux, check @url{http://linuxtv.org/}.
  992. @subsection Options
  993. @table @option
  994. @item standard
  995. Set the standard. Must be the name of a supported standard. To get a
  996. list of the supported standards, use the @option{list_standards}
  997. option.
  998. @item channel
  999. Set the input channel number. Default to -1, which means using the
  1000. previously selected channel.
  1001. @item video_size
  1002. Set the video frame size. The argument must be a string in the form
  1003. @var{WIDTH}x@var{HEIGHT} or a valid size abbreviation.
  1004. @item pixel_format
  1005. Select the pixel format (only valid for raw video input).
  1006. @item input_format
  1007. Set the preferred pixel format (for raw video) or a codec name.
  1008. This option allows one to select the input format, when several are
  1009. available.
  1010. @item framerate
  1011. Set the preferred video frame rate.
  1012. @item list_formats
  1013. List available formats (supported pixel formats, codecs, and frame
  1014. sizes) and exit.
  1015. Available values are:
  1016. @table @samp
  1017. @item all
  1018. Show all available (compressed and non-compressed) formats.
  1019. @item raw
  1020. Show only raw video (non-compressed) formats.
  1021. @item compressed
  1022. Show only compressed formats.
  1023. @end table
  1024. @item list_standards
  1025. List supported standards and exit.
  1026. Available values are:
  1027. @table @samp
  1028. @item all
  1029. Show all supported standards.
  1030. @end table
  1031. @item timestamps, ts
  1032. Set type of timestamps for grabbed frames.
  1033. Available values are:
  1034. @table @samp
  1035. @item default
  1036. Use timestamps from the kernel.
  1037. @item abs
  1038. Use absolute timestamps (wall clock).
  1039. @item mono2abs
  1040. Force conversion from monotonic to absolute timestamps.
  1041. @end table
  1042. Default value is @code{default}.
  1043. @item use_libv4l2
  1044. Use libv4l2 (v4l-utils) conversion functions. Default is 0.
  1045. @end table
  1046. @section vfwcap
  1047. VfW (Video for Windows) capture input device.
  1048. The filename passed as input is the capture driver number, ranging from
  1049. 0 to 9. You may use "list" as filename to print a list of drivers. Any
  1050. other filename will be interpreted as device number 0.
  1051. @subsection Options
  1052. @table @option
  1053. @item video_size
  1054. Set the video frame size.
  1055. @item framerate
  1056. Set the grabbing frame rate. Default value is @code{ntsc},
  1057. corresponding to a frame rate of @code{30000/1001}.
  1058. @end table
  1059. @section x11grab
  1060. X11 video input device.
  1061. To enable this input device during configuration you need libxcb
  1062. installed on your system. It will be automatically detected during
  1063. configuration.
  1064. This device allows one to capture a region of an X11 display.
  1065. The filename passed as input has the syntax:
  1066. @example
  1067. [@var{hostname}]:@var{display_number}.@var{screen_number}[+@var{x_offset},@var{y_offset}]
  1068. @end example
  1069. @var{hostname}:@var{display_number}.@var{screen_number} specifies the
  1070. X11 display name of the screen to grab from. @var{hostname} can be
  1071. omitted, and defaults to "localhost". The environment variable
  1072. @env{DISPLAY} contains the default display name.
  1073. @var{x_offset} and @var{y_offset} specify the offsets of the grabbed
  1074. area with respect to the top-left border of the X11 screen. They
  1075. default to 0.
  1076. Check the X11 documentation (e.g. @command{man X}) for more detailed
  1077. information.
  1078. Use the @command{xdpyinfo} program for getting basic information about
  1079. the properties of your X11 display (e.g. grep for "name" or
  1080. "dimensions").
  1081. For example to grab from @file{:0.0} using @command{ffmpeg}:
  1082. @example
  1083. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0 out.mpg
  1084. @end example
  1085. Grab at position @code{10,20}:
  1086. @example
  1087. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  1088. @end example
  1089. @subsection Options
  1090. @table @option
  1091. @item draw_mouse
  1092. Specify whether to draw the mouse pointer. A value of @code{0} specifies
  1093. not to draw the pointer. Default value is @code{1}.
  1094. @item follow_mouse
  1095. Make the grabbed area follow the mouse. The argument can be
  1096. @code{centered} or a number of pixels @var{PIXELS}.
  1097. When it is specified with "centered", the grabbing region follows the mouse
  1098. pointer and keeps the pointer at the center of region; otherwise, the region
  1099. follows only when the mouse pointer reaches within @var{PIXELS} (greater than
  1100. zero) to the edge of region.
  1101. For example:
  1102. @example
  1103. ffmpeg -f x11grab -follow_mouse centered -framerate 25 -video_size cif -i :0.0 out.mpg
  1104. @end example
  1105. To follow only when the mouse pointer reaches within 100 pixels to edge:
  1106. @example
  1107. ffmpeg -f x11grab -follow_mouse 100 -framerate 25 -video_size cif -i :0.0 out.mpg
  1108. @end example
  1109. @item framerate
  1110. Set the grabbing frame rate. Default value is @code{ntsc},
  1111. corresponding to a frame rate of @code{30000/1001}.
  1112. @item show_region
  1113. Show grabbed region on screen.
  1114. If @var{show_region} is specified with @code{1}, then the grabbing
  1115. region will be indicated on screen. With this option, it is easy to
  1116. know what is being grabbed if only a portion of the screen is grabbed.
  1117. @item region_border
  1118. Set the region border thickness if @option{-show_region 1} is used.
  1119. Range is 1 to 128 and default is 3 (XCB-based x11grab only).
  1120. For example:
  1121. @example
  1122. ffmpeg -f x11grab -show_region 1 -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  1123. @end example
  1124. With @var{follow_mouse}:
  1125. @example
  1126. ffmpeg -f x11grab -follow_mouse centered -show_region 1 -framerate 25 -video_size cif -i :0.0 out.mpg
  1127. @end example
  1128. @item video_size
  1129. Set the video frame size. Default value is @code{vga}.
  1130. @item grab_x
  1131. @item grab_y
  1132. Set the grabbing region coordinates. They are expressed as offset from
  1133. the top left corner of the X11 window and correspond to the
  1134. @var{x_offset} and @var{y_offset} parameters in the device name. The
  1135. default value for both options is 0.
  1136. @end table
  1137. @c man end INPUT DEVICES