You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

1547 lines
46KB

  1. @chapter Input Devices
  2. @c man begin INPUT DEVICES
  3. Input devices are configured elements in FFmpeg which enable accessing
  4. the data coming from a multimedia device attached to your system.
  5. When you configure your FFmpeg build, all the supported input devices
  6. are enabled by default. You can list all available ones using the
  7. configure option "--list-indevs".
  8. You can disable all the input devices using the configure option
  9. "--disable-indevs", and selectively enable an input device using the
  10. option "--enable-indev=@var{INDEV}", or you can disable a particular
  11. input device using the option "--disable-indev=@var{INDEV}".
  12. The option "-devices" of the ff* tools will display the list of
  13. supported input devices.
  14. A description of the currently available input devices follows.
  15. @section alsa
  16. ALSA (Advanced Linux Sound Architecture) input device.
  17. To enable this input device during configuration you need libasound
  18. installed on your system.
  19. This device allows capturing from an ALSA device. The name of the
  20. device to capture has to be an ALSA card identifier.
  21. An ALSA identifier has the syntax:
  22. @example
  23. hw:@var{CARD}[,@var{DEV}[,@var{SUBDEV}]]
  24. @end example
  25. where the @var{DEV} and @var{SUBDEV} components are optional.
  26. The three arguments (in order: @var{CARD},@var{DEV},@var{SUBDEV})
  27. specify card number or identifier, device number and subdevice number
  28. (-1 means any).
  29. To see the list of cards currently recognized by your system check the
  30. files @file{/proc/asound/cards} and @file{/proc/asound/devices}.
  31. For example to capture with @command{ffmpeg} from an ALSA device with
  32. card id 0, you may run the command:
  33. @example
  34. ffmpeg -f alsa -i hw:0 alsaout.wav
  35. @end example
  36. For more information see:
  37. @url{http://www.alsa-project.org/alsa-doc/alsa-lib/pcm.html}
  38. @subsection Options
  39. @table @option
  40. @item sample_rate
  41. Set the sample rate in Hz. Default is 48000.
  42. @item channels
  43. Set the number of channels. Default is 2.
  44. @end table
  45. @section android_camera
  46. Android camera input device.
  47. This input devices uses the Android Camera2 NDK API which is
  48. available on devices with API level 24+. The availability of
  49. android_camera is autodetected during configuration.
  50. This device allows capturing from all cameras on an Android device,
  51. which are integrated into the Camera2 NDK API.
  52. The available cameras are enumerated internally and can be selected
  53. with the @var{camera_index} parameter. The input file string is
  54. discarded.
  55. Generally the back facing camera has index 0 while the front facing
  56. camera has index 1.
  57. @subsection Options
  58. @table @option
  59. @item video_size
  60. Set the video size given as a string such as 640x480 or hd720.
  61. Falls back to the first available configuration reported by
  62. Android if requested video size is not available or by default.
  63. @item framerate
  64. Set the video framerate.
  65. Falls back to the first available configuration reported by
  66. Android if requested framerate is not available or by default (-1).
  67. @item camera_index
  68. Set the index of the camera to use. Default is 0.
  69. @item input_queue_size
  70. Set the maximum number of frames to buffer. Default is 5.
  71. @end table
  72. @section avfoundation
  73. AVFoundation input device.
  74. AVFoundation is the currently recommended framework by Apple for streamgrabbing on OSX >= 10.7 as well as on iOS.
  75. The input filename has to be given in the following syntax:
  76. @example
  77. -i "[[VIDEO]:[AUDIO]]"
  78. @end example
  79. The first entry selects the video input while the latter selects the audio input.
  80. The stream has to be specified by the device name or the device index as shown by the device list.
  81. Alternatively, the video and/or audio input device can be chosen by index using the
  82. @option{
  83. -video_device_index <INDEX>
  84. }
  85. and/or
  86. @option{
  87. -audio_device_index <INDEX>
  88. }
  89. , overriding any
  90. device name or index given in the input filename.
  91. All available devices can be enumerated by using @option{-list_devices true}, listing
  92. all device names and corresponding indices.
  93. There are two device name aliases:
  94. @table @code
  95. @item default
  96. Select the AVFoundation default device of the corresponding type.
  97. @item none
  98. Do not record the corresponding media type.
  99. This is equivalent to specifying an empty device name or index.
  100. @end table
  101. @subsection Options
  102. AVFoundation supports the following options:
  103. @table @option
  104. @item -list_devices <TRUE|FALSE>
  105. If set to true, a list of all available input devices is given showing all
  106. device names and indices.
  107. @item -video_device_index <INDEX>
  108. Specify the video device by its index. Overrides anything given in the input filename.
  109. @item -audio_device_index <INDEX>
  110. Specify the audio device by its index. Overrides anything given in the input filename.
  111. @item -pixel_format <FORMAT>
  112. Request the video device to use a specific pixel format.
  113. If the specified format is not supported, a list of available formats is given
  114. and the first one in this list is used instead. Available pixel formats are:
  115. @code{monob, rgb555be, rgb555le, rgb565be, rgb565le, rgb24, bgr24, 0rgb, bgr0, 0bgr, rgb0,
  116. bgr48be, uyvy422, yuva444p, yuva444p16le, yuv444p, yuv422p16, yuv422p10, yuv444p10,
  117. yuv420p, nv12, yuyv422, gray}
  118. @item -framerate
  119. Set the grabbing frame rate. Default is @code{ntsc}, corresponding to a
  120. frame rate of @code{30000/1001}.
  121. @item -video_size
  122. Set the video frame size.
  123. @item -capture_cursor
  124. Capture the mouse pointer. Default is 0.
  125. @item -capture_mouse_clicks
  126. Capture the screen mouse clicks. Default is 0.
  127. @item -capture_raw_data
  128. Capture the raw device data. Default is 0.
  129. Using this option may result in receiving the underlying data delivered to the AVFoundation framework. E.g. for muxed devices that sends raw DV data to the framework (like tape-based camcorders), setting this option to false results in extracted video frames captured in the designated pixel format only. Setting this option to true results in receiving the raw DV stream untouched.
  130. @end table
  131. @subsection Examples
  132. @itemize
  133. @item
  134. Print the list of AVFoundation supported devices and exit:
  135. @example
  136. $ ffmpeg -f avfoundation -list_devices true -i ""
  137. @end example
  138. @item
  139. Record video from video device 0 and audio from audio device 0 into out.avi:
  140. @example
  141. $ ffmpeg -f avfoundation -i "0:0" out.avi
  142. @end example
  143. @item
  144. Record video from video device 2 and audio from audio device 1 into out.avi:
  145. @example
  146. $ ffmpeg -f avfoundation -video_device_index 2 -i ":1" out.avi
  147. @end example
  148. @item
  149. Record video from the system default video device using the pixel format bgr0 and do not record any audio into out.avi:
  150. @example
  151. $ ffmpeg -f avfoundation -pixel_format bgr0 -i "default:none" out.avi
  152. @end example
  153. @item
  154. Record raw DV data from a suitable input device and write the output into out.dv:
  155. @example
  156. $ ffmpeg -f avfoundation -capture_raw_data true -i "zr100:none" out.dv
  157. @end example
  158. @end itemize
  159. @section bktr
  160. BSD video input device.
  161. @subsection Options
  162. @table @option
  163. @item framerate
  164. Set the frame rate.
  165. @item video_size
  166. Set the video frame size. Default is @code{vga}.
  167. @item standard
  168. Available values are:
  169. @table @samp
  170. @item pal
  171. @item ntsc
  172. @item secam
  173. @item paln
  174. @item palm
  175. @item ntscj
  176. @end table
  177. @end table
  178. @section decklink
  179. The decklink input device provides capture capabilities for Blackmagic
  180. DeckLink devices.
  181. To enable this input device, you need the Blackmagic DeckLink SDK and you
  182. need to configure with the appropriate @code{--extra-cflags}
  183. and @code{--extra-ldflags}.
  184. On Windows, you need to run the IDL files through @command{widl}.
  185. DeckLink is very picky about the formats it supports. Pixel format of the
  186. input can be set with @option{raw_format}.
  187. Framerate and video size must be determined for your device with
  188. @command{-list_formats 1}. Audio sample rate is always 48 kHz and the number
  189. of channels can be 2, 8 or 16. Note that all audio channels are bundled in one single
  190. audio track.
  191. @subsection Options
  192. @table @option
  193. @item list_devices
  194. If set to @option{true}, print a list of devices and exit.
  195. Defaults to @option{false}. This option is deprecated, please use the
  196. @code{-sources} option of ffmpeg to list the available input devices.
  197. @item list_formats
  198. If set to @option{true}, print a list of supported formats and exit.
  199. Defaults to @option{false}.
  200. @item format_code <FourCC>
  201. This sets the input video format to the format given by the FourCC. To see
  202. the supported values of your device(s) use @option{list_formats}.
  203. Note that there is a FourCC @option{'pal '} that can also be used
  204. as @option{pal} (3 letters).
  205. Default behavior is autodetection of the input video format, if the hardware
  206. supports it.
  207. @item raw_format
  208. Set the pixel format of the captured video.
  209. Available values are:
  210. @table @samp
  211. @item uyvy422
  212. @item yuv422p10
  213. @item argb
  214. @item bgra
  215. @item rgb10
  216. @end table
  217. @item teletext_lines
  218. If set to nonzero, an additional teletext stream will be captured from the
  219. vertical ancillary data. Both SD PAL (576i) and HD (1080i or 1080p)
  220. sources are supported. In case of HD sources, OP47 packets are decoded.
  221. This option is a bitmask of the SD PAL VBI lines captured, specifically lines 6
  222. to 22, and lines 318 to 335. Line 6 is the LSB in the mask. Selected lines
  223. which do not contain teletext information will be ignored. You can use the
  224. special @option{all} constant to select all possible lines, or
  225. @option{standard} to skip lines 6, 318 and 319, which are not compatible with
  226. all receivers.
  227. For SD sources, ffmpeg needs to be compiled with @code{--enable-libzvbi}. For
  228. HD sources, on older (pre-4K) DeckLink card models you have to capture in 10
  229. bit mode.
  230. @item channels
  231. Defines number of audio channels to capture. Must be @samp{2}, @samp{8} or @samp{16}.
  232. Defaults to @samp{2}.
  233. @item duplex_mode
  234. Sets the decklink device duplex mode. Must be @samp{unset}, @samp{half} or @samp{full}.
  235. Defaults to @samp{unset}.
  236. @item timecode_format
  237. Timecode type to include in the frame and video stream metadata. Must be
  238. @samp{none}, @samp{rp188vitc}, @samp{rp188vitc2}, @samp{rp188ltc},
  239. @samp{rp188any}, @samp{vitc}, @samp{vitc2}, or @samp{serial}. Defaults to
  240. @samp{none} (not included).
  241. @item video_input
  242. Sets the video input source. Must be @samp{unset}, @samp{sdi}, @samp{hdmi},
  243. @samp{optical_sdi}, @samp{component}, @samp{composite} or @samp{s_video}.
  244. Defaults to @samp{unset}.
  245. @item audio_input
  246. Sets the audio input source. Must be @samp{unset}, @samp{embedded},
  247. @samp{aes_ebu}, @samp{analog}, @samp{analog_xlr}, @samp{analog_rca} or
  248. @samp{microphone}. Defaults to @samp{unset}.
  249. @item video_pts
  250. Sets the video packet timestamp source. Must be @samp{video}, @samp{audio},
  251. @samp{reference}, @samp{wallclock} or @samp{abs_wallclock}.
  252. Defaults to @samp{video}.
  253. @item audio_pts
  254. Sets the audio packet timestamp source. Must be @samp{video}, @samp{audio},
  255. @samp{reference}, @samp{wallclock} or @samp{abs_wallclock}.
  256. Defaults to @samp{audio}.
  257. @item draw_bars
  258. If set to @samp{true}, color bars are drawn in the event of a signal loss.
  259. Defaults to @samp{true}.
  260. @item queue_size
  261. Sets maximum input buffer size in bytes. If the buffering reaches this value,
  262. incoming frames will be dropped.
  263. Defaults to @samp{1073741824}.
  264. @item audio_depth
  265. Sets the audio sample bit depth. Must be @samp{16} or @samp{32}.
  266. Defaults to @samp{16}.
  267. @item decklink_copyts
  268. If set to @option{true}, timestamps are forwarded as they are without removing
  269. the initial offset.
  270. Defaults to @option{false}.
  271. @item timestamp_align
  272. Capture start time alignment in seconds. If set to nonzero, input frames are
  273. dropped till the system timestamp aligns with configured value.
  274. Alignment difference of up to one frame duration is tolerated.
  275. This is useful for maintaining input synchronization across N different
  276. hardware devices deployed for 'N-way' redundancy. The system time of different
  277. hardware devices should be synchronized with protocols such as NTP or PTP,
  278. before using this option.
  279. Note that this method is not foolproof. In some border cases input
  280. synchronization may not happen due to thread scheduling jitters in the OS.
  281. Either sync could go wrong by 1 frame or in a rarer case
  282. @option{timestamp_align} seconds.
  283. Defaults to @samp{0}.
  284. @item wait_for_tc (@emph{bool})
  285. Drop frames till a frame with timecode is received. Sometimes serial timecode
  286. isn't received with the first input frame. If that happens, the stored stream
  287. timecode will be inaccurate. If this option is set to @option{true}, input frames
  288. are dropped till a frame with timecode is received.
  289. Option @var{timecode_format} must be specified.
  290. Defaults to @option{false}.
  291. @item enable_klv(@emph{bool})
  292. If set to @option{true}, extracts KLV data from VANC and outputs KLV packets.
  293. KLV VANC packets are joined based on MID and PSC fields and aggregated into
  294. one KLV packet.
  295. Defaults to @option{false}.
  296. @end table
  297. @subsection Examples
  298. @itemize
  299. @item
  300. List input devices:
  301. @example
  302. ffmpeg -sources decklink
  303. @end example
  304. @item
  305. List supported formats:
  306. @example
  307. ffmpeg -f decklink -list_formats 1 -i 'Intensity Pro'
  308. @end example
  309. @item
  310. Capture video clip at 1080i50:
  311. @example
  312. ffmpeg -format_code Hi50 -f decklink -i 'Intensity Pro' -c:a copy -c:v copy output.avi
  313. @end example
  314. @item
  315. Capture video clip at 1080i50 10 bit:
  316. @example
  317. ffmpeg -raw_format yuv422p10 -format_code Hi50 -f decklink -i 'UltraStudio Mini Recorder' -c:a copy -c:v copy output.avi
  318. @end example
  319. @item
  320. Capture video clip at 1080i50 with 16 audio channels:
  321. @example
  322. ffmpeg -channels 16 -format_code Hi50 -f decklink -i 'UltraStudio Mini Recorder' -c:a copy -c:v copy output.avi
  323. @end example
  324. @end itemize
  325. @section dshow
  326. Windows DirectShow input device.
  327. DirectShow support is enabled when FFmpeg is built with the mingw-w64 project.
  328. Currently only audio and video devices are supported.
  329. Multiple devices may be opened as separate inputs, but they may also be
  330. opened on the same input, which should improve synchronism between them.
  331. The input name should be in the format:
  332. @example
  333. @var{TYPE}=@var{NAME}[:@var{TYPE}=@var{NAME}]
  334. @end example
  335. where @var{TYPE} can be either @var{audio} or @var{video},
  336. and @var{NAME} is the device's name or alternative name..
  337. @subsection Options
  338. If no options are specified, the device's defaults are used.
  339. If the device does not support the requested options, it will
  340. fail to open.
  341. @table @option
  342. @item video_size
  343. Set the video size in the captured video.
  344. @item framerate
  345. Set the frame rate in the captured video.
  346. @item sample_rate
  347. Set the sample rate (in Hz) of the captured audio.
  348. @item sample_size
  349. Set the sample size (in bits) of the captured audio.
  350. @item channels
  351. Set the number of channels in the captured audio.
  352. @item list_devices
  353. If set to @option{true}, print a list of devices and exit.
  354. @item list_options
  355. If set to @option{true}, print a list of selected device's options
  356. and exit.
  357. @item video_device_number
  358. Set video device number for devices with the same name (starts at 0,
  359. defaults to 0).
  360. @item audio_device_number
  361. Set audio device number for devices with the same name (starts at 0,
  362. defaults to 0).
  363. @item pixel_format
  364. Select pixel format to be used by DirectShow. This may only be set when
  365. the video codec is not set or set to rawvideo.
  366. @item audio_buffer_size
  367. Set audio device buffer size in milliseconds (which can directly
  368. impact latency, depending on the device).
  369. Defaults to using the audio device's
  370. default buffer size (typically some multiple of 500ms).
  371. Setting this value too low can degrade performance.
  372. See also
  373. @url{http://msdn.microsoft.com/en-us/library/windows/desktop/dd377582(v=vs.85).aspx}
  374. @item video_pin_name
  375. Select video capture pin to use by name or alternative name.
  376. @item audio_pin_name
  377. Select audio capture pin to use by name or alternative name.
  378. @item crossbar_video_input_pin_number
  379. Select video input pin number for crossbar device. This will be
  380. routed to the crossbar device's Video Decoder output pin.
  381. Note that changing this value can affect future invocations
  382. (sets a new default) until system reboot occurs.
  383. @item crossbar_audio_input_pin_number
  384. Select audio input pin number for crossbar device. This will be
  385. routed to the crossbar device's Audio Decoder output pin.
  386. Note that changing this value can affect future invocations
  387. (sets a new default) until system reboot occurs.
  388. @item show_video_device_dialog
  389. If set to @option{true}, before capture starts, popup a display dialog
  390. to the end user, allowing them to change video filter properties
  391. and configurations manually.
  392. Note that for crossbar devices, adjusting values in this dialog
  393. may be needed at times to toggle between PAL (25 fps) and NTSC (29.97)
  394. input frame rates, sizes, interlacing, etc. Changing these values can
  395. enable different scan rates/frame rates and avoiding green bars at
  396. the bottom, flickering scan lines, etc.
  397. Note that with some devices, changing these properties can also affect future
  398. invocations (sets new defaults) until system reboot occurs.
  399. @item show_audio_device_dialog
  400. If set to @option{true}, before capture starts, popup a display dialog
  401. to the end user, allowing them to change audio filter properties
  402. and configurations manually.
  403. @item show_video_crossbar_connection_dialog
  404. If set to @option{true}, before capture starts, popup a display
  405. dialog to the end user, allowing them to manually
  406. modify crossbar pin routings, when it opens a video device.
  407. @item show_audio_crossbar_connection_dialog
  408. If set to @option{true}, before capture starts, popup a display
  409. dialog to the end user, allowing them to manually
  410. modify crossbar pin routings, when it opens an audio device.
  411. @item show_analog_tv_tuner_dialog
  412. If set to @option{true}, before capture starts, popup a display
  413. dialog to the end user, allowing them to manually
  414. modify TV channels and frequencies.
  415. @item show_analog_tv_tuner_audio_dialog
  416. If set to @option{true}, before capture starts, popup a display
  417. dialog to the end user, allowing them to manually
  418. modify TV audio (like mono vs. stereo, Language A,B or C).
  419. @item audio_device_load
  420. Load an audio capture filter device from file instead of searching
  421. it by name. It may load additional parameters too, if the filter
  422. supports the serialization of its properties to.
  423. To use this an audio capture source has to be specified, but it can
  424. be anything even fake one.
  425. @item audio_device_save
  426. Save the currently used audio capture filter device and its
  427. parameters (if the filter supports it) to a file.
  428. If a file with the same name exists it will be overwritten.
  429. @item video_device_load
  430. Load a video capture filter device from file instead of searching
  431. it by name. It may load additional parameters too, if the filter
  432. supports the serialization of its properties to.
  433. To use this a video capture source has to be specified, but it can
  434. be anything even fake one.
  435. @item video_device_save
  436. Save the currently used video capture filter device and its
  437. parameters (if the filter supports it) to a file.
  438. If a file with the same name exists it will be overwritten.
  439. @end table
  440. @subsection Examples
  441. @itemize
  442. @item
  443. Print the list of DirectShow supported devices and exit:
  444. @example
  445. $ ffmpeg -list_devices true -f dshow -i dummy
  446. @end example
  447. @item
  448. Open video device @var{Camera}:
  449. @example
  450. $ ffmpeg -f dshow -i video="Camera"
  451. @end example
  452. @item
  453. Open second video device with name @var{Camera}:
  454. @example
  455. $ ffmpeg -f dshow -video_device_number 1 -i video="Camera"
  456. @end example
  457. @item
  458. Open video device @var{Camera} and audio device @var{Microphone}:
  459. @example
  460. $ ffmpeg -f dshow -i video="Camera":audio="Microphone"
  461. @end example
  462. @item
  463. Print the list of supported options in selected device and exit:
  464. @example
  465. $ ffmpeg -list_options true -f dshow -i video="Camera"
  466. @end example
  467. @item
  468. Specify pin names to capture by name or alternative name, specify alternative device name:
  469. @example
  470. $ ffmpeg -f dshow -audio_pin_name "Audio Out" -video_pin_name 2 -i video=video="@@device_pnp_\\?\pci#ven_1a0a&dev_6200&subsys_62021461&rev_01#4&e2c7dd6&0&00e1#@{65e8773d-8f56-11d0-a3b9-00a0c9223196@}\@{ca465100-deb0-4d59-818f-8c477184adf6@}":audio="Microphone"
  471. @end example
  472. @item
  473. Configure a crossbar device, specifying crossbar pins, allow user to adjust video capture properties at startup:
  474. @example
  475. $ ffmpeg -f dshow -show_video_device_dialog true -crossbar_video_input_pin_number 0
  476. -crossbar_audio_input_pin_number 3 -i video="AVerMedia BDA Analog Capture":audio="AVerMedia BDA Analog Capture"
  477. @end example
  478. @end itemize
  479. @section fbdev
  480. Linux framebuffer input device.
  481. The Linux framebuffer is a graphic hardware-independent abstraction
  482. layer to show graphics on a computer monitor, typically on the
  483. console. It is accessed through a file device node, usually
  484. @file{/dev/fb0}.
  485. For more detailed information read the file
  486. Documentation/fb/framebuffer.txt included in the Linux source tree.
  487. See also @url{http://linux-fbdev.sourceforge.net/}, and fbset(1).
  488. To record from the framebuffer device @file{/dev/fb0} with
  489. @command{ffmpeg}:
  490. @example
  491. ffmpeg -f fbdev -framerate 10 -i /dev/fb0 out.avi
  492. @end example
  493. You can take a single screenshot image with the command:
  494. @example
  495. ffmpeg -f fbdev -framerate 1 -i /dev/fb0 -frames:v 1 screenshot.jpeg
  496. @end example
  497. @subsection Options
  498. @table @option
  499. @item framerate
  500. Set the frame rate. Default is 25.
  501. @end table
  502. @section gdigrab
  503. Win32 GDI-based screen capture device.
  504. This device allows you to capture a region of the display on Windows.
  505. There are two options for the input filename:
  506. @example
  507. desktop
  508. @end example
  509. or
  510. @example
  511. title=@var{window_title}
  512. @end example
  513. The first option will capture the entire desktop, or a fixed region of the
  514. desktop. The second option will instead capture the contents of a single
  515. window, regardless of its position on the screen.
  516. For example, to grab the entire desktop using @command{ffmpeg}:
  517. @example
  518. ffmpeg -f gdigrab -framerate 6 -i desktop out.mpg
  519. @end example
  520. Grab a 640x480 region at position @code{10,20}:
  521. @example
  522. ffmpeg -f gdigrab -framerate 6 -offset_x 10 -offset_y 20 -video_size vga -i desktop out.mpg
  523. @end example
  524. Grab the contents of the window named "Calculator"
  525. @example
  526. ffmpeg -f gdigrab -framerate 6 -i title=Calculator out.mpg
  527. @end example
  528. @subsection Options
  529. @table @option
  530. @item draw_mouse
  531. Specify whether to draw the mouse pointer. Use the value @code{0} to
  532. not draw the pointer. Default value is @code{1}.
  533. @item framerate
  534. Set the grabbing frame rate. Default value is @code{ntsc},
  535. corresponding to a frame rate of @code{30000/1001}.
  536. @item show_region
  537. Show grabbed region on screen.
  538. If @var{show_region} is specified with @code{1}, then the grabbing
  539. region will be indicated on screen. With this option, it is easy to
  540. know what is being grabbed if only a portion of the screen is grabbed.
  541. Note that @var{show_region} is incompatible with grabbing the contents
  542. of a single window.
  543. For example:
  544. @example
  545. ffmpeg -f gdigrab -show_region 1 -framerate 6 -video_size cif -offset_x 10 -offset_y 20 -i desktop out.mpg
  546. @end example
  547. @item video_size
  548. Set the video frame size. The default is to capture the full screen if @file{desktop} is selected, or the full window size if @file{title=@var{window_title}} is selected.
  549. @item offset_x
  550. When capturing a region with @var{video_size}, set the distance from the left edge of the screen or desktop.
  551. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned to the left of your primary monitor, you will need to use a negative @var{offset_x} value to move the region to that monitor.
  552. @item offset_y
  553. When capturing a region with @var{video_size}, set the distance from the top edge of the screen or desktop.
  554. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned above your primary monitor, you will need to use a negative @var{offset_y} value to move the region to that monitor.
  555. @end table
  556. @section iec61883
  557. FireWire DV/HDV input device using libiec61883.
  558. To enable this input device, you need libiec61883, libraw1394 and
  559. libavc1394 installed on your system. Use the configure option
  560. @code{--enable-libiec61883} to compile with the device enabled.
  561. The iec61883 capture device supports capturing from a video device
  562. connected via IEEE1394 (FireWire), using libiec61883 and the new Linux
  563. FireWire stack (juju). This is the default DV/HDV input method in Linux
  564. Kernel 2.6.37 and later, since the old FireWire stack was removed.
  565. Specify the FireWire port to be used as input file, or "auto"
  566. to choose the first port connected.
  567. @subsection Options
  568. @table @option
  569. @item dvtype
  570. Override autodetection of DV/HDV. This should only be used if auto
  571. detection does not work, or if usage of a different device type
  572. should be prohibited. Treating a DV device as HDV (or vice versa) will
  573. not work and result in undefined behavior.
  574. The values @option{auto}, @option{dv} and @option{hdv} are supported.
  575. @item dvbuffer
  576. Set maximum size of buffer for incoming data, in frames. For DV, this
  577. is an exact value. For HDV, it is not frame exact, since HDV does
  578. not have a fixed frame size.
  579. @item dvguid
  580. Select the capture device by specifying its GUID. Capturing will only
  581. be performed from the specified device and fails if no device with the
  582. given GUID is found. This is useful to select the input if multiple
  583. devices are connected at the same time.
  584. Look at /sys/bus/firewire/devices to find out the GUIDs.
  585. @end table
  586. @subsection Examples
  587. @itemize
  588. @item
  589. Grab and show the input of a FireWire DV/HDV device.
  590. @example
  591. ffplay -f iec61883 -i auto
  592. @end example
  593. @item
  594. Grab and record the input of a FireWire DV/HDV device,
  595. using a packet buffer of 100000 packets if the source is HDV.
  596. @example
  597. ffmpeg -f iec61883 -i auto -dvbuffer 100000 out.mpg
  598. @end example
  599. @end itemize
  600. @section jack
  601. JACK input device.
  602. To enable this input device during configuration you need libjack
  603. installed on your system.
  604. A JACK input device creates one or more JACK writable clients, one for
  605. each audio channel, with name @var{client_name}:input_@var{N}, where
  606. @var{client_name} is the name provided by the application, and @var{N}
  607. is a number which identifies the channel.
  608. Each writable client will send the acquired data to the FFmpeg input
  609. device.
  610. Once you have created one or more JACK readable clients, you need to
  611. connect them to one or more JACK writable clients.
  612. To connect or disconnect JACK clients you can use the @command{jack_connect}
  613. and @command{jack_disconnect} programs, or do it through a graphical interface,
  614. for example with @command{qjackctl}.
  615. To list the JACK clients and their properties you can invoke the command
  616. @command{jack_lsp}.
  617. Follows an example which shows how to capture a JACK readable client
  618. with @command{ffmpeg}.
  619. @example
  620. # Create a JACK writable client with name "ffmpeg".
  621. $ ffmpeg -f jack -i ffmpeg -y out.wav
  622. # Start the sample jack_metro readable client.
  623. $ jack_metro -b 120 -d 0.2 -f 4000
  624. # List the current JACK clients.
  625. $ jack_lsp -c
  626. system:capture_1
  627. system:capture_2
  628. system:playback_1
  629. system:playback_2
  630. ffmpeg:input_1
  631. metro:120_bpm
  632. # Connect metro to the ffmpeg writable client.
  633. $ jack_connect metro:120_bpm ffmpeg:input_1
  634. @end example
  635. For more information read:
  636. @url{http://jackaudio.org/}
  637. @subsection Options
  638. @table @option
  639. @item channels
  640. Set the number of channels. Default is 2.
  641. @end table
  642. @section kmsgrab
  643. KMS video input device.
  644. Captures the KMS scanout framebuffer associated with a specified CRTC or plane as a
  645. DRM object that can be passed to other hardware functions.
  646. Requires either DRM master or CAP_SYS_ADMIN to run.
  647. If you don't understand what all of that means, you probably don't want this. Look at
  648. @option{x11grab} instead.
  649. @subsection Options
  650. @table @option
  651. @item device
  652. DRM device to capture on. Defaults to @option{/dev/dri/card0}.
  653. @item format
  654. Pixel format of the framebuffer. Defaults to @option{bgr0}.
  655. @item format_modifier
  656. Format modifier to signal on output frames. This is necessary to import correctly into
  657. some APIs, but can't be autodetected. See the libdrm documentation for possible values.
  658. @item crtc_id
  659. KMS CRTC ID to define the capture source. The first active plane on the given CRTC
  660. will be used.
  661. @item plane_id
  662. KMS plane ID to define the capture source. Defaults to the first active plane found if
  663. neither @option{crtc_id} nor @option{plane_id} are specified.
  664. @item framerate
  665. Framerate to capture at. This is not synchronised to any page flipping or framebuffer
  666. changes - it just defines the interval at which the framebuffer is sampled. Sampling
  667. faster than the framebuffer update rate will generate independent frames with the same
  668. content. Defaults to @code{30}.
  669. @end table
  670. @subsection Examples
  671. @itemize
  672. @item
  673. Capture from the first active plane, download the result to normal frames and encode.
  674. This will only work if the framebuffer is both linear and mappable - if not, the result
  675. may be scrambled or fail to download.
  676. @example
  677. ffmpeg -f kmsgrab -i - -vf 'hwdownload,format=bgr0' output.mp4
  678. @end example
  679. @item
  680. Capture from CRTC ID 42 at 60fps, map the result to VAAPI, convert to NV12 and encode as H.264.
  681. @example
  682. ffmpeg -crtc_id 42 -framerate 60 -f kmsgrab -i - -vf 'hwmap=derive_device=vaapi,scale_vaapi=w=1920:h=1080:format=nv12' -c:v h264_vaapi output.mp4
  683. @end example
  684. @item
  685. To capture only part of a plane the output can be cropped - this can be used to capture
  686. a single window, as long as it has a known absolute position and size. For example, to
  687. capture and encode the middle quarter of a 1920x1080 plane:
  688. @example
  689. ffmpeg -f kmsgrab -i - -vf 'hwmap=derive_device=vaapi,crop=960:540:480:270,scale_vaapi=960:540:nv12' -c:v h264_vaapi output.mp4
  690. @end example
  691. @end itemize
  692. @section lavfi
  693. Libavfilter input virtual device.
  694. This input device reads data from the open output pads of a libavfilter
  695. filtergraph.
  696. For each filtergraph open output, the input device will create a
  697. corresponding stream which is mapped to the generated output. Currently
  698. only video data is supported. The filtergraph is specified through the
  699. option @option{graph}.
  700. @subsection Options
  701. @table @option
  702. @item graph
  703. Specify the filtergraph to use as input. Each video open output must be
  704. labelled by a unique string of the form "out@var{N}", where @var{N} is a
  705. number starting from 0 corresponding to the mapped input stream
  706. generated by the device.
  707. The first unlabelled output is automatically assigned to the "out0"
  708. label, but all the others need to be specified explicitly.
  709. The suffix "+subcc" can be appended to the output label to create an extra
  710. stream with the closed captions packets attached to that output
  711. (experimental; only for EIA-608 / CEA-708 for now).
  712. The subcc streams are created after all the normal streams, in the order of
  713. the corresponding stream.
  714. For example, if there is "out19+subcc", "out7+subcc" and up to "out42", the
  715. stream #43 is subcc for stream #7 and stream #44 is subcc for stream #19.
  716. If not specified defaults to the filename specified for the input
  717. device.
  718. @item graph_file
  719. Set the filename of the filtergraph to be read and sent to the other
  720. filters. Syntax of the filtergraph is the same as the one specified by
  721. the option @var{graph}.
  722. @item dumpgraph
  723. Dump graph to stderr.
  724. @end table
  725. @subsection Examples
  726. @itemize
  727. @item
  728. Create a color video stream and play it back with @command{ffplay}:
  729. @example
  730. ffplay -f lavfi -graph "color=c=pink [out0]" dummy
  731. @end example
  732. @item
  733. As the previous example, but use filename for specifying the graph
  734. description, and omit the "out0" label:
  735. @example
  736. ffplay -f lavfi color=c=pink
  737. @end example
  738. @item
  739. Create three different video test filtered sources and play them:
  740. @example
  741. ffplay -f lavfi -graph "testsrc [out0]; testsrc,hflip [out1]; testsrc,negate [out2]" test3
  742. @end example
  743. @item
  744. Read an audio stream from a file using the amovie source and play it
  745. back with @command{ffplay}:
  746. @example
  747. ffplay -f lavfi "amovie=test.wav"
  748. @end example
  749. @item
  750. Read an audio stream and a video stream and play it back with
  751. @command{ffplay}:
  752. @example
  753. ffplay -f lavfi "movie=test.avi[out0];amovie=test.wav[out1]"
  754. @end example
  755. @item
  756. Dump decoded frames to images and closed captions to a file (experimental):
  757. @example
  758. ffmpeg -f lavfi -i "movie=test.ts[out0+subcc]" -map v frame%08d.png -map s -c copy -f rawvideo subcc.bin
  759. @end example
  760. @end itemize
  761. @section libcdio
  762. Audio-CD input device based on libcdio.
  763. To enable this input device during configuration you need libcdio
  764. installed on your system. It requires the configure option
  765. @code{--enable-libcdio}.
  766. This device allows playing and grabbing from an Audio-CD.
  767. For example to copy with @command{ffmpeg} the entire Audio-CD in @file{/dev/sr0},
  768. you may run the command:
  769. @example
  770. ffmpeg -f libcdio -i /dev/sr0 cd.wav
  771. @end example
  772. @subsection Options
  773. @table @option
  774. @item speed
  775. Set drive reading speed. Default value is 0.
  776. The speed is specified CD-ROM speed units. The speed is set through
  777. the libcdio @code{cdio_cddap_speed_set} function. On many CD-ROM
  778. drives, specifying a value too large will result in using the fastest
  779. speed.
  780. @item paranoia_mode
  781. Set paranoia recovery mode flags. It accepts one of the following values:
  782. @table @samp
  783. @item disable
  784. @item verify
  785. @item overlap
  786. @item neverskip
  787. @item full
  788. @end table
  789. Default value is @samp{disable}.
  790. For more information about the available recovery modes, consult the
  791. paranoia project documentation.
  792. @end table
  793. @section libdc1394
  794. IIDC1394 input device, based on libdc1394 and libraw1394.
  795. Requires the configure option @code{--enable-libdc1394}.
  796. @subsection Options
  797. @table @option
  798. @item framerate
  799. Set the frame rate. Default is @code{ntsc}, corresponding to a frame
  800. rate of @code{30000/1001}.
  801. @item pixel_format
  802. Select the pixel format. Default is @code{uyvy422}.
  803. @item video_size
  804. Set the video size given as a string such as @code{640x480} or @code{hd720}.
  805. Default is @code{qvga}.
  806. @end table
  807. @section openal
  808. The OpenAL input device provides audio capture on all systems with a
  809. working OpenAL 1.1 implementation.
  810. To enable this input device during configuration, you need OpenAL
  811. headers and libraries installed on your system, and need to configure
  812. FFmpeg with @code{--enable-openal}.
  813. OpenAL headers and libraries should be provided as part of your OpenAL
  814. implementation, or as an additional download (an SDK). Depending on your
  815. installation you may need to specify additional flags via the
  816. @code{--extra-cflags} and @code{--extra-ldflags} for allowing the build
  817. system to locate the OpenAL headers and libraries.
  818. An incomplete list of OpenAL implementations follows:
  819. @table @strong
  820. @item Creative
  821. The official Windows implementation, providing hardware acceleration
  822. with supported devices and software fallback.
  823. See @url{http://openal.org/}.
  824. @item OpenAL Soft
  825. Portable, open source (LGPL) software implementation. Includes
  826. backends for the most common sound APIs on the Windows, Linux,
  827. Solaris, and BSD operating systems.
  828. See @url{http://kcat.strangesoft.net/openal.html}.
  829. @item Apple
  830. OpenAL is part of Core Audio, the official Mac OS X Audio interface.
  831. See @url{http://developer.apple.com/technologies/mac/audio-and-video.html}
  832. @end table
  833. This device allows one to capture from an audio input device handled
  834. through OpenAL.
  835. You need to specify the name of the device to capture in the provided
  836. filename. If the empty string is provided, the device will
  837. automatically select the default device. You can get the list of the
  838. supported devices by using the option @var{list_devices}.
  839. @subsection Options
  840. @table @option
  841. @item channels
  842. Set the number of channels in the captured audio. Only the values
  843. @option{1} (monaural) and @option{2} (stereo) are currently supported.
  844. Defaults to @option{2}.
  845. @item sample_size
  846. Set the sample size (in bits) of the captured audio. Only the values
  847. @option{8} and @option{16} are currently supported. Defaults to
  848. @option{16}.
  849. @item sample_rate
  850. Set the sample rate (in Hz) of the captured audio.
  851. Defaults to @option{44.1k}.
  852. @item list_devices
  853. If set to @option{true}, print a list of devices and exit.
  854. Defaults to @option{false}.
  855. @end table
  856. @subsection Examples
  857. Print the list of OpenAL supported devices and exit:
  858. @example
  859. $ ffmpeg -list_devices true -f openal -i dummy out.ogg
  860. @end example
  861. Capture from the OpenAL device @file{DR-BT101 via PulseAudio}:
  862. @example
  863. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out.ogg
  864. @end example
  865. Capture from the default device (note the empty string '' as filename):
  866. @example
  867. $ ffmpeg -f openal -i '' out.ogg
  868. @end example
  869. Capture from two devices simultaneously, writing to two different files,
  870. within the same @command{ffmpeg} command:
  871. @example
  872. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out1.ogg -f openal -i 'ALSA Default' out2.ogg
  873. @end example
  874. Note: not all OpenAL implementations support multiple simultaneous capture -
  875. try the latest OpenAL Soft if the above does not work.
  876. @section oss
  877. Open Sound System input device.
  878. The filename to provide to the input device is the device node
  879. representing the OSS input device, and is usually set to
  880. @file{/dev/dsp}.
  881. For example to grab from @file{/dev/dsp} using @command{ffmpeg} use the
  882. command:
  883. @example
  884. ffmpeg -f oss -i /dev/dsp /tmp/oss.wav
  885. @end example
  886. For more information about OSS see:
  887. @url{http://manuals.opensound.com/usersguide/dsp.html}
  888. @subsection Options
  889. @table @option
  890. @item sample_rate
  891. Set the sample rate in Hz. Default is 48000.
  892. @item channels
  893. Set the number of channels. Default is 2.
  894. @end table
  895. @section pulse
  896. PulseAudio input device.
  897. To enable this output device you need to configure FFmpeg with @code{--enable-libpulse}.
  898. The filename to provide to the input device is a source device or the
  899. string "default"
  900. To list the PulseAudio source devices and their properties you can invoke
  901. the command @command{pactl list sources}.
  902. More information about PulseAudio can be found on @url{http://www.pulseaudio.org}.
  903. @subsection Options
  904. @table @option
  905. @item server
  906. Connect to a specific PulseAudio server, specified by an IP address.
  907. Default server is used when not provided.
  908. @item name
  909. Specify the application name PulseAudio will use when showing active clients,
  910. by default it is the @code{LIBAVFORMAT_IDENT} string.
  911. @item stream_name
  912. Specify the stream name PulseAudio will use when showing active streams,
  913. by default it is "record".
  914. @item sample_rate
  915. Specify the samplerate in Hz, by default 48kHz is used.
  916. @item channels
  917. Specify the channels in use, by default 2 (stereo) is set.
  918. @item frame_size
  919. Specify the number of bytes per frame, by default it is set to 1024.
  920. @item fragment_size
  921. Specify the minimal buffering fragment in PulseAudio, it will affect the
  922. audio latency. By default it is unset.
  923. @item wallclock
  924. Set the initial PTS using the current time. Default is 1.
  925. @end table
  926. @subsection Examples
  927. Record a stream from default device:
  928. @example
  929. ffmpeg -f pulse -i default /tmp/pulse.wav
  930. @end example
  931. @section sndio
  932. sndio input device.
  933. To enable this input device during configuration you need libsndio
  934. installed on your system.
  935. The filename to provide to the input device is the device node
  936. representing the sndio input device, and is usually set to
  937. @file{/dev/audio0}.
  938. For example to grab from @file{/dev/audio0} using @command{ffmpeg} use the
  939. command:
  940. @example
  941. ffmpeg -f sndio -i /dev/audio0 /tmp/oss.wav
  942. @end example
  943. @subsection Options
  944. @table @option
  945. @item sample_rate
  946. Set the sample rate in Hz. Default is 48000.
  947. @item channels
  948. Set the number of channels. Default is 2.
  949. @end table
  950. @section video4linux2, v4l2
  951. Video4Linux2 input video device.
  952. "v4l2" can be used as alias for "video4linux2".
  953. If FFmpeg is built with v4l-utils support (by using the
  954. @code{--enable-libv4l2} configure option), it is possible to use it with the
  955. @code{-use_libv4l2} input device option.
  956. The name of the device to grab is a file device node, usually Linux
  957. systems tend to automatically create such nodes when the device
  958. (e.g. an USB webcam) is plugged into the system, and has a name of the
  959. kind @file{/dev/video@var{N}}, where @var{N} is a number associated to
  960. the device.
  961. Video4Linux2 devices usually support a limited set of
  962. @var{width}x@var{height} sizes and frame rates. You can check which are
  963. supported using @command{-list_formats all} for Video4Linux2 devices.
  964. Some devices, like TV cards, support one or more standards. It is possible
  965. to list all the supported standards using @command{-list_standards all}.
  966. The time base for the timestamps is 1 microsecond. Depending on the kernel
  967. version and configuration, the timestamps may be derived from the real time
  968. clock (origin at the Unix Epoch) or the monotonic clock (origin usually at
  969. boot time, unaffected by NTP or manual changes to the clock). The
  970. @option{-timestamps abs} or @option{-ts abs} option can be used to force
  971. conversion into the real time clock.
  972. Some usage examples of the video4linux2 device with @command{ffmpeg}
  973. and @command{ffplay}:
  974. @itemize
  975. @item
  976. List supported formats for a video4linux2 device:
  977. @example
  978. ffplay -f video4linux2 -list_formats all /dev/video0
  979. @end example
  980. @item
  981. Grab and show the input of a video4linux2 device:
  982. @example
  983. ffplay -f video4linux2 -framerate 30 -video_size hd720 /dev/video0
  984. @end example
  985. @item
  986. Grab and record the input of a video4linux2 device, leave the
  987. frame rate and size as previously set:
  988. @example
  989. ffmpeg -f video4linux2 -input_format mjpeg -i /dev/video0 out.mpeg
  990. @end example
  991. @end itemize
  992. For more information about Video4Linux, check @url{http://linuxtv.org/}.
  993. @subsection Options
  994. @table @option
  995. @item standard
  996. Set the standard. Must be the name of a supported standard. To get a
  997. list of the supported standards, use the @option{list_standards}
  998. option.
  999. @item channel
  1000. Set the input channel number. Default to -1, which means using the
  1001. previously selected channel.
  1002. @item video_size
  1003. Set the video frame size. The argument must be a string in the form
  1004. @var{WIDTH}x@var{HEIGHT} or a valid size abbreviation.
  1005. @item pixel_format
  1006. Select the pixel format (only valid for raw video input).
  1007. @item input_format
  1008. Set the preferred pixel format (for raw video) or a codec name.
  1009. This option allows one to select the input format, when several are
  1010. available.
  1011. @item framerate
  1012. Set the preferred video frame rate.
  1013. @item list_formats
  1014. List available formats (supported pixel formats, codecs, and frame
  1015. sizes) and exit.
  1016. Available values are:
  1017. @table @samp
  1018. @item all
  1019. Show all available (compressed and non-compressed) formats.
  1020. @item raw
  1021. Show only raw video (non-compressed) formats.
  1022. @item compressed
  1023. Show only compressed formats.
  1024. @end table
  1025. @item list_standards
  1026. List supported standards and exit.
  1027. Available values are:
  1028. @table @samp
  1029. @item all
  1030. Show all supported standards.
  1031. @end table
  1032. @item timestamps, ts
  1033. Set type of timestamps for grabbed frames.
  1034. Available values are:
  1035. @table @samp
  1036. @item default
  1037. Use timestamps from the kernel.
  1038. @item abs
  1039. Use absolute timestamps (wall clock).
  1040. @item mono2abs
  1041. Force conversion from monotonic to absolute timestamps.
  1042. @end table
  1043. Default value is @code{default}.
  1044. @item use_libv4l2
  1045. Use libv4l2 (v4l-utils) conversion functions. Default is 0.
  1046. @end table
  1047. @section vfwcap
  1048. VfW (Video for Windows) capture input device.
  1049. The filename passed as input is the capture driver number, ranging from
  1050. 0 to 9. You may use "list" as filename to print a list of drivers. Any
  1051. other filename will be interpreted as device number 0.
  1052. @subsection Options
  1053. @table @option
  1054. @item video_size
  1055. Set the video frame size.
  1056. @item framerate
  1057. Set the grabbing frame rate. Default value is @code{ntsc},
  1058. corresponding to a frame rate of @code{30000/1001}.
  1059. @end table
  1060. @section x11grab
  1061. X11 video input device.
  1062. To enable this input device during configuration you need libxcb
  1063. installed on your system. It will be automatically detected during
  1064. configuration.
  1065. This device allows one to capture a region of an X11 display.
  1066. The filename passed as input has the syntax:
  1067. @example
  1068. [@var{hostname}]:@var{display_number}.@var{screen_number}[+@var{x_offset},@var{y_offset}]
  1069. @end example
  1070. @var{hostname}:@var{display_number}.@var{screen_number} specifies the
  1071. X11 display name of the screen to grab from. @var{hostname} can be
  1072. omitted, and defaults to "localhost". The environment variable
  1073. @env{DISPLAY} contains the default display name.
  1074. @var{x_offset} and @var{y_offset} specify the offsets of the grabbed
  1075. area with respect to the top-left border of the X11 screen. They
  1076. default to 0.
  1077. Check the X11 documentation (e.g. @command{man X}) for more detailed
  1078. information.
  1079. Use the @command{xdpyinfo} program for getting basic information about
  1080. the properties of your X11 display (e.g. grep for "name" or
  1081. "dimensions").
  1082. For example to grab from @file{:0.0} using @command{ffmpeg}:
  1083. @example
  1084. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0 out.mpg
  1085. @end example
  1086. Grab at position @code{10,20}:
  1087. @example
  1088. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  1089. @end example
  1090. @subsection Options
  1091. @table @option
  1092. @item draw_mouse
  1093. Specify whether to draw the mouse pointer. A value of @code{0} specifies
  1094. not to draw the pointer. Default value is @code{1}.
  1095. @item follow_mouse
  1096. Make the grabbed area follow the mouse. The argument can be
  1097. @code{centered} or a number of pixels @var{PIXELS}.
  1098. When it is specified with "centered", the grabbing region follows the mouse
  1099. pointer and keeps the pointer at the center of region; otherwise, the region
  1100. follows only when the mouse pointer reaches within @var{PIXELS} (greater than
  1101. zero) to the edge of region.
  1102. For example:
  1103. @example
  1104. ffmpeg -f x11grab -follow_mouse centered -framerate 25 -video_size cif -i :0.0 out.mpg
  1105. @end example
  1106. To follow only when the mouse pointer reaches within 100 pixels to edge:
  1107. @example
  1108. ffmpeg -f x11grab -follow_mouse 100 -framerate 25 -video_size cif -i :0.0 out.mpg
  1109. @end example
  1110. @item framerate
  1111. Set the grabbing frame rate. Default value is @code{ntsc},
  1112. corresponding to a frame rate of @code{30000/1001}.
  1113. @item show_region
  1114. Show grabbed region on screen.
  1115. If @var{show_region} is specified with @code{1}, then the grabbing
  1116. region will be indicated on screen. With this option, it is easy to
  1117. know what is being grabbed if only a portion of the screen is grabbed.
  1118. @item region_border
  1119. Set the region border thickness if @option{-show_region 1} is used.
  1120. Range is 1 to 128 and default is 3 (XCB-based x11grab only).
  1121. For example:
  1122. @example
  1123. ffmpeg -f x11grab -show_region 1 -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  1124. @end example
  1125. With @var{follow_mouse}:
  1126. @example
  1127. ffmpeg -f x11grab -follow_mouse centered -show_region 1 -framerate 25 -video_size cif -i :0.0 out.mpg
  1128. @end example
  1129. @item video_size
  1130. Set the video frame size. Default is the full desktop.
  1131. @item grab_x
  1132. @item grab_y
  1133. Set the grabbing region coordinates. They are expressed as offset from
  1134. the top left corner of the X11 window and correspond to the
  1135. @var{x_offset} and @var{y_offset} parameters in the device name. The
  1136. default value for both options is 0.
  1137. @end table
  1138. @c man end INPUT DEVICES