You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

1551 lines
46KB

  1. @chapter Input Devices
  2. @c man begin INPUT DEVICES
  3. Input devices are configured elements in FFmpeg which enable accessing
  4. the data coming from a multimedia device attached to your system.
  5. When you configure your FFmpeg build, all the supported input devices
  6. are enabled by default. You can list all available ones using the
  7. configure option "--list-indevs".
  8. You can disable all the input devices using the configure option
  9. "--disable-indevs", and selectively enable an input device using the
  10. option "--enable-indev=@var{INDEV}", or you can disable a particular
  11. input device using the option "--disable-indev=@var{INDEV}".
  12. The option "-devices" of the ff* tools will display the list of
  13. supported input devices.
  14. A description of the currently available input devices follows.
  15. @section alsa
  16. ALSA (Advanced Linux Sound Architecture) input device.
  17. To enable this input device during configuration you need libasound
  18. installed on your system.
  19. This device allows capturing from an ALSA device. The name of the
  20. device to capture has to be an ALSA card identifier.
  21. An ALSA identifier has the syntax:
  22. @example
  23. hw:@var{CARD}[,@var{DEV}[,@var{SUBDEV}]]
  24. @end example
  25. where the @var{DEV} and @var{SUBDEV} components are optional.
  26. The three arguments (in order: @var{CARD},@var{DEV},@var{SUBDEV})
  27. specify card number or identifier, device number and subdevice number
  28. (-1 means any).
  29. To see the list of cards currently recognized by your system check the
  30. files @file{/proc/asound/cards} and @file{/proc/asound/devices}.
  31. For example to capture with @command{ffmpeg} from an ALSA device with
  32. card id 0, you may run the command:
  33. @example
  34. ffmpeg -f alsa -i hw:0 alsaout.wav
  35. @end example
  36. For more information see:
  37. @url{http://www.alsa-project.org/alsa-doc/alsa-lib/pcm.html}
  38. @subsection Options
  39. @table @option
  40. @item sample_rate
  41. Set the sample rate in Hz. Default is 48000.
  42. @item channels
  43. Set the number of channels. Default is 2.
  44. @end table
  45. @section android_camera
  46. Android camera input device.
  47. This input devices uses the Android Camera2 NDK API which is
  48. available on devices with API level 24+. The availability of
  49. android_camera is autodetected during configuration.
  50. This device allows capturing from all cameras on an Android device,
  51. which are integrated into the Camera2 NDK API.
  52. The available cameras are enumerated internally and can be selected
  53. with the @var{camera_index} parameter. The input file string is
  54. discarded.
  55. Generally the back facing camera has index 0 while the front facing
  56. camera has index 1.
  57. @subsection Options
  58. @table @option
  59. @item video_size
  60. Set the video size given as a string such as 640x480 or hd720.
  61. Falls back to the first available configuration reported by
  62. Android if requested video size is not available or by default.
  63. @item framerate
  64. Set the video framerate.
  65. Falls back to the first available configuration reported by
  66. Android if requested framerate is not available or by default (-1).
  67. @item camera_index
  68. Set the index of the camera to use. Default is 0.
  69. @item input_queue_size
  70. Set the maximum number of frames to buffer. Default is 5.
  71. @end table
  72. @section avfoundation
  73. AVFoundation input device.
  74. AVFoundation is the currently recommended framework by Apple for streamgrabbing on OSX >= 10.7 as well as on iOS.
  75. The input filename has to be given in the following syntax:
  76. @example
  77. -i "[[VIDEO]:[AUDIO]]"
  78. @end example
  79. The first entry selects the video input while the latter selects the audio input.
  80. The stream has to be specified by the device name or the device index as shown by the device list.
  81. Alternatively, the video and/or audio input device can be chosen by index using the
  82. @option{
  83. -video_device_index <INDEX>
  84. }
  85. and/or
  86. @option{
  87. -audio_device_index <INDEX>
  88. }
  89. , overriding any
  90. device name or index given in the input filename.
  91. All available devices can be enumerated by using @option{-list_devices true}, listing
  92. all device names and corresponding indices.
  93. There are two device name aliases:
  94. @table @code
  95. @item default
  96. Select the AVFoundation default device of the corresponding type.
  97. @item none
  98. Do not record the corresponding media type.
  99. This is equivalent to specifying an empty device name or index.
  100. @end table
  101. @subsection Options
  102. AVFoundation supports the following options:
  103. @table @option
  104. @item -list_devices <TRUE|FALSE>
  105. If set to true, a list of all available input devices is given showing all
  106. device names and indices.
  107. @item -video_device_index <INDEX>
  108. Specify the video device by its index. Overrides anything given in the input filename.
  109. @item -audio_device_index <INDEX>
  110. Specify the audio device by its index. Overrides anything given in the input filename.
  111. @item -pixel_format <FORMAT>
  112. Request the video device to use a specific pixel format.
  113. If the specified format is not supported, a list of available formats is given
  114. and the first one in this list is used instead. Available pixel formats are:
  115. @code{monob, rgb555be, rgb555le, rgb565be, rgb565le, rgb24, bgr24, 0rgb, bgr0, 0bgr, rgb0,
  116. bgr48be, uyvy422, yuva444p, yuva444p16le, yuv444p, yuv422p16, yuv422p10, yuv444p10,
  117. yuv420p, nv12, yuyv422, gray}
  118. @item -framerate
  119. Set the grabbing frame rate. Default is @code{ntsc}, corresponding to a
  120. frame rate of @code{30000/1001}.
  121. @item -video_size
  122. Set the video frame size.
  123. @item -capture_cursor
  124. Capture the mouse pointer. Default is 0.
  125. @item -capture_mouse_clicks
  126. Capture the screen mouse clicks. Default is 0.
  127. @item -capture_raw_data
  128. Capture the raw device data. Default is 0.
  129. Using this option may result in receiving the underlying data delivered to the AVFoundation framework. E.g. for muxed devices that sends raw DV data to the framework (like tape-based camcorders), setting this option to false results in extracted video frames captured in the designated pixel format only. Setting this option to true results in receiving the raw DV stream untouched.
  130. @end table
  131. @subsection Examples
  132. @itemize
  133. @item
  134. Print the list of AVFoundation supported devices and exit:
  135. @example
  136. $ ffmpeg -f avfoundation -list_devices true -i ""
  137. @end example
  138. @item
  139. Record video from video device 0 and audio from audio device 0 into out.avi:
  140. @example
  141. $ ffmpeg -f avfoundation -i "0:0" out.avi
  142. @end example
  143. @item
  144. Record video from video device 2 and audio from audio device 1 into out.avi:
  145. @example
  146. $ ffmpeg -f avfoundation -video_device_index 2 -i ":1" out.avi
  147. @end example
  148. @item
  149. Record video from the system default video device using the pixel format bgr0 and do not record any audio into out.avi:
  150. @example
  151. $ ffmpeg -f avfoundation -pixel_format bgr0 -i "default:none" out.avi
  152. @end example
  153. @item
  154. Record raw DV data from a suitable input device and write the output into out.dv:
  155. @example
  156. $ ffmpeg -f avfoundation -capture_raw_data true -i "zr100:none" out.dv
  157. @end example
  158. @end itemize
  159. @section bktr
  160. BSD video input device.
  161. @subsection Options
  162. @table @option
  163. @item framerate
  164. Set the frame rate.
  165. @item video_size
  166. Set the video frame size. Default is @code{vga}.
  167. @item standard
  168. Available values are:
  169. @table @samp
  170. @item pal
  171. @item ntsc
  172. @item secam
  173. @item paln
  174. @item palm
  175. @item ntscj
  176. @end table
  177. @end table
  178. @section decklink
  179. The decklink input device provides capture capabilities for Blackmagic
  180. DeckLink devices.
  181. To enable this input device, you need the Blackmagic DeckLink SDK and you
  182. need to configure with the appropriate @code{--extra-cflags}
  183. and @code{--extra-ldflags}.
  184. On Windows, you need to run the IDL files through @command{widl}.
  185. DeckLink is very picky about the formats it supports. Pixel format of the
  186. input can be set with @option{raw_format}.
  187. Framerate and video size must be determined for your device with
  188. @command{-list_formats 1}. Audio sample rate is always 48 kHz and the number
  189. of channels can be 2, 8 or 16. Note that all audio channels are bundled in one single
  190. audio track.
  191. @subsection Options
  192. @table @option
  193. @item list_devices
  194. If set to @option{true}, print a list of devices and exit.
  195. Defaults to @option{false}. This option is deprecated, please use the
  196. @code{-sources} option of ffmpeg to list the available input devices.
  197. @item list_formats
  198. If set to @option{true}, print a list of supported formats and exit.
  199. Defaults to @option{false}.
  200. @item format_code <FourCC>
  201. This sets the input video format to the format given by the FourCC. To see
  202. the supported values of your device(s) use @option{list_formats}.
  203. Note that there is a FourCC @option{'pal '} that can also be used
  204. as @option{pal} (3 letters).
  205. Default behavior is autodetection of the input video format, if the hardware
  206. supports it.
  207. @item raw_format
  208. Set the pixel format of the captured video.
  209. Available values are:
  210. @table @samp
  211. @item uyvy422
  212. @item yuv422p10
  213. @item argb
  214. @item bgra
  215. @item rgb10
  216. @end table
  217. @item teletext_lines
  218. If set to nonzero, an additional teletext stream will be captured from the
  219. vertical ancillary data. Both SD PAL (576i) and HD (1080i or 1080p)
  220. sources are supported. In case of HD sources, OP47 packets are decoded.
  221. This option is a bitmask of the SD PAL VBI lines captured, specifically lines 6
  222. to 22, and lines 318 to 335. Line 6 is the LSB in the mask. Selected lines
  223. which do not contain teletext information will be ignored. You can use the
  224. special @option{all} constant to select all possible lines, or
  225. @option{standard} to skip lines 6, 318 and 319, which are not compatible with
  226. all receivers.
  227. For SD sources, ffmpeg needs to be compiled with @code{--enable-libzvbi}. For
  228. HD sources, on older (pre-4K) DeckLink card models you have to capture in 10
  229. bit mode.
  230. @item channels
  231. Defines number of audio channels to capture. Must be @samp{2}, @samp{8} or @samp{16}.
  232. Defaults to @samp{2}.
  233. @item duplex_mode
  234. Sets the decklink device duplex mode. Must be @samp{unset}, @samp{half} or @samp{full}.
  235. Defaults to @samp{unset}.
  236. @item timecode_format
  237. Timecode type to include in the frame and video stream metadata. Must be
  238. @samp{none}, @samp{rp188vitc}, @samp{rp188vitc2}, @samp{rp188ltc},
  239. @samp{rp188any}, @samp{vitc}, @samp{vitc2}, or @samp{serial}. Defaults to
  240. @samp{none} (not included).
  241. @item video_input
  242. Sets the video input source. Must be @samp{unset}, @samp{sdi}, @samp{hdmi},
  243. @samp{optical_sdi}, @samp{component}, @samp{composite} or @samp{s_video}.
  244. Defaults to @samp{unset}.
  245. @item audio_input
  246. Sets the audio input source. Must be @samp{unset}, @samp{embedded},
  247. @samp{aes_ebu}, @samp{analog}, @samp{analog_xlr}, @samp{analog_rca} or
  248. @samp{microphone}. Defaults to @samp{unset}.
  249. @item video_pts
  250. Sets the video packet timestamp source. Must be @samp{video}, @samp{audio},
  251. @samp{reference}, @samp{wallclock} or @samp{abs_wallclock}.
  252. Defaults to @samp{video}.
  253. @item audio_pts
  254. Sets the audio packet timestamp source. Must be @samp{video}, @samp{audio},
  255. @samp{reference}, @samp{wallclock} or @samp{abs_wallclock}.
  256. Defaults to @samp{audio}.
  257. @item draw_bars
  258. If set to @samp{true}, color bars are drawn in the event of a signal loss.
  259. Defaults to @samp{true}.
  260. @item queue_size
  261. Sets maximum input buffer size in bytes. If the buffering reaches this value,
  262. incoming frames will be dropped.
  263. Defaults to @samp{1073741824}.
  264. @item audio_depth
  265. Sets the audio sample bit depth. Must be @samp{16} or @samp{32}.
  266. Defaults to @samp{16}.
  267. @item decklink_copyts
  268. If set to @option{true}, timestamps are forwarded as they are without removing
  269. the initial offset.
  270. Defaults to @option{false}.
  271. @item timestamp_align
  272. Capture start time alignment in seconds. If set to nonzero, input frames are
  273. dropped till the system timestamp aligns with configured value.
  274. Alignment difference of up to one frame duration is tolerated.
  275. This is useful for maintaining input synchronization across N different
  276. hardware devices deployed for 'N-way' redundancy. The system time of different
  277. hardware devices should be synchronized with protocols such as NTP or PTP,
  278. before using this option.
  279. Note that this method is not foolproof. In some border cases input
  280. synchronization may not happen due to thread scheduling jitters in the OS.
  281. Either sync could go wrong by 1 frame or in a rarer case
  282. @option{timestamp_align} seconds.
  283. Defaults to @samp{0}.
  284. @item wait_for_tc (@emph{bool})
  285. Drop frames till a frame with timecode is received. Sometimes serial timecode
  286. isn't received with the first input frame. If that happens, the stored stream
  287. timecode will be inaccurate. If this option is set to @option{true}, input frames
  288. are dropped till a frame with timecode is received.
  289. Option @var{timecode_format} must be specified.
  290. Defaults to @option{false}.
  291. @item enable_klv(@emph{bool})
  292. If set to @option{true}, extracts KLV data from VANC and outputs KLV packets.
  293. KLV VANC packets are joined based on MID and PSC fields and aggregated into
  294. one KLV packet.
  295. Defaults to @option{false}.
  296. @end table
  297. @subsection Examples
  298. @itemize
  299. @item
  300. List input devices:
  301. @example
  302. ffmpeg -sources decklink
  303. @end example
  304. @item
  305. List supported formats:
  306. @example
  307. ffmpeg -f decklink -list_formats 1 -i 'Intensity Pro'
  308. @end example
  309. @item
  310. Capture video clip at 1080i50:
  311. @example
  312. ffmpeg -format_code Hi50 -f decklink -i 'Intensity Pro' -c:a copy -c:v copy output.avi
  313. @end example
  314. @item
  315. Capture video clip at 1080i50 10 bit:
  316. @example
  317. ffmpeg -raw_format yuv422p10 -format_code Hi50 -f decklink -i 'UltraStudio Mini Recorder' -c:a copy -c:v copy output.avi
  318. @end example
  319. @item
  320. Capture video clip at 1080i50 with 16 audio channels:
  321. @example
  322. ffmpeg -channels 16 -format_code Hi50 -f decklink -i 'UltraStudio Mini Recorder' -c:a copy -c:v copy output.avi
  323. @end example
  324. @end itemize
  325. @section dshow
  326. Windows DirectShow input device.
  327. DirectShow support is enabled when FFmpeg is built with the mingw-w64 project.
  328. Currently only audio and video devices are supported.
  329. Multiple devices may be opened as separate inputs, but they may also be
  330. opened on the same input, which should improve synchronism between them.
  331. The input name should be in the format:
  332. @example
  333. @var{TYPE}=@var{NAME}[:@var{TYPE}=@var{NAME}]
  334. @end example
  335. where @var{TYPE} can be either @var{audio} or @var{video},
  336. and @var{NAME} is the device's name or alternative name..
  337. @subsection Options
  338. If no options are specified, the device's defaults are used.
  339. If the device does not support the requested options, it will
  340. fail to open.
  341. @table @option
  342. @item video_size
  343. Set the video size in the captured video.
  344. @item framerate
  345. Set the frame rate in the captured video.
  346. @item sample_rate
  347. Set the sample rate (in Hz) of the captured audio.
  348. @item sample_size
  349. Set the sample size (in bits) of the captured audio.
  350. @item channels
  351. Set the number of channels in the captured audio.
  352. @item list_devices
  353. If set to @option{true}, print a list of devices and exit.
  354. @item list_options
  355. If set to @option{true}, print a list of selected device's options
  356. and exit.
  357. @item video_device_number
  358. Set video device number for devices with the same name (starts at 0,
  359. defaults to 0).
  360. @item audio_device_number
  361. Set audio device number for devices with the same name (starts at 0,
  362. defaults to 0).
  363. @item pixel_format
  364. Select pixel format to be used by DirectShow. This may only be set when
  365. the video codec is not set or set to rawvideo.
  366. @item audio_buffer_size
  367. Set audio device buffer size in milliseconds (which can directly
  368. impact latency, depending on the device).
  369. Defaults to using the audio device's
  370. default buffer size (typically some multiple of 500ms).
  371. Setting this value too low can degrade performance.
  372. See also
  373. @url{http://msdn.microsoft.com/en-us/library/windows/desktop/dd377582(v=vs.85).aspx}
  374. @item video_pin_name
  375. Select video capture pin to use by name or alternative name.
  376. @item audio_pin_name
  377. Select audio capture pin to use by name or alternative name.
  378. @item crossbar_video_input_pin_number
  379. Select video input pin number for crossbar device. This will be
  380. routed to the crossbar device's Video Decoder output pin.
  381. Note that changing this value can affect future invocations
  382. (sets a new default) until system reboot occurs.
  383. @item crossbar_audio_input_pin_number
  384. Select audio input pin number for crossbar device. This will be
  385. routed to the crossbar device's Audio Decoder output pin.
  386. Note that changing this value can affect future invocations
  387. (sets a new default) until system reboot occurs.
  388. @item show_video_device_dialog
  389. If set to @option{true}, before capture starts, popup a display dialog
  390. to the end user, allowing them to change video filter properties
  391. and configurations manually.
  392. Note that for crossbar devices, adjusting values in this dialog
  393. may be needed at times to toggle between PAL (25 fps) and NTSC (29.97)
  394. input frame rates, sizes, interlacing, etc. Changing these values can
  395. enable different scan rates/frame rates and avoiding green bars at
  396. the bottom, flickering scan lines, etc.
  397. Note that with some devices, changing these properties can also affect future
  398. invocations (sets new defaults) until system reboot occurs.
  399. @item show_audio_device_dialog
  400. If set to @option{true}, before capture starts, popup a display dialog
  401. to the end user, allowing them to change audio filter properties
  402. and configurations manually.
  403. @item show_video_crossbar_connection_dialog
  404. If set to @option{true}, before capture starts, popup a display
  405. dialog to the end user, allowing them to manually
  406. modify crossbar pin routings, when it opens a video device.
  407. @item show_audio_crossbar_connection_dialog
  408. If set to @option{true}, before capture starts, popup a display
  409. dialog to the end user, allowing them to manually
  410. modify crossbar pin routings, when it opens an audio device.
  411. @item show_analog_tv_tuner_dialog
  412. If set to @option{true}, before capture starts, popup a display
  413. dialog to the end user, allowing them to manually
  414. modify TV channels and frequencies.
  415. @item show_analog_tv_tuner_audio_dialog
  416. If set to @option{true}, before capture starts, popup a display
  417. dialog to the end user, allowing them to manually
  418. modify TV audio (like mono vs. stereo, Language A,B or C).
  419. @item audio_device_load
  420. Load an audio capture filter device from file instead of searching
  421. it by name. It may load additional parameters too, if the filter
  422. supports the serialization of its properties to.
  423. To use this an audio capture source has to be specified, but it can
  424. be anything even fake one.
  425. @item audio_device_save
  426. Save the currently used audio capture filter device and its
  427. parameters (if the filter supports it) to a file.
  428. If a file with the same name exists it will be overwritten.
  429. @item video_device_load
  430. Load a video capture filter device from file instead of searching
  431. it by name. It may load additional parameters too, if the filter
  432. supports the serialization of its properties to.
  433. To use this a video capture source has to be specified, but it can
  434. be anything even fake one.
  435. @item video_device_save
  436. Save the currently used video capture filter device and its
  437. parameters (if the filter supports it) to a file.
  438. If a file with the same name exists it will be overwritten.
  439. @end table
  440. @subsection Examples
  441. @itemize
  442. @item
  443. Print the list of DirectShow supported devices and exit:
  444. @example
  445. $ ffmpeg -list_devices true -f dshow -i dummy
  446. @end example
  447. @item
  448. Open video device @var{Camera}:
  449. @example
  450. $ ffmpeg -f dshow -i video="Camera"
  451. @end example
  452. @item
  453. Open second video device with name @var{Camera}:
  454. @example
  455. $ ffmpeg -f dshow -video_device_number 1 -i video="Camera"
  456. @end example
  457. @item
  458. Open video device @var{Camera} and audio device @var{Microphone}:
  459. @example
  460. $ ffmpeg -f dshow -i video="Camera":audio="Microphone"
  461. @end example
  462. @item
  463. Print the list of supported options in selected device and exit:
  464. @example
  465. $ ffmpeg -list_options true -f dshow -i video="Camera"
  466. @end example
  467. @item
  468. Specify pin names to capture by name or alternative name, specify alternative device name:
  469. @example
  470. $ ffmpeg -f dshow -audio_pin_name "Audio Out" -video_pin_name 2 -i video=video="@@device_pnp_\\?\pci#ven_1a0a&dev_6200&subsys_62021461&rev_01#4&e2c7dd6&0&00e1#@{65e8773d-8f56-11d0-a3b9-00a0c9223196@}\@{ca465100-deb0-4d59-818f-8c477184adf6@}":audio="Microphone"
  471. @end example
  472. @item
  473. Configure a crossbar device, specifying crossbar pins, allow user to adjust video capture properties at startup:
  474. @example
  475. $ ffmpeg -f dshow -show_video_device_dialog true -crossbar_video_input_pin_number 0
  476. -crossbar_audio_input_pin_number 3 -i video="AVerMedia BDA Analog Capture":audio="AVerMedia BDA Analog Capture"
  477. @end example
  478. @end itemize
  479. @section fbdev
  480. Linux framebuffer input device.
  481. The Linux framebuffer is a graphic hardware-independent abstraction
  482. layer to show graphics on a computer monitor, typically on the
  483. console. It is accessed through a file device node, usually
  484. @file{/dev/fb0}.
  485. For more detailed information read the file
  486. Documentation/fb/framebuffer.txt included in the Linux source tree.
  487. See also @url{http://linux-fbdev.sourceforge.net/}, and fbset(1).
  488. To record from the framebuffer device @file{/dev/fb0} with
  489. @command{ffmpeg}:
  490. @example
  491. ffmpeg -f fbdev -framerate 10 -i /dev/fb0 out.avi
  492. @end example
  493. You can take a single screenshot image with the command:
  494. @example
  495. ffmpeg -f fbdev -framerate 1 -i /dev/fb0 -frames:v 1 screenshot.jpeg
  496. @end example
  497. @subsection Options
  498. @table @option
  499. @item framerate
  500. Set the frame rate. Default is 25.
  501. @end table
  502. @section gdigrab
  503. Win32 GDI-based screen capture device.
  504. This device allows you to capture a region of the display on Windows.
  505. There are two options for the input filename:
  506. @example
  507. desktop
  508. @end example
  509. or
  510. @example
  511. title=@var{window_title}
  512. @end example
  513. The first option will capture the entire desktop, or a fixed region of the
  514. desktop. The second option will instead capture the contents of a single
  515. window, regardless of its position on the screen.
  516. For example, to grab the entire desktop using @command{ffmpeg}:
  517. @example
  518. ffmpeg -f gdigrab -framerate 6 -i desktop out.mpg
  519. @end example
  520. Grab a 640x480 region at position @code{10,20}:
  521. @example
  522. ffmpeg -f gdigrab -framerate 6 -offset_x 10 -offset_y 20 -video_size vga -i desktop out.mpg
  523. @end example
  524. Grab the contents of the window named "Calculator"
  525. @example
  526. ffmpeg -f gdigrab -framerate 6 -i title=Calculator out.mpg
  527. @end example
  528. @subsection Options
  529. @table @option
  530. @item draw_mouse
  531. Specify whether to draw the mouse pointer. Use the value @code{0} to
  532. not draw the pointer. Default value is @code{1}.
  533. @item framerate
  534. Set the grabbing frame rate. Default value is @code{ntsc},
  535. corresponding to a frame rate of @code{30000/1001}.
  536. @item show_region
  537. Show grabbed region on screen.
  538. If @var{show_region} is specified with @code{1}, then the grabbing
  539. region will be indicated on screen. With this option, it is easy to
  540. know what is being grabbed if only a portion of the screen is grabbed.
  541. Note that @var{show_region} is incompatible with grabbing the contents
  542. of a single window.
  543. For example:
  544. @example
  545. ffmpeg -f gdigrab -show_region 1 -framerate 6 -video_size cif -offset_x 10 -offset_y 20 -i desktop out.mpg
  546. @end example
  547. @item video_size
  548. Set the video frame size. The default is to capture the full screen if @file{desktop} is selected, or the full window size if @file{title=@var{window_title}} is selected.
  549. @item offset_x
  550. When capturing a region with @var{video_size}, set the distance from the left edge of the screen or desktop.
  551. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned to the left of your primary monitor, you will need to use a negative @var{offset_x} value to move the region to that monitor.
  552. @item offset_y
  553. When capturing a region with @var{video_size}, set the distance from the top edge of the screen or desktop.
  554. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned above your primary monitor, you will need to use a negative @var{offset_y} value to move the region to that monitor.
  555. @end table
  556. @section iec61883
  557. FireWire DV/HDV input device using libiec61883.
  558. To enable this input device, you need libiec61883, libraw1394 and
  559. libavc1394 installed on your system. Use the configure option
  560. @code{--enable-libiec61883} to compile with the device enabled.
  561. The iec61883 capture device supports capturing from a video device
  562. connected via IEEE1394 (FireWire), using libiec61883 and the new Linux
  563. FireWire stack (juju). This is the default DV/HDV input method in Linux
  564. Kernel 2.6.37 and later, since the old FireWire stack was removed.
  565. Specify the FireWire port to be used as input file, or "auto"
  566. to choose the first port connected.
  567. @subsection Options
  568. @table @option
  569. @item dvtype
  570. Override autodetection of DV/HDV. This should only be used if auto
  571. detection does not work, or if usage of a different device type
  572. should be prohibited. Treating a DV device as HDV (or vice versa) will
  573. not work and result in undefined behavior.
  574. The values @option{auto}, @option{dv} and @option{hdv} are supported.
  575. @item dvbuffer
  576. Set maximum size of buffer for incoming data, in frames. For DV, this
  577. is an exact value. For HDV, it is not frame exact, since HDV does
  578. not have a fixed frame size.
  579. @item dvguid
  580. Select the capture device by specifying its GUID. Capturing will only
  581. be performed from the specified device and fails if no device with the
  582. given GUID is found. This is useful to select the input if multiple
  583. devices are connected at the same time.
  584. Look at /sys/bus/firewire/devices to find out the GUIDs.
  585. @end table
  586. @subsection Examples
  587. @itemize
  588. @item
  589. Grab and show the input of a FireWire DV/HDV device.
  590. @example
  591. ffplay -f iec61883 -i auto
  592. @end example
  593. @item
  594. Grab and record the input of a FireWire DV/HDV device,
  595. using a packet buffer of 100000 packets if the source is HDV.
  596. @example
  597. ffmpeg -f iec61883 -i auto -dvbuffer 100000 out.mpg
  598. @end example
  599. @end itemize
  600. @section jack
  601. JACK input device.
  602. To enable this input device during configuration you need libjack
  603. installed on your system.
  604. A JACK input device creates one or more JACK writable clients, one for
  605. each audio channel, with name @var{client_name}:input_@var{N}, where
  606. @var{client_name} is the name provided by the application, and @var{N}
  607. is a number which identifies the channel.
  608. Each writable client will send the acquired data to the FFmpeg input
  609. device.
  610. Once you have created one or more JACK readable clients, you need to
  611. connect them to one or more JACK writable clients.
  612. To connect or disconnect JACK clients you can use the @command{jack_connect}
  613. and @command{jack_disconnect} programs, or do it through a graphical interface,
  614. for example with @command{qjackctl}.
  615. To list the JACK clients and their properties you can invoke the command
  616. @command{jack_lsp}.
  617. Follows an example which shows how to capture a JACK readable client
  618. with @command{ffmpeg}.
  619. @example
  620. # Create a JACK writable client with name "ffmpeg".
  621. $ ffmpeg -f jack -i ffmpeg -y out.wav
  622. # Start the sample jack_metro readable client.
  623. $ jack_metro -b 120 -d 0.2 -f 4000
  624. # List the current JACK clients.
  625. $ jack_lsp -c
  626. system:capture_1
  627. system:capture_2
  628. system:playback_1
  629. system:playback_2
  630. ffmpeg:input_1
  631. metro:120_bpm
  632. # Connect metro to the ffmpeg writable client.
  633. $ jack_connect metro:120_bpm ffmpeg:input_1
  634. @end example
  635. For more information read:
  636. @url{http://jackaudio.org/}
  637. @subsection Options
  638. @table @option
  639. @item channels
  640. Set the number of channels. Default is 2.
  641. @end table
  642. @section kmsgrab
  643. KMS video input device.
  644. Captures the KMS scanout framebuffer associated with a specified CRTC or plane as a
  645. DRM object that can be passed to other hardware functions.
  646. Requires either DRM master or CAP_SYS_ADMIN to run.
  647. If you don't understand what all of that means, you probably don't want this. Look at
  648. @option{x11grab} instead.
  649. @subsection Options
  650. @table @option
  651. @item device
  652. DRM device to capture on. Defaults to @option{/dev/dri/card0}.
  653. @item format
  654. Pixel format of the framebuffer. This can be autodetected if you are running Linux 5.7
  655. or later, but needs to be provided for earlier versions. Defaults to @option{bgr0},
  656. which is the most common format used by the Linux console and Xorg X server.
  657. @item format_modifier
  658. Format modifier to signal on output frames. This is necessary to import correctly into
  659. some APIs. It can be autodetected if you are running Linux 5.7 or later, but will need
  660. to be provided explicitly when needed in earlier versions. See the libdrm documentation
  661. for possible values.
  662. @item crtc_id
  663. KMS CRTC ID to define the capture source. The first active plane on the given CRTC
  664. will be used.
  665. @item plane_id
  666. KMS plane ID to define the capture source. Defaults to the first active plane found if
  667. neither @option{crtc_id} nor @option{plane_id} are specified.
  668. @item framerate
  669. Framerate to capture at. This is not synchronised to any page flipping or framebuffer
  670. changes - it just defines the interval at which the framebuffer is sampled. Sampling
  671. faster than the framebuffer update rate will generate independent frames with the same
  672. content. Defaults to @code{30}.
  673. @end table
  674. @subsection Examples
  675. @itemize
  676. @item
  677. Capture from the first active plane, download the result to normal frames and encode.
  678. This will only work if the framebuffer is both linear and mappable - if not, the result
  679. may be scrambled or fail to download.
  680. @example
  681. ffmpeg -f kmsgrab -i - -vf 'hwdownload,format=bgr0' output.mp4
  682. @end example
  683. @item
  684. Capture from CRTC ID 42 at 60fps, map the result to VAAPI, convert to NV12 and encode as H.264.
  685. @example
  686. ffmpeg -crtc_id 42 -framerate 60 -f kmsgrab -i - -vf 'hwmap=derive_device=vaapi,scale_vaapi=w=1920:h=1080:format=nv12' -c:v h264_vaapi output.mp4
  687. @end example
  688. @item
  689. To capture only part of a plane the output can be cropped - this can be used to capture
  690. a single window, as long as it has a known absolute position and size. For example, to
  691. capture and encode the middle quarter of a 1920x1080 plane:
  692. @example
  693. ffmpeg -f kmsgrab -i - -vf 'hwmap=derive_device=vaapi,crop=960:540:480:270,scale_vaapi=960:540:nv12' -c:v h264_vaapi output.mp4
  694. @end example
  695. @end itemize
  696. @section lavfi
  697. Libavfilter input virtual device.
  698. This input device reads data from the open output pads of a libavfilter
  699. filtergraph.
  700. For each filtergraph open output, the input device will create a
  701. corresponding stream which is mapped to the generated output. Currently
  702. only video data is supported. The filtergraph is specified through the
  703. option @option{graph}.
  704. @subsection Options
  705. @table @option
  706. @item graph
  707. Specify the filtergraph to use as input. Each video open output must be
  708. labelled by a unique string of the form "out@var{N}", where @var{N} is a
  709. number starting from 0 corresponding to the mapped input stream
  710. generated by the device.
  711. The first unlabelled output is automatically assigned to the "out0"
  712. label, but all the others need to be specified explicitly.
  713. The suffix "+subcc" can be appended to the output label to create an extra
  714. stream with the closed captions packets attached to that output
  715. (experimental; only for EIA-608 / CEA-708 for now).
  716. The subcc streams are created after all the normal streams, in the order of
  717. the corresponding stream.
  718. For example, if there is "out19+subcc", "out7+subcc" and up to "out42", the
  719. stream #43 is subcc for stream #7 and stream #44 is subcc for stream #19.
  720. If not specified defaults to the filename specified for the input
  721. device.
  722. @item graph_file
  723. Set the filename of the filtergraph to be read and sent to the other
  724. filters. Syntax of the filtergraph is the same as the one specified by
  725. the option @var{graph}.
  726. @item dumpgraph
  727. Dump graph to stderr.
  728. @end table
  729. @subsection Examples
  730. @itemize
  731. @item
  732. Create a color video stream and play it back with @command{ffplay}:
  733. @example
  734. ffplay -f lavfi -graph "color=c=pink [out0]" dummy
  735. @end example
  736. @item
  737. As the previous example, but use filename for specifying the graph
  738. description, and omit the "out0" label:
  739. @example
  740. ffplay -f lavfi color=c=pink
  741. @end example
  742. @item
  743. Create three different video test filtered sources and play them:
  744. @example
  745. ffplay -f lavfi -graph "testsrc [out0]; testsrc,hflip [out1]; testsrc,negate [out2]" test3
  746. @end example
  747. @item
  748. Read an audio stream from a file using the amovie source and play it
  749. back with @command{ffplay}:
  750. @example
  751. ffplay -f lavfi "amovie=test.wav"
  752. @end example
  753. @item
  754. Read an audio stream and a video stream and play it back with
  755. @command{ffplay}:
  756. @example
  757. ffplay -f lavfi "movie=test.avi[out0];amovie=test.wav[out1]"
  758. @end example
  759. @item
  760. Dump decoded frames to images and closed captions to a file (experimental):
  761. @example
  762. ffmpeg -f lavfi -i "movie=test.ts[out0+subcc]" -map v frame%08d.png -map s -c copy -f rawvideo subcc.bin
  763. @end example
  764. @end itemize
  765. @section libcdio
  766. Audio-CD input device based on libcdio.
  767. To enable this input device during configuration you need libcdio
  768. installed on your system. It requires the configure option
  769. @code{--enable-libcdio}.
  770. This device allows playing and grabbing from an Audio-CD.
  771. For example to copy with @command{ffmpeg} the entire Audio-CD in @file{/dev/sr0},
  772. you may run the command:
  773. @example
  774. ffmpeg -f libcdio -i /dev/sr0 cd.wav
  775. @end example
  776. @subsection Options
  777. @table @option
  778. @item speed
  779. Set drive reading speed. Default value is 0.
  780. The speed is specified CD-ROM speed units. The speed is set through
  781. the libcdio @code{cdio_cddap_speed_set} function. On many CD-ROM
  782. drives, specifying a value too large will result in using the fastest
  783. speed.
  784. @item paranoia_mode
  785. Set paranoia recovery mode flags. It accepts one of the following values:
  786. @table @samp
  787. @item disable
  788. @item verify
  789. @item overlap
  790. @item neverskip
  791. @item full
  792. @end table
  793. Default value is @samp{disable}.
  794. For more information about the available recovery modes, consult the
  795. paranoia project documentation.
  796. @end table
  797. @section libdc1394
  798. IIDC1394 input device, based on libdc1394 and libraw1394.
  799. Requires the configure option @code{--enable-libdc1394}.
  800. @subsection Options
  801. @table @option
  802. @item framerate
  803. Set the frame rate. Default is @code{ntsc}, corresponding to a frame
  804. rate of @code{30000/1001}.
  805. @item pixel_format
  806. Select the pixel format. Default is @code{uyvy422}.
  807. @item video_size
  808. Set the video size given as a string such as @code{640x480} or @code{hd720}.
  809. Default is @code{qvga}.
  810. @end table
  811. @section openal
  812. The OpenAL input device provides audio capture on all systems with a
  813. working OpenAL 1.1 implementation.
  814. To enable this input device during configuration, you need OpenAL
  815. headers and libraries installed on your system, and need to configure
  816. FFmpeg with @code{--enable-openal}.
  817. OpenAL headers and libraries should be provided as part of your OpenAL
  818. implementation, or as an additional download (an SDK). Depending on your
  819. installation you may need to specify additional flags via the
  820. @code{--extra-cflags} and @code{--extra-ldflags} for allowing the build
  821. system to locate the OpenAL headers and libraries.
  822. An incomplete list of OpenAL implementations follows:
  823. @table @strong
  824. @item Creative
  825. The official Windows implementation, providing hardware acceleration
  826. with supported devices and software fallback.
  827. See @url{http://openal.org/}.
  828. @item OpenAL Soft
  829. Portable, open source (LGPL) software implementation. Includes
  830. backends for the most common sound APIs on the Windows, Linux,
  831. Solaris, and BSD operating systems.
  832. See @url{http://kcat.strangesoft.net/openal.html}.
  833. @item Apple
  834. OpenAL is part of Core Audio, the official Mac OS X Audio interface.
  835. See @url{http://developer.apple.com/technologies/mac/audio-and-video.html}
  836. @end table
  837. This device allows one to capture from an audio input device handled
  838. through OpenAL.
  839. You need to specify the name of the device to capture in the provided
  840. filename. If the empty string is provided, the device will
  841. automatically select the default device. You can get the list of the
  842. supported devices by using the option @var{list_devices}.
  843. @subsection Options
  844. @table @option
  845. @item channels
  846. Set the number of channels in the captured audio. Only the values
  847. @option{1} (monaural) and @option{2} (stereo) are currently supported.
  848. Defaults to @option{2}.
  849. @item sample_size
  850. Set the sample size (in bits) of the captured audio. Only the values
  851. @option{8} and @option{16} are currently supported. Defaults to
  852. @option{16}.
  853. @item sample_rate
  854. Set the sample rate (in Hz) of the captured audio.
  855. Defaults to @option{44.1k}.
  856. @item list_devices
  857. If set to @option{true}, print a list of devices and exit.
  858. Defaults to @option{false}.
  859. @end table
  860. @subsection Examples
  861. Print the list of OpenAL supported devices and exit:
  862. @example
  863. $ ffmpeg -list_devices true -f openal -i dummy out.ogg
  864. @end example
  865. Capture from the OpenAL device @file{DR-BT101 via PulseAudio}:
  866. @example
  867. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out.ogg
  868. @end example
  869. Capture from the default device (note the empty string '' as filename):
  870. @example
  871. $ ffmpeg -f openal -i '' out.ogg
  872. @end example
  873. Capture from two devices simultaneously, writing to two different files,
  874. within the same @command{ffmpeg} command:
  875. @example
  876. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out1.ogg -f openal -i 'ALSA Default' out2.ogg
  877. @end example
  878. Note: not all OpenAL implementations support multiple simultaneous capture -
  879. try the latest OpenAL Soft if the above does not work.
  880. @section oss
  881. Open Sound System input device.
  882. The filename to provide to the input device is the device node
  883. representing the OSS input device, and is usually set to
  884. @file{/dev/dsp}.
  885. For example to grab from @file{/dev/dsp} using @command{ffmpeg} use the
  886. command:
  887. @example
  888. ffmpeg -f oss -i /dev/dsp /tmp/oss.wav
  889. @end example
  890. For more information about OSS see:
  891. @url{http://manuals.opensound.com/usersguide/dsp.html}
  892. @subsection Options
  893. @table @option
  894. @item sample_rate
  895. Set the sample rate in Hz. Default is 48000.
  896. @item channels
  897. Set the number of channels. Default is 2.
  898. @end table
  899. @section pulse
  900. PulseAudio input device.
  901. To enable this output device you need to configure FFmpeg with @code{--enable-libpulse}.
  902. The filename to provide to the input device is a source device or the
  903. string "default"
  904. To list the PulseAudio source devices and their properties you can invoke
  905. the command @command{pactl list sources}.
  906. More information about PulseAudio can be found on @url{http://www.pulseaudio.org}.
  907. @subsection Options
  908. @table @option
  909. @item server
  910. Connect to a specific PulseAudio server, specified by an IP address.
  911. Default server is used when not provided.
  912. @item name
  913. Specify the application name PulseAudio will use when showing active clients,
  914. by default it is the @code{LIBAVFORMAT_IDENT} string.
  915. @item stream_name
  916. Specify the stream name PulseAudio will use when showing active streams,
  917. by default it is "record".
  918. @item sample_rate
  919. Specify the samplerate in Hz, by default 48kHz is used.
  920. @item channels
  921. Specify the channels in use, by default 2 (stereo) is set.
  922. @item frame_size
  923. Specify the number of bytes per frame, by default it is set to 1024.
  924. @item fragment_size
  925. Specify the minimal buffering fragment in PulseAudio, it will affect the
  926. audio latency. By default it is unset.
  927. @item wallclock
  928. Set the initial PTS using the current time. Default is 1.
  929. @end table
  930. @subsection Examples
  931. Record a stream from default device:
  932. @example
  933. ffmpeg -f pulse -i default /tmp/pulse.wav
  934. @end example
  935. @section sndio
  936. sndio input device.
  937. To enable this input device during configuration you need libsndio
  938. installed on your system.
  939. The filename to provide to the input device is the device node
  940. representing the sndio input device, and is usually set to
  941. @file{/dev/audio0}.
  942. For example to grab from @file{/dev/audio0} using @command{ffmpeg} use the
  943. command:
  944. @example
  945. ffmpeg -f sndio -i /dev/audio0 /tmp/oss.wav
  946. @end example
  947. @subsection Options
  948. @table @option
  949. @item sample_rate
  950. Set the sample rate in Hz. Default is 48000.
  951. @item channels
  952. Set the number of channels. Default is 2.
  953. @end table
  954. @section video4linux2, v4l2
  955. Video4Linux2 input video device.
  956. "v4l2" can be used as alias for "video4linux2".
  957. If FFmpeg is built with v4l-utils support (by using the
  958. @code{--enable-libv4l2} configure option), it is possible to use it with the
  959. @code{-use_libv4l2} input device option.
  960. The name of the device to grab is a file device node, usually Linux
  961. systems tend to automatically create such nodes when the device
  962. (e.g. an USB webcam) is plugged into the system, and has a name of the
  963. kind @file{/dev/video@var{N}}, where @var{N} is a number associated to
  964. the device.
  965. Video4Linux2 devices usually support a limited set of
  966. @var{width}x@var{height} sizes and frame rates. You can check which are
  967. supported using @command{-list_formats all} for Video4Linux2 devices.
  968. Some devices, like TV cards, support one or more standards. It is possible
  969. to list all the supported standards using @command{-list_standards all}.
  970. The time base for the timestamps is 1 microsecond. Depending on the kernel
  971. version and configuration, the timestamps may be derived from the real time
  972. clock (origin at the Unix Epoch) or the monotonic clock (origin usually at
  973. boot time, unaffected by NTP or manual changes to the clock). The
  974. @option{-timestamps abs} or @option{-ts abs} option can be used to force
  975. conversion into the real time clock.
  976. Some usage examples of the video4linux2 device with @command{ffmpeg}
  977. and @command{ffplay}:
  978. @itemize
  979. @item
  980. List supported formats for a video4linux2 device:
  981. @example
  982. ffplay -f video4linux2 -list_formats all /dev/video0
  983. @end example
  984. @item
  985. Grab and show the input of a video4linux2 device:
  986. @example
  987. ffplay -f video4linux2 -framerate 30 -video_size hd720 /dev/video0
  988. @end example
  989. @item
  990. Grab and record the input of a video4linux2 device, leave the
  991. frame rate and size as previously set:
  992. @example
  993. ffmpeg -f video4linux2 -input_format mjpeg -i /dev/video0 out.mpeg
  994. @end example
  995. @end itemize
  996. For more information about Video4Linux, check @url{http://linuxtv.org/}.
  997. @subsection Options
  998. @table @option
  999. @item standard
  1000. Set the standard. Must be the name of a supported standard. To get a
  1001. list of the supported standards, use the @option{list_standards}
  1002. option.
  1003. @item channel
  1004. Set the input channel number. Default to -1, which means using the
  1005. previously selected channel.
  1006. @item video_size
  1007. Set the video frame size. The argument must be a string in the form
  1008. @var{WIDTH}x@var{HEIGHT} or a valid size abbreviation.
  1009. @item pixel_format
  1010. Select the pixel format (only valid for raw video input).
  1011. @item input_format
  1012. Set the preferred pixel format (for raw video) or a codec name.
  1013. This option allows one to select the input format, when several are
  1014. available.
  1015. @item framerate
  1016. Set the preferred video frame rate.
  1017. @item list_formats
  1018. List available formats (supported pixel formats, codecs, and frame
  1019. sizes) and exit.
  1020. Available values are:
  1021. @table @samp
  1022. @item all
  1023. Show all available (compressed and non-compressed) formats.
  1024. @item raw
  1025. Show only raw video (non-compressed) formats.
  1026. @item compressed
  1027. Show only compressed formats.
  1028. @end table
  1029. @item list_standards
  1030. List supported standards and exit.
  1031. Available values are:
  1032. @table @samp
  1033. @item all
  1034. Show all supported standards.
  1035. @end table
  1036. @item timestamps, ts
  1037. Set type of timestamps for grabbed frames.
  1038. Available values are:
  1039. @table @samp
  1040. @item default
  1041. Use timestamps from the kernel.
  1042. @item abs
  1043. Use absolute timestamps (wall clock).
  1044. @item mono2abs
  1045. Force conversion from monotonic to absolute timestamps.
  1046. @end table
  1047. Default value is @code{default}.
  1048. @item use_libv4l2
  1049. Use libv4l2 (v4l-utils) conversion functions. Default is 0.
  1050. @end table
  1051. @section vfwcap
  1052. VfW (Video for Windows) capture input device.
  1053. The filename passed as input is the capture driver number, ranging from
  1054. 0 to 9. You may use "list" as filename to print a list of drivers. Any
  1055. other filename will be interpreted as device number 0.
  1056. @subsection Options
  1057. @table @option
  1058. @item video_size
  1059. Set the video frame size.
  1060. @item framerate
  1061. Set the grabbing frame rate. Default value is @code{ntsc},
  1062. corresponding to a frame rate of @code{30000/1001}.
  1063. @end table
  1064. @section x11grab
  1065. X11 video input device.
  1066. To enable this input device during configuration you need libxcb
  1067. installed on your system. It will be automatically detected during
  1068. configuration.
  1069. This device allows one to capture a region of an X11 display.
  1070. The filename passed as input has the syntax:
  1071. @example
  1072. [@var{hostname}]:@var{display_number}.@var{screen_number}[+@var{x_offset},@var{y_offset}]
  1073. @end example
  1074. @var{hostname}:@var{display_number}.@var{screen_number} specifies the
  1075. X11 display name of the screen to grab from. @var{hostname} can be
  1076. omitted, and defaults to "localhost". The environment variable
  1077. @env{DISPLAY} contains the default display name.
  1078. @var{x_offset} and @var{y_offset} specify the offsets of the grabbed
  1079. area with respect to the top-left border of the X11 screen. They
  1080. default to 0.
  1081. Check the X11 documentation (e.g. @command{man X}) for more detailed
  1082. information.
  1083. Use the @command{xdpyinfo} program for getting basic information about
  1084. the properties of your X11 display (e.g. grep for "name" or
  1085. "dimensions").
  1086. For example to grab from @file{:0.0} using @command{ffmpeg}:
  1087. @example
  1088. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0 out.mpg
  1089. @end example
  1090. Grab at position @code{10,20}:
  1091. @example
  1092. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  1093. @end example
  1094. @subsection Options
  1095. @table @option
  1096. @item draw_mouse
  1097. Specify whether to draw the mouse pointer. A value of @code{0} specifies
  1098. not to draw the pointer. Default value is @code{1}.
  1099. @item follow_mouse
  1100. Make the grabbed area follow the mouse. The argument can be
  1101. @code{centered} or a number of pixels @var{PIXELS}.
  1102. When it is specified with "centered", the grabbing region follows the mouse
  1103. pointer and keeps the pointer at the center of region; otherwise, the region
  1104. follows only when the mouse pointer reaches within @var{PIXELS} (greater than
  1105. zero) to the edge of region.
  1106. For example:
  1107. @example
  1108. ffmpeg -f x11grab -follow_mouse centered -framerate 25 -video_size cif -i :0.0 out.mpg
  1109. @end example
  1110. To follow only when the mouse pointer reaches within 100 pixels to edge:
  1111. @example
  1112. ffmpeg -f x11grab -follow_mouse 100 -framerate 25 -video_size cif -i :0.0 out.mpg
  1113. @end example
  1114. @item framerate
  1115. Set the grabbing frame rate. Default value is @code{ntsc},
  1116. corresponding to a frame rate of @code{30000/1001}.
  1117. @item show_region
  1118. Show grabbed region on screen.
  1119. If @var{show_region} is specified with @code{1}, then the grabbing
  1120. region will be indicated on screen. With this option, it is easy to
  1121. know what is being grabbed if only a portion of the screen is grabbed.
  1122. @item region_border
  1123. Set the region border thickness if @option{-show_region 1} is used.
  1124. Range is 1 to 128 and default is 3 (XCB-based x11grab only).
  1125. For example:
  1126. @example
  1127. ffmpeg -f x11grab -show_region 1 -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  1128. @end example
  1129. With @var{follow_mouse}:
  1130. @example
  1131. ffmpeg -f x11grab -follow_mouse centered -show_region 1 -framerate 25 -video_size cif -i :0.0 out.mpg
  1132. @end example
  1133. @item video_size
  1134. Set the video frame size. Default is the full desktop.
  1135. @item grab_x
  1136. @item grab_y
  1137. Set the grabbing region coordinates. They are expressed as offset from
  1138. the top left corner of the X11 window and correspond to the
  1139. @var{x_offset} and @var{y_offset} parameters in the device name. The
  1140. default value for both options is 0.
  1141. @end table
  1142. @c man end INPUT DEVICES