You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

1539 lines
44KB

  1. @chapter Input Devices
  2. @c man begin INPUT DEVICES
  3. Input devices are configured elements in FFmpeg which enable accessing
  4. the data coming from a multimedia device attached to your system.
  5. When you configure your FFmpeg build, all the supported input devices
  6. are enabled by default. You can list all available ones using the
  7. configure option "--list-indevs".
  8. You can disable all the input devices using the configure option
  9. "--disable-indevs", and selectively enable an input device using the
  10. option "--enable-indev=@var{INDEV}", or you can disable a particular
  11. input device using the option "--disable-indev=@var{INDEV}".
  12. The option "-devices" of the ff* tools will display the list of
  13. supported input devices.
  14. A description of the currently available input devices follows.
  15. @section alsa
  16. ALSA (Advanced Linux Sound Architecture) input device.
  17. To enable this input device during configuration you need libasound
  18. installed on your system.
  19. This device allows capturing from an ALSA device. The name of the
  20. device to capture has to be an ALSA card identifier.
  21. An ALSA identifier has the syntax:
  22. @example
  23. hw:@var{CARD}[,@var{DEV}[,@var{SUBDEV}]]
  24. @end example
  25. where the @var{DEV} and @var{SUBDEV} components are optional.
  26. The three arguments (in order: @var{CARD},@var{DEV},@var{SUBDEV})
  27. specify card number or identifier, device number and subdevice number
  28. (-1 means any).
  29. To see the list of cards currently recognized by your system check the
  30. files @file{/proc/asound/cards} and @file{/proc/asound/devices}.
  31. For example to capture with @command{ffmpeg} from an ALSA device with
  32. card id 0, you may run the command:
  33. @example
  34. ffmpeg -f alsa -i hw:0 alsaout.wav
  35. @end example
  36. For more information see:
  37. @url{http://www.alsa-project.org/alsa-doc/alsa-lib/pcm.html}
  38. @subsection Options
  39. @table @option
  40. @item sample_rate
  41. Set the sample rate in Hz. Default is 48000.
  42. @item channels
  43. Set the number of channels. Default is 2.
  44. @end table
  45. @section android_camera
  46. Android camera input device.
  47. This input devices uses the Android Camera2 NDK API which is
  48. available on devices with API level 24+. The availability of
  49. android_camera is autodetected during configuration.
  50. This device allows capturing from all cameras on an Android device,
  51. which are integrated into the Camera2 NDK API.
  52. The available cameras are enumerated internally and can be selected
  53. with the @var{camera_index} parameter. The input file string is
  54. discarded.
  55. Generally the back facing camera has index 0 while the front facing
  56. camera has index 1.
  57. @subsection Options
  58. @table @option
  59. @item video_size
  60. Set the video size given as a string such as 640x480 or hd720.
  61. Falls back to the first available configuration reported by
  62. Android if requested video size is not available or by default.
  63. @item framerate
  64. Set the video framerate.
  65. Falls back to the first available configuration reported by
  66. Android if requested framerate is not available or by default (-1).
  67. @item camera_index
  68. Set the index of the camera to use. Default is 0.
  69. @item input_queue_size
  70. Set the maximum number of frames to buffer. Default is 5.
  71. @end table
  72. @section avfoundation
  73. AVFoundation input device.
  74. AVFoundation is the currently recommended framework by Apple for streamgrabbing on OSX >= 10.7 as well as on iOS.
  75. The input filename has to be given in the following syntax:
  76. @example
  77. -i "[[VIDEO]:[AUDIO]]"
  78. @end example
  79. The first entry selects the video input while the latter selects the audio input.
  80. The stream has to be specified by the device name or the device index as shown by the device list.
  81. Alternatively, the video and/or audio input device can be chosen by index using the
  82. @option{
  83. -video_device_index <INDEX>
  84. }
  85. and/or
  86. @option{
  87. -audio_device_index <INDEX>
  88. }
  89. , overriding any
  90. device name or index given in the input filename.
  91. All available devices can be enumerated by using @option{-list_devices true}, listing
  92. all device names and corresponding indices.
  93. There are two device name aliases:
  94. @table @code
  95. @item default
  96. Select the AVFoundation default device of the corresponding type.
  97. @item none
  98. Do not record the corresponding media type.
  99. This is equivalent to specifying an empty device name or index.
  100. @end table
  101. @subsection Options
  102. AVFoundation supports the following options:
  103. @table @option
  104. @item -list_devices <TRUE|FALSE>
  105. If set to true, a list of all available input devices is given showing all
  106. device names and indices.
  107. @item -video_device_index <INDEX>
  108. Specify the video device by its index. Overrides anything given in the input filename.
  109. @item -audio_device_index <INDEX>
  110. Specify the audio device by its index. Overrides anything given in the input filename.
  111. @item -pixel_format <FORMAT>
  112. Request the video device to use a specific pixel format.
  113. If the specified format is not supported, a list of available formats is given
  114. and the first one in this list is used instead. Available pixel formats are:
  115. @code{monob, rgb555be, rgb555le, rgb565be, rgb565le, rgb24, bgr24, 0rgb, bgr0, 0bgr, rgb0,
  116. bgr48be, uyvy422, yuva444p, yuva444p16le, yuv444p, yuv422p16, yuv422p10, yuv444p10,
  117. yuv420p, nv12, yuyv422, gray}
  118. @item -framerate
  119. Set the grabbing frame rate. Default is @code{ntsc}, corresponding to a
  120. frame rate of @code{30000/1001}.
  121. @item -video_size
  122. Set the video frame size.
  123. @item -capture_cursor
  124. Capture the mouse pointer. Default is 0.
  125. @item -capture_mouse_clicks
  126. Capture the screen mouse clicks. Default is 0.
  127. @end table
  128. @subsection Examples
  129. @itemize
  130. @item
  131. Print the list of AVFoundation supported devices and exit:
  132. @example
  133. $ ffmpeg -f avfoundation -list_devices true -i ""
  134. @end example
  135. @item
  136. Record video from video device 0 and audio from audio device 0 into out.avi:
  137. @example
  138. $ ffmpeg -f avfoundation -i "0:0" out.avi
  139. @end example
  140. @item
  141. Record video from video device 2 and audio from audio device 1 into out.avi:
  142. @example
  143. $ ffmpeg -f avfoundation -video_device_index 2 -i ":1" out.avi
  144. @end example
  145. @item
  146. Record video from the system default video device using the pixel format bgr0 and do not record any audio into out.avi:
  147. @example
  148. $ ffmpeg -f avfoundation -pixel_format bgr0 -i "default:none" out.avi
  149. @end example
  150. @end itemize
  151. @section bktr
  152. BSD video input device.
  153. @subsection Options
  154. @table @option
  155. @item framerate
  156. Set the frame rate.
  157. @item video_size
  158. Set the video frame size. Default is @code{vga}.
  159. @item standard
  160. Available values are:
  161. @table @samp
  162. @item pal
  163. @item ntsc
  164. @item secam
  165. @item paln
  166. @item palm
  167. @item ntscj
  168. @end table
  169. @end table
  170. @section decklink
  171. The decklink input device provides capture capabilities for Blackmagic
  172. DeckLink devices.
  173. To enable this input device, you need the Blackmagic DeckLink SDK and you
  174. need to configure with the appropriate @code{--extra-cflags}
  175. and @code{--extra-ldflags}.
  176. On Windows, you need to run the IDL files through @command{widl}.
  177. DeckLink is very picky about the formats it supports. Pixel format of the
  178. input can be set with @option{raw_format}.
  179. Framerate and video size must be determined for your device with
  180. @command{-list_formats 1}. Audio sample rate is always 48 kHz and the number
  181. of channels can be 2, 8 or 16. Note that all audio channels are bundled in one single
  182. audio track.
  183. @subsection Options
  184. @table @option
  185. @item list_devices
  186. If set to @option{true}, print a list of devices and exit.
  187. Defaults to @option{false}. Alternatively you can use the @code{-sources}
  188. option of ffmpeg to list the available input devices.
  189. @item list_formats
  190. If set to @option{true}, print a list of supported formats and exit.
  191. Defaults to @option{false}.
  192. @item format_code <FourCC>
  193. This sets the input video format to the format given by the FourCC. To see
  194. the supported values of your device(s) use @option{list_formats}.
  195. Note that there is a FourCC @option{'pal '} that can also be used
  196. as @option{pal} (3 letters).
  197. Default behavior is autodetection of the input video format, if the hardware
  198. supports it.
  199. @item bm_v210
  200. This is a deprecated option, you can use @option{raw_format} instead.
  201. If set to @samp{1}, video is captured in 10 bit v210 instead
  202. of uyvy422. Not all Blackmagic devices support this option.
  203. @item raw_format
  204. Set the pixel format of the captured video.
  205. Available values are:
  206. @table @samp
  207. @item uyvy422
  208. @item yuv422p10
  209. @item argb
  210. @item bgra
  211. @item rgb10
  212. @end table
  213. @item teletext_lines
  214. If set to nonzero, an additional teletext stream will be captured from the
  215. vertical ancillary data. Both SD PAL (576i) and HD (1080i or 1080p)
  216. sources are supported. In case of HD sources, OP47 packets are decoded.
  217. This option is a bitmask of the SD PAL VBI lines captured, specifically lines 6
  218. to 22, and lines 318 to 335. Line 6 is the LSB in the mask. Selected lines
  219. which do not contain teletext information will be ignored. You can use the
  220. special @option{all} constant to select all possible lines, or
  221. @option{standard} to skip lines 6, 318 and 319, which are not compatible with
  222. all receivers.
  223. For SD sources, ffmpeg needs to be compiled with @code{--enable-libzvbi}. For
  224. HD sources, on older (pre-4K) DeckLink card models you have to capture in 10
  225. bit mode.
  226. @item channels
  227. Defines number of audio channels to capture. Must be @samp{2}, @samp{8} or @samp{16}.
  228. Defaults to @samp{2}.
  229. @item duplex_mode
  230. Sets the decklink device duplex mode. Must be @samp{unset}, @samp{half} or @samp{full}.
  231. Defaults to @samp{unset}.
  232. @item timecode_format
  233. Timecode type to include in the frame and video stream metadata. Must be
  234. @samp{none}, @samp{rp188vitc}, @samp{rp188vitc2}, @samp{rp188ltc},
  235. @samp{rp188any}, @samp{vitc}, @samp{vitc2}, or @samp{serial}. Defaults to
  236. @samp{none} (not included).
  237. @item video_input
  238. Sets the video input source. Must be @samp{unset}, @samp{sdi}, @samp{hdmi},
  239. @samp{optical_sdi}, @samp{component}, @samp{composite} or @samp{s_video}.
  240. Defaults to @samp{unset}.
  241. @item audio_input
  242. Sets the audio input source. Must be @samp{unset}, @samp{embedded},
  243. @samp{aes_ebu}, @samp{analog}, @samp{analog_xlr}, @samp{analog_rca} or
  244. @samp{microphone}. Defaults to @samp{unset}.
  245. @item video_pts
  246. Sets the video packet timestamp source. Must be @samp{video}, @samp{audio},
  247. @samp{reference}, @samp{wallclock} or @samp{abs_wallclock}.
  248. Defaults to @samp{video}.
  249. @item audio_pts
  250. Sets the audio packet timestamp source. Must be @samp{video}, @samp{audio},
  251. @samp{reference}, @samp{wallclock} or @samp{abs_wallclock}.
  252. Defaults to @samp{audio}.
  253. @item draw_bars
  254. If set to @samp{true}, color bars are drawn in the event of a signal loss.
  255. Defaults to @samp{true}.
  256. @item queue_size
  257. Sets maximum input buffer size in bytes. If the buffering reaches this value,
  258. incoming frames will be dropped.
  259. Defaults to @samp{1073741824}.
  260. @item audio_depth
  261. Sets the audio sample bit depth. Must be @samp{16} or @samp{32}.
  262. Defaults to @samp{16}.
  263. @item decklink_copyts
  264. If set to @option{true}, timestamps are forwarded as they are without removing
  265. the initial offset.
  266. Defaults to @option{false}.
  267. @end table
  268. @subsection Examples
  269. @itemize
  270. @item
  271. List input devices:
  272. @example
  273. ffmpeg -f decklink -list_devices 1 -i dummy
  274. @end example
  275. @item
  276. List supported formats:
  277. @example
  278. ffmpeg -f decklink -list_formats 1 -i 'Intensity Pro'
  279. @end example
  280. @item
  281. Capture video clip at 1080i50:
  282. @example
  283. ffmpeg -format_code Hi50 -f decklink -i 'Intensity Pro' -c:a copy -c:v copy output.avi
  284. @end example
  285. @item
  286. Capture video clip at 1080i50 10 bit:
  287. @example
  288. ffmpeg -bm_v210 1 -format_code Hi50 -f decklink -i 'UltraStudio Mini Recorder' -c:a copy -c:v copy output.avi
  289. @end example
  290. @item
  291. Capture video clip at 1080i50 with 16 audio channels:
  292. @example
  293. ffmpeg -channels 16 -format_code Hi50 -f decklink -i 'UltraStudio Mini Recorder' -c:a copy -c:v copy output.avi
  294. @end example
  295. @end itemize
  296. @section dshow
  297. Windows DirectShow input device.
  298. DirectShow support is enabled when FFmpeg is built with the mingw-w64 project.
  299. Currently only audio and video devices are supported.
  300. Multiple devices may be opened as separate inputs, but they may also be
  301. opened on the same input, which should improve synchronism between them.
  302. The input name should be in the format:
  303. @example
  304. @var{TYPE}=@var{NAME}[:@var{TYPE}=@var{NAME}]
  305. @end example
  306. where @var{TYPE} can be either @var{audio} or @var{video},
  307. and @var{NAME} is the device's name or alternative name..
  308. @subsection Options
  309. If no options are specified, the device's defaults are used.
  310. If the device does not support the requested options, it will
  311. fail to open.
  312. @table @option
  313. @item video_size
  314. Set the video size in the captured video.
  315. @item framerate
  316. Set the frame rate in the captured video.
  317. @item sample_rate
  318. Set the sample rate (in Hz) of the captured audio.
  319. @item sample_size
  320. Set the sample size (in bits) of the captured audio.
  321. @item channels
  322. Set the number of channels in the captured audio.
  323. @item list_devices
  324. If set to @option{true}, print a list of devices and exit.
  325. @item list_options
  326. If set to @option{true}, print a list of selected device's options
  327. and exit.
  328. @item video_device_number
  329. Set video device number for devices with the same name (starts at 0,
  330. defaults to 0).
  331. @item audio_device_number
  332. Set audio device number for devices with the same name (starts at 0,
  333. defaults to 0).
  334. @item pixel_format
  335. Select pixel format to be used by DirectShow. This may only be set when
  336. the video codec is not set or set to rawvideo.
  337. @item audio_buffer_size
  338. Set audio device buffer size in milliseconds (which can directly
  339. impact latency, depending on the device).
  340. Defaults to using the audio device's
  341. default buffer size (typically some multiple of 500ms).
  342. Setting this value too low can degrade performance.
  343. See also
  344. @url{http://msdn.microsoft.com/en-us/library/windows/desktop/dd377582(v=vs.85).aspx}
  345. @item video_pin_name
  346. Select video capture pin to use by name or alternative name.
  347. @item audio_pin_name
  348. Select audio capture pin to use by name or alternative name.
  349. @item crossbar_video_input_pin_number
  350. Select video input pin number for crossbar device. This will be
  351. routed to the crossbar device's Video Decoder output pin.
  352. Note that changing this value can affect future invocations
  353. (sets a new default) until system reboot occurs.
  354. @item crossbar_audio_input_pin_number
  355. Select audio input pin number for crossbar device. This will be
  356. routed to the crossbar device's Audio Decoder output pin.
  357. Note that changing this value can affect future invocations
  358. (sets a new default) until system reboot occurs.
  359. @item show_video_device_dialog
  360. If set to @option{true}, before capture starts, popup a display dialog
  361. to the end user, allowing them to change video filter properties
  362. and configurations manually.
  363. Note that for crossbar devices, adjusting values in this dialog
  364. may be needed at times to toggle between PAL (25 fps) and NTSC (29.97)
  365. input frame rates, sizes, interlacing, etc. Changing these values can
  366. enable different scan rates/frame rates and avoiding green bars at
  367. the bottom, flickering scan lines, etc.
  368. Note that with some devices, changing these properties can also affect future
  369. invocations (sets new defaults) until system reboot occurs.
  370. @item show_audio_device_dialog
  371. If set to @option{true}, before capture starts, popup a display dialog
  372. to the end user, allowing them to change audio filter properties
  373. and configurations manually.
  374. @item show_video_crossbar_connection_dialog
  375. If set to @option{true}, before capture starts, popup a display
  376. dialog to the end user, allowing them to manually
  377. modify crossbar pin routings, when it opens a video device.
  378. @item show_audio_crossbar_connection_dialog
  379. If set to @option{true}, before capture starts, popup a display
  380. dialog to the end user, allowing them to manually
  381. modify crossbar pin routings, when it opens an audio device.
  382. @item show_analog_tv_tuner_dialog
  383. If set to @option{true}, before capture starts, popup a display
  384. dialog to the end user, allowing them to manually
  385. modify TV channels and frequencies.
  386. @item show_analog_tv_tuner_audio_dialog
  387. If set to @option{true}, before capture starts, popup a display
  388. dialog to the end user, allowing them to manually
  389. modify TV audio (like mono vs. stereo, Language A,B or C).
  390. @item audio_device_load
  391. Load an audio capture filter device from file instead of searching
  392. it by name. It may load additional parameters too, if the filter
  393. supports the serialization of its properties to.
  394. To use this an audio capture source has to be specified, but it can
  395. be anything even fake one.
  396. @item audio_device_save
  397. Save the currently used audio capture filter device and its
  398. parameters (if the filter supports it) to a file.
  399. If a file with the same name exists it will be overwritten.
  400. @item video_device_load
  401. Load a video capture filter device from file instead of searching
  402. it by name. It may load additional parameters too, if the filter
  403. supports the serialization of its properties to.
  404. To use this a video capture source has to be specified, but it can
  405. be anything even fake one.
  406. @item video_device_save
  407. Save the currently used video capture filter device and its
  408. parameters (if the filter supports it) to a file.
  409. If a file with the same name exists it will be overwritten.
  410. @end table
  411. @subsection Examples
  412. @itemize
  413. @item
  414. Print the list of DirectShow supported devices and exit:
  415. @example
  416. $ ffmpeg -list_devices true -f dshow -i dummy
  417. @end example
  418. @item
  419. Open video device @var{Camera}:
  420. @example
  421. $ ffmpeg -f dshow -i video="Camera"
  422. @end example
  423. @item
  424. Open second video device with name @var{Camera}:
  425. @example
  426. $ ffmpeg -f dshow -video_device_number 1 -i video="Camera"
  427. @end example
  428. @item
  429. Open video device @var{Camera} and audio device @var{Microphone}:
  430. @example
  431. $ ffmpeg -f dshow -i video="Camera":audio="Microphone"
  432. @end example
  433. @item
  434. Print the list of supported options in selected device and exit:
  435. @example
  436. $ ffmpeg -list_options true -f dshow -i video="Camera"
  437. @end example
  438. @item
  439. Specify pin names to capture by name or alternative name, specify alternative device name:
  440. @example
  441. $ ffmpeg -f dshow -audio_pin_name "Audio Out" -video_pin_name 2 -i video=video="@@device_pnp_\\?\pci#ven_1a0a&dev_6200&subsys_62021461&rev_01#4&e2c7dd6&0&00e1#@{65e8773d-8f56-11d0-a3b9-00a0c9223196@}\@{ca465100-deb0-4d59-818f-8c477184adf6@}":audio="Microphone"
  442. @end example
  443. @item
  444. Configure a crossbar device, specifying crossbar pins, allow user to adjust video capture properties at startup:
  445. @example
  446. $ ffmpeg -f dshow -show_video_device_dialog true -crossbar_video_input_pin_number 0
  447. -crossbar_audio_input_pin_number 3 -i video="AVerMedia BDA Analog Capture":audio="AVerMedia BDA Analog Capture"
  448. @end example
  449. @end itemize
  450. @section fbdev
  451. Linux framebuffer input device.
  452. The Linux framebuffer is a graphic hardware-independent abstraction
  453. layer to show graphics on a computer monitor, typically on the
  454. console. It is accessed through a file device node, usually
  455. @file{/dev/fb0}.
  456. For more detailed information read the file
  457. Documentation/fb/framebuffer.txt included in the Linux source tree.
  458. See also @url{http://linux-fbdev.sourceforge.net/}, and fbset(1).
  459. To record from the framebuffer device @file{/dev/fb0} with
  460. @command{ffmpeg}:
  461. @example
  462. ffmpeg -f fbdev -framerate 10 -i /dev/fb0 out.avi
  463. @end example
  464. You can take a single screenshot image with the command:
  465. @example
  466. ffmpeg -f fbdev -framerate 1 -i /dev/fb0 -frames:v 1 screenshot.jpeg
  467. @end example
  468. @subsection Options
  469. @table @option
  470. @item framerate
  471. Set the frame rate. Default is 25.
  472. @end table
  473. @section gdigrab
  474. Win32 GDI-based screen capture device.
  475. This device allows you to capture a region of the display on Windows.
  476. There are two options for the input filename:
  477. @example
  478. desktop
  479. @end example
  480. or
  481. @example
  482. title=@var{window_title}
  483. @end example
  484. The first option will capture the entire desktop, or a fixed region of the
  485. desktop. The second option will instead capture the contents of a single
  486. window, regardless of its position on the screen.
  487. For example, to grab the entire desktop using @command{ffmpeg}:
  488. @example
  489. ffmpeg -f gdigrab -framerate 6 -i desktop out.mpg
  490. @end example
  491. Grab a 640x480 region at position @code{10,20}:
  492. @example
  493. ffmpeg -f gdigrab -framerate 6 -offset_x 10 -offset_y 20 -video_size vga -i desktop out.mpg
  494. @end example
  495. Grab the contents of the window named "Calculator"
  496. @example
  497. ffmpeg -f gdigrab -framerate 6 -i title=Calculator out.mpg
  498. @end example
  499. @subsection Options
  500. @table @option
  501. @item draw_mouse
  502. Specify whether to draw the mouse pointer. Use the value @code{0} to
  503. not draw the pointer. Default value is @code{1}.
  504. @item framerate
  505. Set the grabbing frame rate. Default value is @code{ntsc},
  506. corresponding to a frame rate of @code{30000/1001}.
  507. @item show_region
  508. Show grabbed region on screen.
  509. If @var{show_region} is specified with @code{1}, then the grabbing
  510. region will be indicated on screen. With this option, it is easy to
  511. know what is being grabbed if only a portion of the screen is grabbed.
  512. Note that @var{show_region} is incompatible with grabbing the contents
  513. of a single window.
  514. For example:
  515. @example
  516. ffmpeg -f gdigrab -show_region 1 -framerate 6 -video_size cif -offset_x 10 -offset_y 20 -i desktop out.mpg
  517. @end example
  518. @item video_size
  519. Set the video frame size. The default is to capture the full screen if @file{desktop} is selected, or the full window size if @file{title=@var{window_title}} is selected.
  520. @item offset_x
  521. When capturing a region with @var{video_size}, set the distance from the left edge of the screen or desktop.
  522. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned to the left of your primary monitor, you will need to use a negative @var{offset_x} value to move the region to that monitor.
  523. @item offset_y
  524. When capturing a region with @var{video_size}, set the distance from the top edge of the screen or desktop.
  525. Note that the offset calculation is from the top left corner of the primary monitor on Windows. If you have a monitor positioned above your primary monitor, you will need to use a negative @var{offset_y} value to move the region to that monitor.
  526. @end table
  527. @section iec61883
  528. FireWire DV/HDV input device using libiec61883.
  529. To enable this input device, you need libiec61883, libraw1394 and
  530. libavc1394 installed on your system. Use the configure option
  531. @code{--enable-libiec61883} to compile with the device enabled.
  532. The iec61883 capture device supports capturing from a video device
  533. connected via IEEE1394 (FireWire), using libiec61883 and the new Linux
  534. FireWire stack (juju). This is the default DV/HDV input method in Linux
  535. Kernel 2.6.37 and later, since the old FireWire stack was removed.
  536. Specify the FireWire port to be used as input file, or "auto"
  537. to choose the first port connected.
  538. @subsection Options
  539. @table @option
  540. @item dvtype
  541. Override autodetection of DV/HDV. This should only be used if auto
  542. detection does not work, or if usage of a different device type
  543. should be prohibited. Treating a DV device as HDV (or vice versa) will
  544. not work and result in undefined behavior.
  545. The values @option{auto}, @option{dv} and @option{hdv} are supported.
  546. @item dvbuffer
  547. Set maximum size of buffer for incoming data, in frames. For DV, this
  548. is an exact value. For HDV, it is not frame exact, since HDV does
  549. not have a fixed frame size.
  550. @item dvguid
  551. Select the capture device by specifying its GUID. Capturing will only
  552. be performed from the specified device and fails if no device with the
  553. given GUID is found. This is useful to select the input if multiple
  554. devices are connected at the same time.
  555. Look at /sys/bus/firewire/devices to find out the GUIDs.
  556. @end table
  557. @subsection Examples
  558. @itemize
  559. @item
  560. Grab and show the input of a FireWire DV/HDV device.
  561. @example
  562. ffplay -f iec61883 -i auto
  563. @end example
  564. @item
  565. Grab and record the input of a FireWire DV/HDV device,
  566. using a packet buffer of 100000 packets if the source is HDV.
  567. @example
  568. ffmpeg -f iec61883 -i auto -hdvbuffer 100000 out.mpg
  569. @end example
  570. @end itemize
  571. @section jack
  572. JACK input device.
  573. To enable this input device during configuration you need libjack
  574. installed on your system.
  575. A JACK input device creates one or more JACK writable clients, one for
  576. each audio channel, with name @var{client_name}:input_@var{N}, where
  577. @var{client_name} is the name provided by the application, and @var{N}
  578. is a number which identifies the channel.
  579. Each writable client will send the acquired data to the FFmpeg input
  580. device.
  581. Once you have created one or more JACK readable clients, you need to
  582. connect them to one or more JACK writable clients.
  583. To connect or disconnect JACK clients you can use the @command{jack_connect}
  584. and @command{jack_disconnect} programs, or do it through a graphical interface,
  585. for example with @command{qjackctl}.
  586. To list the JACK clients and their properties you can invoke the command
  587. @command{jack_lsp}.
  588. Follows an example which shows how to capture a JACK readable client
  589. with @command{ffmpeg}.
  590. @example
  591. # Create a JACK writable client with name "ffmpeg".
  592. $ ffmpeg -f jack -i ffmpeg -y out.wav
  593. # Start the sample jack_metro readable client.
  594. $ jack_metro -b 120 -d 0.2 -f 4000
  595. # List the current JACK clients.
  596. $ jack_lsp -c
  597. system:capture_1
  598. system:capture_2
  599. system:playback_1
  600. system:playback_2
  601. ffmpeg:input_1
  602. metro:120_bpm
  603. # Connect metro to the ffmpeg writable client.
  604. $ jack_connect metro:120_bpm ffmpeg:input_1
  605. @end example
  606. For more information read:
  607. @url{http://jackaudio.org/}
  608. @subsection Options
  609. @table @option
  610. @item channels
  611. Set the number of channels. Default is 2.
  612. @end table
  613. @section kmsgrab
  614. KMS video input device.
  615. Captures the KMS scanout framebuffer associated with a specified CRTC or plane as a
  616. DRM object that can be passed to other hardware functions.
  617. Requires either DRM master or CAP_SYS_ADMIN to run.
  618. If you don't understand what all of that means, you probably don't want this. Look at
  619. @option{x11grab} instead.
  620. @subsection Options
  621. @table @option
  622. @item device
  623. DRM device to capture on. Defaults to @option{/dev/dri/card0}.
  624. @item format
  625. Pixel format of the framebuffer. Defaults to @option{bgr0}.
  626. @item format_modifier
  627. Format modifier to signal on output frames. This is necessary to import correctly into
  628. some APIs, but can't be autodetected. See the libdrm documentation for possible values.
  629. @item crtc_id
  630. KMS CRTC ID to define the capture source. The first active plane on the given CRTC
  631. will be used.
  632. @item plane_id
  633. KMS plane ID to define the capture source. Defaults to the first active plane found if
  634. neither @option{crtc_id} nor @option{plane_id} are specified.
  635. @item framerate
  636. Framerate to capture at. This is not synchronised to any page flipping or framebuffer
  637. changes - it just defines the interval at which the framebuffer is sampled. Sampling
  638. faster than the framebuffer update rate will generate independent frames with the same
  639. content. Defaults to @code{30}.
  640. @end table
  641. @subsection Examples
  642. @itemize
  643. @item
  644. Capture from the first active plane, download the result to normal frames and encode.
  645. This will only work if the framebuffer is both linear and mappable - if not, the result
  646. may be scrambled or fail to download.
  647. @example
  648. ffmpeg -f kmsgrab -i - -vf 'hwdownload,format=bgr0' output.mp4
  649. @end example
  650. @item
  651. Capture from CRTC ID 42 at 60fps, map the result to VAAPI, convert to NV12 and encode as H.264.
  652. @example
  653. ffmpeg -crtc_id 42 -framerate 60 -f kmsgrab -i - -vf 'hwmap=derive_device=vaapi,scale_vaapi=w=1920:h=1080:format=nv12' -c:v h264_vaapi output.mp4
  654. @end example
  655. @end itemize
  656. @section lavfi
  657. Libavfilter input virtual device.
  658. This input device reads data from the open output pads of a libavfilter
  659. filtergraph.
  660. For each filtergraph open output, the input device will create a
  661. corresponding stream which is mapped to the generated output. Currently
  662. only video data is supported. The filtergraph is specified through the
  663. option @option{graph}.
  664. @subsection Options
  665. @table @option
  666. @item graph
  667. Specify the filtergraph to use as input. Each video open output must be
  668. labelled by a unique string of the form "out@var{N}", where @var{N} is a
  669. number starting from 0 corresponding to the mapped input stream
  670. generated by the device.
  671. The first unlabelled output is automatically assigned to the "out0"
  672. label, but all the others need to be specified explicitly.
  673. The suffix "+subcc" can be appended to the output label to create an extra
  674. stream with the closed captions packets attached to that output
  675. (experimental; only for EIA-608 / CEA-708 for now).
  676. The subcc streams are created after all the normal streams, in the order of
  677. the corresponding stream.
  678. For example, if there is "out19+subcc", "out7+subcc" and up to "out42", the
  679. stream #43 is subcc for stream #7 and stream #44 is subcc for stream #19.
  680. If not specified defaults to the filename specified for the input
  681. device.
  682. @item graph_file
  683. Set the filename of the filtergraph to be read and sent to the other
  684. filters. Syntax of the filtergraph is the same as the one specified by
  685. the option @var{graph}.
  686. @item dumpgraph
  687. Dump graph to stderr.
  688. @end table
  689. @subsection Examples
  690. @itemize
  691. @item
  692. Create a color video stream and play it back with @command{ffplay}:
  693. @example
  694. ffplay -f lavfi -graph "color=c=pink [out0]" dummy
  695. @end example
  696. @item
  697. As the previous example, but use filename for specifying the graph
  698. description, and omit the "out0" label:
  699. @example
  700. ffplay -f lavfi color=c=pink
  701. @end example
  702. @item
  703. Create three different video test filtered sources and play them:
  704. @example
  705. ffplay -f lavfi -graph "testsrc [out0]; testsrc,hflip [out1]; testsrc,negate [out2]" test3
  706. @end example
  707. @item
  708. Read an audio stream from a file using the amovie source and play it
  709. back with @command{ffplay}:
  710. @example
  711. ffplay -f lavfi "amovie=test.wav"
  712. @end example
  713. @item
  714. Read an audio stream and a video stream and play it back with
  715. @command{ffplay}:
  716. @example
  717. ffplay -f lavfi "movie=test.avi[out0];amovie=test.wav[out1]"
  718. @end example
  719. @item
  720. Dump decoded frames to images and closed captions to a file (experimental):
  721. @example
  722. ffmpeg -f lavfi -i "movie=test.ts[out0+subcc]" -map v frame%08d.png -map s -c copy -f rawvideo subcc.bin
  723. @end example
  724. @end itemize
  725. @section libcdio
  726. Audio-CD input device based on libcdio.
  727. To enable this input device during configuration you need libcdio
  728. installed on your system. It requires the configure option
  729. @code{--enable-libcdio}.
  730. This device allows playing and grabbing from an Audio-CD.
  731. For example to copy with @command{ffmpeg} the entire Audio-CD in @file{/dev/sr0},
  732. you may run the command:
  733. @example
  734. ffmpeg -f libcdio -i /dev/sr0 cd.wav
  735. @end example
  736. @subsection Options
  737. @table @option
  738. @item speed
  739. Set drive reading speed. Default value is 0.
  740. The speed is specified CD-ROM speed units. The speed is set through
  741. the libcdio @code{cdio_cddap_speed_set} function. On many CD-ROM
  742. drives, specifying a value too large will result in using the fastest
  743. speed.
  744. @item paranoia_mode
  745. Set paranoia recovery mode flags. It accepts one of the following values:
  746. @table @samp
  747. @item disable
  748. @item verify
  749. @item overlap
  750. @item neverskip
  751. @item full
  752. @end table
  753. Default value is @samp{disable}.
  754. For more information about the available recovery modes, consult the
  755. paranoia project documentation.
  756. @end table
  757. @section libdc1394
  758. IIDC1394 input device, based on libdc1394 and libraw1394.
  759. Requires the configure option @code{--enable-libdc1394}.
  760. @section libndi_newtek
  761. The libndi_newtek input device provides capture capabilities for using NDI (Network
  762. Device Interface, standard created by NewTek).
  763. Input filename is a NDI source name that could be found by sending -find_sources 1
  764. to command line - it has no specific syntax but human-readable formatted.
  765. To enable this input device, you need the NDI SDK and you
  766. need to configure with the appropriate @code{--extra-cflags}
  767. and @code{--extra-ldflags}.
  768. @subsection Options
  769. @table @option
  770. @item find_sources
  771. If set to @option{true}, print a list of found/available NDI sources and exit.
  772. Defaults to @option{false}.
  773. @item wait_sources
  774. Override time to wait until the number of online sources have changed.
  775. Defaults to @option{0.5}.
  776. @item allow_video_fields
  777. When this flag is @option{false}, all video that you receive will be progressive.
  778. Defaults to @option{true}.
  779. @end table
  780. @subsection Examples
  781. @itemize
  782. @item
  783. List input devices:
  784. @example
  785. ffmpeg -f libndi_newtek -find_sources 1 -i dummy
  786. @end example
  787. @item
  788. Restream to NDI:
  789. @example
  790. ffmpeg -f libndi_newtek -i "DEV-5.INTERNAL.M1STEREO.TV (NDI_SOURCE_NAME_1)" -f libndi_newtek -y NDI_SOURCE_NAME_2
  791. @end example
  792. @end itemize
  793. @section openal
  794. The OpenAL input device provides audio capture on all systems with a
  795. working OpenAL 1.1 implementation.
  796. To enable this input device during configuration, you need OpenAL
  797. headers and libraries installed on your system, and need to configure
  798. FFmpeg with @code{--enable-openal}.
  799. OpenAL headers and libraries should be provided as part of your OpenAL
  800. implementation, or as an additional download (an SDK). Depending on your
  801. installation you may need to specify additional flags via the
  802. @code{--extra-cflags} and @code{--extra-ldflags} for allowing the build
  803. system to locate the OpenAL headers and libraries.
  804. An incomplete list of OpenAL implementations follows:
  805. @table @strong
  806. @item Creative
  807. The official Windows implementation, providing hardware acceleration
  808. with supported devices and software fallback.
  809. See @url{http://openal.org/}.
  810. @item OpenAL Soft
  811. Portable, open source (LGPL) software implementation. Includes
  812. backends for the most common sound APIs on the Windows, Linux,
  813. Solaris, and BSD operating systems.
  814. See @url{http://kcat.strangesoft.net/openal.html}.
  815. @item Apple
  816. OpenAL is part of Core Audio, the official Mac OS X Audio interface.
  817. See @url{http://developer.apple.com/technologies/mac/audio-and-video.html}
  818. @end table
  819. This device allows one to capture from an audio input device handled
  820. through OpenAL.
  821. You need to specify the name of the device to capture in the provided
  822. filename. If the empty string is provided, the device will
  823. automatically select the default device. You can get the list of the
  824. supported devices by using the option @var{list_devices}.
  825. @subsection Options
  826. @table @option
  827. @item channels
  828. Set the number of channels in the captured audio. Only the values
  829. @option{1} (monaural) and @option{2} (stereo) are currently supported.
  830. Defaults to @option{2}.
  831. @item sample_size
  832. Set the sample size (in bits) of the captured audio. Only the values
  833. @option{8} and @option{16} are currently supported. Defaults to
  834. @option{16}.
  835. @item sample_rate
  836. Set the sample rate (in Hz) of the captured audio.
  837. Defaults to @option{44.1k}.
  838. @item list_devices
  839. If set to @option{true}, print a list of devices and exit.
  840. Defaults to @option{false}.
  841. @end table
  842. @subsection Examples
  843. Print the list of OpenAL supported devices and exit:
  844. @example
  845. $ ffmpeg -list_devices true -f openal -i dummy out.ogg
  846. @end example
  847. Capture from the OpenAL device @file{DR-BT101 via PulseAudio}:
  848. @example
  849. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out.ogg
  850. @end example
  851. Capture from the default device (note the empty string '' as filename):
  852. @example
  853. $ ffmpeg -f openal -i '' out.ogg
  854. @end example
  855. Capture from two devices simultaneously, writing to two different files,
  856. within the same @command{ffmpeg} command:
  857. @example
  858. $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out1.ogg -f openal -i 'ALSA Default' out2.ogg
  859. @end example
  860. Note: not all OpenAL implementations support multiple simultaneous capture -
  861. try the latest OpenAL Soft if the above does not work.
  862. @section oss
  863. Open Sound System input device.
  864. The filename to provide to the input device is the device node
  865. representing the OSS input device, and is usually set to
  866. @file{/dev/dsp}.
  867. For example to grab from @file{/dev/dsp} using @command{ffmpeg} use the
  868. command:
  869. @example
  870. ffmpeg -f oss -i /dev/dsp /tmp/oss.wav
  871. @end example
  872. For more information about OSS see:
  873. @url{http://manuals.opensound.com/usersguide/dsp.html}
  874. @subsection Options
  875. @table @option
  876. @item sample_rate
  877. Set the sample rate in Hz. Default is 48000.
  878. @item channels
  879. Set the number of channels. Default is 2.
  880. @end table
  881. @section pulse
  882. PulseAudio input device.
  883. To enable this output device you need to configure FFmpeg with @code{--enable-libpulse}.
  884. The filename to provide to the input device is a source device or the
  885. string "default"
  886. To list the PulseAudio source devices and their properties you can invoke
  887. the command @command{pactl list sources}.
  888. More information about PulseAudio can be found on @url{http://www.pulseaudio.org}.
  889. @subsection Options
  890. @table @option
  891. @item server
  892. Connect to a specific PulseAudio server, specified by an IP address.
  893. Default server is used when not provided.
  894. @item name
  895. Specify the application name PulseAudio will use when showing active clients,
  896. by default it is the @code{LIBAVFORMAT_IDENT} string.
  897. @item stream_name
  898. Specify the stream name PulseAudio will use when showing active streams,
  899. by default it is "record".
  900. @item sample_rate
  901. Specify the samplerate in Hz, by default 48kHz is used.
  902. @item channels
  903. Specify the channels in use, by default 2 (stereo) is set.
  904. @item frame_size
  905. Specify the number of bytes per frame, by default it is set to 1024.
  906. @item fragment_size
  907. Specify the minimal buffering fragment in PulseAudio, it will affect the
  908. audio latency. By default it is unset.
  909. @item wallclock
  910. Set the initial PTS using the current time. Default is 1.
  911. @end table
  912. @subsection Examples
  913. Record a stream from default device:
  914. @example
  915. ffmpeg -f pulse -i default /tmp/pulse.wav
  916. @end example
  917. @section sndio
  918. sndio input device.
  919. To enable this input device during configuration you need libsndio
  920. installed on your system.
  921. The filename to provide to the input device is the device node
  922. representing the sndio input device, and is usually set to
  923. @file{/dev/audio0}.
  924. For example to grab from @file{/dev/audio0} using @command{ffmpeg} use the
  925. command:
  926. @example
  927. ffmpeg -f sndio -i /dev/audio0 /tmp/oss.wav
  928. @end example
  929. @subsection Options
  930. @table @option
  931. @item sample_rate
  932. Set the sample rate in Hz. Default is 48000.
  933. @item channels
  934. Set the number of channels. Default is 2.
  935. @end table
  936. @section video4linux2, v4l2
  937. Video4Linux2 input video device.
  938. "v4l2" can be used as alias for "video4linux2".
  939. If FFmpeg is built with v4l-utils support (by using the
  940. @code{--enable-libv4l2} configure option), it is possible to use it with the
  941. @code{-use_libv4l2} input device option.
  942. The name of the device to grab is a file device node, usually Linux
  943. systems tend to automatically create such nodes when the device
  944. (e.g. an USB webcam) is plugged into the system, and has a name of the
  945. kind @file{/dev/video@var{N}}, where @var{N} is a number associated to
  946. the device.
  947. Video4Linux2 devices usually support a limited set of
  948. @var{width}x@var{height} sizes and frame rates. You can check which are
  949. supported using @command{-list_formats all} for Video4Linux2 devices.
  950. Some devices, like TV cards, support one or more standards. It is possible
  951. to list all the supported standards using @command{-list_standards all}.
  952. The time base for the timestamps is 1 microsecond. Depending on the kernel
  953. version and configuration, the timestamps may be derived from the real time
  954. clock (origin at the Unix Epoch) or the monotonic clock (origin usually at
  955. boot time, unaffected by NTP or manual changes to the clock). The
  956. @option{-timestamps abs} or @option{-ts abs} option can be used to force
  957. conversion into the real time clock.
  958. Some usage examples of the video4linux2 device with @command{ffmpeg}
  959. and @command{ffplay}:
  960. @itemize
  961. @item
  962. List supported formats for a video4linux2 device:
  963. @example
  964. ffplay -f video4linux2 -list_formats all /dev/video0
  965. @end example
  966. @item
  967. Grab and show the input of a video4linux2 device:
  968. @example
  969. ffplay -f video4linux2 -framerate 30 -video_size hd720 /dev/video0
  970. @end example
  971. @item
  972. Grab and record the input of a video4linux2 device, leave the
  973. frame rate and size as previously set:
  974. @example
  975. ffmpeg -f video4linux2 -input_format mjpeg -i /dev/video0 out.mpeg
  976. @end example
  977. @end itemize
  978. For more information about Video4Linux, check @url{http://linuxtv.org/}.
  979. @subsection Options
  980. @table @option
  981. @item standard
  982. Set the standard. Must be the name of a supported standard. To get a
  983. list of the supported standards, use the @option{list_standards}
  984. option.
  985. @item channel
  986. Set the input channel number. Default to -1, which means using the
  987. previously selected channel.
  988. @item video_size
  989. Set the video frame size. The argument must be a string in the form
  990. @var{WIDTH}x@var{HEIGHT} or a valid size abbreviation.
  991. @item pixel_format
  992. Select the pixel format (only valid for raw video input).
  993. @item input_format
  994. Set the preferred pixel format (for raw video) or a codec name.
  995. This option allows one to select the input format, when several are
  996. available.
  997. @item framerate
  998. Set the preferred video frame rate.
  999. @item list_formats
  1000. List available formats (supported pixel formats, codecs, and frame
  1001. sizes) and exit.
  1002. Available values are:
  1003. @table @samp
  1004. @item all
  1005. Show all available (compressed and non-compressed) formats.
  1006. @item raw
  1007. Show only raw video (non-compressed) formats.
  1008. @item compressed
  1009. Show only compressed formats.
  1010. @end table
  1011. @item list_standards
  1012. List supported standards and exit.
  1013. Available values are:
  1014. @table @samp
  1015. @item all
  1016. Show all supported standards.
  1017. @end table
  1018. @item timestamps, ts
  1019. Set type of timestamps for grabbed frames.
  1020. Available values are:
  1021. @table @samp
  1022. @item default
  1023. Use timestamps from the kernel.
  1024. @item abs
  1025. Use absolute timestamps (wall clock).
  1026. @item mono2abs
  1027. Force conversion from monotonic to absolute timestamps.
  1028. @end table
  1029. Default value is @code{default}.
  1030. @item use_libv4l2
  1031. Use libv4l2 (v4l-utils) conversion functions. Default is 0.
  1032. @end table
  1033. @section vfwcap
  1034. VfW (Video for Windows) capture input device.
  1035. The filename passed as input is the capture driver number, ranging from
  1036. 0 to 9. You may use "list" as filename to print a list of drivers. Any
  1037. other filename will be interpreted as device number 0.
  1038. @subsection Options
  1039. @table @option
  1040. @item video_size
  1041. Set the video frame size.
  1042. @item framerate
  1043. Set the grabbing frame rate. Default value is @code{ntsc},
  1044. corresponding to a frame rate of @code{30000/1001}.
  1045. @end table
  1046. @section x11grab
  1047. X11 video input device.
  1048. To enable this input device during configuration you need libxcb
  1049. installed on your system. It will be automatically detected during
  1050. configuration.
  1051. This device allows one to capture a region of an X11 display.
  1052. The filename passed as input has the syntax:
  1053. @example
  1054. [@var{hostname}]:@var{display_number}.@var{screen_number}[+@var{x_offset},@var{y_offset}]
  1055. @end example
  1056. @var{hostname}:@var{display_number}.@var{screen_number} specifies the
  1057. X11 display name of the screen to grab from. @var{hostname} can be
  1058. omitted, and defaults to "localhost". The environment variable
  1059. @env{DISPLAY} contains the default display name.
  1060. @var{x_offset} and @var{y_offset} specify the offsets of the grabbed
  1061. area with respect to the top-left border of the X11 screen. They
  1062. default to 0.
  1063. Check the X11 documentation (e.g. @command{man X}) for more detailed
  1064. information.
  1065. Use the @command{xdpyinfo} program for getting basic information about
  1066. the properties of your X11 display (e.g. grep for "name" or
  1067. "dimensions").
  1068. For example to grab from @file{:0.0} using @command{ffmpeg}:
  1069. @example
  1070. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0 out.mpg
  1071. @end example
  1072. Grab at position @code{10,20}:
  1073. @example
  1074. ffmpeg -f x11grab -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  1075. @end example
  1076. @subsection Options
  1077. @table @option
  1078. @item draw_mouse
  1079. Specify whether to draw the mouse pointer. A value of @code{0} specifies
  1080. not to draw the pointer. Default value is @code{1}.
  1081. @item follow_mouse
  1082. Make the grabbed area follow the mouse. The argument can be
  1083. @code{centered} or a number of pixels @var{PIXELS}.
  1084. When it is specified with "centered", the grabbing region follows the mouse
  1085. pointer and keeps the pointer at the center of region; otherwise, the region
  1086. follows only when the mouse pointer reaches within @var{PIXELS} (greater than
  1087. zero) to the edge of region.
  1088. For example:
  1089. @example
  1090. ffmpeg -f x11grab -follow_mouse centered -framerate 25 -video_size cif -i :0.0 out.mpg
  1091. @end example
  1092. To follow only when the mouse pointer reaches within 100 pixels to edge:
  1093. @example
  1094. ffmpeg -f x11grab -follow_mouse 100 -framerate 25 -video_size cif -i :0.0 out.mpg
  1095. @end example
  1096. @item framerate
  1097. Set the grabbing frame rate. Default value is @code{ntsc},
  1098. corresponding to a frame rate of @code{30000/1001}.
  1099. @item show_region
  1100. Show grabbed region on screen.
  1101. If @var{show_region} is specified with @code{1}, then the grabbing
  1102. region will be indicated on screen. With this option, it is easy to
  1103. know what is being grabbed if only a portion of the screen is grabbed.
  1104. @item region_border
  1105. Set the region border thickness if @option{-show_region 1} is used.
  1106. Range is 1 to 128 and default is 3 (XCB-based x11grab only).
  1107. For example:
  1108. @example
  1109. ffmpeg -f x11grab -show_region 1 -framerate 25 -video_size cif -i :0.0+10,20 out.mpg
  1110. @end example
  1111. With @var{follow_mouse}:
  1112. @example
  1113. ffmpeg -f x11grab -follow_mouse centered -show_region 1 -framerate 25 -video_size cif -i :0.0 out.mpg
  1114. @end example
  1115. @item video_size
  1116. Set the video frame size. Default value is @code{vga}.
  1117. @item grab_x
  1118. @item grab_y
  1119. Set the grabbing region coordinates. They are expressed as offset from
  1120. the top left corner of the X11 window and correspond to the
  1121. @var{x_offset} and @var{y_offset} parameters in the device name. The
  1122. default value for both options is 0.
  1123. @end table
  1124. @c man end INPUT DEVICES