DART-MX8M CSI: Difference between revisions
Line 58: | Line 58: | ||
* QSXGA 2592x1944: | * QSXGA 2592x1944: | ||
# gst-launch-1.0 v4l2src device=/dev/videoX num-buffers=1 ! video/x-raw,width=2592,height=1944 ! jpegenc ! filesink location=/tmp/test.jpg | # gst-launch-1.0 v4l2src device=/dev/videoX num-buffers=1 ! video/x-raw,width=2592,height=1944 ! jpegenc ! filesink location=/tmp/test.jpg | ||
== Using both cameras simultaneously == | == Using both cameras simultaneously == |
Revision as of 17:36, 23 October 2018
Testing our MIPI cameras
Before running any of the pipelines below make sure you have camera(s) connected. i.MX family of processors uses gstreamer as multimedia framework. Please refer to https://gstreamer.freedesktop.org/
Camera Probe
Run the following command to see if you have got the camera detected
root@imx8m-var-dart:~# v4l2-ctl --list-devices
This should output below
root@imx8m-var-dart:~# v4l2-ctl --list-devices i.MX6S_CSI (platform:30a90000.csi1_bridge): /dev/video0 i.MX6S_CSI (platform:30b80000.csi2_bridge): /dev/video1
As you can see above there are 2 cameras accessible as /dev/video0 and /dev/video1
Testing Camera Preview on Display
In the examples below /dev/videoX is either /dev/video0 or /dev/video1
- 480P 640x480@30fps:
# gst-launch-1.0 v4l2src device=/dev/videoX ! video/x-raw,width=640,height=480 ! kmssink
- NTSC 720x480@30fps:
# gst-launch-1.0 v4l2src device=/dev/videoX ! video/x-raw,width=720,height=480 ! kmssink
- 720P 1280x720@30fps:
# gst-launch-1.0 v4l2src device=/dev/videoX ! video/x-raw,width=1280,height=720 ! kmssink
- 1080P: 1920x1080@30fps:
# gst-launch-1.0 v4l2src device=/dev/videoX ! video/x-raw,width=1920,height=1080 ! kmssink
- QSXGA 2592x1944@15fps:
# gst-launch-1.0 v4l2src device=/dev/videoX ! video/x-raw,width=2592,height=1944 ! kmssink
Testing Camera JPEG Snapshot
To capture JPEG snapshot run one of the pipelines below depending upon the required picture resolution.
- 480P 640x480:
# gst-launch-1.0 v4l2src device=/dev/videoX num-buffers=1 ! video/x-raw,width=640,height=480 ! jpegenc ! filesink location=/tmp/test.jpg
- NTSC 720x480:
# gst-launch-1.0 v4l2src device=/dev/videoX num-buffers=1 ! video/x-raw,width=720,height=480 ! jpegenc ! filesink location=/tmp/test.jpg
- 720P 1280x720:
# gst-launch-1.0 v4l2src device=/dev/videoX num-buffers=1 ! video/x-raw,width=1280,height=720 ! jpegenc ! filesink location=/tmp/test.jpg
- 1080P 1920x1080:
# gst-launch-1.0 v4l2src device=/dev/videoX num-buffers=1 ! video/x-raw,width=1920,height=1080 ! jpegenc ! filesink location=/tmp/test.jpg
- QSXGA 2592x1944:
# gst-launch-1.0 v4l2src device=/dev/videoX num-buffers=1 ! video/x-raw,width=2592,height=1944 ! jpegenc ! filesink location=/tmp/test.jpg
Using both cameras simultaneously
The following pipeline is an example of using both cameras simultaneously.
In this example a short stream from each camera is MJPEG encoded and saved to an AVI file.
# gst-launch-1.0 v4l2src device=/dev/video0 num-buffers=100 ! video/x-raw,width=1920,height=1080 ! jpegenc ! avimux ! filesink location=./test0.avi \ v4l2src device=/dev/video1 num-buffers=100 ! video/x-raw,width=1920,height=1080 ! jpegenc ! avimux ! filesink location=./test1.avi