cancel
Showing results for 
Search instead for 
Did you mean: 

Hello, We have an issue with QML multimedia to display camera stream. The result is not smooth while with Qt widgets it works fine. Gstreamer pipeline from command line works as well.

CCour.1
Associate II

The hardware is a STM32MP157C SoC which provides a Vivante GCnano as GPU. Everything is compiled with our specific Yocto layer along with OpenSTLinux 2.0 release. So Qt release is 5.14.2 and GSTreamer release is 1.16. The camera output is VGA in UYVY format and the screen resolution is 480x800. Everything was tested in Weston session or EGLFS without differences.

GStreamer pipeline from command line is:

gst-launch-1.0 -v v4l2src ! "video/x-raw, width=640, height=480, framerate=(fraction)30/1, format=(string)UYVY" ! videoconvert ! videoscale ! video/x
-raw,width=480,height=800 ! kmssink driver-name=stm connector-id=32 force-modesetting=true

The result is smooth.

QML applications are started by default with this environment

QT_QPA_EGLFS_ALWAYS_SET_MODE=1 QT_QPA_EGLFS_INTEGRATION=eglfs_kms QT_QPA_PLATFORM=eglfs QT_QPA_EGLFS_KMS_ATOMIC=1

(because EGLFS is our reference).

We have this issue with official examples, declarative-camera (or qmlvideofx example) is displaying around 7-10 fps when camera widgets example is around 25-30 fps. We tried to analyze and to test some workarounds without understanding what is going on. And it does not seems normal to have good results in Qt widgets but not in QML.

We also try to play several video files (H264 in VGA or 720p) without issues with official examples for both methods: QML and Qt widgets. The problem is not related to video but camera stream only. And according to our monitoring with top and sysfs (for GPU), memory, GPU and CPU are not overused.

We tried to get more logs and according to renderer loop debug for QML example, a lot of frames are generated in 100-250 ms which seems to be the issue.

Firstly, we try in our application to use Camera -> VideoOutput directly. Camera item is also configured with those instructions:

   QCameraViewfinderSettings settings = camera->viewfinderSettings(); 
 
   settings.setPixelFormat(QVideoFrame::Format_UYVY); 
   settings.setResolution(640, 480); 
   settings.setMaximumFrameRate(30); 
   settings.setMinimumFrameRate(10); 
   camera->setViewfinderSettings(settings); 

Then using MediaPlayer instead, to be able to use custom pipeline feature:

source: "gst-pipeline: v4l2src ! video/x-raw,width=640,height=480 ! videoconvert ! qtvideosink"

but it does not work better.

Finally, we try to use qmlglsink in connection with GstGLVideoItem item. It works on the PC, but on the target it triggers a segmentation fault into gcoTEXTURE_GetMipMap function (related to OpenGL library provided by GCnano firmware) according to gdb backtrace. We don't have access to the source code so it is difficult to identify the issue and how to fix it.

Setting the Qt::AA_ShareOpenGLContexts value before starting QML engine does not change the results.

On official examples we tried also several workaround:

QT_QUICK_NO_TEXTURE_VIDEOFRAMES environment variable set to 1. No changes for QML examples.

QT_GSTREAMER_USE_OPENGL_PLUGIN environment variable set to 1. Camera generates an error, it can not stream (probably due to a required conversion). And BTW we tried to play a video with qmlvideofx example with this environment variable value and the video is not smooth. Probably due to uploads/downloads from/to GPU. So it is probably not relevant to continue in this direction.

We tried to change QSG_RENDER_LOOP environment variable, by default it seems threaded is used, but other values don't improve the situation.

Playing with QT_GSTREAMER_WINDOW_VIDEOSINK or QT_GSTREAMER_WIDGET_VIDEOSINK seemed useless in our case also.

So, the hardware allow us to do what we want according to Qt widgets and Gstreamer results. It seems to be a bug to have this big difference between QML and Qt widgets for this kind of tasks. And we don't see how to fix it in QtMultimedia code in fact. And it is not normal that qmlglsink generates a crash into GPU firmware.

Do you have any suggestions?

6 REPLIES 6
Olivier GALLIEN
ST Employee

Hi @CCour.1​ ,

Sorry for late reply on this topic.

Some improvement have been done around QT integration in our latest delivery V2.1 and we have publish this wiki page :

https://wiki.st.com/stm32mpu/wiki/How_to_build_and_use_an_SDK_for_QT

Could you please have a look and make a try ?

Let us know then the status to resume investigation if needed.

Hope it help,

OIivier

Olivier GALLIEN
In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.
Olivier GALLIEN
ST Employee

Hi @CCour.1​ ,

I asked one expert to look at your case. Read below his feedback :

"The issue may be a bad combination of camera output color format vs gstreamer convert & sink vs qt “stuffs�?…

Maybe you can try to adjust the camera output color format, thanks to the wiki page https://wiki.st.com/stm32mpu/wiki/V4L2_camera_overview#Set_the_pixel_format-2C_resolution_and_framerate . For instance, if the camera output color format is in YUV coplanar and if the pipeline needs to “incrust information�? (fps) on top of the buffers, then there are probably sw color conversions that could be easy to check with a high cpu load.

The issue may be also a badly programmed camera output resolution (too big), please check the gstreamer pipeline inside the application.

Note: it could be nice to have more debug inputs with following information while the use case is running:

Display framerate: https://wiki.st.com/stm32mpu/wiki/How_to_monitor_the_display_framerate

gpu load: https://wiki.st.com/stm32mpu/wiki/How_to_monitor_the_GCNANO_GPU_load

perf top: https://wiki.st.com/stm32mpu/wiki/Perf

cpu load with mpstat: https://wiki.st.com/stm32mpu/wiki/Sysstat_tool_suite#Using_mpstat

And just to be sure, could you please ask for the confirmation that:

  1. the issue is only with the camera ov5640 (ie. no issue with the video playback? This is my understanding of “The problem is not related to video but camera stream only�?)?
  2. There is no issue with the default wayland and the following command for a fullscreen preview (here)

"

Hope it help to progress.

Waiting for your feedback.

Olivier

Olivier GALLIEN
In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.

I'm working with camera ov7690 with my custom driver which does not support RGB colorspace well for the moment so I can't perform this test for the moment.

So to answer your questions:

  1. Yes, video playback works fine. I tested different video files:
    1. 640x480 at 935 kb/s, 25 FPS, AVC format in MP4 container
    2. 560x320 at 466 kb/s, 30 FPS, AVC format in MP4 container
    3. 640x360 at 731 kb/s, 30 FPS, AVC format in MP4 container
    4. 848x464 at 1 082 kb/s, 30 FPS, AVC format in MP4 container
  2. Yes Wayland in fullscreen to display direct GStreamer pipeline works fine. But it works also for Qt widget demo app (just QML app has this issue !)

In attachments, the logs that you wanted. The command line for the application is in the logs. Just to sum up:

  1. Gstreamer: smootly,
  2. camera (Qt widget camera demo): smootly
  3. declarative-camera (Qt QML camera demo): with the issue

Apparently I can't upload several files for one message. So a new message for a new log file.

Apparently I can't upload several files for one message. So a new message for a new log file.

Olivier GALLIEN
ST Employee

Hi @CCour.1​ ,

Hereafter a 1st analysis ( from expert )

1) gstreamer.log

In your use case, the color conversion from UYVU to (probably) AR24 is performed in sw by gstreamer thanks to the videoconvert plugin (“orcexec.jvJcla�? task is linked to the ARM Neon SIMD used in the orc gstreamer package that is used in the videoconvert plugin).

The display framerate is “low�? but I think it is acceptable in your use case as you mentioned that it is “smooth�?.

The GPU is not used at all because the use case use gstreamer kmssink that is a direct access to the drm display chain, ie without using the gpu or any framework gpu-based composition.

2) camera-widget.log

In this use case, it seems that there are 2 sw color conversions: The first one is from the camera color output to ABGR (task “libQt5Gui.so.5.14.1�?), maybe performed by QT, the second is from ABGR to ARGB and is performed by the GPU… but in sw, that is strange (task “_UploadABGRtoARGB�?), maybe the texture input format is not properly aligned in memory or something else but the gpu needs to use a sw conversion before using it…

3) declarative-camera.log

The display fps looks similar to the 2 other configurations (~7fps). The gpu is used but not a lot, the cpu is low so the system is probably using its time waiting something…

- Do you have details regarding the related gstreamer pipeline used by the Qt QML API?

- Maybe the v4l2 camera in QML is not working or badly working (buffer copy, scaling...), could you please check the qt qml version?

- In gstreamer.log, you were using a cropping to be able to see the camera preview content on the kmssink, here in qml, is there scaling or cropping particular configurations?

- It could be nice to check if dmabuf camera output buffers are produced, thanks to these 2 wiki links

https://wiki.st.com/stm32mpu/wiki/V4L2_camera_overview#How_to_trace

https://wiki.st.com/stm32mpu/wiki/V4L2_camera_overview#How_to_debug

- Could you please provide the dmesg information during this use case?

Moreover:

- When you wrote “Yes, video playback works fine�?, are you testing videos with Qt QML (ie close to Qt QML camera demo) or in a different manner?

Olivier

Olivier GALLIEN
In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.