-
Notifications
You must be signed in to change notification settings - Fork 8.1k
UVC Video encoder (zephyr,videoenc) + H264 support + STM32 STM32N6 conf files #97425
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
UVC Video encoder (zephyr,videoenc) + H264 support + STM32 STM32N6 conf files #97425
Conversation
Moving assignee to Video subsystem maintainer, as it seems more appropriate to the content of the RFC |
Adding a DNM flag for PRs dependencies. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this submission. It makes sense for me as it is, and AFAI the FIXME are for planning future infrastructure of the video area, not for this particular video sample
samples/subsys/usb/uvc/src/main.c
Outdated
ret = video_stream_start(video_dev, VIDEO_BUF_TYPE_OUTPUT); | ||
if (ret != 0) { | ||
LOG_ERR("Failed to start %s", video_dev->name); | ||
return ret; | ||
} | ||
ret = video_stream_start(uvc_src_dev, VIDEO_BUF_TYPE_INPUT); | ||
if (ret != 0) { | ||
LOG_ERR("Failed to start %s", uvc_src_dev->name); | ||
return ret; | ||
} | ||
#endif |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about using videoenc_dev
for both here?
Then if a videoscaler_dev
is introduced, this becomes another #if DT_HAS_CHOSEN(zephyr_videoscaler)
that uses the associated videoscaler_dev
for both input/output?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You have to find a better way to handle it, using another #if DT_HAS_CHOSEN()
is not acceptable to me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reworked
samples/subsys/usb/uvc/src/main.c
Outdated
#if DT_HAS_CHOSEN(zephyr_videoenc) | ||
vbuf_enc_in = &(struct video_buffer){.type = VIDEO_BUF_TYPE_OUTPUT}; | ||
|
||
if (video_dequeue(video_dev, &vbuf_enc_in, K_NO_WAIT) == 0) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This works well for this sample but this is where things will start to become difficult.
One way to solve this maybe is to have a configuration table at the top:
video_dst = video_dev;
prev = &video_dst;
#if DT_HAS_CHOSEN(zephyr_videoisp)
videoisp_src = *prev;
*prev = videoisp_dev;
prev = &videoisp_dst;
#endif
#if DT_HAS_CHOSEN(zephyr_videoscaler)
videoscaler_src = *prev;
*prev = videoscaler_dev;
prev = &videoscaler_dst;
#endif
#if DT_HAS_CHOSEN(zephyr_videoenc)
videoenc_src = *prev;
*prev = videoenc_dev;
prev = &videoenc_dst;
#endif
videousb_src = *prev;
*prev = videousb_dev;
Then in each block, it is possible to just use _src
and _dst
.
Of course a component-based system is better, but this would be a different sample for libMP. :]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is just for discussion and not necessarily to implement on this PR, thoough, as there is only one M2M device.
75eec12
to
3edf0b3
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I caught 2 small issues, and found a few things that you might be curious about but not causing issues.
Thanks, getting close!
CONFIG_VIDEO_BUFFER_POOL_NUM_MAX=2 | ||
CONFIG_VIDEO_BUFFER_POOL_SZ_MAX=24576 | ||
CONFIG_VIDEO_LOG_LEVEL_WRN=y | ||
CONFIG_VIDEO_ENCODER_H264=y |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see the only significant difference to prj.conf is CONFIG_VIDEO_ENCODER_H264=y
and
CONFIG_VIDEO_ENCODER_JPEG=y
, if so then document it and pass it to build command is the right way.
Also please add
sample.subsys.usb.uvc.encoder.h264:
depends_on:
- usbd
tags: usb video
extra_configs:
- CONFIG_VIDEO_ENCODER_H264=y
extra_args:
- EXTRA_DTC_OVERLAY_FILE="app_h264enc.overlay"
filter: dt_chosen_enabled("zephyr,camera")
integration_platforms:
- fancy_stm32n6_board
to the samples/subsys/usb/uvc/sample.yaml. And a similar entry for the jpeg.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tests via sample.yaml added. I didn't yet add the update documentation, I will add a commit for that.
2c36247
to
c36e8ce
Compare
c36e8ce
to
dd47e3c
Compare
dd47e3c
to
da92d84
Compare
/* Rough estimate for the worst case */ | ||
fmt->pitch = 0; | ||
fmt->size = fmt->width * fmt->height * 2; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Exactly the same code as in previous case, could be fallthrough ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I originally put this separately since in the end with better estimate that's most probably going to be different from JPEG but yes, since currently it is same maybe I could just have the two case together I guess.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, it's way too much for H.264 (equivalent to raw uncompressed image frame). I think for H.264 it depends a number of parameters like profile, bitrate, etc. which cannot be passed to this function (where we have only width / height)
Addition of a helper function which takes care of dequeue from a source video device and queue into a sink video device. Signed-off-by: Alain Volmat <[email protected]>
Replace video_dequeue / video_enqueue buffer exchange code by video_transfer_buffer helper function. Signed-off-by: Alain Volmat <[email protected]>
In preparation for the introduction of video encoder support add an indirection for handling of the buffers of the UVC source device. Currently this is only video_dev however it can also be an encoder device when encoder is introduced between video capture device and the UVC device. Signed-off-by: Alain Volmat <[email protected]>
Allow creating a pipeline as follow camera receiver -> encoder -> uvc If the chosen zephyr,videoenc is available, the sample will pipe the camera receiver to the encoder and then the UVC device instead of directly the camera receiver to the UVC. Current implementation has several points hardcoded for the time being: 1. intermediate pixel format between the camera receiver and encoder is set to NV12. This shouldn't be hardcoded and should instead be discovered as a commonly capable format from the encoder / video dev 2. it is considered that encoder device do NOT perform any resolution change and that encoder output resolution is directly based on the camera receiver resolution. Thanks to this, UVC exposed formats are thus the encoder output pixel format & camera receiver resolutions. Signed-off-by: Alain Volmat <[email protected]>
Add rough estimate of a worth case H264 output size. Signed-off-by: Alain Volmat <[email protected]>
This commit prepares introduction of the UVC Frame Based support by using the struct uvc_frame_descriptor as parameter of most of the UVC functions. struct uvc_frame_descriptor contains the common fields for all supported frame type and then depending on the DescriptorSubtype the pointer is casted in the correct struct definition. Signed-off-by: Alain Volmat <[email protected]>
The frame_based descriptors differ from the frame descriptors in that there is no dwMaxVideoFrameBufferSize field. In order to do that, add a new uvc_frame_based_discrete_descriptor structure to be used to fill in proper information into the frame descriptor. In addition to that, a new format descriptor is also added for frame based transfer. Signed-off-by: Alain Volmat <[email protected]>
Add proper check of the return value of video_enqueue / video_dequeue. Signed-off-by: Alain Volmat <[email protected]>
Add overlay files in order to enable usage of the encoder in the UVC sample. This work with platform defining node label zephyr_jpegenc zephyr_h264enc Mode can be selected by using -DFILE_SUFFIX="jpegenc" or -DFILE_SUFFIX="h264enc" when building the sample while also adding -DCONFIG_VIDEO_ENCODER_JPEG or -DCONFIG_VIDEO_ENCODER_H264 as well in the command line. Signed-off-by: Alain Volmat <[email protected]>
Add zephyr_h264enc and zephyr_jpegenc labels on node in order to be able to use VENC and JPEG codec from samples. Signed-off-by: Alain Volmat <[email protected]>
Add entries in sample.yaml for enabling h264enc / jpegenc uvc based test on the stm32n6570_dk/stm32n657xx/sb platform. Signed-off-by: Alain Volmat <[email protected]>
da92d84
to
1ab716d
Compare
Had to push a new version to correct an issue in the |
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for finding the a simple solution that works for all current supported hardware!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks like nothing but it saves a lot of variable definition in the samples 👍
Allow creating a pipeline as follow
camera receiver -> encoder -> uvc
I post this PR while there are still some points to improve (cf hardcoded stuff detailed below) in order to get some first feedbacks. Since this depends on the UVC PR (PR #93192) and DCMIPP UVC PR (PR #94562), there are lots of commits in this PR. However only the LAST 8 COMMITS is relevant for this PR.
If the chosen zephyr,videoenc is available, the sample will pipe the camera receiver to the encoder and then the UVC device instead of directly the camera receiver to the UVC.
In order to making the change as simple as possible, the source device of the UVC device is renamed uvc_src_dev instead of video_dev previously since, depending on the configuration, the UVC source might be either the video_dev or the encoder_dev.
Current implementation has several points hardcoded for the time being:
1. intermediate pixel format between the camera receiver and encoder
is set to NV12. This is temporary until proper analysis of video_dev
caps and encoder caps is done, allowing to select the common
format of the two devices.
2. it is considered that encoder device do NOT perform any resolution
change and that encoder output resolution is directly based on the
camera receiver resolution. Thanks to this, UVC exposed formats
are thus the encoder output pixel format & camera receiver
resolutions.
This has been tested using the STM32N6-DK and the JPEG codec, leading to the following pipe:
IMX335 -> CSI/DCMIPP -> JPEG -> UVC
compiled via:
And also VENC codec, leading to the following pipe:
IMX335 -> CSI/DCMIPP -> VENC -> UVC
compiled via: