Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Query about the timestamp of image frame #795

Open
lenardxu opened this issue Sep 21, 2022 · 4 comments
Open

Query about the timestamp of image frame #795

lenardxu opened this issue Sep 21, 2022 · 4 comments

Comments

@lenardxu
Copy link

Now I am working on a project of synchronization between image and imu readings. To render the sync more precisely, I'd like to ask how the timestamp is assigned to the image frame that is acquired say using depthai.DataOutputQueue.get() method. Is it defined as the time instant when exposure finishes and the image is (fully) read from the sensor (You may refer to your shared doc (https://docs.luxonis.com/projects/hardware/en/latest/pages/guides/sync_frames.html?#software-soft-sync))? OR is it set as the time instant when exposure begins and the image capture starts, as commonly pointed out in related papers (you may refer to Section "B. Sensor Synchronization and Data Acquisition" of this paper (https://ieeexplore.ieee.org/document/6906892))?

@themarpe
Copy link
Collaborator

@lenardxu

the image frame timestamp, is currently assigned when MIPI streaming starts, that's immediately (few microseconds) after the exposure ends (full frame exposure for global shutter, first row exposure done for rolling shutter).
It has the advantage that the frame interval is fixed from frame to frame in continuous streaming mode, regardless of varying exposure time. That's because the exposure is end-aligned within a time-frame: https://docs.luxonis.com/projects/hardware/en/latest/pages/guides/sync_frames.html?#synchronizng-frames-externally)
Also for hardware-synced multiple cameras, the FSIN/FSYNC signal is aligned with the end of exposure / MIPI SoF.

Quoting @alex-luxonis ^

@alex-luxonis
Copy link
Collaborator

@lenardxu The timestamp is assigned at the MIPI SoF (start of frame) event, when the sensor starts streaming the frame. For global shutter sensors, this follows immediately after the exposure for the whole frame was finished, so we can say the timestamp assigned is aligned with end-of-exposure-window (within a margin of few microseconds). Marked on the timing diagram from our docs (that you linked above):
image

For rolling shutter sensors (as our high resolution RGB sensors), it's slightly different, MIPI SoF follows after the first row of the image was fully exposed and it's being streamed, but the following rows are still exposing or may have not started exposing yet (depending on exposure time).
The MIPI readout time for the IMX378 is with the current configs, based on the configured resolution:

  • THE_1080_P (default resolution) - 16.52ms
  • THE_4_K - 23.58ms
  • THE_12_MP - 33.04ms

@lenardxu
Copy link
Author

@themarpe , @alex-luxonis First off, my hearty gratitude for your comprehensive clarification of timestamping method that is adopted by you, which is a brilliant way.

Then I'd like to double check if I understand your way of timestamp assignment since it should be different from the normal way of timestamp assignment where timestamp is defined at the beginning of each image capture and of exposure, and I am not familiar with the concept of MIPI SoF. So, upon your explanantion based on the plot you marked, in case of global shutter sensor for convenience, the timestamp is always set at the end of corresponding image frame, where the affiliated exposure that takes place at the back also ends. For example, for frame_0, the assigned timestamp is actually the instant when frame_1 is to be captured.

Request for an update/extension to another timestamping method

Finally, I'd like to request for an update/extension from your current timestamping method to the one which assigns the mid-exposure time to frame timestamp. That is out of better sync quality (no matter for hardware sync or software sync), for which you may still refer to section "B. Sensor Synchronization and Data Acquisition" of this paper (https://ieeexplore.ieee.org/document/6906892) and experiment section. According to their experiment results, the switch to aligning imu trigger to timestamp (mid-exposure time) of triggered image frame improves sync greatly. Although upon your description, I've learned that would be hard since the fixed period between each two frames would be broken when a simple switch to mid-exposure time is adopted, I still think it deserves your attention.

@lenardxu
Copy link
Author

@themarpe , @alex-luxonis There's still one question about exposure time. Given your current timestamping method and auto exposure by default, I'd like to try using fixed exposure in my software synchronization program (image frame as visual cue, imu measurement as triggered), to see if the alignment of mid-exposure time with imu timestamp will improve the sync quality. However, I don't know which temporal length of fixed exposure (say 0-33.3ms) is reasonable, I'd like to ask for you opinion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants