When to acquire an image? Is it continuously to be acquired or only a frame at an exact time or continuous acquiring from an exact time or stop at an exact time etc.. are questions that arise depending on the application need. This is crucial in some applications and in successful implementation. Several methods are available to trigger an image acquisition for different levels of control over the timing of the acquisition. Acquisition timing control among multiple cameras can be accomplished by synchronizing cameras to each other or to an image acquisition device. This synchronization also to happen according to the event to be captured.
Cameras in general can be run in Free Running mode or in triggered acquisition mode. Single camera Triggering is done to synchronize the camera with the event to be captured. When we talk about Camera and acquisition, we also talk about camera separately and acquisition device separately. Hence we can either trigger the camera or the acquisition device. Triggering only the acquisition device while using the camera in free-running mode creates a variable delay between the time the acquisition device receives the trigger and the time the image is acquired. The delay occurs because the acquisition device must finish acquiring the current frame before acquiring the triggered frame.
In olden days, while using standard TV format outputs, Asynchronous reset allows the camera to output a frame immediately upon receiving a trigger. Without asynchronous reset capability, the image acquisition device waits for the camera to finish outputting the current frame before acquiring the next full frame.
Triggering Modes of a Line Scan Camera:
A line scan camera can also be triggered to acquire a line or variable number of lines asynchronously. In line scan applications to synchronize with the motion of the object, a trigger from a motor’s encoder signals or some other signal of varying frequency to start each line acquiring. In line scan applications many times we do not know the exact size of the object acquired. The reference signal from Encoders are generally used to give a virtual frame trigger as the size of the frame ( lines acquired to form a Frame) may vary and this trigger facilitates the acquired image with “variable height”, and the object is guaranteed to fit in one image, in contrast to area scan cameras that may acquire only part of an object due to the set image size. Both Line and Virtual Frame triggers can be used to acquire lines and frames with variable size according to the object’s speed of motion and needed size.
In general, in the past, special genlock boards or boxes were used to synchronize multiple cameras
Genlocking is the term used for sharing the timing information from one camera or source, usually referred to as the master, with one or more additional cameras, usually referred to as slaves. The image acquisition device can lock on to the timing signal of the master camera, and then digitize any of the slave camera signals using the same timing information.
There are situations where multiple cameras are connected to Multiple Image acquisition devices (say Frame grabbers – in the Same host system or separate host systems). Many of the frame grabbers have board sync connection ports on board to connect from one board to another and the same Master and Slave concept applied to acquire. With new technology boards and cameras, data forward techniques are used to acquire higher speeds and also synchronize.
Ethernet cameras rely on precise timing and synchronization for accurate data transmission. The Precision Time Protocol (PTP) ensures sub-microsecond accuracy, synchronizing time clocks among IP-connected devices. Gain expert insights into how precise timing and synchronization work within Ethernet cameras. Discover the critical role of PTP modes in preserving data integrity and enabling real-time control in Ethernet Cameras.
Synchronization is critical in multi-camera arrays, especially in volumetric capture and other 3D reconstruction applications. Multiple cameras connected to multiple Digital Video Recorders can be precisely shutter-synchronized as well as time-synchronized. All frames from all cameras recorded together will have the same timestamps to correlate the images by. IEEE-1588 Precision time protocol (PTP) is supported through the 10GbE interface. Multiple Digital Video Recorders can be synchronized to a PTP grandmaster clock, or one Digital Video Recorder can be operated as a master and distribute PTP time to the other DVRs itself. Other synchronization options include a SMPTE LTC input, a SMPTE tri-level sync input, and GPIO pins for synchronization between the Recorders as well as with external devices.
In Security applications, the synchronization is basically through system clock and time on the device that captures. These cameras can be also shutter synchronized by additional inputs based on the models.
FALSE Synchronization is also a research study which needs effective trouble shooting mechanism to determine the same.
In Stereo Vision Systems and depth estimation, Synchronization between the two captured images is crucial for the accuracy in the disparity and depth estimation. A key validation step for all stereo system used for dynamic scene is to physically quantify the delay between the two cameras due to any electronic hardware and underlying software. One cost-effective way of measuring the delay between two nominally ‘synchronized’ cameras is to use the camera pair to capture an object moving at a known velocity.
When reconstructing a 3D object, it is difficult to obtain accurate 3D geometric information using a single camera. In order to capture detailed geometric information of a 3D object, it is inevitable to increase the number of cameras to capture the object. However, cameras need to be synchronized in order to simultaneously capture frames. If cameras are incorrectly synchronized, many artifacts are produced in the reconstructed 3D object. A Novel synchronization method suggested in a research paper that synchronizes an arbitrary number of cameras by adjusting the number of hosts to support stable data transmission. This method establishes a master-slave architecture in order to synchronize the system clocks of the hosts. While synchronizing the system clocks, delays that resulted from the process of synchronization can be estimated so that the difference between the system clocks could be minimized. Through synchronization of the system clocks, cameras connected to the different hosts can be synchronized based on the timestamp of the data received by the hosts.
In Vehicle installed cameras to do 360 degree view, again synchronization of multiple cameras is crucial. This Multi-camera synchronization minimizes blind spots and reduces false alarms. Synchronized multi-camera systems can also provide real-time information to enable these systems to make informed decisions.
Apart from Camera Synchronization with external events or multiple cameras, lenses if used as controllable ones through motor etc., then these lenses too have to be synchronized to achieve the same kind of images for critical applications. Many of the high speed cameras or high speed events or events which can happen in milliseconds would need synchronized strobe lights to capture meaningful and project specific images. Even standard machine vision applications may demand these.
Disclaimer: The above article is only for informative purposes. The contents are self-written or taken from different sites available. If anyone finds the content or usage objectionable then they can mail us at [email protected] to do corrective measures. . The contents can be rechecked for their correctness by the readers.
Article created by: S Sukumar – Director Online Solutions (Imaging) Pvt. Ltd., Chennai India on 17 January 2024.
We'll be glad to help you! Please contact our Sales Team for more information.