For some reason, a few people have started asking me about the TBS ZeroZero camera, with its claim of sub-1ms latency. I am very skeptical of this claim, and here's why.
Everyone I've seen claiming to show sub-1ms latency is using the "point a camera at a stopwatch" method of measuring latency. In this method, you point the camera at a stopwatch app on your smartphone and then you look at the output of the camera and subtract the one time from the other to get latency.
The problem with this is that the video feed coming out of the camera is either 30 fps (NTSC) or 25 fps (PAL). This means that the absolute maximum precision that method can measure is 1/30th or 1/25th of a second (33 ms or 40 ms). You simply cannot measure with more precision than that if you are basing your measurement on the camera's output alone.
So anybody who shows you a screen shot of a stopwatch and says they are measuring camera latency to a precision of less than 33 ms or 40 ms has immediately lost all credibility. They don't know what they're talking about.
Here's a video I made about that, and some other common sources of inaccuracy in camera latency measurement.
The best way of measuring camera latency is not the one I show in that video (although I am a bit proud of it) but what Oscar Liang and RCSchim are doing: you light an LED in front of the camera and you measure with a photodiode when the screen lights up, and then you use an oscilloscope or Arduino to measure the time between those two events. You are still affected by the camera's output framerate, but you can do multiple measurements and average around it, as Oscar and RCSChim do.