Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Is It Snappy? – Measure latency with your iPhone's 240 Hz camera (isitsnappy.com)
293 points by chadaustin on April 3, 2017 | hide | past | favorite | 84 comments



Tangentially relevant, but I've been part of several communities that have discovered a lot of interesting things about some video games using high-speed cameras.

Super Smash Bros. Melee: A game where input lag really matters a lot, the reason we still play using CRTs, not only has several frames of input lag, but it's not a constant number of frames of lag. It can range from 2 to 5 frames of input lag.

Tetris the Grandmaster series: Another game where input lag has a huge impact not only has several frames of input lag (and people have resorted to all sorts of ways to reduce it, including AHK scripts that constantly move the mouse on Windows XP) but the first two games of the series don't run at 60Hz. TGM1 runs at a slightly lower rate, not really significant, but TGM2+ runs at 62.68Hz, which is quite significant and makes some of the challenges a tad harder.

Both of these communities took the measuring test a few steps further than what can be done with isitsnappy. They connected LEDs to the buttons so it was easier to know precisely in what frame the button was pressed.

Someone in the Melee community also placed photoresistors close to the screen and used an oscilloscope to know exactly when the brightness changed.

Not exactly the most relevant anecdotes, but I felt like sharing.


This is so great for melee, though. The led tool was amazing and well-done, but this puts the measuring tools right in the hands of people! This is an excellent idea that will let everyone measure the lag on their own screens with just a phone! I know you weren't saying anything to the contrary, but i just wanted to express my excitement!


Thanks for sharing though! I love anecdotes like this :)


Nice. That reminds me, Carmack tweeted that he could send an IP packet to Europe faster than he could send a pixel to the screen[1], and this is unfortunately true on modern hardware. It's nice that phones are putting the kind of high-speed cameras that can measure this latency into the hands of consumers. Maybe this will allow gamers to put some pressure on hardware manufacturers to reduce the latency that they add to the systems.

I tried to measure the latency of my own system a little while back. I used a digital camera to record video at 240 fps and measured the time it took for a button press on a DS4 connected over Bluetooth to be reflected on a Mega Drive emulator running the 240p test suite. I can't remember the exact latency, but I think it was around 80ms, which is okay, though there is definitely room for improvement.

[1]: https://twitter.com/id_aa_carmack/status/193480622533120001


Carmack also wrote a Super User answer [1] with a lot more background on that tweet.

[1] https://superuser.com/questions/419070/transatlantic-ping-fa...


> This one’s bizarre: the onboard keyboards on both the 2015 Macbook Pro and Macbook Air have worse latency than an external keyboard. Plugging in an external wireless keyboard improves response latency by 40 ms. What are you doing, Apple?

I noticed this myself when I upgraded my wireless Magic Keyboard a few weeks ago. I haven't noticed any difference between the two (keyboards) before, and assumed that the new one felt snappier due to shorter travel.


At least with my magic trackpad 2, when plugged into USB on my macbook pro, it communicates over USB instead of bluetooth. So worth using USB to cut some bluetooth lag out.


I was probably not clear, but I'm experiencing less lag over Bluetooth than the internal keyboard on my Macbook.


Apple is very specific about their human interface experience, IMHO. Could be a case of Apple deciding for people the amount of lag that people want and being wrong.


I find it hard to believe lag could be a good thing in any way for an input device.


Its not about what people want, its about the speed of USB pooling. Both Internal keyboard and touchpad are on the USB Bus in Macbooks, pooling USB slower means you save additional battery power (less frequent interrupts/cpu wakeups).


With the Magic Mouse "intentionally" misplaced charging port I wouldn't be surprised!


I wonder if this lag changes if you increase the 'key repeat' time, any time I'm on a mac with this setting on the default value the system feels slow to me


" The response latency of the real world is zero. When you pick up an object, your fingers and the object move in lockstep."

This is a misleading and mostly false statement.

The human consciousness requires time for discrete data to be combined by the brain into what we perceive.

It is NOT instantaneous in real life. There is multiple forms of biological latency from the maximum speed of electrical impulse propagation through neurons to the "compile time" of the brain to assemble and modify information into a cohesive output for internal human consumption.

It may seem pointless to point this out, but many sources indicate that the delay between real life and perception is ~80 miliseconds, around 1/10th of a second.

These are speeds which are highly relevant to the discussion of software lag, as a program running 60 times per second updates several times during the "brain lag" of 80ms.

(Oh and the reason why you can overcome the 1/10th of a second lag and have your hand move in lockstep with reality is a variety of compensation mechanisms, like proprioception, which allow you to estimate where your hand will be in relation to reality and achieve it successfully).

https://blogs.scientificamerican.com/observations/time-on-th...


Not misleading or false at all. The latency of the real world is 0. Our internal perception latency might be non-zero, but that's on us, not the "real world". When you pile on additional latency between button press and screen feedback, the internal biological latency doesn't cover for it. It adds to it.


The latency of the real world is still limited by the speed of light, as far as I can tell.


Which, in our real world, is 0.000… Picking up a cup, pressing a button, opening a window. The speed of light is irrelevant at these scales.


Actually, by the speed of sound. When you move an object, the pressure wave that travels along it and makes it move with you propagates at the speed of sound in that material. Still pretty fast at human scales, though!


True, but of course any latency introduced by a program is in addition to that brain lag. Even though the program can do work while the brain is processing, to feel snappy it must let the brain begin that processing as soon as possible, i.e. respond as close as possible to instantaneously.


It doesn't matter if it's technically instantaneous, the point is it is our reference point for "instantaneous".


My use for the timing capabilities of the 240Hz camera has been in measuring the shutter speeds of antique film cameras, which tend to get slower with age. So I find myself using a 240Hz camera from 2016 to calibrate and use a film camera from 1941.


Great work! I've already measured 4 of my gaming systems. The NDS outperformed them all - Rhythm Paradise reacts in 33ms from tap to image change B-)

I can imagine screenshot test results from your app becoming a standard way to show the latency of a certain game setup in internet forums. Ideally, an easily sharable result would have to show several things all in one image:

- 1 frame before you pressed the button

- the frame when you pressed the button

- 1 frame before something changed on the screen

- the frame when something changed on the screen

- plus timing data, obviously.

Since changes are sometimes quite small (like when mario jumps but he is kinda in the background because you have the joypad in the same shot), one would have to be able to zoom into a part of the screen for each of the 4 images. And maybe mark a part of the screen with a circle.

If you could do these changes, you've got a certified hit on your hands! Keep up the great work!


Ubuntu latency is better than Windows 10 on this hardware?! Way to go!

Better check your videocard driver's settings. For NVidia on Windows, you can for example alter number of prerendered frames which IIRC defaults to at least 3. Putting that to 0 and turning off a bunch of other things listed in the settings, decent nVidia cards on decent hardware can get a latency of 2 frames which is probably the bare minimum. At 60Hz with a screen which doesn't add additional latency that is 33.3mSec (between the software issueing a render call and the actual output on the screen).

At work we measure latency using an oscilloscope, a photodiode pointed to the (top left of the) screen and some input event e.g. from a fast digital output card (i.e. less than a mSec to get an output high from software). E.g. in software we set an output high in the piece of code of interest then just measure the time between that high edge and the photodiode firing. But using a camera is a neat though somewhat more manual process.


On the Windows 10 performance, I did a single test and found something mildly interesting.

I tested in Notepad and have the "Hide pointer while typing" setting enabled, and found the first thing that happens on screen is the pointing becoming hidden. That happens after 50 ms. The character shows up in Notepad 16.7 ms later.

This was measured with a 120 FPS camera and an old 60 Hz monitor.


This is likely because the gpu has specific hardware for drawing the cursor on top of the current framebuffer, which can be turned off quicker than the character can be drawn to the framebuffer


Indeed it does. When using f.lux the color temperature of the cursor remains unchanged due to this.


You can also observe it in many cases while dragging a window - the composited window's position will lag behind the position of the cursor.


DWM also adds at least one frame of latency (usually two I think?) at all times. If you bypass composition (through exclusive mode fullscreen) that additional compositor latency goes away. Higher refresh rate would also drop that latency, naturally.


Love these high speed cameras on phones. I used my Pixel's 240 fps camera to expose a 120 Hz flicker on subpar LED bulbs: https://www.youtube.com/watch?v=QbenId_F2RQ


Believe it or not, I can see those with my eyes and when I tell people the light flickers, they usually don't believe me so I show it to them on my phone.


Only slightly related, but your phone camera is also useful for checking if your television remote is working.

The otherwise non visible IR led in the remote lights up when viewed via a phone camera, as you press buttons on the remote.

Handy if you suspect the remote is broken, but aren't sure.


At least on recent iPhones, the front-facing camera does not have IR filtering, but the back one does.

Source: I use the front camera to debug one of our products, which uses IR LEDs as a flash to check for obstructions.


I thought most phones had IR filters these days and that this wouldn't work? Am I incorrect here?


They have basic IR "filters" which are used to reduce IR light, but they don't totally block it, which is why you can see a purple light coming from TV remotes and such [0]

0: http://www.wikihow.com/images/6/60/Check-if-a-Remote-Control...


I just checked mine now, new Moto Play Z so pretty recent, and it's much less noticable than it used to be. However, going into a dark spot so nothing else could be seen, the led in the remote flashed a slight purple. Definitely enough to check it's working, but you need a bit more cover around it to be able to see it. In the past, they used to flash like bright bulbs.


The cell phone camera is a great way to test whether a multimode fiber optic cable is connected at the far end, too.


Our current benchmarks tell us what's happening after OS sends keystroke to the text editor and before we hand off rendering to the OS. This tool has potential to tell us what's happening outside these bounds.

I would like to see a visualization of the recorded sound, so that I can set the "input" frame to exactly when I hit the keyboard. I'll even pay to have this accurately aligned.

Good job so far!


Last night I was going to use my video camera to measure the startup latency of a vibration motor by comparing it to a "Power On" LED. So this HN post is great timing.


This is awesome! I work in the cash register software development world, and our customers are sometimes quite picky about the latency of our systems as a whole (= the time between a scan of an item happening, indicated by the scanner beeping and lighting an LED, and the output being visible on the display). I found myself measuring this total latency using an iPhone (at that time it was a 4, without the high speed camera, but nevertheless sufficient to see whether we were below the critical limit of - at that time - 200ms), and it worked out very well, although I had to load all the videos into external software to count the frames accurately because the phones' own players weren't sufficient and there were no apps like this.

I will definitely give this tool a try, it could vastly improve future measurements using the same technique :D


interestingly, your 200 ms are close to the 150 ms telephony standards regard as the maximum one way delay for good speech quality. Seems to be in our physiology..


  > The response latency of the real world is zero
No, it's actually the speed of light.


This article mentions latencies in the order of magnitude of 50 to 100ms. For reference, the speed light takes to travel 3m/10ft is 0.00001 ms. For the problem being considered (latency in user interfaces), it IS insignificant, and rounded to zero.


Not sure what you're measuring with 3m/10ft, and what would travel that distance at the exact speed of light. For the reference, electrical signals travel at about 2/3c through fiber, a bit faster that that through copper.

I was simply pointing out that even interaction of atoms to proximity is propagated with a speed of light, and should not be called zero. Nothing to do with the rest of the article or the excellent app idea.


No, actually latency is a time interval (duration) whereas the speed of light is, well, a speed, and so not directly comparable. But I think we both know what the author meant by the original statement.


If you bring one atom to a second atom, a second atom will react to the proximity of the first atom in the time it takes light to travel from the first atom to the second one. Atoms never really touch, but the point is that there is always latency.


I think it's actually much more.

I ridiculed a friend when he said he could a stroboscopic effect from wheels in normal daylight.

But it is actually a real thing which I've experienced myself.

https://en.wikipedia.org/wiki/Wagon-wheel_effect#Truly_conti...


I'm no physics expert, but doesn't quantum entanglement [1] contradict that?

--

[1] https://en.wikipedia.org/wiki/Quantum_entanglement


No because there's no information traveling between the two particles. The particle that gets measured 'first' still has a random state. Also forcing one of the particles into a particular state to try to send a message breaks entanglement.


I thought "instantaneous" essentially be on the order of 100-200 ms given human reaction times, but apparently we can visually detect latencies down to the 10-20 ms scales.


Microsoft Research showed that anything above 1ms latency is noticeable on touchscreens: https://www.youtube.com/watch?v=vOvQCPLkPt4


Reaction time includes a behavioral response like pressing a button after a stimulus.

https://en.wikipedia.org/wiki/Reaction_time

You also have to consider the speed of the neural pathway.

https://en.wikipedia.org/wiki/Nerve_conduction_velocity


I noticed this earlier today as I intentionally smashed a small piece of glass (for a project) - the smashing and scattering of pieces happened in less than one second, but my subconscious managed to accurately track both the number (~5) and direction of the significant pieces that went flying. That was quite surprising to me.


End-to-end latency is definitely something that deserves attention. It'd be interesting to modify mouse&kb by adding LEDs in series with the switches. That way you'd get good visual indicator when the event happened without any added latency. Combine that with 144hz monitor and actual high-speed camera (I hear good things about Chronos camera), and you could do very accurate software latency measurements

This MSR video from few years back is also pretty cool wrt touch interface latency: https://www.youtube.com/watch?v=vOvQCPLkPt4


Re: the gotchas, it's possible that this is due to how the app is architected in combination with how the phone and OS are architected. The input rate of 3D Touch, for example, is tightly coupled to the display output frame rate, which, in turn, is tightly coupled to the things you do on the main thread. I don't know off-hand if the camera input is coupled to anything in this way, but it's something to look into, and Apple's documentation should indicate this sort of behavior.


Very cool use of standard tech!


Yeah definitely! No later than yesterday was I asking myself how I could measure the latency of my bluetooth controller connected to my retropie setup. The latency seems quite high and I find myself struggling to play Super Mario World correctly. Now I can tell if it's a technical issue or more of a 'getting old' problem.

Thanks for this project, it's really cool.


This is great! Measuring latency is the first step to fixing it, and it's a step most never take. I've previously worked on a method of measuring latency in an automated way without cameras, which may also be of interest: https://google.github.io/latency-benchmark/


Unrelated to the article but coincidental post about the iPhone camera's speed on Reddit yesterday:

https://www.reddit.com/r/apple/comments/63117n/the_speed_of_...


> It is interesting that Atom’s latency gets worse by two frames when Haskell syntax highlighting is enabled.

Heh, that's not so surprising to me. There are several vim plugins for Haskell, and one of them had some very slow features that made the latency go into the "many seconds" range :D

The Macbook keyboards one is ridiculous though.


This is likely related to the Macbook keyboards being connected via SPI internally rather than USB. Maybe a lower polling rate than USB or a higher latency where the interconnect occurs.

This was mentioned on HN[0] late last year regarding 2016 Macbook compatibility.

[0] https://news.ycombinator.com/item?id=12924051


What a crazy cool idea!

Question: what is the standard human physiologic "click" time?

How fast or slow does it take to make a click. A mouse click takes longer than a screen tap, no?

Sometimes one will hold a mouse click if they are unsure.

What might be a better way is to record a scenario and measure response time on the replay?

Cool idea regardless


No such thing as standard human. Racing field has had studies into reaction time for a good while now. F1 drivers are close to 150ms, people on the street can take as long as 1.5second to react.


> Question: what is the standard human physiologic "click" time?

According to Disney in the 1960's, it takes from 3/4 to a full second[1]. :)

1: https://www.youtube.com/watch?v=dPHGI_5BaSQ&t=570



Oh, you could use this for treadmill calibration:

http://fellrnr.com/wiki/Treadmill_Calibration


Surprised that the author is quoting latency of editors to 0.1ms, when the precision of the camera is only 4ms. Are they really repeating these tests 40+ times?


Would be nice to also see latency in cmd.exe and in bash on the XPS 13; and also the latency inside BIOS/UEFI menus.


Very curious to see the numbers for Sublime Text using his same methodology, as Sublime is often praised for its speed.


Awesome! This is also very visible in touch based interfaces, especially in scrolling. I did a video with high speed camera to show how laggy real interfaces are https://www.youtube.com/watch?v=8EhJo2OPR44


I'm building a similar app for measuring UI latencies, but as a desktop app (for regressions checks on CI). Your app seems much more universal and platform-agnostic, really cool idea!


Could something like this work with the pixel's camera?


Would love this for Android! Hell, i'd happily buy it!


Are there any Android smartphones with 240Hz cams?


Pixel and Nexus 5x, for a start.


Galaxy S7 got 240 Hz camera.


I don't know if it's accurate but it also works on iPad Air 2 which supposedly only does 120 FPS.

The app crashes when attempting to delete captures though.


Yikes! I'll see if I can reproduce. Thanks for the bug report!


Fix is in the App Store! Update should come soon.


Used a similar trick with an old Casio ages ago. Maybe it was a EX-10? not sure. I think it did 120 fps at 480p in like 2010.


The next level would be a desktop app that syncs up with your phone and flashes the screen at a precise time while you record video. Then monitor lag could be tested without expensive hardware.

But maybe UNIX time isn't precise enough for that sort of thing? I actually don't know.


Thanks for this! I've been looking for something just like it!!


I used a similar trick when evaluating mice and monitors for gaming, but only had 60fps, but used some tricks (seperate lcd based display displaying microsecond timer in frame, some other timing hacks) to discover my trusty old crt outpreformed even the best flat panel in latency and black level. Still use it for gaming, and scrounging for another for when it finally goes. Other discoveries were being able to vary the ps2 keyboard polling rate (although the correct way seems broken in linux), and that even with higher polking rate on usb, the ps2 interrupts handled faster. Mouse ended up better as usb due to thruput issues with serial. Take care to find an uncorrected sensor output to avoid microjitter and variable response time. Only a few sensors do this, so finding mice by sensor is how I approached it.


A lot of Quake players still prefer CRT for this reason. I remember when I had a 170Hz CRT with 1000Hz USB rate back when it was more difficult to achieve and it was incredible. What mice did you evaluate and decide on?


60Hz on a CRT always looked bad too, it was a last resort. I could really see the flicker at 60Hz on my old Trinitron. 75 was pretty OK but it still looked smoother at 85 which is what I usually had it on. Above that I couldn't visually see any more improvement.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: