Hacker News new | past | comments | ask | show | jobs | submit login
Is your computer too slow to keep up with your typing? (tnhh.net)
14 points by jimmies on July 4, 2021 | hide | past | favorite | 7 comments



Macs seem to have some weird input latency that I don’t notice on Linux or Windows. Letters don't appear for sometimes up to a full two seconds after I type them on my Macbook Air if I have an external monitor connected.


I noticed a snappier response to my keystrokes (particularly in Emacs) when I switched recently from a Mac mini to a NUC running Linux (keeping the keyboard the same).


This type of thing leads to some of the most annoying UX regressions with "as you type" autocompletes I've ever seen. It's one of the reasons why I don't understand as you type suggestions. Either the person knows what they want and you should let them finish typing, or they don't, and there are way better ways to allow them to signal you they want a suggestion.

One of my earliest QA moments was how some of the women in the office could type so fast, our auto complete widget would spaz out.


As someone with some experience in the area, can you explain why these as-you-type systems don’t simply wait until the user pauses to do their thing? Even better, allow users to set that threshold themselves?


Embarassingly, I know at the time it was simply a case of we were using GWT (Google Web Toolkit/GXT) to transpile the frontend, and it was one of those things where we didn't have the time to go plumbing the depths of the toolkit to rewrite it.

This was before mobile was even a big thing, which kind of makes it even more heinous to me that so many frameworks repeat the same bloody mistake over and over again.

The reason it is so horrible though, is easy to get to when you think about it. The widget is probably implemented with event listeners for some combo of keyDown and keyUp, or worse, just keyUp, then it waits some amount of time for the next keystroke (which varies by person, and is generally not user configurable as you pointed out, so you are guaranteed to commit the UX sin of forcing the user to learn the system instead of making it so the system can be told by the user how to work with them). If your timer just expires, and you start your (really_expensive_network_call) which inevitably not only may take a while to get the data back you need to finish up, but needs to do all the gymnastics to get it rendered.

But you already started typing again damnit.

What now?

Do I implement event handlers everywhere to abort what I'm doing?

Make my UI smart enough to change the query of my network call in flight?

Maybe build an index client side that persists between sessions so at first it's janky but gets faster overtime? How often do I clean it out? Can I even store that safely client-side?

Nowadays, it's even less likely anyone would want to rewrite these basic UI conventions, because you'd have so many places to have to implement them, and with the advent of mobile, people do tend to hunt and peck more with touchscreens, which tends to make touch-typists even more of a niche audience than they were previously.

I always felt the best way to do it was to go old-school. Don't "help" the user by second guessing what they are doing. That's just forcing one more flavor of ui-madness on them that they have to randomly hunt and peck around to learn. I prefer clear cut signaling to do assistive things. You side-step all that complexity of what to do if people start typing again, because odds are if they signalled for you to help them out, they'll naturally wait for you to do so, or you give them a "nevermind" sequence to abort and get back to typing again. My "user brain" treats my interactions with computers like we're having a conversation, so it gets flustered when doing just what I ask it to do ends up kicking off too much other stuff. As programmer's, it's easy to get in the habit of doing things behind the scenes for the sake of 'ooooh, magic', right up until it breaks and pisses off my users because they don't understand their way of interfacing with the computer clashes with my way of hiding complexity from them.

The other interesting piece a lot of people don't understand, is that how processors handle input changed drastically between P/S2 and USB.

Back in the P/S2 days, input preempted processing. You'd generate interrupts that would halt what the processor was doing to handle whatever you just did. Now, it buffers. USB is polling based, so depending on how much you're doing and how often your processor samples those buffers, there is generally always going to be a lag when handling things. It changes up the calculus a bit because generally speaking, there is so much your processor can get done in between those polls that most don't even think about the fact everything adds up.

At least, until you're on an anemic i5 at 1.4 GHz, with high-memory pressure, with a bunch of different network calls in flight and being spontaneously generated by apps written in Electron or something similarly multi-platform. Then you sit there staring at your system you just studiously tapped out a long string of text on, and a third of it was lost, your cursor jumped up a line or two, dumped the next third there, and you're still tapping your finger waiting on... There's the last third (thanks OSX, Safari, a WebRTC client, and a VM, and a couple dev tools, and Slack).

It's a very subjective thing, and to be honest, I almost wonder if the best thing for UI/UX is to actually use the weakest machine possible. It's generally only there that you actually gain an appreciation for the non-trivial cost of getting character data on screen, and just how bad a taste it can leave in a user's mouth when they could do better if your program would just let them.

So yeah... Don't know if that rant answered your question...

In summary, give your user tools to tell the computer how to fit in the stuff you as a programmer built for them. This let's them pick and choose functionality that let's them get stuff done, and makes the error handling easier, and when breaks happen, your user actually has a chance of maybe unsticking themselves by walking back functionality they turned on.

Give the user a way to ask you for help. Don't infer just because they have processor cycles, and you think you're UX hot shit.

Beware of the illusion everything in computing is free. It isn't. Moore's Law just made it seem that way.


I remember back in the day (like 90’s) there was significant latency noticeable on Macs when typing. Today if I have any problem with my hardware keeping up with my typing, it’s probably radio related. My Logitech MX Keys will stop buffer and then barf out a crap tone of letters, sometimes more than just my backlog. Not sure what that’s all about but I understand others have had similar problem with the keyboard. It’s not frequent enough to give up the otherwise great keyboard, and it’s worse over Bluetooth than Logitech’s wireless protocol.


I didn't read the whole thing, but of course this can happen and for a multitude of reasons.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: