Hacker News new | past | comments | ask | show | jobs | submit login

Gpu powered terminal emulator? This performs almost 100x worse than Mac OS Terminal.app. Cat-ting a long file takes 100x longer to draw on the screen! Isn’t it supposed to be faster if it’s “GPU powered”?



"Some people have asked why kitty does not perform better than terminal XXX in the test of sinking large amounts of data, such as catting a large text file. The answer is because this is not a goal for kitty. kitty deliberately throttles input parsing and output rendering to minimize resource usage while still being able to sink output faster than any real world program can produce it. Reducing CPU usage, and hence battery drain while achieving instant response times and smooth scrolling to a human eye is a far more important goal."

From the performance page: https://sw.kovidgoyal.net/kitty/performance.html


That... doesn't make any sense.


I think it does if you consider "speed" as meaning response latency and perceived speed, not data throughput. From what I've read here so far, it feels fast, while not killing your battery with bulk cat'ing of text. That's my take on it anyway. Just now going to download and try out...

EDIT/Whinging: Welp, scratch that, kitty requires an OS X one version higher than what apple will allow me to install. And while it is an older MBP from 2010, at least it's fast and reliable. AND it has the multitude of ports that I like. And the magsafe.

I'm sure I'm venting into an echo changer, but, here goes. Why won't Apple simply provide me (an option to buy) a modern solid mid-level 15" rMBP for under $2000. I bought mine for $1700 and it came with a $200 ipod as a gift (sold on eBay).

Give me that. A rMBP, with all the ports, plus the new USB-C. They could leave out some of the stuff that's pricy.

Ya, give me a new version of what I have, and will last at least another 8 years (fully supported by MacOS releases), and price it under $2000. That I would buy. When I replace this one, I'm just going to have to buy something to linux or Windows (probably both, realistically), since I've been priced out the product I would have normally purchased and recommended.


It's simple, take for example scrolling a file in less. Most modern terminals are fast enough that you can do it at the key repeat rate of your computer. The difference with kitty is that the scrolling feels smoother and uses less CPU, for the same task, thereby saving battery and pleasing your eyes.


I thought macOS was already GPU accelerated with Quartz? And wouldn't Windows be doing something similar by now? The CPU is certainly not writing pixels directly out to VESA buffers in 2018, right?


Most Cocoa apps use the CPU backend of Core Graphics, which doesn't use the GPU for vector graphics rendering. (CG is usually what "Quartz" refers to, though the brand is so overloaded at this point that it's hard to make any general statements about it.) Cocoa apps do frequently use Core Animation for compositing surfaces together on GPU, though.

Most of what terminals have to do is blitting of prerendered text bitmaps, which is relatively slow on CPU and does benefit from the much faster memory bandwidth of the GPU. Core Graphics generally does not use the GPU for text blitting in most Mac apps, so having a custom renderer can help here. Font rasterization on the Mac does not use the GPU either, but the glyph cache hit rate for a terminal emulator is so high that it ends up pretty much irrelevant.

On Windows, Microsoft ships multiple rendering stacks for legacy compatibility. Most terminals are old Win32 programs that use classic GDI, which as far as I know is partially accelerated but mostly CPU (and implemented in the kernel!) Direct2D, the newer API, does use the GPU for blitting text. Like macOS, Windows still does all font rasterization on CPU. In GDI, font rasterization is done on CPU in the kernel (!) (except on Windows 10, in which the kernel calls out to a userspace fontdrvhost.exe). In Direct2D, font rasterization is done on CPU in userspace.


As a software developer, I get upset when people use old versions of my software and talk about how it isn’t modern.

This also makes me weary of statements that criticize how e.g. windows does it wrong, “except in the current three years old version”.


what is the benefit of using a GPU powered terminal on my Integraded 6000 ? rather than my iterm2 ?


Iterm2 also supports GPU accel. FYI


Only in beta, or did they release it?





Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: