Hacker News new | past | comments | ask | show | jobs | submit login
Rendering like it's 1996 – Baby's first pixel (marioslab.io)
84 points by homarp on Jan 16, 2023 | hide | past | favorite | 21 comments



There is a bit of irony in how many steps it took to render a few pixels on the screen. I was a teen in 1990 and remember this being one or two lines in Turbo Pascal for DOS.


Your modern OS and graphics hardware is way more complex. In DOS you wrote something that took over exclusive access to all the PC hardware and didn't have to multitask or share it with any other apps or code (more or less). You could directly write to hardware memory with no need to have some intermediate layers arbitrating access like today. The hardware itself was extremely simple and just a buffer of bytes that represented pixel colors, not like modern GPUs that are entirely distinct computers themselves that you have to pass messages and cooperate with for rendering even basic pixels.

Check out a modern library like raylib, it makes blasting out pixels on modern hardware easy: https://www.raylib.com/


I'm not an expert, but can't modern GPUs just take a "texture" in shared memory that is e.g. 640x480 and render it to some window? Having your code write directly to that memory wouldn't interfere with anything else.

Would obviously require some setup, but that functionality could come standard with the OS instead of requiring countless libraries.

Also half the article is about setting up a build system.


The standard “hello Wayland” project[1] is basically that, using an interface called wl_shm. You still need to do some IPC in order to arrange for the environment to so that for you, obviously, so a library called libwayland-client handles the necessary Unix socket handshake, plus some more of that to tell the system you’re done with the buffer and want to submit it. It would be possible to isolate the first part into a reusable executable that performs it then execs you leaving a well-known mmapable fd for the buffer, but the second you’d still need to do yourself. (Wayland isn’t willing to just use a pixel buffer you’re still writing to—indeed I don’t think discrete GPUs can do that as opposed to requiring an explicit transfer to video memory.)

[1] https://github.com/emersion/hello-wayland


thats how it works. its called a swapchain and its usually more than one texture. to avoid artifacts.


Setting up a correct swapchain, with correct frame timing and no microstutter, across Windows, Linux and macOS is a very non-trivial task though. This was no issue on 8-bit home-computers, and where it's basically a handful lines of assembly code to install a vsync interrupt service routine.

(modern GPUs gained performance at the cost of latency, but a lot of the complications are also just pointless over-engineering on the software side)


First set mode 13h (320x200, 256 colors):

  asm
    mov ax,0013h
    int 10h
  end;
...then plot pixel with palette index c at coordinate (x,y) -- mem[] is a built in array of bytes representing memory, 0xA000 is where the VGA framebuffer starts:

  mem[$a000:x+(y*320)] := c;
So yes, it's a oneliner, assuming you've already set the correct video mode!


I remember printing out something called (I think) Ralph Browns Interrupt List on the university line-printer, because I just learned how to embed asm into Turbo C and wanted graphics faster than BGI could do[1].

[1] The BGI library was insanely slow, because, IIRC, it used BIOS interrupts to draw. Could be wrong, I was very young then.



Oh my, the nostalgia...

I remember pecking out Wu anti-aliased line drawing routines in asm in DOS Debug from out of the Black Book on a PCjr. Those were the days. (Just kidding. These are the days.)

https://en.wikipedia.org/wiki/Xiaolin_Wu%27s_line_algorithm

https://en.wikipedia.org/wiki/Dos_debug

https://www.jagregory.com/abrash-black-book/

https://en.wikipedia.org/wiki/IBM_PCjr


Yeah, I remember messing around with Mode-X and SVGA video modes, then writing code to plot lines, circles, Wu pixels, Anti-aliased lines and all that stuff.

Years later at different jobs I was able to use the accumulation method used in Bresenham's line drawing algorithm to do some neat integer based graph drawing and scaling.


“How to just put pixels on the screen?” is a common enough question on r/GraphicsProgramming that I made a little gist that uses SDL

https://gist.github.com/CoryBloyd/6725bb78323bb1157ff8d4175d...


Heh, coming from 8bit BASIC I remember how complicated it seemed setting some pixels in TurboPascal for DOS :)


If you don't want a build system and are willing to use a higher level language, it's still pretty easy. This automatically builds the same demo for the web- something you can do without.


CMake is already obtuse, I can't imagine how the author can cope with a "CMakePresets.json" on-top.

That said, most of this seems to be Emscripten related kludge... seems kinda backward to use a portable language like C and compile to support one format (web).


Apart from the CMake setup, cmake presets also simplify integration with IDEs (like in this case: VSCode) - AFAIK that was actually the main motivation behind the feature.


There's nothing that makes you feel old more quickly than seeing someone trying to rebuild the sort of technology you implemented in the era that they were actually modern.


War story: ca. 1980, before PC meant IBM, there were many teams rolling their own personal computers. In one of which I'm aware, the hardware people attempted to play a practical joke on the software people by swapping the leads on the CRT yokes every once in a while.

It didn't pay off though: the software people quickly had a flag in their driver, and just absent-mindedly flipped it whenever they came in and their screen was displaying upside down, then carried on working as if nothing untoward had happened.


Is there no way to just pop open a web page and draw a pixel on it? This seems like Javascript and a Canvas should be much better suited this.

Nowadays, I would not recommend that beginner attempt to build their own pixmap rendering program and even more definitely I wouldn't use C. The infinite black mire of "Open a Window" would absolutely kill whatever enthusiasm a beginner has.

There are lots of 2D game engines. Pick any one that works in the language you want and proceed henceforth.


Alternatively, watch the first couple of videos of Handmade Hero.





Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: