It always baffles me how people making anything on GitHub don't show how it looks, especially with something as visually prominent as a video game or similarly visual products. I'm always extremely eager to show how my projects function and look, even if they are command-line programs!
I was just thinking a few days ago that it is surprising flappy bird didn't get invented much earlier. In terms of complexity it seems to be somewhere between pong and Asteroids. Games like Pac-Man and steet fighter are much more complex. Yet flappy bird came out in 2013. Wikipedia puts "Helicopter Game"[0] from 2003 as the first game with flappy bird's game mechanic.[1] That still seems a bit late to me. Am I missing something?
I like these challenges, where people rebuild stuff low level and “line efficient” (not saying this is a good idea in general). This makes me curious and I can learn a lot from these examples.
I was a bit disappointed though to see that this code used SDL for sprites and more stuff though. That’s not 1000 lines anymore imo. Still the code was an interesting read.
AFAIK scene demos which compete for minimal size are allowed to use system libraries like D3D on Windows. From that perspective, using SDL is ok, since it can be considered a system library at least on Linux (to workaround a ton a little compatibility warts between Linux distros).
It's impossible to do graphics in pure C, so what are the alternatives? Whatever you do at whatever level of abstraction/portability (SDL, OpenGL, Metal, Vulkan, Unity, ...), you're leveraging zillions of lines of other people's code.
The only way arguably you could do it in pure C (with undefined behavior) is in enviroments that let you write straight to graphics framebuffers without any abstraction layer and even when you can't do keyboard or mouse input in C without library and/or kernel support.
> It's impossible to do graphics in pure C, so what are the alternatives
Plenty of other comments have already disputed this. It's a reasonable mistake to make, especially if your experience is with more recent technologies and languages.
All the same if reminds me of this perennial quote from a man who really couldn't just use C for everything;
> On two occasions I have been asked, — "Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?" In one case a member of the Upper, and in the other a member of the Lower, House put this question. I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.
- Passages from the Life of a Philosopher (1864), ch. 5 "Difference Engine No. 1"
Although I don't start new projects in C very often, I'm acutely aware how it's not turtles all the way down (as the logophiles believe), but rather C. With some infrequent exceptions the whole tower of abstraction was written in C. SDL is written in C, the compiler? C, the Linux kernel is written in C, the graphics driver was written in C, the GPU firmware was C. It might be unfeasible for you to get by without writing these yourself, but with enough C and somewhere to sit you can move the earth.
(Of course all projects have a smattering of other languages and with great effort you can bootstrap from Forth or handwrite assembly or whatever, but you can do it all in C, and very likely that's what happened.)
> Although I don't start new projects in C very often, I'm acutely aware how it's not turtles all the way down (as the logophiles believe), but rather C. With some infrequent exceptions the whole tower of abstraction was written in C. SDL is written in C, the compiler? C, the Linux kernel is written in C, the graphics driver was written in C, the GPU firmware was C. It might be unfeasible for you to get by without writing these yourself, but with enough C and somewhere to sit you can move the earth.
This is a Linux-only situation. A significant fraction of systems code in Windows and macOS are written in fairly modern C++, including core system libraries and the kernel. A project that emerged out of Windows' C++ effort is the Windows Implementation Library[1]. I'm also certain Windows and macOS graphics drivers are written in C++.
Yes indeed, if the project were targetting a simpler machine without an operating system and windowing system in the way, then doing the whole thing - graphics presentation and all - in a thousand lines of C would be perfectly feasible
> I'm acutely aware how it's not turtles all the way down (as the logophiles believe)
It uses a sprite sheet png file. I am curious how you would target a simpler machine without a bunch of code to display those. I am having trouble picturing this.
>> It's impossible to do graphics in pure C, so what are the alternatives
> I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question. [...] , but you can do it all in C
If you trace the actual context of the conversation from ggp (doubtfuluser - "SDL is not 1000 lines anymore") to gp (shric - "pure C") ... it is clear that "pure C" was just an imprecise and shorthand alternative to saying the more verbose "pure ISO C Standard C with no extra external libraries such as SDL". (I.e. https://www.iso-9899.info/wiki/The_Standard ... has <stdio.h> and console printf() builtin there are no but no graphics and audio primitives)
But people just quickly type things out that seem obvious in a particular discussion e.g. "pure C" ... but can't foresee 4d chess moves ahead in how others will misinterpret phrases like that, and then the pedantic correction guns come out blazing.
<pedantic>But SDL _is_ "pure C".</pedantic> Yes, yes... I know. But one can just read gp's use of "pure C" with charity to see what he was trying to communicate.
Since FILE io is part if standard C I guess you could do a plan9-style interface where you fopen("framebuffer"); and fwrite() means writing a pixel value to the screen.
Since io is always implementation defined anyway, it wouldnt be any less portable than when you do printf("some string") and 'expects to see a string somewhere'.
Perhaps you're responding to the wrong person? I didn't think or write that GGP didn't know any of this. I was (clearly, I thought) considering the idea that someone might take such a position.
I wouldn't even mention this, except that you've expressed a interest in avoiding misinterpretation.
The idiot there is Babbage -- the politician, more well trained in general thinking, clearly observed that thinking-things can correct mistakes against a background of common understanding.
eg., "Q. Who is the king of france?", "A. France has no king, did you mean who was the last king of france?"
But that's an input too (cf. contingent vs. necessary truth). If the Analytical Engine were controlled by an Orléaniste it would surely declare the balding[0] Count of Paris, Jean Carl Pierre Marie d'Orléans, to be king[1].
Clearly raised to a higher peak by taking one of the words of a statement most easy to misread for the sake of outrage, and highlighting it, whilst ignoring the substance of the comment.
Babbage was insulting politicians who had, in the instance of his insult, a better grasp on the nature of intelligence than he had. It is a foolish quote to repeat, and reeks of the same smug obliviousness in which it was said.
If Babbage was fit to be so smug, he is fit to be called an idiot when having been so foolishly.
"Peak HN" is presumably when you can cite an oblivious smarter-than-thou Engineer at his most inastute, but not call one an idiot.
> If Babbage was fit to be so smug, he is fit to be called an idiot
Please become familiar with the British sense of humor and the context of the writing before taking it at face value. If that is too much work, at least be prepared to give the writer the benefit of the doubt. Babbage was writing for a specific audience and with a specific intent, and I would suggest that you are probably misunderstanding him if you are inferring smugness from an isolated quote.
Yes, that brings back the memory of working through books by Andre LaMothe and implementing little games in DOS with C and a little bit of Assembler. I believe there was a very primitive graphics library included in Borland C, but it was not that useful for this task.
Andre LaMothe showed me the wonder of alternative graphics memory layout like Mode 13h and Mode Z. Though it confused teenage me at the time why you would have four memory segments each dealing with (offset %4) bytes in a line-oriented graphics buffer, it was magical when it worked and was one of the first times I had to let the code just work and move on.
Double buffering was an understandable neat trick and one that would have taken me a bit longer to discover on my own. He also had a custom controller wired through the printer port and some C+ASM for interfacing with it.
I had three of his books, I think the Black Art book was constantly on my desk in the 90s.
You do call a routine stored in VGA ROM (interrupt 0x10) to set up the mode, then do some port I/O to configure VGA registers and then access VGA memory directly. No "system libraries" from DOS involved as such (they are needed for things like filesystem access, allocating memory and dealing with command-line arguments and returning to the system, though).
Borland's dev tools came with "BGI" (Borlands Graphics Interface), but that's not necessary and wasn't really used for many games -- it provides abstract high-level drawing routines, like lines, circles, etc... that can be made to work on different graphics devices (CGA, VGA, ...). This was not necessary for direct graphics card access that most games used.
On Linux, you may mmap /dev/fb0 to access the front buffer like you would have on old computers.
Not all kernel configs and GPU devices allow that. You can test if fb0 is usable with cat /dev/random > /dev/fb0 to see if the screen is filled with garbage (you may also need root privilege).
You could also implement a Wayland or X11 client by yourself. Those libraries are also implemented in C after all. No need for external libraries to write to some sockets.
This is not actually that difficult if you just want the basics. I have a project that implements just the x11 key & mouse events, and the shared memory extension in 400 lines of Rust. Was worth it for me since it eliminated dependency on libx11 and libc, which removed a dependency on something like 800,000 lines of code across those two libraries. (Determined by a basic cloc on each library source. Actually compiled code for a specific architecture would probably be less than that, but still orders of magnitude more than 400.)
I probably wouldn't bother with implementing X11 from scratch for a game as even simply fullscreening it would likely require some diverging code paths to actually work everywhere, but Wayland should be a breeze. Having worked on SDL's Wayland backend I'd say that most of the difficulty came from impedance mismatches with preexisting SDL APIs, so if you design your thing from scratch all you really need to deal with there are the protocol bits - which you could just mostly hardcode and automatically get rid of most of libwayland's complexity that deals with protocol extensions and their XML descriptions.
Too many people have replied to my post saying I'm wrong in some way, so rather than replying the same thing to almost every single comment, I'm going to reply to myself.
My original response was to the person claiming that it was basically cheating to use SDL. You need abstraction, so there is nothing wrong with having SDL in the way.
Someone said that it's C all the way down. It's not. The Linux kernel is not 100% C and it cannot possibly be.
There are no C facilities to:
- Read a keystroke and/or mouse (required for Flappy Bird) directly. C only supports reading from the stdin and stdout streams.
- Generate graphics (more on this below).
As I mentioned in the original post, some operating systems let you write straight to graphics memory.
While you can technically do this in C with a pointer to an address, you're still going to have to issue e.g. a BIOS call to set the mode to something other than the default text mode. But we can let that slide.
There is absolutely no way to do "C all the way down" when reading and writing I/O ports, executing interrupts, etc. which is required for I/O beyond just the buffered stdout and stdin streams that C provides.
Linux of course provides things in /dev, but they're also not C all the way down.
If there was a hypothetical computer which supported all graphics and other I/O via reading and writing to/from streams, then sure, but I don't think that computer exists.
I know very well what C can and cannot do, I've been using it for 40 years.
You can't directly access the GPU hardware on modern systems (at least without a massive re-engeneering effort like the Asahi Linux GPU driver for Apple hardware), so any 'minimal graphics API' still goes through a massive amount of wrapper code in the operating system and GPU driver.
Also, on older home computer systems the hardware was essentially the rendering- and audio-engine. Those systems were designed from the ground up to be convenient to program by directly writing hardware registers from assembly code (e.g. you didn't have to draw a sprite yourself, you just wrote a hardware register with a pointer to the sprite's image data in memory, and then wrote another hardware register to update the current sprite position and the video hardware did the rest). On modern hardware, providing such an interface is the job of drivers.
flappy bird is a simple game, 1000 lines sounds like a normal amount(assuming there is a graphics scaffold, which in this case appears to be SDL), and it would indicate that it's not compressed code-golf style
Indeed. The linked project is quite verbose in fact. Just checked my old ncurses clone in Python and it had 280 lines. JS clone based on Pixi had less than 200. It wouldn't be much more if translated directly to C.