> They probably did not pick that number at random. 224 is a number evenly divisible by 16 (224/16 = 14) which means it plays nicely with the graphic rendering pipeline tilemaps.
This was something that took a bit to figure out, but made so much sense to me after I had been playing around with trying to learn game programming when I was a kid.
CGA/EGA/VGA all had popular 320x200 modes.
The NES was 256x224, as was the SNES (although it did have higher resolution modes), and that was really a TV limitation.
Meanwhile, Pac-man was 288x224 in the arcade.
So none of the Pacman clones on the PC would ever look 'right', and even the Pacman games on the NES that were made by Namco didn't look right either. There were always hacks like giant characters because the tiles for the map were smaller, or you'd get a scrolling world (Gameboy, Tengen versions), other kinds of distortion, non-original maps...it was all just weird and frustrating when you're trying to play the 'arcade' game at home.
But after learning the details of the machines, how sprites worked (and then coming to the conclusion that they just didn't have any other choice), was such a huge 'a-ha!' moment for me. Let's not even get into the fact that pixels aren't square on those resolutions on the PC.
And then it became almost an instant reaction when I'd see a Pacman port or clone, and try to figure out what size the world was, what size the tiles were, what size the sprites were....
The NES was actually 256x240, using all 240 lines of an NTSC field, but many TVs would cut off part of the picture on the top and bottom -- thus, 256x224 was the "usable" space that was safe on most TVs. But this overscan was inconsistent from one TV to another, and modern TVs and emulators will usually show you all 240 lines.
The SNES's vertical resolution was configurable to either 224 or 240 lines, as the article mentions. Most games stuck with 224, as the longer vertical blanking interval gives you more time to transfer graphics to the PPU.
> 256x224 was the "usable" space that was safe on most TVs
Adding further complication, although most arcade cabinet games also used 15Khz CRTs similar to de-cased televisions, since all the cabinets were being assembled by the manufacturer using CRTs they specified, designers could take some liberties with varying the resolution, frame rate, scan lines and scanning frequency of the video signal generated by their game's hardware circuit board. Being analog devices, CRTs of this era were generally tolerant of such variations within specified ranges since they had to sync up with inputs from disparate over-the-air television channels, cable boxes, VCRs or even live cameras. This allowed arcade hardware designers to optimize their circuits either for slightly better resolution and frame rates or alternatively reduce them somewhat in cases where their hardware wasn't quite able to generate enough pixels in real-time. For example, the Mortal Kombat 1, 2 and 3 cabinets displayed video at 54 Hz (instead of NTSC-standard 59.94 Hz) enabling higher horizontal resolution. They could also optionally choose to use the entire overscan safe area for active game pixels since they knew the CRT's width and height adjustments could be dialed on the cabinet manufacturing line to ensure the entire active picture area was visible inside the bezels - whereas few consumer TVs exposed all of these adjustments externally to users.
All this subtle variation in classic arcade hardware makes clock-for-clock, line-for-line accurate emulation especially challenging. Fortunately, the emulation community has solved this thorny set of problems with a special version of MAME called GroovyMAME which is designed specifically to generate precisely accurate emulated output signals so these emulated classic arcade games can be perfectly displayed on analog CRTs. This requires using one of the many PC graphics cards with native analog RGB output, which was most graphics cards made up to 2015 - but sadly none since. GroovyMAME works with specially modified Windows graphics card drivers to generate correct signal ranges from the analog output hardware of most off-the-shelf, pre-2015 Radeon and NVidia cards - which are still widely available on eBay ($10-$50).
For arcade preservationists and retro gaming purists, the resulting output to a CRT is sublime perfection, identical to the original cabinet hardware circuit boards, many of which are now dead or dying. This enables creating an emulation arcade cabinet either using a period-correct CRT matching the traits of a certain series of original cabinets, or alternatively, using a special tri-sync or quad-sync analog RGB CRT which is able to display a wide variety of signals in the 15Khz, 25Khz, 31Khz and 38Khz ranges. This is what I have in my dedicated analog CRT cabinet and using GroovyMAME along with a 2015 Radeon GPU it can precisely emulate 99+% of raster CRT arcade cabinets released from 1975 up to the early 2000s accurately (and automatically) recreating hundreds of different native resolutions, frame rates, pixel aspect ratios and scanning frequencies on my cabinet's 25-inch CRT. For more info on GroovyMAME and accurate CRT emulation visit this forum: http://forum.arcadecontrols.com/index.php/board,52.0.html.
Fascinating post, and innovation to get this all emulated with more recent hardware. Thank you for sharing.
What is the future for RGB-output video cards looking like? Are there more specialised cards still in production?
And are these tri-/quad-sync analog CRTs still manufactured?
The feeling of CRTs and contemporaneous hardware provokes almost overwhelming nostalgia for me, and I feel like modern television hardware is only just beginning to catch up with respect to UI responsiveness and reliability, for instance changing channel & volume, and playback functions like pausing, fast-forwarding and rewinding videos.
> What is the future for RGB-output video cards looking like? Are there more specialised cards still in production?
Sadly, no graphics card manufacturer still makes cards with native analog RGB output and, AFAIK there haven't been any since ~2015. There may be cards which have analog output but it's not natively generated with variable analog timing (dot clocks etc). Instead it's created as a native digital signal and then converted to analog, at which point it's no better than adding an analog converter externally to an HDMI or Displayport output connector (this is pointless and not worth doing).
On the positive side, there are a ton of used graphics cards with native analog output available on eBay for dirt cheap (or free if you have PC hobbyist friends or access to a typical corporate or edu recycle pile). Arcade cabinet games which output to CRT monitors stopped being made by around 2005 and game consoles which hooked to CRT TVs ended with the sixth generation (PS2, Gamecube, Dreamcast). This is good news because emulating the vast majority of arcade cabinet and console games up through the early 2000s doesn't require a fast GPU or CPU so using an older GPU with native analog output does everything you need (and saves a lot of money).
The last, best GPU made with native analog output was the Radeon R9 380x launched at the end of 2015. I have this card in my arcade cabinet emulation system (plugged into a 2014 HP ProDesk 600 G1 with i5-4590 Haswell CPU (~$70 used on eBay)). This PC is more than fast enough to perfectly emulate everything relevant to CRT gaming and the 380x GPU is substantial overkill. Being the last analog output card, the 380x is overpriced on eBay at >$50 but I only got it because I have a quite rare Wells Gardner D9200 quad-sync industrial CRT made specifically for arcade cabinets and that monitor is fairly unique because it can scan up to 38Khz (800x600 resolution). No games originally designed for CRTs use resolutions that high so it's only relevant for some PS2 games, and only then if I use non-authentic, 2x upscaling or HD texture packs in the emulator. So, I might theoretically, occasionally actually need the otherwise uselessly excess power of the 380x. If you're not me, just use almost any Radeon graphics card from 2012-2014 which can be had for ~$10-$15 to drive your CRT with GroovyMAME. Card compatibility list: https://emulation.gametechwiki.com/index.php/GroovyMAME). GroovyMAME forum: https://forum.arcadecontrols.com/index.php/board,52.0.html
> And are these tri-/quad-sync analog CRTs still manufactured?
All CRT manufacturing stopped around 2010. I was fortunate to buy my industrial-grade, quad-sync CRT new directly from the manufacturer in 2009. However, there are lot of used CRT TVs locally available from Craigslist and thrift stores, many of them for free or close to it. Higher quality CRTs like the Sony PVM and BVM series made for video production studios and broadcasters are now collectables selling for astronomical prices. However, high-quality consumer TVs from the late 90s and early 2000s, like Sony WEGA and any of dozens of models based on the well-regarded Sony BA-5 chassis, can be had in good condition for fairly reasonable prices. Many of these can also be modded to accept direct analog RGB input in addition to composite or S-Video, elevating their quality significantly higher (Modding Guide: https://sector.sunthar.com/guides/crt-rgb-mod/sony-ba-5.html). With the exploding interest in CRT retro gaming (for example: https://www.reddit.com/r/crtgaming), I'm surprised no one has yet restarted CRT manufacturing but CRTs are pretty complex beasts, essentially a kaiju-scale vacuum tube with arcane analog driver circuitry bolted on.
> I feel like modern television hardware is only just beginning to catch up
To be fair, with expanded color spaces, higher contrast, wide color gamuts (HDR10 etc), high-nits, faster gray-to-gray response times, black frame insertion and VRR, the latest, most expensive digital flat screen tech is getting closer in many ways. I can imagine it maybe getting there in the future but, unfortunately, the hardest part may be actually finding a modern television without ads, apps, online updates and DLC bloat.
Although I'm a retro purist and will never part with my beloved CRT-based emulation cabinet, I know not everyone is quite as obsessed or may not have space for such a system. So, it's important to also share that in recent years modern GPU-based pixel shader CRT emulation has gotten impressively closer to emulating analog CRTs, including shadow masks, analog glow, glass warping and even ray-traced bezel reflections. If you can't play on a real CRT, I encourage everyone to at least play games which were originally created for CRTs via CRT emulation. It's easy to do and retro pixel art looks so much better when presented as originally intended. See this image comparison: https://x.com/CRTpixels/status/1408451743214616587. Without CRT scanlines and phosphor glow, the art looks terrible and is just completely wrong. Check out Retroarch's shader community (https://forums.libretro.com/c/retroarch-additions/retroarch-...) and ReShade.
I only discovered that a lot of games output at weird refresh rates when I was putting together a mister. My vrr TV handles most of the weird refresh rates and resolutions but not all (in particular not bad dudes Vs the dragon ninja).
I didn't know mk 1,2,3 run at 54 Hz! Mister doesn't have support for the mk boards so I've only played them via emulation on a pc, this means they have been running too fast! (I think)
One thing, my pc has a Nvidia 3070, if I tell retro arch to output at the original refresh rate (which my vrr TV should be able to handle) I'll get the correct refresh rate?
I think what you have been talking about is that post 2015, analogue output on graphics cards isn't natively generated so it's as bad as a hdmi to analogue adapter. Digital output to a vrr is completely separate.
> this means they have been running too fast! (I think)
Not necessarily. There are settings in MAME which provide some options on how to address frame rate mismatches. I think they all have various trade-offs like dropping, doubling or blending frames but I'm not current on what they are since all my serious retro gaming is on my CRT-based arcade cabinet :-). In theory at least, a modern GPU's ability to synthesize motion interpolated frames should allow fairly decent frame rate matching, even without VRR.
> if I tell retro arch to output at the original refresh rate (which my vrr TV should be able to handle) I'll get the correct refresh rate?
Yes. VRR is basically intended to do with a digital display what an analog CRT has always done, vary the display's refresh rate to match the source clock. However, I'll add a small caveat here. VRR is relatively new and advanced digital display features newly added to revisions of existing consumer video standards have a tendency to go through some teething pains as various device and display manufacturers figure out the nuances. I've only played around a little bit with VRR but haven't done any serious validation myself. Until it's more mature, I wouldn't assume correctness or compatibility of any recent addition to HDMI 2.1 (looking at you Source-based Tone Mapping!). So... trust but verify :-)
Also, since you mentioned Retroarch, here's a ProTip: Retroarch is admittedly convenient but for serious retro gaming I generally recommend using the original emulators directly, especially if you're striving for emulation accuracy and display correctness. MAME's interface is definitely more clunky and it's probably possible to achieve identical results with Retroarch but as a wrapper, it adds another layer of abstraction and potential for gremlins. There's also the potential for cores to not be up to date and the RA authors do change some things and make certain trade-offs to integrate disparate cores into their architecture. For CRT users I also don't know if there's even a GroovyMAME core for RetroArch.
> Retroarch is admittedly convenient but for serious retro gaming I generally recommend using the original emulators directly, especially if you're striving for emulation accuracy and display correctness.
This comes with the caveat that sometimes RetroArch's frontend is better than the standalone emulator's frontend -- RetroArch's graphics and input is quite mature and configurable on all platforms, and I've definitely had problems with bugs or latency in some less-maintained standalone emulators that aren't a problem when running through RetroArch. But yeah, agreed otherwise -- RetroArch is another layer between you and the emulator core that doesn't always do what you want or expose the options you need.
Yeah, on retro arch and specific emulators, you are right. I've been messing around with emulators since about 1997 and it's only in recent years I've been using retro arch, I miss the days of zsnes and genesyst. I actually think retro archs UI is pretty awful. Also, I don't get why their website reads like an infomercial. For example they really want to make a point that FPGA emulation is no match for retro arch, which is silly, they both have advantages and drawbacks. Maybe they make money off retro arch, not sure.
The layer of abstraction point you make is spot on, I've been using my steam deck a lot, I'm using emu deck which installs emulation station which installs retro arch. Configuration is scattered everywhere.
I haven't mucked about much with individual emulators in a while, so I'm not sure if they'll support run ahead latency reduction features, that's the one big thing I like in retro arch.
Edit: my main issue currently is figuring out what settings I should be using for particular cores/emulators. The steam deck screen isn't vrr, but it does allow refresh limiting. So that is its own set of problems. Similarly I think I'm using the right settings for my pc vrr set up, but never certain.
Actually, I spend more time fiddling with setting Vs playing games!
The Mega Drive also uses a 320 wide mode for most games, the "width" of an analogue TV picture is somewhat arbitrary and based on things like available bandwidths / sample rates and so on, so it's a bit flexible depending on system design.
I love this comment, because I always saw the different ports as stretched compared to arcade versions, and I honestly had no idea why. That includes even old ports to Atari 8-bit computers, and the various versions, like Ms. Pac-Man and Super Pac-Man.
This SNES video analysis one is incredible. I've always had all of this stuff running around in my head for how to explain how weirdly cool video generation for NTSC is, and you have done an incredible job finding a way to do so.
There is yet another reason for the weird frame and horizontal scan rate. When NTSC was originally introduced as a broadcast standard over a single RF modulated signal, the sound carrier and signals were also embedded in the signal as well. [1] Actually, I just found that Wikipedia does a good job of describing this on the NTSC page [2]:
When a transmitter broadcasts an NTSC signal, it amplitude-modulates a radio-frequency carrier with the NTSC signal just described, while it frequency-modulates a carrier 4.5 MHz higher with the audio signal. If non-linear distortion happens to the broadcast signal, the 3.579545 MHz color carrier may beat with the sound carrier to produce a dot pattern on the screen. To make the resulting pattern less noticeable, designers adjusted the original 15,750 Hz scanline rate down by a factor of 1.001 (0.1%) to match the audio carrier frequency divided by the factor 286, resulting in a field rate of approximately 59.94 Hz.
So yes, yet another difficulty with NTSC -- sound actually splattered visual noise on the screen as well!
A bit unrelated, but the links to your books at your website are no longer working. I tried to connect to you via email for this issue. Is this intentional or you will fix them?
It was originally 60 fields per second (30 interlaced frames per second), on black and white TVs.
The highest frequency generated in a black and white TV was the horizontal scan rate, which was a multiple of the frame rate. With the addition of the NTSC color signal, which used a 3.579545 MHz carrier wave, the highest frequency generated in the TV became much higher. To keep the hardware simple, all lower frequencies were still divisors of the highest frequency, now color carrier wave. For the frame rate, it came out to 59.94 fields per second.
> Trivia: Besides the annoying black band, the game code was also rarely revised to account for the VSYNC which occurred at 50.00697891Hz instead of 60.098Hz. This resulted in game running 17% slower than intended. European gaming was a real dumpster fire. But luckily without the internet we did not know about it.
This one hits home. Although my examples are not specific to the Super Nintendo, it reminded me of the first time I played/watch Sonic the Hedgehog on the Mega Drive (Genesis)
I wasnt impressed with the game. It looked clunky and just felt slower compared to the Master System version. It wasn't until the rise of youtube I realised the difference in speed between the NTSC and PAL is huge. Its not just the speed of the game, but the Music. It sounds horrible on PAL!
Don't get me wrong - I knew about the PAL during the 16-bit, and the need for the "black box" but I didn't realise how much of a difference it was. I am sure the console magazines at the time would say the difference is minor in most games. One of the exception (honesty) was DooM on the SNES. The NTSC version had a bigger screen.
I remember being good at Punch-Out when I was a kid on the NES. I could beat Mr. Dream (or Mike Tyson) in the first round. Of course, I was playing the PAL version. If there was some kind of competition in the USA, I would have been destroyed in the first round! I would have been convinced I was framed!
Super Metroid is a game where the developers did make revisions so that the game would play at the correct speed on PAL consoles. But those tweaks still end up making material differences in gameplay, especially in a speedrunning context, because the slight differences in physics constants and animation timings can make a difference when exploting glitches or race conditions. For example, the slower framerate makes it easier to clip through things, because Samus and her projectiles move more pixels in one frame, so this gate glitch can only be done on PAL: https://www.youtube.com/watch?v=RvyIwtO_qgM
Also, the developers properly adjusted Samus's physics constants and animation timings for the new framerate, but they didn't adjust enemies, cutscenes, or other aspects of the game environment. So Samus moves at the same speed as on NTSC, but the rest of the world moves slower. This means that on PAL you can grab Bombs and escape the room just in time before the door locks, skipping the miniboss fight: https://www.youtube.com/watch?v=R3t8TIIj7IM On the NTSC version, that same skip requires a complicated setup and several dozen frame perfect inputs in a row, and only one person has ever managed to pull it off: https://www.youtube.com/watch?v=jcKUMk5g8Wk
Here's a comparison of the fastest tool-assisted speedruns between NTSC (left) and PAL (right): https://www.youtube.com/watch?v=KD_-thqcB5s Both runs take the same route up until the very end; the NTSC version is faster in almost every single room, but PAL ends up finishing first because the arbitrary-code-execution setups are very different. The NTSC run has to do a very slow sequence of pausing and unpausing to move through a door without activating it, in order to get out of bounds and trigger memory corruption. Whereas on the PAL version, we're able to exploit a race condition in the game's animation system to achive ACE fully inbounds. The race is between a spike's knockback timer and Samus's landing animation; because Samus's timings were revised for PAL but the spike's were not, the timing works out a little differently and the race ends up being exploitable in this context on PAL but not on NTSC.
I absolutely love Super Metroid, but only played it briefly on a Super Nintendo in the 1990's. I've played the NTSC version on emulators for years, and definitely came across timing issues with memory corruption when pausing through a door, but it wasn't even intentional. I always thought developing for different architectures was the only issue developers had to deal with (beyond faster or slower CPU/GPU/RAM), but had no idea that even something like the SNES had issues between NTSC and PAL. This whole comment section has been amazing as someone who loves these classic games, and also enjoys watching speedruns. Gaming got me into software development, even though I never got into that side of development, but your comment and articles like this remind me why development is so interesting (and difficult for the strangest reasons!).
My memory might be a little off but in Australia, because of magazines like 'Hyper' we were made aware of the timing differences but there wasn't anything we could do about it. Thanks for the heads up guys! We could have lived in ignorant bliss!
So when the Dreamcast came along, it was the first to offer games that you could switch between 50Hz and 60Hz but only if your TV could handle it. It also meant that with a lot of games that didn't account for this, you could make things easier by switching back to 50Hz. I recall Crazy taxi being much easier at 50Hz.
Wow, that sounds like it might explain why my timing is always off playing SNES games on emulators. I grew up playing the PAL versions.
I always put it down to lag on modern TVs. Be
It depends on the system and the emulator. The file format most commonly used for NES ROMs does not include region information, so most NES emulators require the user to manually select NTSC or PAL timing to match their ROMs. I think SNES emulators should automatically use the correct framerate though, as the SNES ROM header does include a region field.
The NES/SNES minis always use the NTSC versions of the game, regardless of region.
No, it was a common issue on PS1/PS2 era 3d consoles as well. It wasn't until we got digital connections like HDMI etc that the PAL/NTSC timing issues went away completely. The disconnect between rendering and logic was not so common back then even in 3d titles, there are many games that just run slower in PAL etc.
> Its not just the speed of the game, but the Music.
I get that the game speed depended on the framerate, but playing music at a frequency reduced by 17% would have sounded really horrible, I don't think they would have gotten away with it. But then again, what do I know... The only system I know a bit about is the Amiga, which had dedicated sound hardware, so I'm pretty sure it was not tied to the video frequency, no idea about other systems.
At the time, I would not have figured anything different.. other than I just didn't understand the hype that is the Mega Drive (Genesis)
- The 3-button game controller is horrible, and the D-PAD is worse!
- The main mascot game (Sonic the Hedgehog) was clunky with poor music
Of course, if I had experienced the Genesis on an NTSC then my initial view of the machine might have been totally different!
Overall many games I played in the UK - no matter the console... I was happy because I had nothing to compare it to. It's just Sonic the Hedgehog, from memory, was the main one that just felt off. I would not have thought it was a PAL vs NTSC thing.
You're right, the aspect ratio internally is 8:7. Emulators like Snes9x default to this. This effectively is rendered on a CRT display at 4:3 due to how CRT's work and mangle perfect square pixels to rectangles. At least that is my understanding.
Doesn't this miss the part where the 256x224 (8:7) output resolution gets stretched into a ~4:3 (actually 64:49) image?
The SNES has a dot rate of ~5.37 MHz which is slower than the square pixel rate defined by the ATSC standards of ~6.13 MHz. It's exactly 8/7 slower, so pixels are stretched horizontally by 8/7, causing the 8:7 resolution to be stretched to (8/7)*(8/7)=64/49, which is close to 64:48 = 4:3.
> Result in an aspect ratio close to 4:3. This would mean 224*(4/3) = 298 visible dots.
If you consider what I mentioned, the factor would be (4/3)/(8/7) = 7/6, so they would have to choose something closer to 224*(7/6) = 261.33... visible dots. Which is much closer to what they chose with 256.
Oh my! The only way to get even worse video quality (and extra noise) was to add a two-wire UHF screw terminal adapter between the RF switch box and the TV. So of course that's what I had as kid on my 1970s knock-off Pong game console (we couldn't afford the real Atari). Later in life, video engineer me could laugh about this. I'm just glad that at the time kid me had no idea how awful it was. :-)
The 8:7 artwork aspect ratio is visible in a few SFC/SNES ports to other platforms, like ROCKMANX3 / Mega Man X3. The PSX/Saturn/PC versions of that game retain the original art unstretched and instead add stage-appropriate pillarbars to pad the 8:7 out to 4:3. Very distracting to play since I'm used to the original ver. Check out some screenshots of the Saturn ver here and you can see it — everything is slightly too skinny: https://segaretro.org/Mega_Man_X3
How much of the SNES resolution is hard-coded in the console hardware, versus being something the cartridge could drive? Could a cartridge that didn't need to load sprites (e.g. because it had its own coprocessor), and had its own onboard clock, theoretically drive more than 256 horizontal pixels per line?
So with a coprocessor you can render your own frames and put them in memory where the next line’s tiles are going to be pulled from. That is what the SuperFX did, I think.
But in the end you’re still stuck with the limitations of the PPU actually drawing it in pixels and number of colors and such.
Are there any emulators that accurately simulate the CRT appearance instead of just drawing the pixels straight to the window? This could be done performantly with a GPU shader. (I recall some emulators having an aesthetic scanline effect you can enable, but that's not the same thing.)
Wow. I had no clue that an input device actually had any sort of control over the drawing of a CRT. I thought you just throw out signal at a certain timing and that’s it.
Also this article was wonderful in the way that it didn’t waste a word. Very concise.
i'm real. I just wanted to make a point that "59.94Hz is such a weird number. Isn't the power grid running at 30Hz and TVs used to double it?" They don't double it, it was in fact running at 60hz or 50hz depending where you lived.
"Please don't post insinuations about astroturfing, shilling, bots, brigading, foreign agents and the like. It degrades discussion and is usually mistaken. If you're worried about abuse, email hn@ycombinator.com and we'll look at the data."
"Don't sit too close, move back a bit". Any truth whether sitting too close makes for bad eye sight? Wondering of any effects (e.g. leakage) that goes past the target screen area.
It's not a dedicated CPU instruction, but rather a write to an MMIO register with a certain bit set. And many games will follow patterns like computing the register value during game logic, saving it in RAM, and copying the value from RAM to the register later during vertical blanking, or they might have code path to enable hires mode that is present but not used if the code came from a library or engine. So in general, the problem is undecidable; and limited to SNES games, probably not very realistic.
The most reliable thing to do is probably to play through the entire game in an emulator (maybe from a TAS movie) and record whether it actually enables high-res mode or not.
Older TVs did obviously not have SCART, and it took a while after 1976 when SCART was created for it to be widespread. Also, a lot of cheaper TVs had only RF in, and a lot of smaller TVs had RF only or RF + composite input only, even in Europe.
Growing up in the UK, everyone I knew used it for VCRs, DVD players and digiboxes. It always made a notable improvement over the RCA jacks, and I longed to get the SCART cable for my PS2 (never did). Famously bulky and stubborn with their wires joining the connectors at an aggressive 45° angle; I never had one go bad on me.
I've read that it could even do HDTV in theory because it had YPbPr lines, but this was was never seriously attempted/rare in practice.
As an American I never knew about it until it was mentioned one day on a forum on the internet I was reading.
Other than the crazy size of the cable, seems like quite a big improvement over our random assortment of cables we went through over the years with composite -> s-video -> component.
It remains fascinating to me that even today if we boarded flights to each other's countries, there are still many small things that would need explained.
I would first be shocked to see the smaller two-pronged electrical plugs in the flesh. Then I might be in trouble with the law if I walk across a road. And finally lose my mind over the fact that asking for 'tea' gives me something else entirely!
The fact that parts of Europe have a "freedom to roam" (if I remember the term correctly) is just insane to an American. How could you _not_ be trespassing?
Here in Belgium almost everyone used scart cables for game consoles, vcr and dvd players. We had 2 scart input but 3 scart devices. PSX vcr and dvd. So we had to switch from time to time. There were even scart switches, but they all sucked.
Ah least with a switch you wouldn't get electric shock while plugging-in the scart cable (which was a major issue for me, and not just static electricity).
In Sweden SCART was the default - your VHS deck or satellite TV box would come with a SCART cable. I don't think our 1995 TV even had composite or S-video ports, you had two SCART and RF[0]. For connecting up PCs or video cameras, we used a SCART to composite adapter plug (with a little switch to choose if it was in or out).
Author is French; the French word for "carve" could translate to something like "sculpting/tailoring" in English, maybe that is what he meant? Tailoring the video system to the proper settings/resolution, or something?
Not a native english speaker but I liked carve better since to me it has more of an implication of having to stay within the bounds of the available medium with the connotation of this being a more artistic than purely scientific craft.
This was something that took a bit to figure out, but made so much sense to me after I had been playing around with trying to learn game programming when I was a kid.
CGA/EGA/VGA all had popular 320x200 modes.
The NES was 256x224, as was the SNES (although it did have higher resolution modes), and that was really a TV limitation.
Meanwhile, Pac-man was 288x224 in the arcade.
So none of the Pacman clones on the PC would ever look 'right', and even the Pacman games on the NES that were made by Namco didn't look right either. There were always hacks like giant characters because the tiles for the map were smaller, or you'd get a scrolling world (Gameboy, Tengen versions), other kinds of distortion, non-original maps...it was all just weird and frustrating when you're trying to play the 'arcade' game at home.
But after learning the details of the machines, how sprites worked (and then coming to the conclusion that they just didn't have any other choice), was such a huge 'a-ha!' moment for me. Let's not even get into the fact that pixels aren't square on those resolutions on the PC.
And then it became almost an instant reaction when I'd see a Pacman port or clone, and try to figure out what size the world was, what size the tiles were, what size the sprites were....