My background is I wrote emulators for 24 systems and counting (higan and bsnes), and I want to try my hand at technical writing, in hopes of encouraging more people to get involved with emulation. I'll hopefully get better over time and with more feedback.
I'll be writing more about video emulation in the near future. I would like to cover techniques for interframe blending (the flickering shadow), composite artifacts (the waterfalls in Sonic the Hedgehog), color bleed (translucency in Kirby's Dream Land 3), screen curvature, scanlines, color subcarriers (and how shifting it causes shakiness you see in composite video), interlacing, phosphor decay, phosphor glow, aperture grilles, non-1:1 RGB pixel layouts (Game Gear LCD), etc.
I'm intending to write about all things emulation there, in fact. I want to cover input latency reduction/mitigation, removing audio aliasing above the Nyquist limit using FIR and IIR low-pass filters, designing thread schedulers and priority queues, provide mirroring and organizing of PDF datasheets for all the ICs used in retro gaming systems, etc.
Basically, I want to cover all the stuff you don't usually find in "how to write an emulator" tutorials. All the polishing stuff that takes a new emulator to a well-polished emulator.
It's not ready yet (the site is brand new), but I'll have an RSS feed at byuu.net/feed in a couple of days if anyone's interested in this sort of content.
I grew up coding demo effects on the C64 and the Amiga.
I always loved how the vblank synced ultra-smooth graphics slide over the screen while the CRT had a sort of magical "glow" to the picture.
I sit here by a nice iMac but often feel like that magical glow just isn't there even though the picture is 1000 times better than my old television I used.
Is there some kind of effect that can be applied to achieve this glow on a good LCD screen?
Or maybe it's just the lying of the nostalgia of childhood memories?
Phosphor glow is one of my favorite effects. One of these days I'd really love to have a software-mode C++ filter for that.
It can be done through filtering, but it's really hard to make it look good. A person named Psiga made this mock-up for Secret of Mana in Photoshop that has always astounded me: https://sites.google.com/site/psigamp3/PhosphorSimTest1.jpg
Unfortunately, no one's been able to figure out exactly how it was made to replicate it in software. But something like it would really add a lot.
We have pinged them in the past about it, actually ^^;; It's been a mini-obsession of mine since I found it, hahah. They regrettably don't remember how it was made.
Ah bummer. I would have thought they'd have kept the PSD lying around somewhere like I do and it would've shed some light. Maybe someday someone will figure out something that looks right!
I would like that, thank you! A few people have tried over the years, and we even contacted the person who made the original image, but he didn't remember how he did it ^-^;
You could probably use machine learning for fine-tuning the parameters of whatever fast approximation comes out of that though.
(having a plausible first-principles model and then automatically tuning the knobs until the error with a ground truth is minimised is a form of machine learning, right?)
Pretty sure a cross correlation of the source and resulting image will reveal most of the transformation's recipe, aside from a few details like the fine grid lines added in. If I have time tomorrow I might take a crack at it.
Look at CRT draw in slow motion. You have an ultra bright flash, then an alarmingly quick phosphor glow falloff.
You'd need, basically, a high hz HDR monitor with an actual sub-ms response time... so you can superbright blink the character, essentially black framing it.
The rest of the actual softness of the glow can be emulated conventionally with shaders (there are some super swanky terminals and emulators that have that part already done).
Although this flicker would indeed be necessary for a 100% accurate reproduction, I'm not sure that is really the thing that current CRT shaders are missing for feeling "just right". After all, the linked example image feels very convincing and yet is static.
Agreed, love this kind of look. I emulated a curved screen, phosphor glow and scan lines somewhat in my Unreal Engine based Z80 emulators, although I think the look could still be improved further. This is fairly easy to do with UE4 shaders and geometry for the curved screen, a bit harder to do all from scratch though in a standalone program I would think.
Thank you for taking the time to write your expertise down. Even if the article doesn't get much attention right now, the value of writing down knowledge is often understated.
For someone who has been so adamant about cycle-accurate emulation I was surprised to see such ad hoc color transformations being used. Has anyone in the emulation community tried to accurately estimate the end-to-end response of these systems? Something along the lines of using a colorimeter with a reference monitor (like a Sony PVM CRT) and homebrew ROM that generates test patterns.
There's certainly limits to idealism. This is currently the best we have for the listed systems, I'm afraid. Hopefully someone reading about this will take interest in trying some things like you mentioned and helping to improve the situation.
As mentioned, the systems that need this most don't have even frontlit displays, so a colorimeter would not work on them. There's also the question of what the correct contrast setting should be on the analog adjustment wheels these systems had. Ideally you'd want to capture multiple contrast settings, and then try and devise an underlying adjustment algorithm from all the data sets to approximate the full analog range of positions for it.
I'm not into emulation at all, but I wonder how low level the passive LCD panels were software controlled in game or system ROM?
The wikipedia page on Walsh functions says:
>They are also used in passive LCD panels as X and Y binary driving waveforms where the autocorrelation between X and Y can be made minimal for pixels that are off.
Not sure how applicable that is on which of the systems you are considering. Sorry for the rabbit hole ;)
I recently started playing around with RetroArch, and found it quite hard to make sense of all the shaders it comes with. :) Looking forward to the coming articles to get a better understanding of the parts that make up the pipeline from raw RGB pixels to something that looks like what I remember playing on my TV as a kid.
One question to the article: Do emulators generally have the color correction/gamma adjustments built in? If so is there typically a flag to turn it off, so it can be done in shaders instead?
I would imagine most emulators do not include colour-correction or gamma adjustments for RGB-native consoles, unless the "native" colours are particularly ugly.
For example, the original NES does not work with RGB, but with analogue NTSC signals, so an emulator has to do something to make the output appear on an RGB screen, and that implicitly involves colour-correction.
The SNES and Genesis do work with RGB, and the output is good enough to look reasonable on modern monitors, so most emulators would probably leave it as-is.
The Game Boy Advance works with RGB, but the built-in screen on earlier models was (as the article describes) low contrast, and games were made garish to compensate. Later models of GBA used better screens, and the GBA Player for the Gamecube output straight RGB to an ordinary television, so later games often used more sensible palettes, or even provided a palette option in the main menu. Thus, GBA emulators probably provide their own colour-correction, and almost certainly make it optional.
> the original NES does not work with RGB, but with analogue NTSC signals
I mean, the video hardware (video DAC? rasterizer?) in a NES is for NTSC/PAL, but that doesn't mean that NES palettes are specified in a YIQ colorspace or anything. The palette data is RGB; you don't have to do anything special to fill a NES-emulator framebuffer beyond what you'd be doing to render a GIF. (The gamma transforms mentioned in the article can be done during blitting with a LUT, but they could also just be done by applying a real gamma-curve transform to the framebuffer texture during screen compositing.)
> Unlike many other game consoles, the NES does not generate RGB or YUV and then encode that to composite. Instead, it generates NTSC video directly in the composite domain, which leads to interesting artifacts.
So far as NES software is concerned, it just tells the NES to use particular palette entries, and the NES hardware is responsible for coming up with an NTSC signal to send to the television. Because NES software doesn't care about specific RGB values, and because converting NTSC-to-RGB is hard, most emulators use an RGB palette for output, but fans have created a lot of alternative palettes over the years[2], and none of them are "correct".
Presumably whatever palette-entry-to-RGB encoding Nintendo's own emulators (e.g. NES VC, NES Online, the NES classic) use, should be considered at least somewhat "canonical", no?
And I don't mean because Nintendo has any special "auteur" control over what NES games "should" look like (they never expressed such control, since they never shipped a NES with a screen!)
Rather, I mean that the palette maps Nintendo uses in their emulators are probably the same maps Nintendo used, in the other direction, to do the original conversion of the RGB palettes of their PC art-assets, into NES palettes to wire into the hardware. I.e., if Mario's red coveralls in Donkey Kong (a Famicom launch title, so likely ported during hardware development) renders as #800000 on Nintendo's emulators, that's probably the same RGB color that was in the PCX file's palette that informed the tuning of the NTSC rasterizer's output for that particular palette entry.
Nintendo's RGB palettes have typically been what many would call "crap". Probably the NES Classic got closest. Wii VC was way too dark. None of them too horribly accurate.
I don't think they used PCX along the way at all. At one point, they'd digitize graphics by using LEDs to scan filled-in graph paper, one tile at a time. I think the closest Nintendo ever got to actually having an official RGB palette for the NES in the old days was the RGB PPU palette (which was very, very different from the colors output by the composite PPU).
The NES PPU is fairly well understood, and palettes can actually be calculated based on that (as opposed to some of the user-created palettes done with visual analysis). All of Nintendo's RGB palettes (AFAIK) are known and dumped.
i love the work you've done on your emulators and your writing--i look forward to the new site and more good content!
if i may offer a suggestion for getting more people involved, i would say please give serious consideration to a UI paradigm rework. i would love to contribute to higan but i find its interface completely impenetrable! in older versions, i was able to sorta figure out how to use icarus to import a ROM after a while, but i tried again a few months back and i couldn't for the life of me get anything to run.
you are a treasure to the emulation community so keep on doing what you love however you want to do it. i just wish i could figure out how to use higan!
I'm currently in the middle of a higan GUI rewrite, and I'd be most happy to get some feedback! It's really difficult because we can't simplify things that would break edge cases on other systems.
Big thanks to you as well as everyone else working on preserving this particular slice of our cultural heritage. I hope that your work will be appreciated more, esp. by game developers and policymakers, as time goes on.
Good stuff!
I like having both options - using the colours the displays were unable to display properly at the time or - on the other hand - getting as close as possible to the original experience.
As I understood it, those were not the two options discussed in the article.
Instead it was: use the color values specified in the code, ignoring that screens have changed and that they won't look the same at all, or account for that and make it look right.
The goal is really emulation, making it look like the original. Making games look better than the original is a harder job, that's more remastering the game to take advantage of newer hardware, like what is done on rereleases for new systems.
There are two kinds of emulation, though. There's emulation of experience, and then there's what an AV engineer would call a "monitor": a reproduction that shows you the data the system puts out, as-is, without any "client-side" transforms, even if most end-user equipment does them. Flat response curves and all that. In emulation, monitor-type emulators are often associated with debuggers, or a component of them.
Monitor-type emulators are more helpful in a platform SDK, as you can view the output of them through different end-user equipment to determine how you want to "master" your game's assets. Given that byuu's emulators aim to be cycle-accurate and bug-for-bug compatible with their consoles, they're pretty great for developing homebrew against, so I'd expect at least the option to have a "monitor" mode for video (and audio!) output.
Kind of curious what you'd use the monitor-type for in this case. For example, the gameboy one, you'd want to be able to see the jank oversaturated colors that are only there to make the actual screen display slightly better? Or maybe that one is just a bad example?
No, that's a perfect example. You'd want to be able to see those colors, to be sure that those are the colors your program is actually attempting to render. When you see the "right" jank, that means you've mastered everything correctly and it'll look right when you write it to a cart and test in on real hardware. It's a way to eyeball the intermediate layer, to know when things are going wrong at the intermediate layer, without having to interpret the problem "through" the filters that apply after it.
byuu, thanks for doing more writing. I have enjoyed following you on Twitter and look forward to reading more in depth work from you. Will definitely sign up for the RSS feed.
Writing emulators seems like really hard thing to do.
What is in it for you ? Is it about technical challenge, or something else people from outside might not know?
I was always into RPGs as a kid, and when I found out the US missed out on tons of them, I got into reverse engineering the games and fan translating them. I worked on Dragon Quest 5, Der Langrisser, Mother 3, etc.
Around 2004, I learned most of my code didn't run on real hardware, but ran on emulators. I found out why (the main reason: writing to video memory while the screen is drawing), and submitted patches to the SNES emulators of the time to work like real SNES hardware, but they were rejected because too many fan translations already relied on the old mistakes.
No one back then seemed to care about how accurate emulators were, so I set about writing my own SNES emulator with the goal of making it as perfect as possible, and it kind of spiraled out of control from there. Within a few years we were decapping chips to extract coprocessor firmware, I had to emulate the Game Boy for the Super Game Boy cartridge, the GBA CPU for this one Japanese chess game that had an ARM CPU in it, etc.
I guess I just really like the challenge, and never stopped adding more systems over time. The more systems I emulate, the more I already have most or all of the chips used in the system emulated. Eg ColecoVision took me an hour because I already had a Z80 core, a TMS9918 video core, and an SN76489 audio core. I can definitely see how MAME turned into what it is today now.
These days, the emulation is the easy part, and organizing all of this code (around 600,000 lines and counting), and getting all of these really unique pieces of hardware to all work in the same GUI, has become the challenging part. I have this really elaborate tree-view of the emulated systems to support zany things like the Genesis "Tower of Power", a complex scheduler that tracks timeslices in attoseconds, practically a full professional DSP stack for mixing multiple audio streams, Reed-Solomon decoding for CD systems, etc. I'm always learning new stuff, and there's always more to improve. I worry that I won't be able to wrap up higan in my lifetime.
There's not a lot of money in emulation, at most I've been offered $2500 for commercial licenses, but showing my hobby work to my employers landed me my last two jobs in software engineering, the latter of which is in Tokyo, Japan. Emulators literally got me halfway around the world. And I even got to work with the developers on Stephen Hawking's voice machine software at one point.
There's been some downsides, and I had a lot of maturing to do over the years, but on the whole, I wouldn't trade this hobby for anything.
> the GBA CPU for this one Japanese chess game that had an ARM CPU in it
I was curious about it, thus I wanted to read more about it and evidently, the source of Wikipedia is your website, but it seems like you took it offline and Web Archive doesn't have it either.
If I recall correctly, the article in question was just "we have now dumped the firmware from all known SNES co-processor chips".
The first SNES co-processor chip, the DSP-1, was based on a weird NEC DSP architecture, and if I recall correctly the part-number was figured out from markings on the chip die, and digging through NEC spec-sheets looking for things that approximately matched the chip's known capabilities. The instruction set and encoding was puzzled out by hand.
Luckily, DSP-2, DSP-3, and DSP-4 all used the same NEC DSP core, just with different firmware, so the same emulator could be used for all of them.
The ST010 and ST011 used a slightly different NEC DSP core, so they required a little more work, but after handling the DSP-1 they weren't too difficult.
The ST018 was incredibly daunting to begin with, since its firmware was much larger than all the other co-processors, and there were no identifying marks on the CPU die and no product sheets to dig up. As a last ditch effort, somebody just opened up the firmware in a hex editor and tried to figure out the instruction encoding from first principles... and eventually they said "that looks familiar", and sure enough it turned out to be the most popular CPU architecture on the planet.
There was also the Cx4, where segher had to reverse engineer the entire instruction set because it's a totally custom, undocumented ISA based on the Hitachi HG51BS architecture.
Specifically, I think the giveaway was the 4-bit condition codes at the beginning of ARM instructions. The code for always executing an instruction is 0b1110, and seeing almost all 32-bit words start with the same non-zero nibble is rather distinctive.
That's not a bad idea. When I was restoring older articles on my new site's CMS Markdown format, I stopped at around 2016, but I could go back to older ones.
Writing an emulator was on my list of “wizardly things beyond my reach” (along with compilers), but I highly recommend burning down that list.
While creating a complete and shareable emulator is a big undertaking, making a simple one for fun is surprisingly not too bad. You start with the CPU emulation, and there are tons of comprehensive tests, so you just fix those until you’re done.
You should expect to do a lot of reading though, and you’ll become very familiar with how the system you’re emulating works :-)
Also on this list: write your own toy programming language, even if it's just an interpreter that builds ASTs. Nothing broadened my view of computer programming more than that.
I think for many people emulators are a sweet spot between technical challenge and potential usefuleness. You can actually improve the state of the art, and it's fun and challenging, and people will actually be thankful!
What led you to repeat bits in the 9->24 example instead of rescale the values? i.e. '(n * 255) / 0b111'? Repeating bits seems like it could potentially give an uneven curve (though not significantly so), and I would have expected the DAC to roughly do the equivalent of the divide.
Historically as a graphics programmer I always see rescaling, potentially with a response curve like sRGB on the back end.
Accurate screen emulation is important for many retro games, and not just because of color issues. For instance the GameBoy didn't support transparency so some clever devs would make sprites blink very fast to make them appear transparent. Since the GameBoy screen had a fairly slow response time the ghosting would effectively blend the sprite with the background. If you emulate the game on a modern low-latency monitor you just get a flickering sprite. See for instance the claw's shadow in this gameplay video for Link's Awakening: https://youtu.be/UQlP9sHf5Ho?t=825 (although this video is way worse that what you'd see in an emulator because the framerate of the video doesn't line up with the GB's ~60fps display, so it doesn't even flicker all the time).
Basically you interlace two images at a high enough frequency and let the analog artifacts blend them. Once again doing playing that on a modern LCD screen without post-processing ruins the effect.
Has anyone tried to simply point a camera at an old screen, say a CRT screen, take a photo of one pixel at a time at a few different levels of brightness, then add all the photos together to render arbitrary images? As far as I can imagine, it should capture the behavior of colors and the fuzzyness between pixels very well. You could even set up a small still life around the monitor, and get accurate reflections and ambient light in the room.
Stuff like the smoothed out flickering that you mention would still need to be emulated, of course.
It's actually a great idea, assuming that background light levels are properly taken care of and the camera is well calibrated to a linear colorspace. Unfortunately "pixels" don't really blend linearly when transferred over a composite cable and there are some games that use NTSC artifacts to achieve certain colors.
Brightness is a non-local effect. i.e. if you change one pixel, the rendering of the adjacent pixels can change.
Also, the rendering is different depending on how the signal made it into the CRT; in particular composite video does not preserve certain things about the image, and in some cases developers took advantage of this/
Most famously the genesis had a poor composite encoder, which was taken advantage of by many games; Sonic 3 in particular looks bad even on a CRT with a real Genesis if you use RGB or component output.
Even color is a non-local effect. Turning two colored pixels on side-by-side on an Apple II makes them white. I can give you an unbounded amount of detail why :-)
There are shaders to render such 'artifacts' programmatically at about a thousandth of the cost of what you describe. It could potentially be more accurate in the way you describe it, but then what kind of blending mode do you use?
I think you would first find the two closest brightness levels for each pixels, and blend appropriately between them - then simply use additive blending to composite all the pixels together. A naive approach would require compositing as many images as the number of pixels in the input signal, which seems extremely expensive, but in practice I’m sure you could optimize it quite a bit, as most of the area of each photo would be almost completely black.
If you can arrange lighting perfectly in your CRT studio, you could take only the difference against the all-off state. You could extract the pixel islands' positions and composite only these smaller chunks afterward.
It could work out, but just not sure if it's worth all the hassle. It seems akin to the way you would do a stop-motion video, I suppose.
You would need to be in a pitch black room with only one pixel at a time lighting the scene. If you want additional light sources in the room they could be photographed and blended in with the same technique.
I think you would need to get the reflections in the same pass. It’s basically ray tracing in real life: emit light from a bunch of points (screen pixels), “calculate” how they bounce through the room (ie adjust the brightness of the corresponding photo), and combine them all into a single image.
> The first detail is that most computer monitors run in 24-bit color mode, which provides 8-bits of color detail for the red, green, and blue channels. But most older game systems do not specify colors in that precision.
> For instance, the Sega Genesis encodes 9-bit colors, giving 3-bits per channel.
> The solution to this is that the source bits should repeat to fill in all of the target bits.
This is interesting and efficient approach. Interpolating a 3-bit number (0-7) into a 8-bit one (0-255) can be done by dividing the first one by 7 (to normalize it) and the multiplying it by 255 (to stretch it to the full range). The operation order can be changed to avoid having smaller-than-1 floating numbers in the first step. So, we basically need to multiply each number by 255/7~36.4.
So, these multiplications give exactly the same results as repeating the bits in the byuu's article and bits operations are much cheaper. I have some intuition on how does it work (we're increasing each "part" of the number by the same factor), but not a math explanation.
But please keep in mind that today's compilers are (most of the time) very smart. Hacker's Delight is somewhat outdated today. There's no need anymore for writing "x << 2" instead of "x * 4".
I'd say it's about intent. Depending on the context, "x * 4" is less clear than the bitshift (For example, when packing multiple values into an integer)
The point isn't "always do multiplication" but to do what makes logical sense. Don't pick your operators for performance, pick them for readability. The compiler will handle the performance aspect for you.
Absolutely, couldn't agree more. I just wanted to clarify why the bit-shifting approach was equivalent to the previous comment's multiplication by 36.4.
I think I get what you're aiming at in your explanation, but you're not being quite explicit enough about how those shifts map to multiplications, and having to assume a world where 1 >> 1 is 0.5 is... consistent, but counterintuitive.
But I think what you're aiming to say is:
r << 5 | r << 2 | r >> 1 amounts (because r is constrained to three bits) to being the same as
r << 5 + r << 2 + r >> 1
And r << 5 is r * 32, r << 2 is r * 4, and r >> 1 is floor(r / 2)
The described bit operations are only suitable when the input range is a power of two, whereas your described math is suitable for almost all systems. That part of the article would do better to explain that you simply want to map your highest value to 0xFF and your lowest to 0x00, mapping linearly, rather than talking about mapping bits. (It also doesn't mention HDR or 10-bit output, both of which could improve visual quality in CRT emulation.)
Also, by implicitly doing the mapping to 8-bit first and the gamma stuff later, some precision is lost, but this shouldn't be a concern until you reach around five input bits per component.
There's definitely room to improve the article. I'm intending it to be more of a Wiki-style site with additions over time.
Indeed internally my emulator calculates 16-bits per channel, performs gamma, saturation, luminance adjust, color emulation etc, and then reduces that to your target display. I have support for 30-bit monitors, but so far that only works on Xorg. I even considered floating-point internally for colors, but felt that was way too much overkill in the end.
In fact, you probably should use normalization when converting arbitrary levels. A floating point multiply is almost certainly cheaper than a while loop. But hey, why not? Just don't use it for any real-time stuff unless you build a color palette cache.
Ok, I gotta say: none of the shaders or scanlines I've tried before ever came close to what I'm seeing in those videos; to the point I'd think "uh looks nice but that's not how I remember from the 90s".
Now that crt royale... Color me impressed! Gonna try it as soon as I get access to a PC.
I've heard people say for CRT Royale to be truly effective you need to use it on a 4K TV. It apparently needs the additional resolution to truly shine.
Having said that, it already looks really impressive to me on a 1080p TV.
The Sega Master System has 6 bit-per-pixel color, 2 bits for each of red, green and blue. You would expect values 00, 01, 10 and 11 for each color to represent 0%, 33%, 66% and 100% voltage on the corresponding pin of the Video Display Processor. However, I once connected an SMS to an oscilloscope and found voltage levels much less evenly-spaced than this - and they were quite different for different console models.
Even if the voltages had been evenly spaced, would it have been correct to use evenly-spaced values (0, 85, 170 and 255) for each of r, g and b in an emulator? I believe this would be correct, assuming that the gamma curve of a modern monitor matches that of a 1980s/90s TV - which should be the case for a monitor calibrated to sRGB?
Brightness doesn’t correlate linearly to voltage level. sRGB is representative of the final output, so the brightness would correspond to 0, 33, 66, 100, and the voltages would be the inverse application of the gamma to those values.
It would be interesting to add three photographs to each example: Original retro system display, un-corrected on modern display, corrected on modern display.
In the US you can't even give away a CRT these days, you have to pay someone to take them. It might be hard to locate one depending on where you are, but once located maybe you could offer to take it off their hands?
One thing I've been thinking about lately with emulation: why can't we replicate and use the original hardware more easily?
There's lots of knockoff console replicas out there, so it seems like the chips can still be manufactured. Wouldn't it be fairly cheap to build a board/chip that contained either the original chip designs from classic computers/consoles, or hardware-level emulated chips? Seems like you could stuff several of those retro chips onto a single expansion board. Then emulation software could tap into this "emulation expansion board" to make use of the real chips. Is this all crazy talk or does any of that make some sense?
There's lots of knockoff console replicas out there, but most of them are software emulation running on the moral equivalent of a Raspberry Pi, and the rest are often quite inaccurate approximations of the original hardware, with many incompatibilities with published games.
Without access to the original designs, hardware emulation is in the same position as software emulation: you need to do a ton of research and development. A company called Analogue[1] makes FPGA recreations of a few classic consoles, but they're pretty expensive, and took a lot of effort to design. In fact, byuu (the author of the OP) was consulted fairly heavily during the design of their SNES reproduction.
Basically, if you have to do a lot of research anyway, you might as well go with the product that has zero marginal production cost (i.e. software) and save yourself a bunch of manufacturing expenses for the same result.
Ah, that's good to know! That's a fun fact about byuu helping with the FPGA SNES that they built as well, really interesting! Thanks for the knowledge share!
I've always wondered if you could just stick a colourimeter on a GBA SP and generate an ICC profile for it, then use that profile to adapt the colours to your display. It probably wouldn't be that easy, but there are colour management tools and libraries available that are designed for adapting colours between devices, so if you could somehow obtain a device profile for the GBA LCD, it might be possible to use it to produce very accurate colour emulation.
As I understand it, colorimeter expect the display to output a particular colour or shade and measure it precisely. You probably couldn't run colorimeter software on a GBA, but you could write homebrew that let the user manually cycle to the correct patterns at the right times.
Some issues would be:
- There's at least three models of GBA display: the original front-lit display, the GBA SP back/front-lit display, and the later GBA SP backlit-only display, and they probably all have different profiles
- It's probably not possible to use a colorimeter on the original front-lit GBA display, because with a colorimeter clamped on top of it, it would always look black
- Colour-correction software that works with ICC profiles is typically built for print/still image work, so it expects a single high-resolution image with high-precision output, not a 60fps stream of very-low-resolution images. Maybe it's fast enough to run in real-time, who knows?
I think relaying the image with a lens onto the colorimeter should work assuming you modify the game so the hardware displays a full screen equivalent of all possible variations of a single pixel. it might be necessary to switch to a color cube with 3 photodiodes, and perhaps read in with 3 DC-coupled audio channels, or alternatively use an arduino's ADC (I know ...) but with a proper antialiasing filter capacitor for the sample rate used.
Yeah, writing homebrew to get the GBA screen to display whatever a screen displays under a colourimeter, and getting that to work with the actual colourimeter might not be so easy. I think getting a device profile for the frontlit SP would be the most interesting, since I've heard it produces colours more like the original GBA than the backlit one, but maybe a colourimeter designed for backlit PC monitors would have trouble with it. I don't think performance would be an issue though. mpv can do colour management in real time by generating a LUT from the ICC profile and applying it to each frame with GPU shaders.
Most colorimeters also have a projector mode for measuring beamer screens. You can just point the colorimeter to the gba screen bar an angle and shine a high quality light source at it
That's cool. I didn't know that (I've never used one before.) Maybe it would make more sense to do the original GBA then. The SP frontlight has a bit of a blueish tinge to it.
Are you sure the Genesis didn't align things on 8 bit boundaries? Either way, it might be worth mentioning current systems that use 10-bit (30-bit) color.
The N64 is notable for using ECC memory modules with ECC disabled, where the CPU sees 8 bits per byte and the GPU uses all 9 bits per byte for some functions.
It is actually still 8-bit aligned with padding - we are talking about a system with a 68000 and Z80 here. The way the graphics systems of the era accomplished color depths like this was generally to do one of two things:
1. draw them in "planar" mode(multiple bitmap layers of a more convenient depth like 1-bit, then composited together into the full image) vs the "chunky" mode we are familiar with(all color information packed in each pixel of the bitmap). Planar graphics are much less convenient for ordinary drawing tasks, but they allow for a tradeoff between memory usage and fidelity. Put they ease a few kinds of raster effects if you carefully use your palettes. This is what the Amiga OCS had, and VGA in many of the custom modes.
2. Allocate the colors to sprite and tile hardware with limited palettes, so that you can achieve "n colors visible" by carefully combining all the different elements. Each element might be able to sample part of the 9-bit space with a palette of 4 or 16. In the case of the Genesis VDP your limit is ~64 colors visible of 512, with the special Shadow/Highlight mode allowing more. This is not as big of a limitation as it sounds like since the low resolution graphics of the time weren't suited for detailed color values: 4 is enough to do a shadow/midtone/highlight, and 16 allows for substantial color ramps with multiple hues.
My background is I wrote emulators for 24 systems and counting (higan and bsnes), and I want to try my hand at technical writing, in hopes of encouraging more people to get involved with emulation. I'll hopefully get better over time and with more feedback.
I'll be writing more about video emulation in the near future. I would like to cover techniques for interframe blending (the flickering shadow), composite artifacts (the waterfalls in Sonic the Hedgehog), color bleed (translucency in Kirby's Dream Land 3), screen curvature, scanlines, color subcarriers (and how shifting it causes shakiness you see in composite video), interlacing, phosphor decay, phosphor glow, aperture grilles, non-1:1 RGB pixel layouts (Game Gear LCD), etc.
I'm intending to write about all things emulation there, in fact. I want to cover input latency reduction/mitigation, removing audio aliasing above the Nyquist limit using FIR and IIR low-pass filters, designing thread schedulers and priority queues, provide mirroring and organizing of PDF datasheets for all the ICs used in retro gaming systems, etc.
Basically, I want to cover all the stuff you don't usually find in "how to write an emulator" tutorials. All the polishing stuff that takes a new emulator to a well-polished emulator.
It's not ready yet (the site is brand new), but I'll have an RSS feed at byuu.net/feed in a couple of days if anyone's interested in this sort of content.
Thanks for reading ^-^