Hacker News new | past | comments | ask | show | jobs | submit login

https://w3c.github.io/ambient-light/#dom-ambientlightsensorr...

Unfortunately, it looks like their definition of brightness is a floating point number, and that's the only thing their API exposes. Hopefully future versions of the standard will be updated to better match the real world.

What we perceive as brightness is (a) the number of photons of each wavelength that (b) intersect a given point in space and (c) are exiting in a given direction. Obviously, all of this information can't be encoded in digital form. There's simply too much data to encode, let alone measure accurately. But reducing the data to a simple 1D number renders the data almost useless.

No website will be able to achieve good results if the dynamic color scheme is based on a single number.

That's also why the WebUSB spec is exciting. I don't have to wait for this spec to be drafted, or for browsers to support it, or to live with any shortcomings it ships with. I'm free to create.




What kind of physical sensor are you proposing that would give you data like that? I think most ambient light sensors in laptops are just LDRs, which are only capable of giving you a 1D number.


I propose that this is a capital-H Hard Problem, and that new technology is needed to address it. I further propose that this technology exists, but is not yet mainstream. The fact that monitors can be calibrated to great accuracy is proof that the tech exists.

There is an opportunity here, but to understand this space, you have to dive into it. There are some excellent books on the subject of colorimetery, but a quick bootstrapped understanding might look like:

1. Your eyes are designed to fool you.

2. How your eyes fool you is determined entirely by the photons that enter it.

3. Each of these photons has a wavelength. The more of them at a given wavelength, the more your perception shifts.

4. What you perceive as color is a combination of wavelengths. But -- critically -- these combinations affect each other. It is not true to say that the more photons that arrive at a given wavelength, the more intensely you perceive that wavelength. It causes a shift in your perception, but this shift is not necessarily a simple increase in brightness.

These are first principles, and it's a very brief sketch of the problem. But everything else follows from this. What ambient light sensors currently do, how APIs are designed, etc is all secondary. The problem is both as simple and as difficult as outlined above.

Now, you can say that this isn't worth tackling, or that the current systems are good enough, and so on. But you'd be missing out on quite an experience. Seeing the type of results you can get from a perfectly calibrated environment is really mindblowing. And the interesting part is, it's impossible for me to describe these results in text, or by showing you a photo, or a video, for the same reason you can't describe an Oculus experience. You have to be there.

Personally, I find colorimetry one of the most intellectually fun and gratifying areas of science.


I think a sensor that has 3 sub-sensors that respond at roughly the same wavelengths as our cones, and has each sensor's response curve matched to human perception would be good enough, no? Maybe with a clever algorithm to reduce the 3 values to a single perceived brightness value?

I feel like something along those lines must already exist for a number of uses.


If only it were that simple. Consider the end goal: to write a program that causes your monitor to invoke a certain visual experience in the viewer. It's not simply "To show blue" or "To show a certain shade of purple." Those are all meaningless terms without context. Their results are entirely relative, right? If you put some purple next to some blue, whether or not it looks good depends on the background color (called the "surround").

The trouble is, "whether or not it looks good" is also determined by your environment. Some people have crappy monitors, some people have perfect monitors, sometimes it's nighttime, sometimes your room is being lit by the early morning sun. When you look at a screen, all of these factors combine, and leaves you with the impression that something looks good or looks bad.

It gets worse. When you look at a monitor, what's behind your monitor is usually the most important thing. I.e. is the wall in front of you white, or green? Is it dark, or lit by the sun? That's going to affect how you see what's on the monitor. What's behind you is irrelevant, because you can't see it! It doesn't matter at all if the wall behind you is white or black, except insofar as it affects the colors of the wall in front of you. So not only do you need to account for all of the factors outlined above, but the damn thing needs to be aimed properly. I'm pretty sure that the right answer will look something like a sensor that mounts to the back of a laptop screen. But it also needs a sensor pointed at your face, and another sensor pointed straight at your monitor. Only at that point do you begin to have enough information to start writing a program that can make correct decisions.

Fiendishly difficult, no? Quite fun in any case.


Well calibrated AR glasses seems to be what you'd want to target.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: