Very interesting stuff. What I think I find mind blowing is that this is just the tip of the iceberg. This is only treating the eye as if it was a standalone imaging sensor. I wish there was a way that we could understand how other humans and animals actually view and perceive the world in an embodied way. I would imagine that each one of these animals processes their visual information in unique ways, perhaps more unique than their eyes.
This is very far from understanding how cats actually perceive the world, but it's possible to extract crude images [0] from a cat's lateral geniculate nucleus (the part of the visual pathway between the eye and the visual cortex). The LGN likely does not perform much (if any) role in high level visual cognition (that's reserved for the visual cortex), but it does perform a lot of image preprocessing.
It might just be pareidolia, but IMO that preprocessing results in the human face looking strikingly catlike.
True - birds process vision very differently than mammals. They're tetrachromats, some species have multiple focal points, polarized vision, things like the woodcock's 360 vision, etc. which are major changes in processing info, but also not only is their visual acuity far more powerful, but their speed at processing input (needed for flight) makes it hard to even imagine what vision is like for them.
It's fascinating isn't it, the way we actually interpret things is truly a whole being greater than the sum of it parts. Mostly I don't think that it would be possible to truly understand how any other creature perceives the world, as our brains form the composite perception of the world from all our different senses both those that pick up external stimuli and then also the way those stimuli are perceived internally through brain/nerve structure, body composition and internal chemistry. I think it would be impossible to truly ever experience the world as anything except you, as there would always be information loss without exactly replicating the physical composition of another entity (human or otherwise). Life really is a pretty wondrous thing
It wouldn't just be visual perception, it would also apply to metaphorical abstractions.
I have a theory that the reason we developed Euclidean plane geometry is because our native experience of space at the time was mostly 2D with some limited 3D extensions.
If we still lived in trees our core geometric model would likely be different. And birds almost certainly have a very different model.
All of the base experiential invariants that we built physics from - temperature, distance, speed, discrete numbers, shape, colour, etc - are extensions and abstractions of our particular sensory and feature extraction systems. They're the important perceptions.
Other species would share some, but not all of them, and others would be unfamiliar. Their version of physics and math would prioritise different perceptions and so look very different.
I still don't understand why when I look at a mirror my left and right is flipped but my up and down isn't. If I hold my left arm out, the person in the mirror holds out his right, but up and down do not get the same treatment. Why?
EDIT: I understand that it's front and back that actually get reversed, but that still doesn't help my understand why we think that left and right are reversed by don't think up and down are reversed
It actually has nothing to do with the mirror. It’s because when you rotate the image to align its ‘front’ direction to your body’s, you naturally perform this rotation about the vertical axis. It is this rotation which has the effect of swapping left and right.
(It may also help here to think of a mirror on the ceiling. Consider: does this also swap left/right, or does it swap up/down instead?)
However I orient myself and the mirror, it feels like there is something special about my left and right according to my up and down. Why would I naturally rotate the object about the vertical axis to align it?
Easy way to consider it, when you look at another human being, you innately understand that their right is your left, so you're rotating your perception of their directions around the vertical axis. The mirror isn't actually flipping anything, the top reflects back from the top, the right reflects back from the right. The "flip" is your brain treating them as a person rotated 180°, not the front of a person as if viewed from behind
Afaik, eyes capture only a tiny bit of the image - the rest is a mental model. Whatever partial data eyes capture, the mind maps to known patterns - face features and the like. So animals may indeed see the same pixels, but perceive something wildly different.
If we can ever get that kind of brain read technology so to speak, it can also finally settle the aggravating and mind bending question of whether your blue is the same as my blue. Wouldn't it be wild if our interpretation of colors were all varied?
If you're interested in stuff like this, I highly, highly recommend Ed Yong's new book, An Immense World. There are huge parts of animals' sensory landscapes that we don't perceive, and he does a good job of letting you see the world through their eyes/smell it through their noses/etc. He also writes about the sometimes-wacky experiments scientists have done to tease out how animals use their senses.
I’ve found it interesting that the range of visible light for humans has the highest frequency roughly double that of the lowest frequency. Working by analogy to sound, we essentially see one “octave” of light and if we perceived frequencies into infrared or ultraviolet, they would perceptually look like variants of the color at 2× or ½× the frequency (which suddenly makes the adjacency of violet and red on the color wheel make sense). I kind of wonder whether anyone has ever made a camera that doubles or halves as appropriate light frequencies beyond the visible spectrum to make them visible. It would be interesting to see what the world looks like extending the range in either direction.
Fascinating! Though, I doubt that the jumping spider's view would have such a blowing DOF, as in the picture.
I guess, all vision systems (sensors and processing) would be optimized for primary needs, should these be food search, mating, shelter, and predator evasion?
As for dogs, I wonder if for them the "vision" is more than just optical pathway. It may be augmented with olfactory detail to "color" the visuals as needed for their priority. After all, colors is very notional concept, even to humans it's more than just wavelengths.
It's all really interesting. The fact that this is only the top of the iceberg astounds me. Using the eye as if it were a separate imaging sensor is not what this is about. My heart aches for a method to grasp how other people and animals see and perceive the world. Every animal's visual input is processed in a unique way, maybe even more so than the eyes themselves.
wouldn't the animal do the same thing humans do with Saccade eye movement, and integrate a "higher" Q image inside the brain, from the low res visual input?
I mean sure, you can't resolve finer than you see, but you can scan-and-pan and the brain does a wonderful job of faking out. Really, really good model, from really really tiny receptors.
they basically set up cameras where the various animals' eyes would be. they also added some filters in order to mimic animal sight. you would then get to see the feed via a display.
here's a video i've found on youtube that shows what i mean:
There is also variation within a species, like humans, where you get not just the different forms of color-blindness, but also tetrachromacy where you can see more colors:
That's a simplification, I think. Many animals have good nightsight, meaning they see in infrared. That's probably also why they avoid approaching fires - those are too bright for them.
Natural night isn't brighter in the near infrared than in the human visual range, and IIRC not much can see in the far-infrared where body heat becomes relevant.