You can think of it as information within other information within still other information...
Simple 2D example (for programmers) -- you have a binary string, which can be looked at as a series of 1's and 0's, or as a series of 8-bit bytes.
Well if we look at it as a series of bytes (think of that as the first "holographic" dimension, because the bytes don't really exist -- that is, all they really are is repeated groupings of 8-bits -- CPU's and programs and memory might work with data of that length and "see" them, but in essence, the string is is just 1's and 0's.
So that's the first "holographic" dimension... bytes. But now, inside of that string are substrings -- discrete runs of shorter information. Let's think of those substrings as "holographic" dimension 2.
From here, there could be even higher "holographic" dimensions, that is, let's say we observe only some substrings relative to a mathematical pattern, f(x).
Well, you can think of f(x) -- and the resulting data it produces as a result of reading specific substrings in a specific order -- as living in a higher dimension, a "higher dimensional" "observer", if you will...
Whenever you see the word "black hole" or "hologram" -- replace that with the word "information", and think about it from that perspective... usually there's something there...
(It's also equally-and-oppositely possible that I'm a crackpot and don't know what I'm talking about -- take this explanation with the proverbial grain of salt... <g>)
I beg to differ on the proposition that you're a crackpot. That seems to me a perfectly mathematical way of thinking.
If you're a crackpot, it's because all mathematicians are.