Look at Animal domestication - see neoteny in domestic animals for eg - is a manipulative process too. People do use those techniques on each other and at mass scale via communication/advertising/marketing. Yet there are regulations and red lines and people regularly get throw in jail.
Theory of Bounded Rationality tells use we are all stupid given the right problem. It reminds us that problems exist which will cause us to hit limits not just in intelligence but in time, resources, skill, communication, attention, energy, info etc etc etc
It also recommends what to do in such cases. You either pick a simpler prob where the limits matter less or Accept your solution for complex problems will have issues.
> Theory of Bounded Rationality tells use we are all stupid given the right problem.
I’d say one of OP’s main messages is that this is a misunderstanding or misuse of the word. We’re not stupid when we are aware of our limitations. Stupidity is more of a character flaw than a lack of intelligence. Perhaps you could say it’s a failure to even try to be rational.
I loved his book Rainbows End as a kid. So many different concepts that blew my mind.
Even without talking about AI we are already struggling with levels of Complexity in tech and the unpredictable consequences, that no one really has any control over.
Michael Chrichton's books touch on that stuff but are all doom and gloom. Vinge's Rainbows End atleast, felt much more hopeful.
I was talking to a VFX supervisor recently and he was saying look at the end credits on any movie (even mid budget ones) and you see hundreds to thounsands involved. The tech roles outnumber the artistic/creative roles 20 to 1. Thats related to rate of change in tech. A big gap opens up between that and the rate at which artists evolve.
The artists are supposed to be in charge and provide direction and vision. But the tools are evolving faster than they can think. But the tools are dumb. AI changes that.
These are rare environments (like R&D labs) where the Explore Exploit tradeoff tilts in favor of Explorers. In the rest of the landscape, org survival depends on exploit. Its why we produce so many inequalities. Survival has always depended more on exploit.
Vinges Rainbows End shows AI/AGI nudging the tradeoff towards Explore.
Honestly, considering the state of the world and how things are shaping up, it’s such a hilariously obvious pipe dream that such a system would be some omnipotent/hyper competent super-god like being.
It’s more likely just going to post ragebait and dumb tiktok videos while producing just enough at it’s ‘job’ to fool people into thinking it’s doing a good job.
Yup things look bleak but its not a static world. For everything that happens there is a reaction. It builds with time. But to find the right reaction also takes time. This is the Explore part in the Tradeoff. AI will be applied there not just on the Exploit front.
What you are alluding too is Media/Social Medias current architecture and how it captures and steals peoples attention. Totally on the Exploit end of the tradeoff. And its easy stuff to do. Doesnt take time.
If you read the news after the fall of France to the nazis (within a month), what do you think the opinion of people was?
People were thinking about peace negotiations with Hitler and that the Germans couldnt be beaten. It took a whole lot of Time to realize things could tilt in a different direction.
I’m talking about evolutionary functions, and how much more likely it is to prefer something that has fun and just looks like it’s doing something, instead of actually doing something.
Aka manipulation vs actual hard work.
Do you have any concrete proposals, besides ‘it will get better’?
Actual competency is hard. Faking it is usually way easier.
You could ask FDR and Churchill that after the fall of France and it wouldnt be too useful what they said cuz it took them almost 3 years before they openly said victory = end of the nazis and nothing else.
So dont just sweep the fact that things take Time under the carpet. Its not healthy cause its like looking at tree shoots in the ground and saying but why does that not look like a tree yet.
Finding gold in an unexplored jungle takes much longer than extracting gold from an existing mine. This is the Explore Exploit tradeoff. Exploit is easy. More ppl do it. Explore is hard. And takes more time. If AI shifts the balance on explore the story changes.
If you want to talk about Explore in Media/attention (mis)allocation you can already see the appearance of green shoots in the ground. There are multiple things going on parallely.
First there is a realization that Attention is finite and doesnt grow while Content keeps exploding. Totally unsustainable to the point the UN has published a report about the Attention Economy. This doesnt happen without people reacting and going into explore mode for solutions.
They are already talking about how to shift these algos/architectures based on Units of Time spent consuming(Exploit) to Value derived from time spent.
Giving people feedback on how their time is being divided between consumption(entertaimment) and value. Then allowing then to create schedules. What you now start seeing as digital wellbeing tech.
There are now time based economic models where platform doesnt just assume time spent is free but something the platform needs to pay for. People are experimenting with rewards micropayments. All these are examples of explore mode being activated.
There is also realization that content discovery on centralized platforms like youtube tiktok insta cause homogenity in what eveeyone upvotes. So you see people reacting and decentralizing to protect and preseeve niches. AI(curator of curators) will play a big role in finding such niche that fit your needs.
Will just end with people are also realizing there is huge misallocation of Ambition/Drive problem. Anthony Bourdain says Life is Good in every show od his and then kills himself. Shaq says he has 40 cars but doesnt know why. Since media(society's attention allocator) has tied success to wealth/status accumulation, conspicuous consumption/luxury/leisure etc. People end up in these kind of traps. So now we are seeing reactions, esp with climate change/sustainability that ambition and energy have to be shown other paths. Lot of changes in advertiaing and media companies around it. All are explore mode functions.
I am saying AI has capacity to shift the natural balance in the Explore Exploit tradeoff. Human limitations we run into(listed above) lets say might allow 10% explore to 90% exploit on most probs. We might not be able to do more than that. Bacteria for example can adapt (explore) much much faster than us, cause they dont get their genes just from their parent (but from anyone via horizontal gene transfer). AI is similar which cld mean we suddenly start seeing a whole lot more Explore possible/happening than we are used too. Which would reduce the need for/amount of Exploit (baked into everything we do)