Agency is overrated. The AI does not have to be an agent. It really just needs to have a degenerate form of 2): a selection process. Any kind of bias creates goals, not the other way around. The only truly goal-free thinking system is a random number generator - everything else has goals, you just don't know what they are.
See also: evolution - the OG case of a strong optimizer that is not an agent. Arguably, the "goals" of evolution are the null case, the most fundamental ones. And if your environment is human civilization, it's easy to see that money and compute are as fundamental as calories, so even near-random process should be able to fixate on them too.
It is a thinking system in the same sense as never freeing memory is a form of garbage collection - known as a "null garbage collector", and of immense usefulness for the relevant fields of study. RNG is the identity function of thinking systems - it defines a degenerate thinking system that does not think.
See also: https://en.wikipedia.org/wiki/The_purpose_of_a_system_is_wha...
See also: evolution - the OG case of a strong optimizer that is not an agent. Arguably, the "goals" of evolution are the null case, the most fundamental ones. And if your environment is human civilization, it's easy to see that money and compute are as fundamental as calories, so even near-random process should be able to fixate on them too.