A lot of fusion researchers think we really ought to consider economics when we choose what reactor designs to pursue. It'd be a shame to spend a lot of time and money on an approach that can't compete in the market.
It would be a shame to spend a lot of time and money on an approach that can compete in the market if it works, but doesn't work.
Maybe it's my programming bias at work here, but it seems like getting anything to work should come before trying to optimize it to be cheaper than anything else out there. Until you have a working system to experiment with, how can you even be sure where the true costs will lie?
It's not that hard to project the costs of a reactor. Tokamaks for example have to be very large for net power, and the chamber has to be surrounded by neutron-shielded superconductors. That's going to be expensive.
Whether the plasma will behave the way we want is a tougher question, but roughly how much a device will cost to build is relatively easy to answer.
There's another benefit to cheaper approaches: you can afford more experiments, and you can probably do them faster. Taking a couple decades and $50 billion to build one experimental reactor is not necessarily the fastest approach to getting something to work. ITER is the ultimate waterfall project.
A lot of the cost will be ongoing operating costs, and a lot of that probably can't be easily predicted without operating experience. Of the expenses involved in running a fission plant (even in a hypothetical generous regulatory regime), how many of them could have been accurately predicted in, say, 1930?