For solar cells, it was (according to Wikipedia) 1954 before the first practical PV cell was built. So that's already a late start. From there it seems that a lot of technologies had to develop in parallel, from larger silicon wafers to advances in understanding how to work with thin-film materials.
Meanwhile, in the 1950s and 1960s nuclear power was going to be the Next Big Thing, and it was understandable that low-efficiency PV tech was of interest only for things like satellites. IMHO, the top-down energy policy managers that couldn't fulfill the "too cheap to meter" promises of nuclear power wouldn't have been up to the task of accelerating PV technology either.
Solar cells are not like gas lasers, which could have been built in a neon-sign shop in the 1930s if the science needed to steer the technology had been in place. They are more like practical neural nets. Sure, they could have been built in the 1960s... if only GPUs weren't so darned hard to come by.
(And as someone else suggested, any nation that began the 20th century by pouring the resources into PV cells and EVs that we put into fossil fuels wouldn't have lived to see the 21st.)
Similar story for lithium batteries. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7215417/ seems to be a decent survey of their evolutionary timeline. They consist of multiple discrete technologies. R&D that goes into a good cathode has to be done separately from work on anodes. Same for electrolytes and separators. Looking at the bibliography in that article, batteries don't seem like the sort of field where progress could have been accelerated just by throwing money at it, and I think that's true for PV tech as well.
The alternate timeline I was suggesting didn't start in 1900, but 1973, with the oil embargo.
It seems to me that by the 1970s all the groundwork was there to start a solar panel revolution and that the major reasons it didn't happen were political and social, not technical.
Keep in mind that I haven't done any research on this subject yet, I just thought of it a few days ago and I'd love to hear input from anyone who is much more informed on the subject than I am.
My father in law sold solar in the 70s. His take is that it failed for the same reason it fails today. For most consumers in most environments, it makes no economic sense whatsoever, even with tax incentives. The short total lifespan, rapid aging of the cells, expensive maintenance, and risks of roof damage over time, altogether reveal costs that typically are not advertised but remain obvious to most consumers. That's aside from the unsightly appearance of the cells.
In environments with more expensive power grids and higher temperatures they appear to be a suitable economic solution for the consumer. E.g. Phoenix, AZ.
>His take is that it failed for the same reason it fails today.
If your dad thinks solar is failing today I'm not sure he's a very reliable source of information. Solar has been growing exponentially since the mid 2000s and shows no signs of slowing down - the average year over year growth rate in solar capacity since 2016 is about 26% (doubling every three years). The most common error in understanding and forecasting the growth in solar capacity is underestimating future growth - every IEA prediction for the growth of solar for the last 10 years has been significantly higher than the previous year's prediction, and also a dramatic underestimate of the actual installed solar capacity. Experts in this space have been predicting that the solar exponential will level off next year for a decade, and in all likelihood they will continue doing so for another decade.
I bought my first set of Solar Panels 33 years ago. Those panels are still on the roof, have had zero maintenance, and are still happily running my small fridge.
The Battery story is somewhat different. My first set of batteries finally died a few years ago. Likewise I've probably replaced three or four inverters in that time.
Whatever, my off-grid Solar power system has repayed itself many times over.
Meanwhile, in the 1950s and 1960s nuclear power was going to be the Next Big Thing, and it was understandable that low-efficiency PV tech was of interest only for things like satellites. IMHO, the top-down energy policy managers that couldn't fulfill the "too cheap to meter" promises of nuclear power wouldn't have been up to the task of accelerating PV technology either.
Solar cells are not like gas lasers, which could have been built in a neon-sign shop in the 1930s if the science needed to steer the technology had been in place. They are more like practical neural nets. Sure, they could have been built in the 1960s... if only GPUs weren't so darned hard to come by.
(And as someone else suggested, any nation that began the 20th century by pouring the resources into PV cells and EVs that we put into fossil fuels wouldn't have lived to see the 21st.)
Similar story for lithium batteries. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7215417/ seems to be a decent survey of their evolutionary timeline. They consist of multiple discrete technologies. R&D that goes into a good cathode has to be done separately from work on anodes. Same for electrolytes and separators. Looking at the bibliography in that article, batteries don't seem like the sort of field where progress could have been accelerated just by throwing money at it, and I think that's true for PV tech as well.