Assuming you are grid-tied, the reduction of electric bills is predicated on net metering, where the utility company gives you (ideally) 1-for-1 credit for the electricity you produce. In California, this was called NEM 1.0 and NEM 2.0. Every few years the California utilities lobby the Public Utilities Commission (PUC) to chip away at the NEM program. This year the PUC voted unanimously to cut net energy metering and any new solar owner will fall under the NEM 3.0 rules. Net energy metering is essentially dead in California. In addition there are additional fixed fees customers have to pay. This will disincentivize solar-only installation. If you want fair compensation of what you generate you’ll have to store it and pay yourself.
Net metering just doesn't make sense though. Everyone who wants to rely on the grid for power availability ought to pay for grid upkeep costs. Net metering would make sense if it came on top of a baseline "connectivity fee" that covered grid maintenance, but that's not the current policy, would likely be politically infeasible, and would hurt rooftop solar customers financially anyway.
Subsidizing solar is a perfectly good policy choice, but net metering is ultimately an unsustainable way to implement it.
That's how my distributed generation contract with our Rural Electric Cooperative works. We pay $50/month for the connection, but the actual use/generation is 1 for 1 net metering. We are limited in how much generation we can connect to 105% of our average annual consumption, so over the course of year, on average, we zero out energy costs. If we want to connect more distributed generation than that, the tariff shifts to we get wholesale for what we sell, pay retail for what we use.
We pay $50 per month real money for the service, regardless of the net metering. If we are net producers in a given month, we pay nothing additional for power, and get a credit (in kwh) that rolls forward to the next month. If we are net users, after applying any rolled forward credits, we pay for the net usage in addition to the $50. Once a year, any rolled forward credits not used are paid to us at what amounts to wholesale rates, so we start over with zero credit balance.
Net metering means utilities sell the power your system produces at full price to your neighbor. They pocket the avoided cost of producing and delivering that power (and during price spikes the cost of fuel can be higher than the value of producing power).
Net metering does create stranded assets if everyone does it. But not everyone does. Further, the idea that the utility gains NO benefit from net metering is wrong.
The corresponding view that rate payers are subsidizing solar by paying additional grid costs is wrong. Solar reduces grid demand. This reduces required generation capacity. Capacity is very expensive.
Your solar panels tend to produce power at times when power is cheap - because there is so much solar power.
When you do net metering, the utility has to pay on top: They have to sell the power that you produce on the spot market, but the price they get for it is less than what they have to pay to you.
Solar also produces power when demand is at it’s absolute peak. It varies by location but prices are still cheapest at night the vast majority of the time.
There’s many ways to do net metering, but regulations are setup to maintain profitable local utilities. Utilities hate it because they make less money by selling less electricity not because it’s ever going to drive them to unprofitability.
I think the demand peak is in the evening. Whether that overlaps with daylight depends on the season. I suppose in areas where people use a lot of AC, peak demand might be earlier in the day, but I haven't seen any figures showing that. I have seen the evening peak.
It really depends on what you mean by peak demand. From the grid’s perspective a home with rooftop solar has zero demand during peak solar production.
However, that’s ignoring the home actually uses electricity during that period it’s demand simply isn’t something the utility can make money from.
Similarly, transmission losses increase as temperatures rise which requires more production to offset those losses. From the perspective of a natural gas power plant that load is just as real as actual customer demands because utilities need to pay for it.
Maybe I'm wrong here but everyone seems to be considering only home use. During day time there's higher demand from factories and industrial processes even if home use its at its lowest.
Hmm? Why guess? Each provider has public sites with graphs of demand and generation. Here's the CalISO one: https://www.caiso.com/TodaysOutlook/Pages/default.aspx showing a funny camel shape today of an 8AM morning peak (heating) and then a 5:30 PM peak. What we can't see is the demand that would be placed on the grid around solar noon, if it weren't being serviced directly by panels.
Industrial and commercial customers often engage in Demand/Response programs though (primarily for building HVAC), so if there's a likely residential spike, they can shed some of their load.
> Your solar panels tend to produce power at times when power is cheap - because there is so much solar power.
But that's not true in how PG&E prices power. The very highest rates start when the day is just past warmest and sunniest (3pm) when there's tons of solar power available to feed into the grid. The cheapest rates start at midnight when solar contribution is obviously zero.
Net metering doesn't make sense for a simple reason:
- when you use power, the grid is forced to produce it for you at that specific moment.
- on the other hand, solar panels produce energy at random times, regardless of whether it's needed or not.
Net metering forces symmetric pricing into a fundamentally asymmetric situation, which is not scalable.
You could think of similar asymmetries in other contexts where the unfairness is obvious. Imagine making a deal with a restaurant for them to deliver pizzas to you whenever you order them, and in exchange you'll give them back the same number of pizzas at a time of your choosing (or at random times). This is obviously not a fair deal for the restaurant.
Your analogy introduces factors that aren't present in the system, I dont think you need to use one here. Solar panels arent exactly random for one, pizzas arent fungible for two, and the grid is already composed of many producers and consumers for three.
Energy as a service isn't fungible either. You can not trade energy at one point in time with energy at an other point in time and expect both to have identical value. There are many energy speculators which trade is to determining when prices are low or high, and with good weather predictions its not that difficult to make a accurate guess about the price a few days ahead in time.
Imagine if you will there’s a peak demand for pizzas (dunno maybe Super Bowl?) and the pizza company is struggling to deliver enough pizzas. You come in and say hey Mr Pizza Company I’ll help out with that demand and make pizza for you, and so will my other neighbors and we’ll help you supply pizza. Mr Pizza Company takes your pizza and sells it but only pays you a fraction of what they charged their customers. And on top of that charges you for making pizza for them. This is what you get in return. Except you’re making pizza everyday and giving it to the pizza company so they can sell your pizza out in the market first before they sell theirs. The greatest trick played.
Example from Boulder: big houses near open space at the "end of the line" as far as the dendritic distribution grid is concerned install solar. On a cold sunny day those houses produce more than they need and more than the network leading to them was sized to deliver. Who pays to upgrade the transformers and switchgear upgrades? Should it be subsidized by all ratepayers?
Utilities pay much lower wholesale costs than than they sell the power for. Why should they be forced to buy your power at an inflated rate when the generator will sell to them for 90% less?
Agreed, at some point the grid will need to spend on storage to enable more solar users. So either you need to have a fee to pay for that, or push the storage burden onto the end users. Make that cost apparent and force people to deal with it and we will find creative ways to do so (like more EV vehicles that can act as storage, as the Ford Lightning can)
If only people were buying hundreds of thousands of batteries a year. Ideally mobile, on wheels or something. that they could plug in while working, and suck up all this extra grid power...
More likely at some point it will be uneconomical to run PV without attached storage. As an added bonus you’ll be able to pull cheap power on the winter from the grid overnight if you have that kind of tariff available.
Just be glad your not in the UK. I pay 35p / kWh down and get only 5p / kWh for excess I sell to grid through Smart Export Guarantee. Price down has doubled in two years and way up has hardly moved.
I joked with my neighbour that we should wire up our homes and sell my excess to him at mid point and we would both be better off.
> I joked with my neighbour that we should wire up our homes and sell my excess to him at mid point and we would both be better off.
I've been looking at this but not as a joke and you run into some fairly interesting problems when you start paralleling after the meter. Long story short: don't do this unless you have a lot of electrical engineering chops or you might end up create a liability in case of a leak to ground or an overcurrent situation. This needs some special handling. The best way would be to just create an extra circuit and use a waterproof extension cord to power it, that way all GFPs see their real current and not a fraction of it and you can't accidentally feed into a distribution panel at more current than the breakers see, which is an excellent way to start a fire.
He doesn't but few on the street do. Could hack something over the summer that provides some charge on my driveway when I am net producing. Not sure if any legal/tax issues with the setup though. An interesting community project.
It might even be a greater option to have a community battery booth(a bank of LFP or the upcoming Sodium ion batteries). Everyone throws their excess energy into it, and it can be a charging station for e-bikes, EVs etc.
In maine, CMP charges $11 a month for a "connection" to the grid and your first monthly 50kwh. Is that not the case in california or other net metering markets?
Yes it is, there's a connection fee of approximately $10/month for that same reason. Net metering is a totally unrelated topic. What's being argued about is the rate. Until now, rooftop solar customers get a retail-level payback for excess production sent to the grid. Under the proposal, this payback would be at the wholesale price, which is significantly lower.
You can still totally offset your own actual consumption at retail rate, but any negative over your actual consumption for the month, you only get paid cash at wholesale rate?
Without knowing anything about this or any context... that does not seem unreasonable to me.
Note that OP didn't even require net negative at all to have it still totally be a great investment. He only offest a portion of his actual usage. So net negative paid only at wholesale rate seems like it should still be functional economically and leave plenty of incentive. Wouldn't have effected OP at all, for those who do go net negative and it effects, it seems like it shouldn't tip the scales.
> You can still totally offset your own actual consumption at retail rate
AFAIK, most home solar is not set up this way. You don't use the energy from your solar panels. The solar panels feed directly into the grid, and your usage comes directly out of the grid.
So if you use 100kwh, and generate 100kwh, then you will pay 100kwh at retail rate, and receive 100kwh at wholesale rate.
Hmm, I've never heard of such a setup, actually. Usually these kinds of generators are called "behind-the-meter" because they are literally behind the electric meter, so any generation you make is directly used by your house, and any excess crosses the meter. At least this web page (from a utility in NE-ISO) indicates that solar generation is still behind the meter even in a net generation setting: https://www.bostonsolar.us/solar-blog-resource-center/blog/w.... It could vary from place to place, though.
I just got solar panels installed (in Berkeley, CA). I can confirm that this setup (you feed into the grid, and then get supply back from the grid) was the primary option suggested (it was possible to request and receive a self-contained "behind-the-meter" system, but was considered unusual)
You've actually done this recently, so probably you are right, but I'm still doubtful. Are there two separate unidirectional meters involved? And the solar panels aren't hooked into your electrical panel or in house electrical system in any way? What you are describing isn't impossible, but it's an odd default, and would seem to be a terrible deal for homeowners now that net metering is being phased out. While it's usually true (especially in California) that you can't keep your house powered while the grid is down, I don't think I've ever heard of a case where all locally generated power was sent 100% to the grid. I've only heard of this being done in cases where the same owner has generation happening at a different place on a separate electric account. Can you confirm?
Ah, I see what you mean now. Sure, I can't absolutely confirm that, from a physical point of view, all locally-generated power is sent out and comingled with grid power. Rather, from an _administrative_ perspective, it's the case that I don't have any "exclusive claim" on the power that I generate, and my house's power is not independent of grid outages. It's quite possible - likely, I agree! - that the power that I generate is locally consumed; but, from an administrative and practical perspective (esp. lack of independence from the grid), the system "behaves as if" that is not the case.
So - yes, you're probably right!
> would seem to be a terrible deal for homeowners now that net metering is being phased out
Whether you're right or not from a physical perspective, this is absolutely the case from a numerical perspective - which is why I'm extremely glad that I got mine installed in the grace period during which NEM 2.0 pricing is still in effect despite NEM 3.0 being approved (and becoming active in April).
Hm, while I understand what you mean about where the actual energy goes, my understanding of how the costs are accounted are that there isn't any separate metering for "electricity out" vs "electricity in".
The Solar out actually runs your meter backwards, so does actually reduce the amount the meter shows you owing to charge you at the retail rate. There is no metering present to make it possible to charge you for the 100kwh you used at one rate, but then you pay you for the 100kwh you generated at a different rate. The electricity you generate runs the meter backwards, and reduces what you are charged for consumption -- which monetarily means you are "being paid" for it at the retail rate.
But if your meter goes negative in a given time period, they can tell at the meter you actually generated more than consumed, and could pay you out a different rate for that. (Or as OP notices, in some places, they allow you offset your use, but don't actually pay out for net negative at all. )
This is why the metering is called "net", it's your "net" use, consumption minus generation. I believe that's how it works for my friends here in Baltimore who have solar panels.
Are there other places in the USA where they install more sophisticated metering capable of separately metering out and in, and charge different rates for it?
Or, anyway, since we're talking about California, if anyone wants to clarify how it actually is going to work in CA, rather than speaking in hypotheticals based on our theoretical understanding of how things work, that would be even better!
With PG&E, we have networked meters that report fine-grained usage in order to support the newly standard time-of-use rate plan. It is not an average over a metering period like a month but able to resolve usage at different hours of the day. Even my bill shows peak and off-peak usage as daily bar graphs, and I think I can login to see more fine-grained data at the power company website if I wanted to.
But I agree with the message several posts up. The old net metering was a subsidy and I understand it going away. It makes sense to me at a technical level that you would be billed your fine-grained net usage, so if your solar supports your daytime load then you didn't consume power, but you can't expect daytime excess to 1:1 cancel your nightime grid load, unless you install local batteries to time-shift your own power.
The old net metering did that. People generate excess during the day and consume power at night from the grid and expect it to all cancel out.
For home solar power, does your PG&E metering meter "power generated" separately from "power consumed"? I'm not sure if that's what you're saying or not.
Do you have details on how home solar is or will be metered in California, with regard to credits for power generated, and if it will be different depending on whether it's "net negative" or not?
I don't have solar. But, I read up on the solar rules changes because I can imagine getting solar later. We're not in a hurry because it will need to be a bigger "electrification" project to make it worthwhile.
With the right kind of grid-tie, you should be able to consume your own solar on-premises and have effectively no grid consumption charges during productive hours. If you had a net excess to supply back to the grid, that net would be credited at the applicable wholesale rate. If your solar was insufficient for your load, your net load on the grid would be billed at the applicable retail rate.
Think of all of this as "instantaneous" net metering. If you want to time shift your production and consumption, you need your own on-premises battery and accept your own charge/discharge efficiency losses. Your 1 kWh of excess solar will end up giving you less than 1 kWh of overnight power due to losses in the converters and battery chemistry. The old net metering rules allowed people to act like the grid was their perfectly efficient battery and consume 1 kWh of nighttime load for 1 kWh of daytime solar excess.
Another proposed rules change was rejected. It would have placed an additional fixed grid connection fee only on solar homes. It was an attempt to add a connection fee for homes that might hit net zero consumption, without restructuring the rest of the non-solar rate plans where the infrastructure costs are tacked onto the marginal energy price.
The other thing to understand about PG&E billing is that some of the new time-of-use rate plans have a "baseline allowance". In a recent bill, we were charged $0.37 or 0.39 for 1 off-peak or peak kWh. But, there is an offset of $0.09 per kWh for the first 281 kWh used in the month. This effectively gives you tiered pricing but without calling it out as such. I don't know why they did this instead of publishing tiered, time-of-use rate schedules.
Most net metering does have a base connection charge. Mine is ~$35/mo, even when I have produced more energy in a month than we consumed, my bill was still about $40 after base fees and taxes/etc.
Isn’t there a base charge though? In my home town it’s around 25 dollars a month just to be hooked up whether you use anything or not.
What would seem logical to me would be a base charge, they net meter you to zero, but then you sell power at the wholesaler rate. If that wipes out your connect charge you still have them the value that they would have paid for the power you made.
Maybe the base charge would have to be higher for net metered houses, it might have some subsidy built into it based on expectations of selling power to the client.
The thing is that grid upkeep costs increase when people are selling power back. Who should have to pay for those increased fixed costs? Net metering means everyone pays.
In the northeast at least there are separate lines for supply and distribution. If your supply goes negative, you still pay the distribution fee....and if you supplied more than the distribution cost....why wouldn't you receive money? Unless the energy company is for some reason not charging properly for distribution, seems everything is already baked in.
But you do pay for 'grid upkeep costs', the grid availability is a part of that connection fee, otherwise what would you connect to if not to the grid. Even if you consume zero KWh in a month you will still pay.
No, you'd don't need net metering to make it viable.
In a typical grid-tied system, the setup is such that any demand is fulfilled first by solar capacity, and second from the grid. If your array can fulfill 100% (or more) of your demand, then you pull nothing from the grid.
If you don't have a net metering arrangement and you regularly produce more than 100% of your local demand, then your options are to simply throw away the excess, or use a battery system to store it for later use.
If you have a net metering agreement, then the grid acts like a giant battery, which may or may not have input and output fees (think of it very roughly like Amazon S3 pricing). Ideally you have a 1:1 arrangement, where you can feed energy into the grid for free, and pull back up to whatever you supplied at parity pricing (eg: for free). But, there is no reason a utility company has to offer that arrangement, they could charge you some fee to supply energy if they wanted, and they could charge you some fee to pull energy back.
For the most part, the most cost effective solar systems will supply about 75-80% of your average monthly demand. It almost never a good investment to overbuild your solar system on the premise of making money off of net metering, as the rate you are paid for production is essentially a wholesale price, making you a micro power plant, which is not a very profitable business. Covering the bulk of your energy usage with your own solar supply, and then pulling a minimum amount of power from the grid to fill the gaps will tend to keep you at the best supply pricing tier.
>… the most cost effective solar systems will supply about 75-80% of your average monthly demand
I am not so sure that is true. Seemingly a large percentage of new installation costs are fixed (paperwork, labor, inverters, etc) with only a smaller portion for the panels themselves. For a minor incremental cost, you can fully provision your house.
Over the last 8 quarters their average profit margin is a whopping 3.25%. If you think that’s just too high I’m not sure what to tell you.
An alternate way to look at the CA rule change is that CA needs more installed storage to shore up the shitty grid, and incentivizing people to install solar alone via net metering isn’t a great idea anymore. Remember that peak power demand is right around and just after sunset.
> Over the last 8 quarters their average profit margin is a whopping 3.25%
This is the trade-off you take when you become a quasi-government company. You get a low, but nearly guaranteed profit. Executives still get their bonuses, employees get paid, etc.
If they want more variable rates of return then they can drop the public protections and go full private like any other regular company.
> Over the last 8 quarters their average profit margin is a whopping 3.25%. If you think that’s just too high I’m not sure what to tell you.
Profit margin is after deducting all expenses. A lot of corrupt spending can be included in expenses to reduce the profit margin at will while still raking in the cash. So yes, that's absurdly high given how they operate.
Because net metering is forcing the utility to purchase power at the retail rate. It doesn't make any sense. It isn't a profit margin issue. There is a massive infrastructure and expense to deliver wholesale power to an individual customer and it has to be recouped. Either you sell back your power at some percent of wholesale or solar installs need to get a corresponding increased delivery fee.
What we need is a fundamental change in electricity billing. The value in a grid connection is in its reliability, availability and demand response (you can demand anything from 1 Amp to 200 Amps at any time, with no notice). The current usage-only billing model is broken.
What we should do is pay a large flat monthly fee for the grid connection, then a lower rate for generation/usage, rather than bundling the delivery costs in. This is how many other utilities (Gas, Water, etc.) are billed, and it makes much more sense. This would also line up better for people with solar panels, as those who choose to leverage a grid connection would lay in line with their utility, or could choose to go off-grid, if they didn't want to pay for the availability of the grid.
I think you’re missing a different piece of the puzzle here which is that if I pay $10 for the first gallon of water and next to nothing for the next 500, I have an incentive to use more water and not less. I’ve paid for it anyway.
When you’re trying to solve public policy problems especially where conservation is concerned the notion of math gets complicated very quickly.
Additionally, the worst strains on the grid are really measured by the peak loads, so if you change billing a way that makes the peak energy less expensive, and therefore disincentivize people to (a) reduce usage at peak times or (b) install solar panels/batteries to mitigate their peak usage, then it actually might make the grid more unreliable.
Water is different in that we really only have usage concerns about water, AFAIK there is never any issue with the water pressure drops at peak times (perhaps thanks to water towers). With electricity the concerns are mostly about the peak usage, and not so much the overall usage.
Yes, that's what you see in the commercial side, where demand charges based on your highest peak usage in a month are common. However, they are un-intuitive and easy to generate big bills for small amounts of power used, so I agree with the general idea that they're not a good idea for residential users.
You could institute something like time-of-day-based fees based on average demand for residential users without having the large unexpected bill problem.
I’m told that Industry gets charged not by how much current they draw but by how much it distorts the grid power waveforms. In fact there are experimental designs out there for power correction facilities that delay the waveforms by for instance 90% of a wavelength (so everything lines up again) and convert the delta to DC to store in batteries. The battery power can be used for their own purposes or to reinject in low voltage situations.
I was watching this one video where they were talking about how they dropped their electric bill by changing how whatever machine they were demonstrating started up. Instead of off to full on they set it to ramp up slow enough to not push them into the higher peak pricing bracket. Electricity usage was the same overall but the bill was less.
Perhaps, but if we want to dis-incentivize usage as a political policy, we can just increase the usage charge above from "next to nothing". For power, generation costs in California run from $150/MWh to -$150/MWh:
https://www.caiso.com/TodaysOutlook/Pages/prices.aspx
$0.15/kWh seems like enough to begin the dis-incentivizing of usage, and that can certainly be pushed up more.
So we’re going to reduce the hourly rate and charge a base fee, and then increase the hourly rate back to the old value?
So you’re asking for a regressive rate in electricity.
If your solution to a public policy issue is “why don’t we just” then you don’t understand the issue. I’m just pointing out externalities that you guys aren’t thinking about when shooting the shit about upcharging low income families and allowing the rich to get richer. You are contributing to the problems we complain about.
Certainly such a model is regressive, but it is in line with the actual costs of service. In my opinion, it's better to have a pricing model that reflects the actual costs of service, then an explicit subsidy to repair the regressivity issue, such as the Consumers Affordable Resource for Energy (CARE) Program in California.
Using usage as a proxy for value-received from the grid worked fine when there was a monopoly provider, but with distributed solar being a thing, we have to move off of the model, or we end up with an even more regressive system. As I noted before, we use these systems for water, gas, and other utilities, so we have methods for addressing their natural regressiveness.
Alternately, if you feel (and this is reasonable) that power should be a ubiquitous service provided to everyone, take over the grid, pay for maintenance and development of the grid with tax dollars, and bill only for generation/usage after a reasonable base allocation.
Theoretically, you could add a third type of charge related to the externalities associated with your consumption. I'm not sure exactly how the accounting would work, but maybe an escalating fee based on the amount you consume. My utility actually does this with electricity, where the rate goes up after your first 1000 kWh. Obviously this could get more elaborate.
Note: the most recent English document still describes this as "planned", but it has been approved and has gone into effect with the start of 2023.
The difference is though that the state subsidies the "cheaper" rate. When a customer has consumed the subsidized amount of electricity or gas, they pay the full market price.
In the Netherlands you a fixed price for being connected. The amount varies on the size of the connection. Usually 1 phase 40A is as expensive as 3 phase 25A. Anything above that easily triples in price. Aside from this you have net metering still and you pay for what you consume. No “usage bundle” is included in the connection fee.
Also, the companies for infrastructure and that provide the actual energy are separate but the energy supplier charges for the infrastructure provider. The infrastructure provider is semi privatised while the energy supplier is fully privatised.
It kind of works, except that we need like everyone else, to upgrade the power grids to deal with increased consumption and we haven’t invested in that.
> Usually 1 phase 40A is as expensive as 3 phase 25A. Anything above that easily triples in price.
That's...not a lot of power. O_o Is it common to get that little amount of power?
I have 200 amps going to my house, but the master breaker on my panel is 160 amps. My car charger alone is 48 amps. When I had an electric stove, that was on a 40 amp breaker.
We run at 230V, although these days with solar panels all over the neighbourhood the voltage is slowly going up to 240.
Only recently, with the introduction of climate goals and ditching gas, houses are getting heavier connections. Electric ovens are typically 16A, although induction cooking stoves are getting 32A or 3phases.
The 22kW car charger is by far the most power hungry device and needs 3x32A.
> What we should do is pay a large flat monthly fee for the grid connection, then a lower rate for generation/usage, rather than bundling the delivery costs in.
This is the absolutely worst way to bill. It enourages wasteful use and hits very hard the poorest people who can't afford the flat fee.
Why do power companies even buy back solar power in the first place? It seems like it's not in their interest to help their customers compete with them.
I do agree there should be a connection fee, like an Internet connection, that covers fixed costs. Watts cost more than bytes, so a usage fee makes sense too.
> Why do power companies even buy back solar power in the first place? It seems like it's not in their interest to help their customers compete with them.
Good question! The more power you generate locally, the less you need to transmit along expensive infrastructure, so it is actually in the interest of utilities to utilize and pay for all of the generation capacity they can as close to the consumer as possible. You can't get any closer to the consumer than paying their neighbor to power their house!
> The more power you generate locally, the less you need to transmit along expensive infrastructure,
This is true, but I don't think the savings on infrastructure scales linearly with the distributed power generated. For example, if you generate 1 MW of distributed power, that reduces the load on wires, transformers and towers across the grid, and may even mean you can eventually use cheaper gear at your next replacement, but it's not going to eliminate the baseline costs (towers, wires, transformer sites, etc.) of serving any particular customer or region.
Eventually you will be able to break large grids into smaller ones and still be as reliable as previously, thus eliminating longer runs of transmission lines and reducing power over the lines you cannot eliminate, which will reduce the sizes and costs of the lines, but not eliminate them. Someday, I would hope that you could tie small neighborhoods together as the grid cells, with very little interconnections needed between them.
I work in the industry, and my vision for the future is less about realtime power transmission and more about pooling energy in smaller grid cells, predicting the usage of the pooled energy in the future, and moving energy to the pools which will need more energy sometime in the future at a trickle rate based on predictions. The ideal situation is to reduce energy transfer to zero over time by distributing the energy generation densely amongst the most dense users of the energy. I don't think the industry is ready for such a step-change in thought yet, but someday electricity will be treated more like water than like it is today.
Water works because there are water towers that buffer usage. Until a thaw happens after a big freeze and drains all of the towers. If you can store electricity then you can increase the reliability and move the store pools around as needed depending on the usage.
> Why do power companies even buy back solar power in the first place?
Increasing capacity is really expensive.
It’s cheaper for them to do stuff like buy every customer a more efficient refrigerator or insulate their attic than build a new power plant so they go for schemes like that.
Its cheaper to shave peaks than almost anything. A distribution company can spend a normal month's worth of electricity in a few hours at system peaks. Unfortunately people aren't willing to be warm or cold for a bit at peaks to save the utility money. I don't blame them since they will get almost nothing out of it.
They are a regulated monopoly as they have been granted rights by the public to use rights of way. So what hey can and cannot do is determined by politics not just what they think of.
First you pay for the electricity itself, can be from anyone, as on the level of the country in question electricity is entirely fungible.
Next is the connection or standing fee. Make sense, eventually you have to pay for the transformer and cabling to your home, even if in some location this is rather high.
And final piece is taxes + transfer fee paid by kWh. Again, you have pay for the local grid and charging by use makes some sense.
There is two prices that vary per use and one that doesn't. And reasonably so. Maintaining existing lines have a cost.
We need something that will incent home solar + storage because:
1) it increases resilience in disaster scenarios and a lot of high usage scenarios. I especially think back to Puerto Rico getting hit by the Hurricane, or the Texas icestorms or things like that. Having a lot of homes semi-independent of the grid would help people mitigate impacts, because you can rely on neighbors for certain things while the grid is down.
2) BEVs are going to introduce a huge additional burden on the grid. But if we couple it with a lot of home solar installations, it will reduce the impact to the grid and the grid can focus on more industrial tasks like highway supercharging for semis and those big challenges.
3) aside from the solar panels, home solar means a distributed storage for the grid too.
Net metering is flawed, but there's the baby with the bathwater argument. Let's introduce better incentives before chucking the flawed ones that are working, and I don't see that happening.
And it's hard to price the incentives. Home solar panels are going to change a lot with forthcoming perovskites, and home storage is going to drastically change with production Sodium Ion, high density LFP/LFMP in the next few years, and Sulfur beyond that.
What I think is ridiculous is the pricing of home solar+storage versus what the grid buyers get. If home solar programs could have their panels bulk-purchased along with grid solar purchases and then the grid providers have to get them installed to homes, it would swap the grid providers from opposing home solar to being incented to finding people to allow the installation.
This is all rather bizarre. If PG&E charges 30 cents/kWh for a marginal kWh delivered, then either something is quite wrong with the pricing model or it really does cost some respectable fraction of 30 cents/kWh to get a marginal kWh through their system into their distribution network where the customer is. If it's the latter, then reduction of 1 kWh delivered should save them the same amount of money.
I think the real issue is that the pricing is almost completely divorced from the actual cost structure, so the whole rate system is a kludge that gives everyone the wrong incentives.
> If it's the latter, then reduction of 1 kWh delivered should save them the same amount of money.
The "net metering" policy being changed isn't regarding "saving the grid the money by reducing the amount delivered to a customer." If you don't use the power from them, you don't pay for it. But this is about the utility paying the customer the retail rate for excess generation - aka paying retail rate for something that would offset a cost they'd otherwise pay wholesale rate to generate. This is about the current (not the new) system: https://news.energysage.com/net-metering-2-0-in-california-e...
It's not clear to me at all that it makes sense to compel the utility to purchase your excess power at the retail rate, if you're generating more than you can use. It also seems to gives weird incentives to size and spend on your home setup.
Reduction of 1kWh delivered does not save them the same amount of money because the cost of connection is dominated by the 1st kW delivered. Scaling delivered power does have some cost, but the majority of delivery is in being able to deliver power to a particular area in the first place and to the maintenance of the lines providing any power.
Yes and no. If I draw an extra 10 kWh one random hour and export 10 kWh back at some other random hour, this costs the utility essentially nothing. The transformer is already there, and other customers near me will consume it possibly without even involving the transformer.
(I say “essentially” because there is some tiny loss in the system, but that’s maybe 2%, not 70%.)
Absolutely false. Wholesale electricity costs vary wildly with time of day. You’re saying the grid should be forced to act as an unlimited battery for free and that just doesn’t make sense.
The issue is that you have to size the hardware for a multiple of the power draw you expect and the cost doesn't go down because people use less temporarily. You also have to do all the vegetation management and billing and it and billing regardless of the amount of kWh that are used even if a solar install is negative. They might draw today and it has to be available.
> They might draw today and it has to be available.
This is true for non-solar customers too. It may be more true for solar, but unexpectedly high residential loads happen for all kinds of reasons. (A big one is heat on a very cold day, and that tends to be a correlated load.)
Another option is to split the fee into a 'connection fee', 'transmission fee' and a 'power fee'.
Have the power fee set realtime by a market. The price might go to zero if there is excess generation.
Have the transmission fee capped by a regulator - and charge it for any kilowatt hours that enter the power network.
Allow third parties to arbitrage and offer 'fixed rate' deals, so consumers aren't at the whims of an electricity market with occasional sky high prices.
Finland uses this model. Though the capping of transmission fees is bit iffy.
And lately the fixed rates have been quite variable going from 4 cents to 40 cents. The action by European countries caused some uncertainty. So timing has great effect.
This is sort of how it works in Sweden and probably a few other countries.
Bill is split into market and (local) grid parts. Many consumers have now moved the market parts to realtime pricing, since fixed-price arbitraging has become so expensive.
You have valid points. But it does make sense if you are trying to 1) incentive solar and 2) keep the pricing super simple for the average Joe customer.
> Because net metering is forcing the utility to purchase power at the retail rate.
No, it doesn't.
I have solar+battery with net metering. I buy from the electric company at ~11.5c/kWh and sell our excess to them at ~2.7c/kWh for a difference of ~4.3x
As the sole entity that you can sell to - short of regulators being involved - they set the rate. And yes, there's already a monthly connection fee.
That is not "net" metering, then. "Net" metering is having an import and an export accumulator meter and charging the customer for the "net" value in the equation "net = import - export". Anything else is marketing bullshit from utility industry organizations.
The term is sort of self defining. Of course it's not surprising that some people just ignore that it means something specific as a phrase. In my state, they don't try to call different rates for demand and generation "net metering":
That is supposed to be what net metering is, but electric utilities captured the regulators and changed the definition to ensure they could cheat the consumer out of the value of their bought and paid for solar panels.
PG&E is a designated as a Utility by law. This means that their profit margin is fixed (at 9? percent). Basically this is the discount they should buy the power from you, which should be fine.
When the company is a utility with a built in monopoly, with legal access to shared resources like land for transmission cables, and is the easiest path to executing policy as important as decarbonization?
Yes, if it helps the implementation of the policy.
Our entire power system lacks proper carbon taxation. So I won't cry over minor things like power companies making a bit less money when the effect is a major avenue to mitigating carbon emissions.
Arguably incenting the grid to switch over from coal/natural gas to wind/solar/storage is more important, but lets try multiple approaches and not stop what is working.
When said company is government subsidized said customer who is also a tax payer, yes. These companies use tax dollars for infrastructure investment and maintenance, at least here in CA.
Those investments theoretically benefit all of their customers.
The question is whether specific customers should benefit more than others, when the activity they are doing probably adds costs for the others.
The "supply" value of the electricity being generated is going to vary widely, it's not at all obvious that fixing it at the average cost of supplying residential electricity is going to benefit everyone buying the electricity (which as the volume of electricity involved becomes larger sort of becomes a requirement).
Consumer rooftop solar is the most expensive way ever thought of to provide electrical power. Money is fungible. A dollar tied up in building a one-off consumer solar installation is exactly a dollar unavailable for building out utility grade solar which can be done at a fraction of the cost.
With net metering, wealthier home owners are essentially paid the retail rate for the electricity they sell to the grid which causes higher electricity bills for those who can't afford to put panels on their roof - sort of a reverse Robinhood scheme. I recall reading that with net-metering, non-solar households in CA pay an estimated extra $115 to $245 per year to cover the subsidies given to their wealthier neighbors.
> Money is fungible. A dollar tied up in building a one-off consumer solar installation is exactly a dollar unavailable for building out utility grade solar which can be done at a fraction of the cost.
This makes the false assumption that those “fungible” dollars are somehow earmarked for solar and the homeowners wouldn’t just spend them on some other, non-solar, goods.
I might not have been clear enough. I am saying that it is incredibly wasteful for the government to directly subsidize consumer solar when installed and then later forcing less wealthy consumers to subsidize their wealthier neighbors. If the government wants to encourage the development of solar power, those subsidies should have gone to encouraging grid based solar which costs a small fraction of the cost of one off consumer solar panel installations. A dollar tied up in building a one-off consumer solar installation is exactly a dollar unavailable for building out utility grade solar.
It's a nice incentive, but you shouldn't need to be paid the retail rate for solar when selling it back to the utility. Especially when the retail rate is from PG&E and contains a lot of non-paying-for-electricity things like covering past liabilities for wildfires.
And you can perfectly well size your solar appropriately so that you aren't net-positive pushing electricity back to the grid at all (or only a little).
This incentivizes storage and time-shifting usage to solar peak times, which seem to be needed in California. Does anyone know if there are pricing signals to help with that? For example, if I (as a power-plant operator or whatever) found a way to sell electricity right as the sun is going down, am I rewarded for that?
As an individual with a solar install, I think this also encourages people to use the energy they produce directly, which also seems beneficial. For example, if your house is well-insulated, you can probably pre-cool it a few degrees while your solar is producing so that you don't need as much cooling during the peak times. If you can be home during the day time, you could set your car to only charge during peak times.
> This incentivizes storage and time-shifting usage to solar peak times, which seem to be needed in California. Does anyone know if there are pricing signals to help with that? For example, if I (as a power-plant operator or whatever) found a way to sell electricity right as the sun is going down, am I rewarded for that?
NEM3.0 incentivizes exactly that. It prices exported electricity at the avoided cost rate. During the late summer and early fall, between 7-9PM that can be more than $1/kWh, or 3x the average NEM2.0 rate [1]. Of course to maximize that rate you need to store energy and sell it back at the optimal time.
The average rate with NEM3.0, however, is much lower.
Net metering is a regressive tax on the poor, because when rich people get solar panels and sell back to the grid at retail prices, the utility has to raise rates on everyone else to pay their large fixed costs. It may have been a good idea at one point when residential solar was so rare as to be effectively nonexistent, but now it has to be abandoned.
In my state, generation is still pretty "regulated" and the cost of generation is a separate line item on my bill, technically from a different company, from distribution. There's no reason at all I shouldn't be paid the same price per kWh for generation.
That depends what "generation" includes. It probably also includes the transmission and stepping up and all of the equipment to deliver it to the distributor. If you are grid tied, you are still using that system as a backup at minimum and should contribute to its upkeep. Therefore you should be paid accordingly. I don't think there is much objection to paying for solar excess at the bulk rate.
If there’s no net-zero metering, aren’t they incentivizing people to be completely disconnected from the grid, or at least not incentivizing them to be connected? (Which is bad for the grid.) Why would I fuck around with the utility at all if I’m installing enough solar/battery to power my needs?
> Why would I fuck around with the utility at all if I’m installing enough solar/battery to power my needs?
What will you do when it has been raining/cloudy for weeks straight like it has recently in California? For this period, wind has dramatically outproduced solar [1]. Even in the summer/fall, when there was severe wildfire smoke in the area, my solar+battery barely produced anything.
Sure, you can set up a propane generator (very expensive at the size needed for a full house backup), but then you are dependent on the propane-distribution grid. The dirty secret of "no-sacrifice off-grid" lifestyles is how much they are dependent on propane for heating and electricity.
Also, the grid's up-time is going to be better than your home battery setup up-time both because it is constantly maintained and it has access to a greater diversity of energy sources (wind, nuclear, fossil, large-scale storage) that are firmer than anything you can attach to your house.
The fact is that modern living standards (cars, TVs, comfortable temperatures) require a much larger amount of energy and energy-availability than most people can produce on their properties with solar. The additional energy has to be imported somehow, either from the electric grid or as liquid or gaseous fuels.
"Why would I fuck around with the utility at all if I’m installing enough solar/battery to power my needs?"
This is exactly my sentiment and is the reason that we have made ourselves independent from the grid.[1]
However the math gets very difficult if you need to charge an electric car ... it's a tremendous amount of electricity to keep one, not to mention two, cars topped up after daily use.
The extra panel and battery requirements for cars are such that it really does make sense to be grid-tied and arrange your charging on off-peak hours, etc. That is what makes this so frustrating for typical homeowners - there's an obvious and sensible symbiosis here that would work well for (most people).
[1] That, and the fact that I hate PG&E with the burning fire of 1000 suns.
If the grid is smaller, then it is good for the grid because it reduces capacity and complexity requirements. Also if the net human experience is better, the grids can as bad as what enables that.
Was the California grid always private or was it run by the state government at one point? To me it makes no sense to privatize services where no competition is possible. You just end up with perverse incentives to do stuff like the above.
The vast minority of utilities in the United States, by number of customers served, are investor-owned; publicly-owned and co-operative utilities only cover 50m people.[1]
Theoretically, the grid is supervised by the California Public Utilities Commission, which has wide latitude to set standards and regulations for PG&E and others. The Public Utility Code is the highest law in the state of California[2], so IMHO a large part of the blame for PG&E's failures fall on CPUC's failed oversight[3][4]
In the US, a lot of the electrical grids were corporations (e.g., Con Edison on the East Coast is from Thomas Edison's company). I'd never looked it up before, but the predecessor to PG&E in San Francisco was the San Francisco Gas Company [1].
Of course, in 1896 an Edison company merged with the derivative of the original San Francisco Gas, to become San Francisco Gas and Electric. By 1905, that merged with something else to become PG&E.
A few data points as the picture is murky and mixed:
The wider "grid" is managed by CAISO - the independent system operator.
LA's utility is municipally run, in contrast to the big 3 IOUs in the state - PGE, SDGE, SCE.
SDGE had a 50 year exclusive franchise agreement with the city of San Diego that recently renewed under slightly different terms. Notably, and I'm paraphrasing: the city has the option after 10 years to terminate under certain conditions. [1]
Enron lobbied for, contributed to, and profited off of utility deregulation over 20 years ago. [2]
You need a battery to efficiently use your solar. Why should pg&e be forced to buy at an inflated retail rate when they can get wholesale electricity for 70% less?
I worked in the solar industry for years and this is a very good summary of the residential install process from the customer side.
I’ll also second one of the main points of the author: do NOT get a solar lease. Either finance it through a solar loan or pay cash. A good solar company will not push you into a lease. If they try, talk to someone else.
Also should be noted to get multiple estimates, even though that's kind of common sense. I had them vary wildly, where the most expensive one was literally twice as much as the cheapest one.
I got three estimates. Huge variety in price. All three companies sent someone out to survey the house. Two were very friendly salesmen who took photos of the roof and told me how much I would save, but couldn't answer technical questions. The third company sent out an engineer, who came into the house and talked me through where the inverter would be best situated, how the batteries worked, the benefits of a diverter etc. Third company instilled the most confidence and they were also the cheapest. Went with them.
Why are the leases so problematic? If you don't have the funds nor credit score to finance it, why not just rent out the roof space so to speak? Any return you get from it is better than nothing.
Solar leases are the problematic ones (where you pay a percent of the energy you generated). It sounds like a good deal, but the rates increase yearly and the compounded cost doesn't end up being worth it. The author was cash-flow positive on a fixed-rate lease immediately and it doesn't come with a contract with a 3rd-party provider, although I guess that depends on how much sunlight you have.
We got a house in San Diego near the bottom of the last market (2012) and I was a medical resident at the time with two school age children. The previous owner had installed electric everything (even had to get a larger trunk line run and breaker panel). So our electric bill was a little nuts, randomly hitting tier 3 pricing and landing $350+ electric bills. For a household with negligible marginal income, that was unpleasant. We had 0 capital to play with but switching to solar was a no brainer. Even the lease is cheaper than paying PG&E.
Would I do it that way again? Probably not. But I have more wiggle room now.
> hitting tier 3 pricing and landing $350+ electric bills
I just wanna chime in and add: PG&E got rid of the third tier (unsure when) and, in the Bay Area, we're looking at around $0.35-$0.40/kWh at the upper end of the rate schedule (plus an explosion in natural gas prices on the West Coast). $350+ electric bills are pretty much now the norm for anyone who wants to heat their house up much past 60°F.
My understanding is the leases are a challenge when you want to sell the house. New buyers would have to also agree to the lease's terms or you have to pay to break the lease and remove them.
I wonder what causes this high electricity use. Not because it's any of my business, it's just genuine curiosity.
He never reveals actual use, only savings. Still, at 0.25$ per kwh charged and an average monthly saving of 210$, we can work out that 860 kwh/month was saved, which amounts to 10,320 kwh per year. Actual use may be (far) higher, but we don't know.
Just these savings alone is triple my total electricity consumption (in the Netherlands) in a fairly similar weather pattern. And I'm not exactly frugal with electricity: big TV, washer, dryer, powerful oven, multiple PCs, and even a few small power hungry electrical heaters (due to very high gas prices).
I'm not exactly projecting a personal situation. Our government has established that median/typical use of a dutch household is 2,900 kwh/year in order to qualify for the new price ceiling, a temporary measure to keep energy affordable.
So here we're talking about 3 times more than an above average consumer (me), and what might be a factor of 4-6 more than our median. Yet the person is unlikely to live in a mansion as he can't afford to pay for the panels in cash. I don't mean that in a judgemental way.
Large scale permanent electrical heating/cooling in every room? EV car? What would cause such enormous usage?
Again, I don't ask this to justify usage, only to understand it.
We're a family of four in the Midwest (US), and we average about 40 kwh/day. Roughly 14,000kwh annually. The only gas stuff we have is our hot water heater and furnace. Our A/C gets a work out in the summer when it is pretty warm (at least this year). I do about 8 loads of laundry a week, and I'm sure that my dryer is a big user of electricity. We have a lot of computers in the house between the four of us; three desktops and four laptops. I work from home, so a pretty consistent use for at least 2 of them. I don't think the house lights or smartphones are worth even tallying. I bet A/C is a good 50% of our usage, and that's focused largely between June and September.
Biggest culprits in energy consumptions are heating appliances. To save some bucks I usually set my thermostat 1 or 2 degrees under my ideal temperature and wear warmer clothes and I never used a dryer, I don't mind having clothes hanging on a rack on a corner of my flat.
I have a house from 1909, with relatively poor insulation, by modern standards. We heat the house with a heat pump and a 100 m deep borehole. The heating of the house (180 m2 and a cellar) is the main energy consumption. The total was 22 000 kWh last year (including about 3 500 kWh for an electric car). An old house with electric heating and cooling could well be what he has.
When we moved in the electric consumption per year was way higher, 35 000 kWh. We have gone through and made the house more effective, better white goods, windows, better ventilation etc. We also sold about 2 000 kWh of excess production from our solar panels.
"An old house with electric heating and cooling could well be what he has."
Definitely could be the case. Your situation is interesting in that it is opposite to the order of transitioning towards a heat pump over here. Perhaps a heat pump was your starting point as there was no gas connection?
Over here we start with gas heating (almost every house) after which we try to transition towards a heat pump. They say you should first insulate your house to the maximum possible before even considering a heat pump. Fail to do this and the heat pump won't even be able to produce enough. This calculation is of course locally specific, unlikely to apply to you.
In fact, many houses will never be able to transition and will end up with a hybrid pump as the maximum achievable.
Before I owned it, our house had an old oil/wood burner. And our heat pump doesn’t need help from an electric heater until it goes below -15C or so. But then it is a 20 year old heat pump.
Relating to our solar panel install (refer another comment of mine in this thread) we went through with one of those plug in power meters and identified about spot and average power draw for about 80 different devices, spreadsheeted those data up to try to get a feel for where our power was being consumed -- as often times we would hit 40-50kWh/ day consumption.
Predictably (in retrospect) the biggest culprit by far was the hot water system. While we couldn't conveniently plug a meter in-line, it was fortunately wired up on its own phase (we're on 3-phase), and with the solar system install came a smart meter, which then provided some hour-by-hour insights.
A timer in the circuit-breaker box for the HWS, along with an element change (replacing the 2k4 with a 1k8 element) massively reduced consumption there -- I expect very many people have inefficient, and/or poorly configured, and/or poorly serviced electric hot water systems that are chewing through vast quantities of power.
TFA seems pretty savvy, so you'd hope this kind of low-hanging fruit isn't their problem, but OTOH you're right that those figures are very high, and you'd expect someone looking in this much detail at their power profile would be off-peaking all the high-draw systems they could.
That's clever. We also have a central smart meter in the house. It has a standard port (I believe it's called "P3") that allows you to read out usage. There's devices on the market that plug into this port, connect to your Wifi and then make it available to a mobile app. It reports total power usage in real time and gas usage with a delay of 15 mins.
I would recommend anyone to do such tour of the house. You're very likely to learn a lot and be in for a shock here and there. Totally agree with you that most people are far away from a state of micro managing energy, there's big wins to be had with little sacrifices in convenience.
And yes, that was the background of my question: If I'd save 210$ in electricity per month, my very first question would be why I even use that much in the first place. Could be valid reasons for it, of course.
The average New Englander uses between 567 kWh/month and 711 kWh/month (different values for different states - areas farther south generally use more).
The answer is probably that you live 5-10 degrees north of him and don’t use as much air conditioning.
Electric heating and cooling (instead of gas, which while still energy doesn’t go in the same bucket) and probably a larger house.
His usage is high by the standards of the region, so it probably is also something like an electric car or crypto mining.
(For what it’s worth, a swamp is far harder to cool than a desert - five of the top six states in the US by energy usage are on the Gulf coast. The average Floridian household uses ~13700 kWh/year whereas New Mexico is ~8000 kWh/year. On the other hand, the household average in Canada is higher than in the US, so…)
That sounds plausible and I had not even considered the possibility of multiple electric cars.
It frankly worries me. Heat pumps and EVs are going to spike consumption dramatically, where even smallish price and taxation movements will have an outsized impact. Not to mention the grid being able to handle it at all, as most people can't ever generate anything close to that level.
Can somebody explain where the return on investment comes from in residential solar? It just strikes me that if there was easy money to be had on solar installations, we'd just get tons of utility scale solar installs (which would likely by more capital efficient) and my utility bill would over time converge on the expected total utility bill for a home with residential solar. Is it just because we're in that transitional period where utility scale solar _is_ being installed, but in the mean time residential solar install internalizes the savings to your own residence? Or is it because most of the testimonials are hiding the ball a bit (only net savings reported due to buyer's remorse, etc.)? Or is it because government subsidies for residential solar are distorting the true capital cost, and thus profitability, of solar at scale?
To be fair, there is a lot of grid scale solar (and wind) being built, largely because they are the cheapest way to add generation capacity right now. But they are not a simple pluggable replacement for gas, coal, or nuclear, because they deliver power to the utility grid when the weather dictates power is available, not when the utilities demand dictates they need it. Utilities struggle once the fraction of their power supplied by these "interruptible sources" gets over 1/4 or so of their total generation base. As far as I know, the only regional-scale grids that have gone completely to renewables are Tasmania and Iceland. South Australia is not far behind, and Scotland not far behind them. But each has special circumstances that make it an easier lift. Tasmania and Iceland have abundant hydro and geothermal, which are ideal complements to interruptible renewables, because they can be quickly throttled up and down as required, as the interruptibles wax and wane. South Australia has highly reliable sunshine and wind, and so have been able to bridge the stochasticity of these with relatively manageable sized batteries (and a couple of flywheel installations to provide very short term momentum based frequency control). Scotland has highly reliable wind, and is headed in the same direction as South Australia.
The main limitation AIUI is grid interconnections, i.e. there are a lot of projects in the backlog that can't be built until the utility approves and prepares for it.
> The lists of projects in this process are known as “interconnection queues”. The amount of new electric capacity in these queues is growing dramatically, with over 1,400 gigawatts (GW) of total generation and storage capacity now seeking connection to the grid (over 90% of which is for zero-carbon resources like solar, wind, and battery storage).
Residential solar reduces retail power costs @ $0.30/kWh, whereas grid-scale solar supplies wholesale power @ $0.02/kWh. (Representative figures from California.) NEM (net energy metering) amplifies this effect.
If you look at a system-cost ROI analysis, the difference between residential and grid-scale is that the latter requires more transmission lines and provides fewer jobs (which is a key part of the political economics), but residential solar comes out looking very bad.
Explicit subsidies -- which pay a part of residential solar capital costs -- are a much smaller force.
> Utility-scale solar power generation in the U.S. is surging. This year, almost half the 46.1 gigawatts (GW) of generating capacity added to the grid will be solar, according to the U.S. Energy Information Administration, and solar has contributed more than 30% of all new capacity in five of the last six years.
Residential solar has no complexity for siting, transmission infrastructure, takes different skills to install, and let's relatively wealthy people move fast.
Utility-scale solar is far more efficient than rooftop solar. With rooftop, the panels face whatever direction the roof does. But utility solar fields have panels that rotate to face the sun.
Considering that power generation depends on the cosine of the angle between the sun and the panel - angle matters a LOT. Efficiency falls off fast if your angle is bad.
This should be at least somewhat offset by the fact that you generate power approximately where it is consumed, so the grid doesn’t have to grow as much. Turns out it isn’t so simple due to peak demand vs peak production etc but still. I wouldn’t have a problem with plastering PV on half of Nevada.
Most solar panels are angled to the South to capture the most sun. I read somewhere that fixed installations are better because:
1) The money spent on panel rotators can be invested into having more solar panels.
2) Affixed solar panels are significantly more storm and hurricane proof. If there’s one bad storm with rotating panels, your investment principal is gone.
3) Maintenance costs increase when these moving parts eventually break. Fixed panels have no moving parts.
> Can somebody explain where the return on investment comes from in residential solar?
During the run-up on housing prices over the last couple of years, there was a belief that an owned solar system would raise your house's sale price in much the same way that a kitchen or bathroom renovation would.
Couple that with the supply and labor constraints, which made it hard to get the panels, battery, and installers you wanted in a timely basis - that raised the perceived home value yet again, because if you wanted to add your own solar later, it wasn't seen as easily doable.
I don't know that the ROI was ever great, but with both of those factors gone today, it's worse than it ever was. (Source: shopped for homes in 2020-2021, bought in 2022, tried to get solar installed, ROI made no sense.)
In California, there's no choice for new housing, where new single/multi family houses must have panels that cover 100% of energy usage for the year (or as much as the roof surface area allows).
> The unsubsidized all-in price for buying and installing the panels was $25,525.43... after counting this $6,636.61 tax credit, the subsidized price was $18,888.82
Looks like a 7 year return on investment after subsidies, 10 years without.
I enjoyed working with Solar Wholesale, and they got me off to a good start. But there is a pretty significant markup even when considering they're giving you a "wholesale" DIY kit. I was quoted ~$35k pre-incentive for my project.
Also, be sure to do your own roof measurements when you look over their proposals. For my project, they used fairly inaccurate aerial/sat photos, which got the shape of my roof entirely wrong. Had I accepted their offer, I'd have ~33% too many panels / racking that wouldn't fit on the roof.
I ended up sourcing all the materials myself. I got much nicer inverters (1:1 Enphase IQ8+) and PV modules (455W LG bifacial) than what they were offering (2:1 APSystems inverters, 355W Bluesun panels).
The real kicker: I had enough budget left over to have a professional solar installer install all the panels for me, so I'm not the one that has to get on my steep roof, plus I have a warranty on the worksmanship of the installation. Pre-incentive, I'll have spent about $24k on my self-sourced version.
At some point I'll write up a blog post about the whole process - parts selection and sourcing, finding an installer, permitting, etc. It's really not that hard.
Unfortunately where I live (Belgium), this kind of options result in a disqualification of any subsidy. They always require that the install is done by a pro with some certificate (even though DIYing electrics is actually legal here).
I understand that there are reasons for this, but as someone with the qualifications but not the certification, I hate that I have to pay someone else to do things like this.
Different states have different rules but in some states you can just put a new breaker in your box that connects the solar output to the rest of your house and a mechanical interlock (literally a stamped piece of metal) that prevents both the main breaker and solar breaker being closed at the same time.
For those you need an automatic transfer switch (ATS), such as the Enphase Enpower. Such devices contain circuitry to automatically manage disconnecting your home from the grid, so you 1) maintain power in the home (at least on the critical circuits you decide on), and 2) don't island your home (so linemen don't die).
I would love to DIY but that website is a complete turn off for me.
In my opinion, a good DIY solar company site would have transparent pricing and ability to buy components. I understand there are variations in installations and pricing may be different, but they should have a pricing catalog.
What's wrong with the website? You submit your address, within a short period of time a human looks at your rooftop using aerial photography, and sends you a quote.
I used them in the past, and they were fine to work with. I still ended up sourcing all my own equipment (saved an additional 50% from what they were quoting me, for a better system), but they're not secretive about what they'll send you.
I never usually see it factored in as a cost but if a roof is replaced or a leak forms under the panels (or a panel) then the cost of removing/re-installing the panel(s) surely isn’t free. So that cost should also be included
Cleaning panels also needs to happen periodically. That cost can be in the thousands, especially if your roof is hard to access (e.g. high off the ground).
There are so many ways a rooftop solar installation investment can turn against you. If regulations change, your net metering goes away. If electricity prices drop, your ROI goes negative.
Rooftop solar is a fun enthusiast project, but it's a lousy use of solar panels. Community solar or utility solar projects are far better for homeowners. And for the environment.
The only reason people do it is that there are artificial incentives; it's not actually efficient in reality.
I have rooftop solar - and eventually I will need a new roof - which will require taking the solar panels off and then mounting them back on again - labor costs. Additionally the inverter will fail at a certain point and that is not cheap. Given some ballpark estimates, I do feel like it could simply be a wash financially.
How old’s your roof? Just wait until it’s time for new panels. You’ll want new panels after a couple decades and change for the significantly improved performance. Panel work will be routine for roofers in the 2030s and 2040s. If you installed them on an old roof, you made a mistake, and your municipality probably shouldn’t have approved the work in the first place.
Your replacement inverter should only cost $1-3k in present money, depending on its size. You probably have a warranty on it for the first half of your panels’ life.
You also don’t need to pay for an expensive extensive cleaning unless something goes really wrong or you live under an HOA that is hostile to solar.
By far, the biggest factor in ROI is your generation relative to your initial installation cost minus tax credits and SRECs.
It's usually recommended to install solar panels on a relatively new roof (< 5 years), so that both the roof and solar panel lifecycles align. Since the panels act as protection of sorts, it also extends the life of the roof, and you can easily have both installations serve you for ~30 years.
There is typically a ten year warranty on the installation that covers leaks. An installer checks for roof age as part of qualification so you don't put panels on a roof that will age out before the panel warranty (typically 20 years, sometimes 25.)
Roof leaks that are accidental are covered by insurance, and the work to remove or replace the panels is part of that. Typically you adjust your home insurance coverage by the value of the panels when you install them to make sure everything's covered.
That just leaves the chance that the installer work will be good enough to make it ten years but not twenty, and the insurance won't cover the damage. I don't know the odds of that. It seems like it would be infrequent, but what numbers do you use?
I don’t use any numbers. Just wanted to point out there is slightly more to it than the upfront cost. As with anything there is an associated maintenance cost that is variable depending on circumstance.
To be clear I think solar panels can be a great idea, but it’s misleading to present only a single upfront cost when determining the years to break even
I understand. It's not bad to keep in mind (I made a spreadsheet with anticipated expenses before I got my panels), but I think it can be more misleading to dwell on small risks than not to if it keeps someone from buying an asset with a good lifetime ROI for most people.
7-10 years to break even. If the panels last 15 years you'll see a net return of $21,416 on investment of $18,888. If they last 25 years, that net return grows to $48,186. In either case it works out to an annual rate of return of 5%, assuming electricity prices stay the same (BIG ASSUMPTION) and also assuming the weather doesn't change to get cloudier or more violent (also a big assumption).
Is net return all that matters? Seems like with increasing heat waves, power outages etc, being able to generate and store your own electricity has value from a resiliency standpoint.
Note that merely having solar panels on your house doesn't automatically let you run your house off of the panels in the event that the grid goes down. Firstly, if you're connected to the grid, then a single house trying to push power into a downed grid would be like trying to refill the ocean with a super soaker. Even with lots of houses doing it, the frequency would be wildly unstable at best and cause damage to equipment. Secondly, it poses dangers for lineworkers who are probably trying to resolve the outage, by putting surprise voltage into wires.
It's certainly possible to have a solar installation that does let you disconnect from the grid, but it requires extra hardware (including a battery backup).
Exactly right. Typical solar installations keep you on the grid. And when the grid goes out, so do you.
To become grid-independent adds substantial expense. Having your own battery adds another $10-20k, along with whatever costs are associated with switching the grid.
Also, being off-grid is actually illegal in some places.
Utilities and governments tend to have close relationships.
Governments want utilities to continue working at reasonable prices, so they regulate utilities heavily. In exchange, utilities get protection against competition. This is true in many countries.
Specifically around solar, there's a concept called "utility death spiral" where more and more people opt out of the grid, causing the grid to be unmaintainably expensive. This the dream that solar conpanies tell their investors.
Power companies would of course want protection from the utility death spiral, and can often get it.
Disconnection isn't always illegal. Another form this can take is "all houses have to pay a grid fee, whether or not they are connected".
I'm not sure what kinds of hardware you guys have seen or been marketed. But from my pov, these are all pretty standard features of most hybrid inverters and come out of the box. The only extra expense to not be grid tied is needing a battery and maybe a tiny amount of extra electrical wiring. It wasn't even a question that popped up during my build.
Then again, we regularly have huge rolling blackouts here so we need the inverter to be a giant UPS for the house and be able to use solar and grid at the same time.
What's the math if we see annual electricity price increases? we've seen 2 hikes of 10-20% in the last 18 months here in Florida. Making me consider solar sooner rather than later.
If the savings is $2677 on year 8 (first year of profits after paying off the initial investment) and then increases 10% every year after that, the net return is $33675. This works out to an annual return of 7%, a considerable improvement! The same calculation with annual rate increases of 20% yields a net return of $53002 and an annual return of 9%.
Also there is a question if electricity produced by panels will match the peak demand or in general is utilized. Scenario where solar is overbuild and thus power generated by it has near zero price for large part of the day might happen.
I agree its a big assumption to assume that electricity prices will stay the same, my assumption is that electricity price increases will significantly outpace inflation.
Sounds great to me. Especially considering you aren't paying any taxes on your returns when they come in the form of deferred costs. That's equivalent to a pre-tax return in the mid-teens. You won't find that in most asset classes these days, and definitely not with the risk profile of solar.
AND considering that replacing only the panels has to be far cheaper than the initial installation, if they do drop too much in 25 years. So long as your roof is still okay, I suppose.
My parents neighborhood is coming up on 25 years old and they are having to replace the layers under the roof tiles. The tiles themselves will probably last a thousand years but the under layer has a lifetime. They pull off the tiles, replace the waterproof layer and put the tiles back on — apparently takes just a couple days and they get better nails so the tiles don’t fall off during the windy monsoons.
I actually wonder how solar panels reduce the rate at which a roof deteriorates. The panels themselves should bear the brunt of the light and rain that the shingles below them would otherwise endure, but on the other hand the panels don't cover the entire roof, so maybe at best you'd need to replace less of the roof but on the same schedule.
Yeah, that makes sense to me. I don't know the specifics of shingle deterioration, but just having them physically shielded from light and precipitation has got to drastically improve their lifespan. That's a good point about only part of the roof being protected though.
The president of the roofer's lobby has entered the chat!
Composition roofs need to be replaced when they are worn out and not before that. Good materials competently installed could last 50 years, more in mild climates. If you have the kind of weather that will wear down a roof in 10 years you probably cannot have solar panels.
Is there a reason that corrugated iron, concrete tile or similar aren’t used?
They last near indefinitely if looked after, and a hell of a long time if not looked after.
This is standard now. My REC Alpha Pure Black 400 panels are warrantied for 92% at 25 years.
The degradation is about 0.25%/yr. Assuming no exponential drop-off it would take >50 years to hit 80%.
Essentially no one has had to replace their panels due to degradation losses as of yet (since the vast majority of panels have been installed in the past ~30 years).
The idea that PV panels become trash in a decade is a lie promulgated by oil companies. According to NREL, reputable panels degrade as little as 0.1% per year, which means they will still be above 90% of rated output after 100 years. The trashiest panels NREL has tested degrade at 1.1% per year which would put them at 75% rated capacity in 25 years.
Generally you can find the degradation curve in their datasheet, and yes, 80% after 25 years sounds possible. IIRC they fade faster with time, so it might be 50% after 50, but ... that's still pretty good.
I don't see anything in the NREL data that suggests panel degradation accelerates. Most of the panels in their data have higher initial degradation, then less in subsequent years. This LG panel seems to be getting stronger as it ages: https://www.nrel.gov/docs/fy22osti/81172.pdf#page=28
>Looks like a 7 year return on investment after subsidies, 10 years without.
Price of electricity more than tripled in some EU countries in the last few years. Prices going up and OP reducing consumption will work towards reducing ROI period.
That is technically true, but why is no one asking how and why his electricity bill is $223 at minimum, plied higher since the bill was only reduced by that amount.
Also, “subsidies” are really just a quaint term for “made the working and lower middle class that can’t pay for such exorbitant luxuries pay for it through taxes and inflation”.
I’m always a bit confused if seemingly otherwise smart people simply don’t understand something so basic, or if they simply just ignore and don’t want to acknowledge it as if that changes the fact.
Guess who else gets subsidies; someone who buys goods out of the truck of a car … wow, such a great deal.
Would anyone like to subsidize my next restaurant visit by paying for my entree? No? But making someone unknown working class person pay for my $25,000 solar install through inflation is ok?
How about we all just pay for the things we want I stead of making others pay for it? It’s immoral and evil, whether you call it subsidies or something else.
>Would anyone like to subsidize my next restaurant visit by paying for my entree? No? But making someone unknown working class person pay for my $25,000 solar install through inflation is ok?
It's amazing how ignorant people can be to the world around them. Let's assume you live in the US, since the article is dealing with those subsidies.
The food you're eating - c. 20% of US farm income is subsidies [0]
The employees of the restaurant - probably, at least one of them is on a government support program to augment their wages [1]
The car you drove to the restaurant in - the US Federal government subsidized your gas (ever wonder why US gas is cheaper than Canada/EU?) [2]
So maybe instead of spending your time looking down your nose and 'not understanding why people don't understand economics,' why don't you do research on the world around you?
>The car you drove to the restaurant in - the US Federal government subsidized your gas (ever wonder why US gas is cheaper than Canada/EU?)
US gas is cheaper than elsewhere in the developed world because a) the US is self-sufficient in terms of supply and, more importantly, b) US fuel taxes are less.
If you actually read the EESI paper you cited, you see that the "subsidies", whether direct or indirect, are actually varying accounting treatments; basically, the same amount of taxes are collected but over a longer period of time. In any case, even if you were to remove all the direct "subsidies", the paper does not claim that tax revenue would rise more than about $40 billion over about a decade. $4 billion a year is a pittance for a federal government that collected $4.9 trillion in 2022.
Oh fun! I love when people Dunning-Kruger themselves on accounting (I literally just sat up straight in my chair!)
Let's go through your argument (would you believe I did both read and understand the document I linked!?), but before we do that, let's look at some more traditional subsidies that O&G gets in America: [0]
The GAO has reported extensively that taxpayers have not received a fair rate of return due to outdated fiscal terms. For example, Federal onshore oil and gas royalty rates are consistently lower than on State-issued leases and Federal offshore leases (see Tables 1 and 2); in fact, onshore royalty rates have never been raised. Likewise, bonding levels have not been raised for 60 years, and minimum bids and rents have been the same for over 30 years. If a lease is not sold competitively at auction, for two years it can be sold non-competitively for a modest administrative fee, with no bonus bid required. These noncompetitive leases are frequently less diligently developed as
competitively issued leases. From 2013 to 2019, average revenues from competitive leases were nearly three times greater than revenues from noncompetitive leases.
Underpriced use of public land sure sounds like a subsidy to me!
Ok back to your comment, let's cherry-pick some arguments you made then get into accounting.
>US gas is cheaper than elsewhere in the developed world because a) the US is self-sufficient in terms of supply
Oh, I was unaware that the cheap gas phenomenon started in 2008 when we started approaching energy independence. Thanks Obama, I guess.
>b) US fuel taxes are less.
You're getting dangerously close to agreeing with me on the subsidy point, but I know we won't agree on the politics of pricing externalities, so I'll just move on.
> the paper does not claim that tax revenue would rise more than about $40 billion over about a decade. $4 billion a year is a pittance for a federal government that collected $4.9 trillion in 2022.
It's an amazing logical fallacy to say "one number is smaller than another unrelated number, so the smaller number is unimportant," but even ignoring that, its still a subsidy and that's my entire point. Subsidies big and small are everywhere and this is one of them. I nowhere made an argument that O&G subsidies are going to bankrupt the US, just wanted to make OP aware of the fact that their gas is subsidized.
OK, now on to my favorite topic: why GAAP and cashflow accounting are different and why that actually matters, especially in CAPEX-driven balance sheet businesses.
1) The Intangible Drilling Costs Deduction - You are 100% wrong here. Depreciation and Amortization schedules exist for a reason, it's not just made up to keep EY busy footing 3-statement models. Let's run with this hypothetical: a business looks to build a well when prices are $100/barrel. In that first year of pumping, they successfully discover that the well is wet and they pay way less tax than they otherwise would because they got to amortize everything all at once. Now in year 2, that wet well is still producing but oil prices fall and it no longer makes sense to keep pumping. So now they have a known wet well (a balance sheet asset that they can restart at any time) and all the retained earnings from year 1 that the government never gets to claw back.
Compare this to a world without this subsidy where those expenses are amortized on expected useful life of the well. In this case, not only does the driller have incentive to keep producing even if prices fall, they absorb some of the pricing risk that the US government currently takes on.
If the US government intentionally absorbing pricing risk
(arguably free insurance for O&G companies) is not a subsidy to you then again, we just disagree.
2) Percentage Depletion - In contrast, percentage depletion allows firms to deduct a set percentage from their taxable income. Because percentage depletion is not based on capital costs, total deductions can exceed capital costs. << Enough said
3) Foreign tax - Instead of claiming royalty payments as deductions, oil and gas companies are able to treat them as fully deductible foreign income tax. << Again, maybe you're not following the language here, but this is not "basically, the same amount of taxes are collected but over a longer period of time" it's "less taxes vs. other industries" (if you actually don't follow, tax paid to foreign governments isn't treated as a normal expense, it's usually covered in treaty agreements and is treated as tax already paid)
So yeah, not only are these real honest to goodness subsidies, they amount to billions of dollars a year!
>Underpriced use of public land sure sounds like a subsidy to me!
I have no problem with raising lease rates for Federal land to market rates.
>>US gas is cheaper than elsewhere in the developed world because a) the US is self-sufficient in terms of supply
>Oh, I was unaware that the cheap gas phenomenon started in 2008 when we started approaching energy independence. Thanks Obama, I guess.
Oh, come now. It is a fact that, both historically and today, part (not all, but part) of the reason why US gas prices are lower than in the rest of the developed world is because the US is relatively self-sufficient. (And before you bring up Canada, Canada significantly lacks domestic refining capability, as well as ability to deliver its own gas to the eastern half of the country.)
2008 to now isn't the first time the US reached energy independence; the US had this status into the 1960s, and if it really needed to it could have always reached this, especially when including Canadian supply. The Gulf War was fought to maintain oil supply to Europe, not to the US.
>>b) US fuel taxes are less.
>You're getting dangerously close to agreeing with me on the subsidy point, but I know we won't agree on the politics of pricing externalities, so I'll just move on.
You and I both know that when people here and on Reddit claim that "the US subsidizes gas and that's why it's so cheap", 99% of the time it's meant to convey the claim "US gas companies get zillions in handouts from the government" (in the sort of bags with dollar signs that Mayor Quimby receives his bribes in), as opposed to "gas is taxed less in the US than elsewhere" (much less "US gas isn't appropriately pricing in externalities"), and 99% of the time that's the message that's taken away by the reader.
We indeed would not agree on the politics of pricing externalities (more precisely, whether such counts as "subsidies"). Your statement, however, implies that other countries' gax taxes are higher because they are more appropriately pricing said externalities. We both know that Canada or Belgium or Portugal's gax taxes are not higher than in the US because their governments have duly, nobly, and wisely calculated the impact of climate change and have set the tax rates accordingly. (Maybe Norway.) Said taxes are higher because their governments believe they are acceptable to the public, and are spent accordingly as part of general funds as opposed to being all (or even part) sent to a "global warming lockbox", or somesuch.
>> the paper does not claim that tax revenue would rise more than about $40 billion over about a decade. $4 billion a year is a pittance for a federal government that collected $4.9 trillion in 2022.
>It's an amazing logical fallacy to say "one number is smaller than another unrelated number, so the smaller number is unimportant," but even ignoring that, its still a subsidy and that's my entire point.
First, both the degree and kind matter. Unless you rush to correct everyone who says that public schools/toll-less highways/police services are "free" with "Ackshually, they aren't free", you also agree.
Second, EIA says (<https://www.eia.gov/tools/faqs/faq.php?id=23&t=10>) that in 2021 135 billion gallons of gas were consumed in the US. We'll simplisticly say that every cent of of the $4 billion a year in "subsidies", which the EESI is presumably citing as a worst-case figure, can be assigned to gas. So each gallon is being "subsidized" by about $0.34. Not nothing (and, again, this is a worst-case figure), but relatively small versus the massive swing we saw in 2022 in the price per gallon (<https://www.cnn.com/2022/12/29/energy/oil-gas-prices-2022/in...>). More importantly, said amount is absolutely not the explanation for the difference in price per gallon/liter between the US and other developed countries, either.
EDIT to above: I, of course, shamefully erred in basic arithmetic. 135 billion gallons of gas and $4 billion in "subsidies" means ~$0.03 per gallon, not $0.34. Only reinforces my point of course by a factor of ten, of course, but still quite embarrassing.
>I have no problem with raising lease rates for Federal land to market rates.
I don't care what your opinion is. Again, just pointing out that it is a subsidy.
>Oh, come now. It is a fact that, both historically and today, part (not all, but part) of the reason why US gas prices are lower than in the rest of the developed world is because the US is relatively self-sufficient.
That's simply not true! 2008 was the beginning of the trend toward energy self sufficiency; please look at the data [0]. The US was far from self-sufficient for most of the 20th Century.
>You and I both know that when people here and on Reddit claim that "the US subsidizes gas and that's why it's so cheap", 99% of the time it's meant to convey the claim "US gas companies get zillions in handouts from the government" (in the sort of bags with dollar signs that Mayor Quimby receives his bribes in), as opposed to "gas is taxed less in the US than elsewhere" (much less "US gas isn't appropriately pricing in externalities"), and 99% of the time that's the message that's taken away by the reader.
This is hard to follow, but we do not agree there. I am just making the point that subsidies are poorly understood by you and the people on Reddit, yet no one seems to want to learn more about them.
>Your statement, however, implies that other countries' gax taxes are higher because they are more appropriately pricing said externalities. We both know that Canada or Belgium or Portugal's gax taxes are not higher than in the US because their governments have duly, nobly, and wisely calculated the impact of climate change and have set the tax rates accordingly. (Maybe Norway.) Said taxes are higher because their governments believe they are acceptable to the public, and are spent accordingly as part of general funds as opposed to being all (or even part) sent to a "global warming lockbox", or somesuch.
Again, pricing externalities does not work that way. It's not a "pay to solve the problems" thing, it's a "make the true cost apparent to the consumer" thing. Politicians often use the funds to solve the problem because that's a popular thing to do, but it's usually not the best use of funds anyway (receipts and expenditures, in general, should not be tied because then you're just randomly skewing markets rather than rationally governing).
>First, both the degree and kind matter.
Agreed, but again, what does the US Federal government receipts have to do with how much these subsidies skew the market?
>Second, EIA says (<https://www.eia.gov/tools/faqs/faq.php?id=23&t=10>) that in 2021 135 billion gallons of gas were consumed in the US. We'll simplisticly say that every cent of of the $4 billion a year in "subsidies", which the EESI is presumably citing as a worst-case figure, can be assigned to gas. So each gallon is being "subsidized" by about $0.34. Not nothing (and, again, this is a worst-case figure), but relatively small versus the massive swing we saw in 2022 in the price per gallon (<https://www.cnn.com/2022/12/29/energy/oil-gas-prices-2022/in...>). More importantly, said amount is absolutely not the explanation for the difference in price per gallon/liter between the US and other developed countries, either.
This is the most wildly incorrect part of your comment. Subsidies skew supply/demand curves, both of which have slopes. If it were as easy as you make it seem, we'd all be Economics PhDs.
The government does not give Exxon $0.34 per gallon it sells, it gives them incentives to do things they wouldn't otherwise do, and given supply and demand dynamics that has complex effects on the market.
And again, circling back here, all I'm trying to say is that the US meaningfully subsidizes energy, which they do. I honestly think it's a good thing! It's been the economic engine of growth for the US. I don't think we should stop! (notice how you assumed I had an agenda just by trying to state facts, maybe examine that a little...)
If you go up to my top comment, you'll see the GP was trying to imply that it's somehow unfair to subsidize solar and I was pointing out that they were ignoring all the other subsidies that we've gotten used to/take for granted.
>That's simply not true! 2008 was the beginning of the trend toward energy self sufficiency; please look at the data [0]. The US was far from self-sufficient for most of the 20th Century.
Oh, good grief. The second sentence of your cite is "Up to the early 1950s, the United States produced most of the energy in consumed."
Let me repeat:
* The US was self-sufficient in oil until the 1960s.
* The US could always have been self-sufficient, or reasonably close to it, especially with Canadian supply. It would have been difficult/very expensive at times, but it could have been done.
* This is not true of the rest of the developed world. The Gulf War was fought to maintain supply to Europe, not the US (although the US benefited from the overall lower global prices as a result).
* Since 2008 or so fracking has brought about a second period of US self-sufficiency without most of the difficulties that would have been necessary in the second half of the 20th century.
>This is hard to follow
You know exactly what I meant.
You wrote:
>the US Federal government subsidized your gas (ever wonder why US gas is cheaper than Canada/EU?)
Even if you weren't trying to insinuate with the word "subsides" that the US wasn't actually giving out huge handouts to oil companies, that's what 99% of readers would take away from your statement.
>Again, pricing externalities does not work that way. It's not a "pay to solve the problems" thing, it's a "make the true cost apparent to the consumer" thing.
"Pricing externalities" implies that there is some sort of closer relationship in other countries between the price of gas and its "actual" cost (in terms of long-term environmental impact), including solutions for same. As I said, we both know that this is almost never the case.
As for affecting behavior in the here and now, higher gas taxes in, say, rural France or Ireland does not mean that French or Irish farmers are less willing to use gas-fueled trucks to carry produce to markets, because they have no alternative (at least right now; maybe this will change in the future with EV trucks). It means that they pay more per gallon to do so than their American counterparts. Period.
>This is the most wildly incorrect part of your comment. Subsidies skew supply/demand curves, both of which have slopes.
I would never deny that subsidies skew behavior. My point was that said "subsidies" were, even in the EESI paper you initially cited, relatively small compared to price per gallon or the overall petroleum market's size and behavior. This was true with the $0.34/gallon figure I initially erroneously calculated, and is certainly true for the correct ~$0.03/gallon figure (again, a "worst case scenario") I provided later! Feel free to tell people about how the US government distorts the gas market at three cents a gallon. I fear that the scale of their reaction will disappoint you.
>And again, circling back here, all I'm trying to say is that the US meaningfully subsidizes energy, which they do. I honestly think it's a good thing! It's been the economic engine of growth for the US. I don't think we should stop! (notice how you assumed I had an agenda just by trying to state facts, maybe examine that a little...)
This is you rapidly beating a retreat.
You began your first reply to me with
>Oh fun! I love when people Dunning-Kruger themselves on accounting (I literally just sat up straight in my chair!)
Basically, you saw the chance to unleash your superior accounting skills. Nothing wrong with that; I've certainly enjoyed doing so many times for things I know more about than other people.
You then said
>Let's go through your argument (would you believe I did both read and understand the document I linked!?),
Based on your missing the second sentence of the EIA document you cited above, the answer remains "No, I do not believe that you actually read the EESI paper before citing it the first time".
You are, of course, by now realizing that although you know more about accounting than me, all you have done is to "prove" that US gas is "subsidized" by the munificent sum of three cents per gallon. I thank you for doing so.
> Oh, good grief. The second sentence of your cite is "Up to the early 1950s, the United States produced most of the energy in consumed." Let me repeat: The US was self-sufficient in oil until the 1960s.
Good Greif to me?!?! You're saying that America making enough oil for the 70mn registered cars in 1960 [0] is evidence that it could make enough oil for the 225mn cars registered in 1999? [1] Don't even get me started on miles driven!
Aramco began Saudi nationalizion in 1950, and you claim that the US chose to up its reliance on them during that period? You have a very interesting understanding of O&G, my friend.
>You know exactly what I meant.
Lol. Ok. I guess I do, then.
>Even if you weren't trying to insinuate with the word "subsides" that the US wasn't actually giving out huge handouts to oil companies, that's what 99% of readers would take away from your statement.
Lol. Ok. I guess that's true, then.
>"Pricing externalities" implies that there is some sort of closer relationship in other countries between the price of gas and its "actual" cost (in terms of long-term environmental impact), including solutions for same.
Again, my entire point is you have no idea what you're talking about here. All that consumption taxes do is disincentive consumption. Per my last comment, often to make the taxes more popular, proceeds are used to "address" the underlying problem, but that's theater/bad governance that ignores the fungibility of tax revenue.
>As for affecting behavior in the here and now, higher gas taxes in, say, rural France or Ireland does not mean that French or Irish farmers are less willing to use gas-fueled trucks to carry produce to markets, because they have no alternative (at least right now; maybe this will change in the future with EV trucks). It means that they pay more per gallon to do so than their American counterparts. Period.
Boy I love a good binary. You don't think higher fuel costs encourage them to sell to more local distributors with lower transportation costs? I mean, we ship roses on airplanes from Ecuador en masse because of cheap fuel! [2]
>Feel free to tell people about how the US government distorts the gas market at three cents a gallon. I fear that the scale of their reaction will disappoint you.
Again, it's amazing to me that you think that these subsidies are direct consumer-facing price supports and I don't know how to explain more clearly that they are not.
>This is you rapidly beating a retreat.
What? Where's the contradiction? I don't care about your politics, I just want you to understand the situation better.
>You began your first reply to me with
Oh fun! I love when people Dunning-Kruger themselves on accounting (I literally just sat up straight in my chair!)
Yeah, because you made false claims about accounting? Are we just entering the "who is going to win" mode? Because then, by all means you've won if that's important to you, but please don't let it cloud your ability to incorporate new information (even if that new information goes against claims you made earlier. That's OK! That's learning!)
>Based on your missing the second sentence of the EIA document you cited above, the answer remains "No, I do not believe that you actually read the EESI paper before citing it the first time". You are, of course, by now realizing that although you know more about accounting than me, all you have done is to "prove" that US gas is "subsidized" by the munificent sum of three cents per gallon. I thank you for doing so.
This is tough to follow too, but again, small brain over here. I guess once I see the light about how (in your opinion) small subsidies aren't subsidies and subsidies to suppliers are best understood in consumer-facing terms, then I'll understand.
BTW, if corn is subsidized by like $5bn/year in the US, what's that per kernel? I bet a really small number!
> I’m always a bit confused if seemingly otherwise smart people simply don’t understand something so basic
Are you surprised, truly? All right, I’ll play along.
We have a progressive tax system. So the majority of the costs are born by the wealthy and the well-off, rather than the working class. Some of the well-to-do got that way by grit and hard work; others by luck; and at least a few by fraud and worse. They’re the winners of a somewhat-arbitrary game, and I don’t see anything wrong with tweaking the rules in pursuit of a collective good, like wider solar panel deployment.
Meanwhile! We know that most industries have quite dramatic learning-by-doing and returns to scale. And yet in this fallen world transaction costs and imperfect information can prevent new technologies from getting enough scale practice to become economically viable. So temporary subsidies in the early stage have a good chance of bringing costs down for everyone.
I’m very glad to be the first to introduce you to these arguments, if I am. You don’t have to agree: I left my mind-control goggles in my other coat. But please, don’t play dumb.
Sort of. The literal tax bracket for earned income is progressive. But due to lower tax rates for certain other types of income, and the ability for the ultra rich to utilize the tax code to the fullest, the net effect is that the actual overall tax rates are progressive at the bottom end, and regressive at the top end.
Unless you walk to that restaurant, it will absolutely be subsidized. 6 percent of all state and local spending is on roads. And that's not counting any negative externalities, which subsidies should be reducing (like, for example, a solar installation subsidy), not increasing.
> Would anyone like to subsidize my next restaurant visit by paying for my entree? No? But making someone unknown working class person pay for my $25,000 solar install through inflation is ok?
Yes, it obviously is ok? Your next restaurant meal doesn't have a broad impact on the world we live in, whereas the deployment of solar panels does.
Perhaps instead of being angry that people are using taxes that come from the working class, instead you should ask why the wealthy aren't paying their share of the taxes?
And nobody pays for anything with inflation. That's not how inflation works.
> instead you should ask why the wealthy aren't paying their share of the taxes?
And what is the right share that the wealthy should pay, and how wealthy should one be to be paying that amount? I'm assuming it's some level of wealth above yours.
About half of the US doesn't pay any federal income tax, and a little over a quarter pays no net federal taxes at all, including payroll taxes. The labor force participation rate hit its peak in the 90s and has been steadily decreasing since then, with the last reading at 62.1%. Perhaps you should ask why two-fifths of adults aren't doing their share of the labor?
I say this without any snark or sarcasm: I welcome your analysis on the matter, if you have thoughts to add. It's inconceivable to me where this belief comes from, that everything will be so much better if only the wealthy paid their "fair share" of taxes.
Otherwise, I ask that you keep empty comments out of the forum.
Well, given that the top 1% in the US owns over 43% of all wealth (numbers as of 2011), and who by definition are getting a bigger benefit from all the infrastructure and protections of the system than everyone else, not seeing how a 38.8% contribution of total tax income is anything but a distortion favoring the wealthy.
Particularly since you carefully ignore that income is typically small fraction of the wealth of the 1%, most of which is capital and which largely escapes taxation.
Capital gains tax is a thing, and in most cases that capital is taxed multiple times through its "lifecycle". The reason capital gains are taxed at a lower rate is because we want to encourage capital investment. It's what makes things. Food. Jobs.
You're also following a flawed premise (many people do this) that wealth is a finite pool, or a zero-sum game.
It isn't.
There's no upper bound to wealth creation. It's not a limited pool. It's unlimited.
Wealth is created when a person, or group of people, create a product or service that other people desire. That's it. Anyone can do it, and successful people do - through a combination of work, opportunity identification and luck.
The ideology I see behind "wealth inequality" in the U.S. typically boils down to some derivative of envy.
The solution to "wealth inequality" isn't taxation - that only has the power to destroy.
The solution is individual, different for each person, their life goals and what sorts of things are important to them, but rests in creation and innovation.
That's what's created wealth through human history, and it's what will continue to do so, if we allow it.
Taxing others and giving it away is lots of fun until you run out of people to tax, and you've managed to disincentivize and destroy what wealth remains.
Take a look at Prodrazvyorstka [1] to see how that works out.
It's the "top 1 percent of taxpayers". So the people who paid the most taxes, paid the most taxes. They also made 20% of the total income. Or roughly 2 trillion dollars.
So if you managed to make a lot of money but also managed to avoid paying taxes on that money, you would not be included in "the top 1 percent of taxpayers".
The bottom 50% of taxpayers only made 11.1% of the income.
And that's the problem with all of this, there are ways to slice the data to try and gloss over the very real problem of wealth inequality we have right now.
Solar won't keep you online through a power outage. The inverter is required by law to cut power from solar panels when there is not power from the grid, so as to avoid feeding electricity back into the grid lines (which technicians may be working on!) - this is known as anti-islanding protection. If you want to be online through a grid outage you need a battery array.
From a circuit perspective, it seems trivial to open a switch back to the grid to prevent grid-feed, while still feeding the internal circuit that is the house. Can you help me understand why this is hard?
The problem is that its not a DC connection to the grid.
Not only do you have to match voltage on re-connection, but you have to match frequency AND phase.
Its "doable" but to do so reliably without a stable reference can be challenging. That's why the DC requirement is there. They want some form of reliable power to well tested electronics to match phase and frequency when the grid comes back up.
A heavily loaded AC line will drift in voltage and frequency stability (say when your AC kicks on and browns out your solar panels).
I've read that before, and I have a further question if you could indulge me
What if, when the grid goes out the inverter keeps powering the house (and disconnects from the grid).
Then when the grid comes back on, kill the output of inverter (which results in outage to the house), reconnect to grid, then power up the inverter again, which skirts all the sync issues.
The house will see a few second outage... but it will have been "on" for the entire grid outage, which I think is much preferable.
The overall problem is more or less solved, it just adds equipment to the installation, both to make the power usable and also to ensure that it is isolated from the grid.
It's not hard. There is type of inverters that called "Hybrid inverter", I have one that can keep power off-grid or on-grid or even on-grid in import mode only (so only import any needed power without exporting...), and it can work off-grid even without battery (from solar) but that is not recommended unless you know that your load is much smaller than the solar output, even small lead-acid battery can mitigate this issue.
The way the inverter work is that when the grid goes off it will automatically switch the power export off and keep the load running from solar and/or battery in isolated mode.
It's doable, but it requires a bunch of extra expensive equipment. Most people that want to accomplish that goal just install a battery array instead, since you're spending the money and it also gets you power coverage when the sun isn't shining. But the point stands that the typical grid-tied solar system just turns off when the grid goes down.
No as I haven’t lost power since installation. However during a power outage I can configure the SMA inverter to detach from my breaker box and operate a single 15Amp outlet
Why would I? I’m on NEM 2.0. The grid is my “battery”. Every KWh I generate that isn’t used is sent back to the grid. In exchange I get an IOU for the equivalent $/KWh price during that time. This is the retail $/KWh which includes transmission costs.
Assuming you have batteries, you discharge and charge them on a daily cycle, not a yearly cycle, to make the most money/have the most benefit. There are predictable time of day based troughs and peaks in the price that you can take advantage of.
It's a lot more efficient to sell back to the power company. I'd be more comfortable with having a dual fuel generator for apocalyptic situations vs a bunch of batteries.
Some (all?) electric companies in CA do net metering, where you can sell them your excess electricity at any time and storage essentially becomes their problem.
While you wait decades for home battery storage technology to improve, you can make a water battery in the meantime with a few things from the hardware store.
You have to be joking, right? That's a 55 gallons of water as a battery that require 5.5 hours to charge and store 3.9 Wh of energy. Not enough to even power a 10W LED lamp for half an hour. A 10,000 mAh power bank is 37Wh.
Having 21 panels requires a rectangle 17ft by 24ft (each panel is 5.4ft by 3.25ft). That's far from an estate: if it's a two story home with a flat roof you're talking 816sqft compared to a US median of 2,300sqft.
1. Most roofs aren't flat.
2. For pitched roofs, you're only going to want to install on the south or west facing sides. So only some percentage of your roof area is ideal.
A pitched roof and installing on only half doesn't change it that much. If your roof is 45deg then you're talking about a 24x24 building. If that's two stories this is 1152sqft, still less than half the size of the median US house.
There are roof designs where even on a large house 21 panels won't fit, but looking around my neigborhood most houses could easily do that many.
If my math is right, 21 panels is 38.2 m^2 or 411 ft^2. That's not a huge amount of roof space. Depends on the roof pitch, but I think it could cover just the south facing sides of a roof of a 2000 ft^2 home.
I live in a much hotter and much more humid area than the author does, and my bills are not even remotely close to these. I do run my AC through the summer, probably more than most
~$135 1190kWh (charged at ~$0.114 / kWh)
~$10.50 tax
~15.50 other fees/riders which can not be recouped with panels
OP is somehow paying ~25c/kWh, though I have to believe they are counting the connections fees etc in their estimate, based on natl averages they would be an outlier for most states. https://www.eia.gov/electricity/state/
I don't have A/C, do have net metering and do have solar panels, so I intentionally picked the plan with the highest costs during the day in the summers and I pay 24c/kWh only in peak hours during the summer.
Author may be a high energy user and on a tiered monthly-usage plan rather than a TOU plan; if that is the case than the solar panels may be saving energy billed at the highest marginal rate which might be 25c/kWh. My previous house had in-ceiling electrical resistive heating only and I accidentally got a 4-figure electric bill the first January I lived there because one thermostat was broken. IIRC everything after the first NkWh (where N was a crazy high number) was billed at around 25c.
California rates (not sure the OP is in California, but still) are just high. Lowest Off-Peak rate is $0.22/kWh in my area, and On-Peak (4-9pm) can be as high as $0.62/kWh.
That's wild. I live in an extremely hot and humid part of the country, in the Gulf Coast; probably the area with the highest AC demand in the summer. I routinely exceed $300 a month in the hotter months.
Do you have a particularly small or recently-built, well-insulated house?
New England is generally not that hot and humid especially outside cities--except for fairly brief spells. That said, some people do regularly run the AC to keep houses cooler in the summer. His savings are probably more than I pay in electricity in a given year (with oil heat and a small window unit AC in my office if I bother to put it in.)
This is a great point. Avg cost for me last year was less than half, around 11c/kWh
Though, https://www.eia.gov/electricity/state/ makes me wonder how you are getting 25c/kWh prices, as author estimates. Maybe higher price tiers due to usage.
I don't find his methods particularly rigorous. He's just taking total generated kWh and multiplying it by current utility rates and then saying this represents savings.
Does it, though? We don't know how much of the generated power was actually used instead of sold back into the grid, or if this is even a grid tied installation, and what the return sales rate would be.
The only way to actually compare this is to look at the actual bills. What did you pay last year, what did you pay this year? That's _actual_ savings. This is calculated plausible savings. It's not at all clear they're the same thing in this scenario.
Finally.. if you're in a position to even say you're _saving_ $233/mo, then how inefficient is your home in the first place? How many people live there to generate this large of a bill? How much of a difference would it have made to make that house more thermally or electrically efficient instead?
If you live in a place that does net metering, it truly just is as simple as generated kWh * utility rate. That's what net metering means. Each kWh you provide via solar offsets the cost of a kWh from the grid.
Additionally, if you live in a place where power is $0.20-$0.30/kWh, then $233/mo is not a particularly large amount of electricity, especially for a single-family house. At $0.30/kWh, that's only ~775kWh of electricity.
There are a few different forms and a few different rules surrounding it. I can probably assume what he means, but a more direct comparison would obviate any of these factors.
Right.. but that's not his total cost, that's his total savings. So, with those factors; which make sense for New England, he's got a 7kW system getting light for about 3 hours a day on average to net that 775kWh to earn the $233 savings in a month. Would that be right?
If the system was totally efficient, then yes, but the 7kW nameplate number is peak efficiency... so most likely it's some averaged number (say, 4-5kW?) that is across a longer period of time.
I am too. My last energy bill was $39 for 174kWh for a house built in 1926. Considering that I pay ~$12 in grid connection fees, the payback period is functionally never.
My parents had panels installed on their house, it is a large single story house in Austin, Texas, which is pretty ideal (more surface area for panels, less volume inside vs two story). No trees in the way either. They haven't had to pay anything for electricity yet. And that is with an EV charging in the garage.
We have panels on our roof, a two story house, also in Austin Texas, not so lucky as my parents. With the area maxed out we still have to pay for about 25% of the electricity we use.
It sounds like they have a good net energy metering plan that is crediting them for their solar being put back to the grid. (So they don't have to pay for charging the electric car in the evening when solar stops working) I wonder if Texas is working to walk that back like California is right now.
I'm in the Austin suburbs on Pedernales Electric Coop (not Austin Energy as in the city proper). In 2022 they switched away from net metering to a scheme that pays just over 50% of the cost for power returned to the grid. So I now pay $0.090337/kWh for power consumed from the grid, and am credited only $0.05377/kWh for power returned to the grid.
This has changed the payback equation quite a bit. I'm now incentivized to do everything I can to use the power I generate rather than returning it to the grid, whether running more A/C during daylight, running laundry during the day, or storing it on my Powerwalls for using off hours.
Although the payback equation doesn't really make sense anymore, we're still happy with our decision to install solar + powerwalls because we can run the A/C as much as we want without feeling guilty, and also get a solid backup in case of major storm events like the 2021 winter storms. We both work from home and have an infant so keeping the house comfortable is important.
almost certainly. Net metering is ultimately unsustainable. Realistically we should be paying homeowners wholesale electricity rates + an amount to cover transmission savings + whatever incentive we decide on for the energy they produce (along with time of use based consumption charges).
This would likely incentivize storage, and could be extremely valuable with V2G technologies that are incoming.
Agreed, incentivizing storage right now is a great idea because there are many creative ways to get it done. From used EV battery packs to EVs that can run the power both ways, and who knows what else people will think of depending on their location and skills.
In Austin I'd like to see some studies on solar vs tree coverage. My empirical evidence would suggest that great tree coverage alone keep the inside livable on 100f+ summer days. On my house without trees I could barely keep some of the rooms cool.
That makes me wonder if the installation of solar panels on a roof act as "shade" similar to partial tree coverage in some way, since in many cases they add a thin cushion of shaded air between the panel and the roof. I wonder if that provides any meaningful shade or temperature reduction.
Not making an argument either way, I just hadn't really considered if this was a factor until I read your comment.
I would certainly never cut a tree down to get more solar on my roof, even if the math worked out favorably from a cost perspective it just feels gross.
One side benefit of solar for me was reducing cooling requirements of the top floor, which I anticipated but didn't expect it to be this significant. Expectedly but interestingly, installing solar panels didn't make any difference, but starting energy production did. Physics is fascinating.
Interesting to see a post about this from someone else who I'm assuming is in a northern climate (they say New England).
One of my biggest questions about solar is, what happens when it snows? Do you have to get up on the roof and clean them off by hand? Or do they have built-in defrosters?
> One of my biggest questions about solar is, what happens when it snows? Do you have to get up on the roof and clean them off by hand? Or do they have built-in defrosters?
Nothing.
In northern climates rooftops usually have more of a pitch to them so snow doesn't accumulate (because you don't want the roof to collapse). The panels, because they are black, will generally collect enough heat to melt the snow in a day or 2 after a snowstorm.
If you want to clean the panels, you risk damaging them. So the common advice is don't do anything and let the sun do the work.
Northern climates is relative. I live in the north of Sweden and the solar panels here will be completely covered from December to February or similar.
It's true though that the snow will melt or fall off quickly when it's not that cold anymore.
All it takes is a little snow to melt or blow off and the exposed panel will heat and cause the rest of the snow to melt and slide off. The panels are relatively slippery so a little melt at a pitch will clear them in no time.
my partner has panels on her house around boston. They work well. The snow slides off, usually with a great "wump" as it drops 2 stories into the yard. Its acually a little startling at first.
They work better in the summer, but provide most of the house power. Doesn't help much with the EV she got since except in the summer. She got a electric water heater as well.
If her water heater is in the basement, when it comes time to replace look at the heat pump water heaters. As a bonus, they dehumidify as they work - which is usually a very good thing for basements!
You don't ever get a few inches of snow on the top of the roof. The point of the aggressive pitch is so the weight of the snow pulls itself off the roof so your roof/home doesn't have to support 5 tons of water weight.
IDK if you live in a place that sees significant snow, if so, just drive around a neighborhood and notice what the snow looks like on the rooftops. Generally none on the top and some on the bottom. That's by design.
While the bottom of the panels aren't doing anything to melt the snow, the top portions are, and that accelerates as the snow melts.
>You don't ever get a few inches of snow on the top of the roof. The point of the aggressive pitch is so the weight of the snow pulls itself off the roof so your roof/home doesn't have to support 5 tons of water weight.
This is definitely not how snow on roofs work. Here is an example with a roof with lots of snow on it:
Roofs in cold climates are designed to hold the weight of snow on top of them.
ETA:
The no snow on top but some on the bottom is because their attic isn't insulated well enough. Ideally you lose no heat through the ceiling into the attic. But without enough insulation you will lose a significant amount and the hotter air will rise to the highest point and melt the peak first. You don't want this because the melted snow can run down your roof and freeze again on the colder parts.
It's a low pitch roof which allows for a lot more buildup.
Also, if you read the article I linked, you'll know that pitch isn't perfect (pun intended), but it does go a long way in preventing snow buildup. A major snow storm with wind blowing snow in the right direction can twart a high pitched roof. But that sort of condition is a lot more rare.
You don't produce much in the winter anyway, you can brush it off the panels with an extendable broom but it will naturally melt and slide off. Unless you have snow cover all winter you'd likely spend more on the extendable broom than you'd produce from the panels.
My brother-in-law's panels produce just as much electricity at noon in the winter time as they do in the summer time. Winter sun in the Canadian prairies is quite intense, and the panels are more efficient at -20C than they are at +20C.
What the GP is saying is that with the losses from higher angle and shorter duration of insolation, the decreased temperatures increased efficiency such to match the summer's output. The heat in the summer reduces a panel's efficiency.
sure but the sun is going down at 3pm, canada gets 1.5 hours of peak sunlight at best in the winter… on a lot of houses the cost of removing the snow is more than what’s gained during the short daily window of direct sunlight… eventually the sun solves the problem for free :)
According to the folks on this YouTube channel that have an installation in a very snowy climate if you have two sided panels they melt any accumulating snow off much quicker.
My issue with New England (I live there) is there that only 50 to 60% of the days are sunny - we get a lot of overcast skies. That, combined with the lack of available roof space on our Four Square style house, has kept me from thinking about installing panels.
Take most of the sibling replies here with a grain of salt. In areas where there are large snowfall events (eg Tahoe) the story is very different; panels can and do cause ice dams, dangerous snow slides, and even the panels can get ripped right off the roof.
I've lived in very snowy areas. Snow mostly falls off the roof because of the pitch, or it melts off in the sun. Its pretty much only in the absolute worst storms AND roofs with low pitches you need to get up there and clean them.
Usually if any significant amount of the panel is exposed, they will start to heat up from the sun and defrost the rest. Its a significant reduction in performance, of course, but still worthwhile if utility power is expensive.
I had them when I lived in Pennsylvania. The panels self-clear in a literal avalanche that destroyed all the bushes where it landed. It sounds like a freight train in your attic when it happens.
Had someone been standing underneath when it happens they would have been seriously injured.
My system generated 14.36 MWh in 2022, which comes out to roughly $4,308 USD. The ROI is about 5-6 years for the system itself. But a new roof was also in order.... so more like 10-12 years. Still a great deal if you can afford to invest for the long term. Especially with a 30 year warranty on most of the equipment.
Well you need to have the solar removed before you can get your roofing work done. Then re-installed afterwards. Typically roofing and solar work is very different, so two different companies involved - just makes things a little more complicated.
Most solar companies that I've spoken to won't (or will strongly recommend against) installing solar on roofs with less than 10 years of useful life remaining.
That being said, roofs typically have a 30 to 50 year lifespan. Depending on the material, location, etc. Most quality solar panels have a warrantied life of at least 25 years. So if you're like me and had an old roof, that gets replaced first before solar gets installed.
And by the time the solar panels are EOL, the roof will be close to it as well. So they'll both get replaced again. Assuming I'm still in the same house in 30 years time.
My mom tells me she's practically been making money with hers in the last years.
She had them installed when my parents were still running a GP practice together in a small village. That was only a few years before they retired though, so now half of the building is basically storage space that's barely used and unheated (also no sterilizing equipment in autoclaves, fewer fridges needed, etc). She also has a solar water heater, she cooks with gas, and her house is very well insulated. As a result of all this she ended up producing more electricity than she uses.
However, she's also still on an "old" contract with her energy provider that is quite beneficial towards her because they didn't anticipate this scenario - this will likely change soon when she has to renew it.
This is in the Netherlands, can't speak for other countries.
This was built by a solar company, not the author directly, but the actual parts making up this system are pretty damned cheap. ~$4200 in panels + $2000 inverter plus additional hardware (mounting, cables, fasteners). Then labor costs.
Net output was 10,894 kWh, inline with a 20 year estimate of 193,545 kWh. If we just divide $6200/193545kWh (ignoring additional costs!!) that'd be $0.032/kWh.
One other data-point: daily output averaged to 29.8 kWh, from this 7.56kWh array.
> One other data-point: daily output averaged to 29.8 kWh, from this 7.56kWh array.
This is useful information that is surprisingly hard to find: if I install panels with a nameplate rating of x kW, how many kWh will I generate per day? In Bruenig's case the multiplier is just less than 4x.
The linked page provides a widely available map for "Peak Sun Hours" (defined as average hours per day greater than 1000 W/m^2 of sunlight), but it's not immediately clear how this relates to the multiple. In theory, you could figure out the efficiency of the panels and their area, but this still doesn't account for the actual curve of sunlight intensity.
Bruenig is (I think) in Connecticut, which according to the map has 4.2 Peak Sun Hours, which is only a little greater the the measured rated kW to kWh multiplier. Is it safe to assume that the "Peak Sun Hours" is always slightly more than the multiplier? Or is there a more better map somewhere that shows this conversion directly?
The author linked the parts. The link to the Longi LR4-60HPH-360M panels shows a $200 price (x 21). I had to google for the inverter, a Solar Edge SE6000H-USS3, to find a price ($2000 x 1).
> I ended up getting a 7.56 kW solar panel system consisting of 21 LONGi panels[1] and 1 Solar Edge inverter[2].
(At the very end of the article, some hint is given as to where the author is located.)
Those numbers - for USA, and for 2022 - seem very high.
Author obtained 7.56 kW of panels with a single phase inverter for a list price of USD$26k - which was reduced via subsidies down to USD$19k actual.
Here in Australia, 24 months ago I had 12kW of panels with a 10kW three-phase inverter installed for an on-paper price of AUD$13k (USD$9k), similarly subsidised with government incentives down to AUD$8k (USD$5k).
I'm about 200km north-west of Sydney ('regional' by any definition), and installation involved 3 guys for one full day (presumably they were a multi-day loop to customers in the area, as we're 3h from their base).
In any case, ignoring post-subsidy delta, why are the list prices so savagely different? I know all these panels and inverters come out of China, but I'd have expected stateside pricing to be much more competitive a year ago than rural Straya pricing two years ago.
In terms of finances / payback - author seems to overlook one (mostly positive) feature, which is the behavioural changes of 'free' power. It's summer here now which means regular 40C+ temperatures, and so the air conditioners (3 x 500W max draw) go on daily, automatically, for 10:00/17:00. This obviously improves QoL but also some non-obvious benefits - extends shelf life of food, takes a significant load off the fridge, etc.
Basically, a raw comparison of before/after often won't be as compelling as it assumes no changes to how you consumer power. I've got $0.33 export / $0.07 per kWh, so the trade-off is slightly simpler to calculate -- along with the installation of an $80 timer switch for the 1800W HWS (only draws power during the middle of the day) I calculate effective payback for me is in the order of 3-4 years.
Anything in the US that will reduce the profit of established players in a given industry is heavily taxed and very expensive. This is the results of decades and billions of dollars in lobbying (i.e. a legal way to give politicians money so they pass laws to help you make more money)
Your panels might be 3x more productive, being in sunny Australia. Where does the author live in the US? I either missed it or he didn't say. Even then, the layout of his roof could have a big effect.
Another point is that certain solar panel imports are tariffed as much as 30% in the US in order to protect the (non-existent?) domestic solar panel manufacturering industry. But it appears Biden temporarily suspended it about 6 months ago. Unclear whether this added to his purchase price.
Author mentions New England, which I gather is a large-ish area, but let's say about 44 degrees north.
I'm 33 degrees south, so obviously 'better' in terms of solar potential. We probably also have fewer cloudy days (though much hotter days during summer, I'm not sure how that compares out over the year).
My panels are not optimally installed, at least not optimally for capture - but definitely optimally for installation cost. I have a gable-roof shed facing roughly NW, with 22 panels on the NW side, and 11 on the SE side. Roof slope is about 9 degrees.
I just start to flat-line my 10kW inverter, fed from 12kW panels, about 3 weeks (on the winter side) of the equinoxes - so there's definitely room to improve the configuration there. FYI in summer I generate average ~70kWh a day, and in winter about 30.
It sounds like author put these panels on in late 2021 --which should post-date the worst of the COVID-induced shipping/trade bumps, and pre-date the Ukraine / Russia induced effects.
EDIT: You mentioned "... tariffed as much as 30% in the US ..." -- I agree that kind of figure could be explained by duties, tariffs, etc, but the difference here is actually 3x (26k vs 9k) for a system that's only about 60% the capacity of mine. A like-for-like system, extrapolating from TFA's numbers, would be US$35k, or nearly 4x difference.
>Realizing that the solar company was in quite the pickle insofar as repossessing the panels would require costly legal actions on top of paying people to take out the system, I just waited on the payment. They would contact me periodically asking me to make payment and offering to help me get approved for a new loan. But I just ignored all of that.
Seems pretty unethical tbh. The solar company delivered, but now this guy is going to refuse to pay? Sure he did later, but not without blackmailing them for $5k discount.
Author definitely took advantage of the situation to force a larger discount on the company, which I also consider a dick move. However, the situation was complicated. The solar company failed to install within the original timeline, so when they did, the original approval on the loan was no longer valid. It seems the interest rate went up during that time as well so the author’s new loan would have been more expensive to him/her.
The solar company messed up by not getting a new confirmation of loan approval prior to installation, and the author then leveraged that into a fat discount.
They took 7 months to install the panels. I'm sure times during all that the OP thought if they forgot about him, called them to ask for a installation date, etc.
As a way to curb any thought about undue wait time, the company made a contract guaranteeing installation between 6 month after signing. They didn't comply and the loan lapsed. Now the company needed to negotiate a new contract with OP or go for the legal route. There is nothing unethical about this.
The company had everything to lose going the the legal option, with all the long terms procedures, paying money for a pay back that may never come. They are probably grateful that OP just asked for a discount instead.
I dont consider strong arming someone due to your now advantageous position against something you previously agreed upon and reasonably similarly delivered (7 months instead of 6, we're not talking about multiple years later here).
A similar ethical framework would allow someone in marriage say "I quit my job, can get alimony from you, better do as I say or I'm filing for divorce" -- It's all legal. Still unethical.
OP wasn't planning a devious plan here. The 6 month contract seems standard from the company side. They shoot themselves on the foot and the loan just got voided. OP would be paying the company for nothing without a new loan
> (7 months instead of 6, we're not talking about multiple years later here)
Maybe this is a disagreement based on past experiences, but the most infuriating thing when working with home contractors is chasing them around so they do they finish their job. I get if this were business to business transaction that are used to year long delays, but as a person I find having a half finished ceiling to by a big disruption in my daily life.
Being fair OP didn't say anything about having to fix their roof before installing the solar panels, but having to put up with business delays in your home projects isn't the most pleasant experience.
If we go to the ethical side of things, in this case the company installed something without doing proper diligence in their part. This could have been solved if they presented a new contract in OP home when they went to install the panels. Paying without a loan would have been foolish from OP side, same if he paid it at the more expensive 2022 prices. In fact OP would probably be forced to do if they asked him about it before installing the panels. This would be unethical from the company side: delaying 6+ months the installation and proposing a more expensive loan before installation.
Let's not skip the company trying to swindle OP to pay without a proper contract by scaring him. Some of us may be used to this kind of threats. Doesn't mean they are ethical at all. They should have been upfront with the problem from the beginning, not bullying people with scary letters.
> Let's not skip the company trying to swindle OP to pay without a proper contract by scaring him. Some of us may be used to this kind of threats. Doesn't mean they are ethical at all. They should have been upfront with the problem from the beginning, not bullying people with scary letters.
If I understand correctly those only came after OP refused to pay their bill.
Expand the argument to the economy as a whole, and it's just the Prisoner's Dilemma. You'll get screwed by someone else someday, the solar company will screw someone other than you etc. I'll certainly agree it would be better if everyone cooperated. Whether it's unethical depends on your ethical framework.
From my read, the solar company delivered late, causing the original loan to fall through and requiring the homeowner to refinance at a higher rate. Since that delay cost money it does seem fair that the company should bear the cost.
Ok, so adding in another $200 a month lost opportunity (say for 7 months. We're now at $2500... So I guess these details are adding up, still not $5k, but the case for a discount is definitely growing.
Interest rates increased by 4 basis points from Dec 2021 to Dec 2022.[1] So a 3% loan then would be a 7% loan now or a 5% loan then would be a 9% loan now. The starting rate would depend on your credit score or other factors.
1) A basis point is 0.01%. 2) We should count the rate increase from the contracted delivery date to actual delivery date, i.e. one month instead of 12 months.
Based on his explanation, the company has at least some level of fault:
> my non-payment would not be a default because default is defined by a breach of contract and no such contract existed. Indeed, if anyone breached the contract, it was the solar company by not installing in the six-month period.
but they did install it in the 7th month, and we see no real hardship applied to the author, perhaps save for $200 per month lost opportunity for savings (less interest applied, would have to think that through more) ...
I have a large 20kw system for an old leaky farmhouse and it provides me about $6k in energy and credits a year.
I like thinking about it as a near guaranteed, medium return long term investment that's good for the environment. Each year is roughly 13% returns and I should get that for at least 20 years or so.
Don't kid yourself that residential electricity prices have tremendous price jumps every year. Nationally prices have increased 2 cents per Kwh from about 2009 to 2021. That is a 1.47% CAGR, CPI inflation was 1.55% per year over that same time.
We aren’t currently grid tied (we may eventually put in a line to reduce sun-drought generator set use, but at about 100 hours a year it’s kinda a tough sell)
Our little solar plant operates a small coffee plantation, 4 houses, 8 cabins, and several outbuildings , parks, and community spaces.
When we do tie in we probably won’t sell power or do net metering. Instead we focus on robotic or mechanised industry where we can generate saleable goods using the excess power. (Computing power, 3d printing, dry ice manufacturing, foamed concrete block manufacturing, etc)
We also manage our utility loads like bullk water production and large scale refrigeration so that they occur during excess production periods and are normally suspended while we are operating on stored power. I’m thinking of implementing an anydrous ammonia plant and distribution system so that we can store more energy useful in air conditioning and refrigeration, which are some of our largest loads. It’s getting hard to find ways to store energy that work out cheaper than just buying more batteries though.
I’m really stoked about LiTo batteries coming down in price eventually (or similar tech not yet released) because they last 30 years + / 30,000 cycles…. And they can take a charge at a ridiculous rate, which solves other challenges.
I have a pretty ideally angled roof near Boston and was considering a solar setup. However as far as I can tell we won't get paid for excess generation in any given month, so since we don't really use much electricity it seems this is not a good investment. Are there any common updates possible to take advantage of the excess power? Before ETH switched to POS I thought about mining crypto... otherwise we could switch to electric heat but that is a huge outlay.
> otherwise we could switch to electric heat but that is a huge outlay.
You can walk into any major retailer, buy a space heater, and plug it into any outlet in your home. Keep in mind that your gas furnace is turned on/off using a thermostat and temperature setting, so if you reduce the set-point e.g. 1 degree, and then either locate the coldest parts of your home or areas you want to prioritize (e.g. livingrooms/bedrooms), you can then target space heaters into that location to offset the savings from gas.
In terms of deployment heating or cooling, electric heat is the cheapest and easiest. You're just thinking that it must be whole-home or nothing, which isn't the case, you can reduce your whole-home and spot-heat as needed.
Sure but I don't want a bunch of random space heaters around the house. Regardless of whether they are safe or not it's a waste of space if there is an existing central heating system.
That's one of the options I've been considering. Another is to switch to heat pumps instead of natural gas burners, which has the benefit of being drop-in for the existing furnace.
That's a pretty good return. I paid around $20k for my system (parts and installation) and get ~$1700 a year out of it (I live pretty far north). After tax credits and other state incentives I break even in 7 years. Output is warrantied by the manufacturer for 25 years (there's a schedule, but should still be 85% at 25 years IIRC). It's a great time to get solar if you own your home.
"per-kWh price of electricity was around 25 cents"
This sounds like the total price of electricity, including distribution. However my understanding of net-metering means that you have to pay distribution fees to put your power on the grid, so you don't get back the full 25 cents, only around half that (the generation portion).
If the author of this post is on here, could you please clarify?
It depends on your state and the laws surrounding net metering. Some will let you put it back into the grid at the full per-kwh price, even if some of that is for distribution. California used to, but they're rolling this back (I think old installs are grandfathered in).
Does anyone have any detailed, up-to-date resources for personal solar use? I'd love to learn more, in particular the cost/output/lifespan progress by time. Last time I checked in my area it seemed that you didn't quite break even, but that was a while ago and my understanding was/is very limited.
In Australia a 6.6kw system with 5kw inverter runs about 3500-4500AUD installed and grid connected.
The rebates in WA are fairly low for exported power (about 5c/kwh) but you can fully offset your usage during the sunny parts of the day running aircon/washing machine.
The price has gone up of late, you are probably looking 4000-5500 depending on inverter quality (that's based on 2023 STCs)
But that said, Western Power's inflexible blanket ban on any feed-in above 5kW (single phase) has had a wonderful side benefit of enabling solar companies to be ultra competitive with the 6.6/5 kW system installs because basically everybody wants the same equipment, allowing much greater standardisation.
Electricity is still dirt cheap in WA (30c/kWh) but that will soon change.
Interestingly, solar has penetrated so far in WA that there's almost no value to the grid operator of putting more daytime solar onto the grid.
Yeah it makes sense that a limitation like that forces competition, along with cheaper import costs from China I imagine not hurting.
30c/kWh is not really cheap by most countries standards, especially when you really need aircon during summer and from this year can't build a new home with gas heating or cooking in WA.
Going to be an interesting few years I think as grids start to move away from coal base loads en-mass
> if you use a 5 percent discount rate, 20 years of $2,677 of annual electricity bill savings yields a net present value of $33,361.
Yes, but with an 8% discount rate, the PV is $26,283. It's true that the cost of electricity will probably rise over time, as he notes. But current interest rates make this a much less attractive proposition from a financial perspective (that is, assuming you don't care about solar generation for environmental reasons). The homeowner takes on significant risk, and the gain is not all that great. Your panels can be destroyed by hurricane, earthquake, or falling trees, rendering them worthless, for example.
I'm not saying this isn't worth doing, just that the financial calculations seem a bit flawed in today's economic environment.
Good point! I guess I view panels as being especially vulnerable because of their fragility and exterior location.
I've also heard that one way to get booted from your homeowner's insurance is to ever file a claim. I know several people who lost coverage after their first claim (decades after becoming a customer).
Anecdotally, I know plenty of people who've filed claims without repercussions. Rates might go up a bit, though. Your friends might consider contacting their state's insurance regulator.
Any1 built their own battery-arrays? I am interested in solar in part to avoid blackouts. But for that I will need batteries. And my house being 100% electric, tesla's powerwall quickly becomes expensive. On a first glance, using lead-acid batteries appears to be straightforward, but I wonder what edge-cases and gotchas others experienced with it. How much time is needed to maintain it.
So far 10kW gas generator was able to power most of the house's essentials before resistive heater kicks in in HVAC system (two heatpump units, each has a 7.5kw resistive heater as a backup when outside temperature drops too much.) But I would feel better if I did not have to rely on availability of propane/gas and clean roads during periods like 2021 blizzard in Texas.
The DIY part of the solar market and now most none Tesla options have moved towards Lithium Phosphate batteries, particularly in rack mount format. They're denser and way easier to handle then lead acid.
30kwh of batteries is about $10k wholesale, and inverters run about $1500, although most of the time you need 2 if you want to supply 240v.
The conversation here seems a bit binary. Maybe I missed a few posts. The whole solar thing is not realized until cooperatives de-grid in favor of their own local grids, along with an arbitrage sharing model. That has the potential to to manage all power needs (including electrical cars). It's a scaling issue sort of in between entirely centralized and rugged individualism.
Of course the tech for this is a real threat for utilities, so it wouldn't surprise me if they would go to any length to stop such and effort. Same could be said for any deflationary tech like micro reactors.
In the UK, I've had a battery plus solar for just over a year.
its a 5kwp set of panels and a 13kwhr battery. From mid march to late November, we only drew from the grid once (that was because we had a portable hottub setup, which took ~28kwhr to heat up.)
Its raining and january and december are shit months for solar, well mostly because of the short days. December we were 45% independent from the grid.
Currently we do not have heating or cooling from electricity. If that were the case we'd need 10kwp plus more storage.
We had solar panels on our last house. Even in the cloudy PNW, we generated enough electricity to cover 100% of our electrical needs. And, the monthly cost of the loan payment was about $30 less than our previous cheapest monthly electrical payment.
Unfortunately our new house doesn't have a roof configuration that allows for solar panels. I wonder if new home architecture and alignment will change to make solar panels work better?
It varies. A lot of solar installers will push the benefits of microinverters or optimisers, because they can make your system cope better with partial shading or panel failure. However, the microinverters or optimisers are expensive, and can easily double the cost of the system. They also have dubious merit, because if you design your system properly in the first place it will cope with partial shading and panel failure quite happily anyway. However, selling the units makes sense for the installer, because it's more money that passes through their hands. Remember, the installers are not necessarily working for your best interests.
You can buy panels off Amazon at about £0.73/W. Add in the various bits that you need to go along with it, and that probably pushes it up to around £1.10/W, if you're doing it yourself.
Australia is even cheaper, I paid around USD 0.6/W for my system (including subsidies, it's closer to USD 1.0/W excluding subsidies). Break-even will be in around 3-4 years.
I noticed (by following the solar sub-reddit) that almost every single US install has either Enphase or Solaredge inverters (which happen to be on the pricier side). I am not sure if that's an indication for the lack of competition but surely this seems like an industry where disruption is possible?
I've just installed solar & batteries (not 2 weeks ago we turned them on!). Be interesting to see how things will unfold and I love reading stories on how people have got on with theirs. It was roughly ~£12,900 installed (in the UK) and we pay 32.596p/kWh & 42.290p/day standing charge...
Will either pay off well or be an expensive lesson
This is interesting to me, I like reading people's experiences with solar panels since I'm waiting for the price to make sense for my own home. But as someone spending about $1,000/year, it's just not there yet. My options seem to be: diy, 20-30 year roi, find more ways to use electricity, or wait.
I’m not sure if you were serious with this comment, but for the sake of other readers I wanted to point out this is a legitimate strategy. Many people tend to pair solar with installing mini splits, EV chargers, or some other home upgrade that will offer value at the expense or a higher electric bill. This can make the solar investment make sense where it previously didn’t.
Residential solar changes the calculus for some appliances. You can go with all electric appliances and have their use covered by your solar. Same with EVs, your solar can offset your charging costs.
That's unfortunate. I got my 8.25KW system + Powerwall installed through Tesla for ~$26K in 2021. Just got my 26% tax credit this year. Also added a Tesla Model 3 in 2021 to complete the Tesla Ecosystem. Truly a dream come true.
My loan payments are less than $140/mo and I plan to completely pay off the system in the next year or so. Living off the sun is great.
I know someone who did 5000W system for around 15,000 in 2018.
Battert/Powerwalls let the solar work when the power goes out. (the dc to ac converters use the 60hz out of the grid to sync. There is no shutoff to the grid so for safety reasons her system shuts down when the power is out).
Oh sure, and I wanted a battery of some kind to keep things running in an outage. I just mean I don’t know why they bid a Powerwall specifically instead of some other battery system.
Any sort of battery system that lets you keep running in an outage is going to be pretty expensive, unfortunately.
But at that point your question isn't "what would be the payback period if I got solar" but "how much is it worth to me to have power when the grid is down". And so not surprisingly many people get solar but don't install a battery.
(Systems that provide a small amount of best-effort when-the-sun-is-shining emergency power when the grid is down seem like they should be a sweet spot here, because they can be implemented very efficiently. Unfortunately a combination of consumers not wanting it, the NEC not prioritizing it, and the shift to microinverters means that instead of adding ~$200 like it did when I got solar several years ago it's now more like $7k)
I get that it’s not gonna pay for itself any time soon, and I’m on board with paying for the privilege of keeping things running when the grid is down. But 50 grand? Come on.
Another benefit is you can use stored energy in your battery during utility peak hours. 4-9pm in California costs [a lot] more than any other time of day.
There is an associated video where he goes through the article and answers many of the questions I see in the comments here: https://youtu.be/YWwJ-DsVT4A
Great summary of this man’s positive experiences with installing solar. Can anyone answer how having solar affixed to your roof affects or complicates getting a new roof on your home every 10 years?
if you don't have net metering you should consume as much as you can yourself, cook when the sun is out, run your washer / dryer / vacuum / dishwasher and whatever else you have when you have excess power and minimize during the times the sun is below the horizon. 60% self consumption is doable without too much effort, 70 is work and if you get to 80%+ please let me know how you are doing that.
Also: the big trick to running of renewables is conservation comes first, solar second and finally the grid as a fallback for the remainder.
Slightly off topic, but is there a service that would quote out a new roof + solar install? We're waiting on doing solar until we need a new roof as ours is ~9-10 years old, and not near EOL.
You could consider getting a solar roof! I have no particular expertise here, but a guy down the street from me got one, and he loves it so far (~1.5 years into it).
If you're interested in learning a lot about doing home/campervan installations, I highly suggest watching/reading everything that Will Prowse has published.
My installation has separate circuit breakers for the various components (grid, home, backup that is on even if grid is off, panels), plus some extra protection against overcurrent in the inverter and batteries.
It's surprising that it's possible to produce electricity at home for a lower cost than the wholesale price.
I can't think of any other commodity for which this is possible. You couldn't make your own steel, bread or printer paper at home and beat manufacturing at scale.
Because the cost of grid electricity includes the comfort of having it 24/7. Solar is damn cheap, but you can rely on it perhaps 10% of the time (0 hours in very cloudy days up to 4-5 hours a day in the summer). There is some way to do some arbitrage even without net metering (charge your car during the day, install batteries) but solar alone is by far not enough to leave the grid.
Largely it is because you don't have to pay for delivery. It is very similar to having a well. If the water is available at your house already it is most likely cheaper to just pump it versus paying someone to supply water to your house.
The short answer is delivery costs. And that's what makes this current era of solar power so exciting, in my opinion! The market is practically begging us for microgrids.
With panels on your house the grid is still providing you a key service, providing you with any power you need in excess of what you produce, and sinking any extra for you. But with net metering you only pay in proportion to your net usage, effectively compensating you at retail rates for the power you supply to the grid.
It isn’t - other customers are paying for their ability to have electricity when (and how much) they want it, via the net metering subsidy. If you divide their cost by the lifetime expected generation, you get 9-13 cents per kWh depending on panel subsidy. This sounds great in California but isn’t far off from retail prices in some states. And the actual grid “wholesale” price is usually far less than that except at peak times or exceptional situations. You pay at retail to still have power in those situations. With net metering everyone else pays you that price for something you are not providing.
PG&E will find a way to keep their profit margin.
https://www.npr.org/2022/12/15/1142927418/california-plans-t...