Hacker News new | past | comments | ask | show | jobs | submit login
Mining Ethereum on M1 Mac GPU (yifangu.com)
153 points by gyf304 on Feb 27, 2021 | hide | past | favorite | 125 comments



> Is it worth it?

> Um. Not really. At current Ethereum prices (2021-02-26), it generates $0.14 of profit per day. It’s still a profit, but very miniscule.

The real question is if it makes more money than it costs in electricity. Apple claims it uses 39 W max. That adds up to 0.936 kWh of electricity per day, which we can approximate at 1 kWh. Electricity prices range from ~4¢ to ~20¢ per kWh in the USA, which means that depending on where you live, you'll be making a profit or a loss of -6¢ to +10¢ per day.

So yeah, this thing can be profitable. But the capital cost required to buy the M1 in the first place is going to be really hard to pay off.


>The real question is if it makes more money than it costs in electricity.

profit implies that costs are taken into account. If I plug 2 MH/s, 30W, $0.1/kWh into a mining calculator I get a profit of 11 cents per day against 18 cents of revenue.


Well the word profit implies they are, but many people use the word incorrectly. It would inspire a little more confidence if the author had posted what the gross was or his electric bill.


Unless a post explicitly mentions all operating costs, I would not assume that.

There's too many different versions of the word profit and in this case it looks like gross instead of net/EBITDA.


all ethereum mining calculators are dead wrong.

in the assumptions they all tell you "2 ethereum per block", Ethereum has been averaging 5-7 ether per block for the past year. hit like 13 ether per block just the other day.

so just multiply by 3, or like 7, up to you


Is electricity really that cheap in the USA? The cheapest price I've found for my city here in Germany is like 0,24 EUR. And this is already including a new customer discount and cashback. Not even guaranteed green energy...

0,24 EUR is currently ~0,29 USD. That's 0,29 USD for my cheapest price against your most expensive price with 0,20 USD.


Where I live (southeastern US) it's $0.089/kWh. There's additional taxes on it that make it functionally closer to $0.10. But yeah, fairly cheap.

Which is a good thing because we have electric heat and it's been a very cold month. I'm expecting an about $350 bill.


It is also a bad thing because it makes it reasonable to live in houses that are not properly insulated and just run the air heater on full blast all winter, like I did when I was living in the US. In Denmark the kWh price is around 0.5$ (the wast majority of that is taxes), and the energy efficiency of the house is a key data point in any house purchase and is always listed next to the price.


In many US locations we use gas heat instead of electric. It’s far more efficient to burn fuel for heat instead of burning fuel for electricity and then converting that electricity to heat.

US building codes are also increasingly strict about insulation and energy efficiency. Premium home builders go even further with insulation. In some locations, new houses are so well insulated and sealed that extra fresh air heat recovery units are necessary to exchange the indoor air.


Building codes are strict, but unfortunately building codes aren't often followed closely enough.

I recently noticed in the house I just purchased that a lot of cold air was coming in behind my kitchen cabinets and appliances. I pulled out my range and noticed that cold air was rushing through the gap between the drywall and the floor. Fortunately a can of expandable foam was an easy fix.

This house is only 2 years old. Corners were cut when building which makes this home not as efficient as it could be. I'm hoping to make gradual improvements.

Even though my electricity rate is only around $0.12/kwh and my gas bill isn't high, I'd still like to do what I can both for the environment and to save money.


Electricity may be a bad example. Creating electricity is very efficient. Its marginally more efficient to use gas? What's better about gas heat is, it takes load off the (possibly) overloaded electric grid, taking pressure off the need to expand it.


This is only true if you don't enforce building codes that ensure everything is built with good insulation.

In Canada (Quebec) we pay 0.06$/kWh but the government gives incentive for energy efficiency as a tax deductible.


Varies greatly based on state or even county. Hundreds or maybe thousands of utility districts each with a unique set of regulations and resources.


Is there any state or nation wide search for good utility prices? We've got Check24[1] in Germany which basically is a no brainer to find good price for stuff like this.

[1] https://www.check24.de/strom/


The approach to utility regulations varies a good bit by state, and things are always changing so I doubt there’s a good lookup beyond the data the federal government collects.

https://www.eia.gov/electricity/state/

Although, the states that are “energy deregulated” will often have a marketplace type website where you can compare suppliers. In those states, the utility commission sets rates for your local utility to run the transmission and local distribution wires, but allows you to choose which supplier generates the power on the grid on your behalf.

Really, the best way to figure out utility rates in the US is to check with that state’s utility commission or directly with the utility that provides physical infrastructure in a particular area.


Not really sure. Most places I've lived we haven't had a ton of options on pricing so haven't dug around.


We pay about $0.10 in British Columbia, Canada. There is a $0.20 daily base rate and over a certain amount it steps up to $0.14. It’s 95% hydro electric so it’s green too.

https://app.bchydro.com/accounts-billing/rates-energy-use/el...


hydro is really not that green: salmons are losing their spawning runs for an example in your area. Also not renewable, because dams get filled with sediment and lose capacity over time.


Hydro has a big upfront cost because you submerge a lot of land which usually was covered with trees. It also adds a bunch of mercury to the water. However over the lifespan of the dam this is not a real concern especially compared to gas/oil. Furthermore if you need to account for energy stockage with batteries for wind/solar as they are conditional to the weather the environmental cost analysis becomes much closer as you have to account for all the manufacturing externalities. Hydro on the other hand can increase and reduce production by drawing from its reservoir which makes it at least as green as other renewables that we have right now.


> dams get filled with sediment and lose capacity over time

I'm struggling to believe this. Can you name one dam on Earth that has lost over 10% capacity due to sediment?


> Because the source of the Indus River is glacial meltwater from the Himalayas, the river carries huge amounts of sediment, with an annual suspended sediment load of 200 million tons.[15] Live storage capacity of Tarbela reservoir had declined more than 33.5 per cent to 6.434 million acre feet (MAF) against its original capacity of 9.679 MAF because of sedimentation over the past 38 years.

https://en.wikipedia.org/wiki/Tarbela_Dam#Lifespan


Dams all over the world capture sediment. This not only reduces their capacity, it deprives downstream areas of sediment needed for natural habitat, to fertilise wetlands and to buffer against oceanic incursion.

The Sanmenxia Dam in China lost 17% of its capacity in the first 18 months of its operation due to sediment accumulation [1]

Countless other dams have already been dismantled due to sediment build up, and many more will follow, considering that around 19000 large dams worldwide are over 50 years old [2]

[1] https://en.wikipedia.org/wiki/Sanmenxia_Dam [2] https://e360.yale.edu/features/water-warning-the-looming-thr...


Honestly this sounds like mostly poor engineering planification and analysis. If you look at the dams with a lot of sediment it's mostly in South-East Asia and Africa while Europe/North America have little of it.


Asia's rivers have by far the highest level of sediment of any continent [1] meanwhile the US has quietly destroyed over 1000 dysfunctional dams in the post war period [2]

[1] https://agupubs.onlinelibrary.wiley.com/doi/epdf/10.1029/WR0...

[2] https://www.pbs.org/wgbh/nova/article/dam-removals/


> I'm struggling to believe this.

If seems you didn't even did the most basic cursory search on the issue.

Here's literally the first hit on a google search for "dam sediment".

https://www.hydroreview.com/world-regions/dealing-with-sedim...

And think about it: what's a dam reservoir other than a big old sedimentation basin.

https://en.wikipedia.org/wiki/Sediment_basin

Also, think about it: hydro power is not the only use for a dam. They are also typically used for things such as flood control, water supply, and energy storage. All of those usecases rely on the sam's ability to hold water (and it's sediments) and lower discharge, if not totally block it.


Depending on the impounded area they can (in some cases) be worse than fossil fuels for greenhouse gases.

https://www.scientificamerican.com/podcast/episode/not-all-h...


I wish that went into more detail! I have no clue how it could be worse than fossil fuels. I get that trees are a store of carbon and if they get submerged they will break down and release it all, but surely over the lifespan of the dam that’s significantly lower than burning coal non-stop? Wouldn’t it be equivalent to a forest fire clearing the area?

I always assumed they would clear the area of trees first anyways. Why let the wood rot when you can cut it down and use it?


> I always assumed they would clear the area of trees first anyways. Why let the wood rot when you can cut it down and use it?

The spaces are often so vast that chopping down the wood could delay a project by impractical lengths of time.

> I get that trees are a store of carbon and if they get submerged they will break down and release it all, but surely over the lifespan of the dam that’s significantly lower than burning coal non-stop? Wouldn’t it be equivalent to a forest fire clearing the area?

My understanding is that microbial breakdown means a lot more carbon is released as methane vs. CO2 from fires — methane being a much more potent greenhouse gas.

Here’s one related paper I found

https://academic.oup.com/bioscience/article/66/11/949/275427...

(I work for a hydro developer but I’m not an expert).


While it’s not perfect it’s significantly better than burning coal.

Geographically we can’t rely on solar; I’m not sure why we don’t have more wind turbines!


see: https://www.eia.gov/electricity/monthly/epm_table_grapher.ph...

There are several states where the retail rates are lower than 10 cent/kwh.


Thanks for the link. Insane to see that some of them even got cheaper in 2020.


US households use 3x the electricity vs German households. They get wholesale prices.


Germany has super high prices for electricity.


See https://de.statista.com/statistik/daten/studie/13020/umfrage... for a comparison of prices around the world. Germany is at the top.


> stat

paywalled, any open source for this data?



Here in Lithuania it's ~€0.18/kWh for peak rates. That includes all connection and admin fees, which in the US is usually billed as a separate line item. Most of our electricity is imported from Sweden. Our media keeps saying we have the highest electricity prices in the EU, but I guess not :-)


Germany, Denmark and Belgium are all bordering 0.30/kwh. Cheapest is Bulgaria around 0.10. So Lithuania pricing seems to be quite average. Maybe they mean highest relative to average income?


The majority of the consumer energy prices in Europe go to taxes. Here in the Netherlands, average rates are around €0.22/kWh, which breaks down into €0.12 energy tax, €0.04 VAT and €0.06 electricity.


In the california bay area with PG&E it is .25/.31/.39 per kwh in tiers:

first 300 kwh @ .25 the next 1000 kwh @ .31 above that you pay .39


USD$0.05/kwh here. Not in the US. My monthly bill comes in around $25 a month.


In Quebec I pay 0.0938 CAD.


Yes.

1) The US has good hydro resources (in the northwest where it rains all the time and there are lots of mountains so you have both lots of water and altitude change) as well as nuclear (especially in the long-industrialized northeast) that (mostly) has long been paid for and that we just keep running for cheap. Nuclear is about 19% of our electricity, and it's cheap once it has been built for decades.

2) Lots of cheap wind, especially in the middle of the continent where all the maize and stuff grows.

3) Way better solar resource especially in the south and southwest where it's about twice as much sun as in Germany and with much, MUCH less seasonal impact. See: http://www.sunisthefuture.net/wp-content/uploads/2013/03/Sun...

4) Like Germany (but even more so), LOTS of coal. Over a quarter of the world's recoverable coal reserves. However, we don't even use that much of it anymore. Just the cheapest kind (usually not the crappy lignite stuff that Germany often uses for power) is still used for power. We use much less than half the coal for electricity than we did even 10 years ago. Because...

5) We have a ridiculous. A RIDICULOUS amount of cheap gas right now. Fracking has opened up the spigot. Texas, Pennsylvania, Lousiana, Oklahoma, Ohio, etc, etc. (And Canada produces a bunch that we end up using because they don't have anyone else to sell it to and it's produced a long way from Canada's cities.) We make so much natural gas we export it to Mexico, and are even starting to liquefy it and export it to Europe.

So yeah. While Germany has long relied heavily on imported fossil fuels, the shutdown of nuclear plants hasn't helped. Think of how expensive it is to send natural gas all the way from wherever Russia makes it (Siberia) all the way to Germany and the markup Putin asks for. (and that's roughly a third of Germany's natural gas, IIRC)

And I think the way wind/solar subsidies work in Germany and different than in the US. In Germany, the solar subsidy is paid for by all the ratepayers. That alone is like 4.6 US cents per kWh on top of your electricity. In the US, the solar subsidy is funded as a tax credit which means the cost isn't directly borne by consumers but ends up being paid for or borrowed by the federal government. (Also, having twice as much sun means the same solar panel makes roughly twice as much electricity...) This has the effect of solar actually reducing the cost of electricity in the US while it increases the cost of electricity in Germany for the consumer... which probably is why German companies have been more reluctant to switch to electric cars than you think they would have been.

So yeah. The US has a huge amount of land, so really good solar/wind resources (and decent geothermal while we're at it), a pretty solid hydro and nuclear power situation, and massive amounts of fossil fuel. The US is now actually a net energy exporter. That's why electricity is so cheap.

You can check out good statistics for US energy production and consumption here: https://www.eia.gov/electricity/monthly/epm_table_grapher.ph...


Also your network infrastructure is much cheaper than a lot of places around the world. This is a large part of the price of electricity where I come from (which is far more expensive).


Yes, Germany has very expensive energy because of their transition away from reliable energy sources onto renewables.


Are you implying that Germany's electric grid is unreliable?

If so, a citation would be good because my sources say Germany's electric supply has been stable.



Neither of these articles says Germany's electricity supply is unreliable. There's no mention of power outages, or equipment damage, or anything like the power crisis in Texas power.

Instead, your citations say renewables are producing more energy than can be used.

That's a good problem. It can be solved by turning some of the supply off, or adding long-distance transmission to move the excess power to where it's needed.

But there's no hint of renewables causing instability or unreliability or power outages.


My source (a guy who works on the Czech power grid) says the Germans kind of abuse the Czech power grid in order to move their wind electricity from the north (wind turbines) to the south (industry). They even had to take measures on the Czech side to push some of that electricity back across the border.


Re the capital cost: Assuming profit of 10c/day and that an M1 mac is about $1000 it'll be about 274 years before you pay that off.


You’re a factor of 10 off. (1000/0.1)*365=27.4


Doh! You’re right of course


I have a friend who's been getting into it. As he tells me prices and how much can be made per day with various video cards, versus dedicated mining ASIC's, I do quick mental calculations. It seems like all of the options seem to be market priced at a payback around 8-10 months. Given the current mining furor, I'm doubting that consumer-grade equipment will be very profitable in another year. If you already had the hardware for Bitcoin, you're probably cleaning up on Etherium, but to start now? You're really only going to make money once the next currency catches on. I don't know. Maybe this is only a fiction I tell myself to avoid feelings of missing out on something fun and interesting, and potentially profitable.


Presumably you already have an M1 mac and mine only while you have no use for it.


You can pick up a 50 W solar panel pretty cheap nowadays so compared to the cost of the laptop that's nothing. Hopefully its battery will charge up enough on the extra 11 W to last during the night.


You must mean, watt hours or amp hours.


I must


And you don't take into account the environmental cost.


But precisely as much as you don’t take into account the environmental cost of every other unit of electricity you use.


This is part of why we need a carbon tax


Only problem with this is I can't figure out how to spend all the money generated by the tax. Obviously it should be used at a technology or process to reduce carbon from the atmosphere, but I don't think that's what governments are going to decide to do. Add a new tax, waste most of it. Or worse, use it to fund societally harmful things. That's what the history shows.


Take the money, pile it all up and directly give it back to the citizens, where every citizen gets the same amount of money. Effect: households will have the same amount of money on average, but people will be nudged to substitute their carbon-intensive lifestyle by greener alternatives. (or, in economic terms, there is no income effect, only a substitution effect).

You're also indirectly taking money from big corporations and putting it into the pockets of citizens, which has been proven to be one of the most effective methods of stimulating the economy.


Doesn’t this motivate people to not reduce total carbon use? It would lower the dividend unless the green energy is cheap enough to balance it out.

And if the green energy is that cheap (which actually it seems like it is) then we don’t need the dividend.

Btw, carbon emissions in the US at least typically aren’t done by “corporations” but rather poor land use decisions causing long commutes and things like that. Buying gas from a corporation may cause them to dig it out of the ground, but you’re the one who told them to do it.

The “climate change is caused by 100 corporations” thing is especially wrong since it comes from a misreading of a list of emitters that has things like Saudi Aramco and “the Chinese government” as corporations.


Does it really matter if the money is wasted, so long as it’s not wasted on something that generates more air pollution? The main purpose of a carbon tax is to stop polluters from externalising the cost of their pollution, so that they pollute less because it’s more cost effective not to. The extra revenue is just a bonus.


The money will be used to wage war to protect the petro-dollar.


> Only problem with this is I can't figure out how to spend all the money generated by the tax.

It doesn't matter. Pay debt. Dig holes. Build bridges. Just rebate it to whoever on a per-capita or per-tax-paid basis. Literally do anything with it.

The point is to incentivize conservation, not to raise revenue. You could get the same thing with a regulatory carbon penalty too, but it's simplest to administer as a tax.


Canada returns the carbon tax as a refund. The idea isn’t to take money into the coffers but to make wasteful services more expensive at the point of sale than efficient ones.


Yes, that’s why a carbon tax is stupid and „emissions trading“ was invented. Under such a system the government sells a limited amount of certificates during an IPO. Later, the biggest energy consumers buy emission rights from the best energy savers.


Still incentive the energy savers to make investment to save more, so they can sell more rights, and incentive the biggest energy consumer to consume a bit less so they pay less.

I wouldn't call that stupid


Part of the emission trading concept is to reduce the number of certificates in circulation overtime to increase the energy saving incentive („cap and trade“). The government can do this by buying certificates or reducing the amount free certificates assigned every year. The details are a bit more complicated but were implemented already over ten years ago in the European Union. There is simply no need for an additional carbon tax if this scheme is maintained well.


Ignoring the horrible externalities is effectively the point of all cryptocurrencies by now.


The standby mode of only the devices in the US consumes 1.5 times the energy of the entire bitcoin network. At least the bitcoin network is doing something useful.


That really depends, a lot of people generate too much solar during the day and typically feed it back into the grid for peanuts (especially in regions with high solar uptake, the wholesale cost of electricity gets towards zero in the middle of the day). Maybe more people doing this instead of feeding back into the grid in these places makes some sense.


More importantly, depreciation / amortization of the laptop


The 39W max draw probably assumes the display is on? Will probably draw less in clamshell mining mode.


Tired: carbon clams ingesting solar energy (tiny plants) and outputting pearls

Wired: silicon clams ingesting solar energy and outputting dogecoin


One has high meme potential. One is a bunch of carbon.


carbon removal needs more memes, see $100 meme contest here: https://twitter.com/titojankowski/status/1354124284705181696


> The 39W max draw probably assumes the display is on?

No, this is the number for the Mac Mini, which is the obvious choice if you wanted a machine for maxing out an M1 24/7. (The Air peaks at about 30W, which it can't sustain for long unless it's in a rather cold environment.) So it has nothing to do with the display's power consumption.

Maxing out just the GPU while the rest of the machine isn't doing much would get you well below 39W, though.


Hmm what if you have the speakers on max volume with pretty much all the other sensors all maxed? Here are a few I can think of: keyboard backlight, wifi, bluetooth, accelerometer, speakers, mic?, camera, proximity sensor, touch id, maxing out CPU, GPU, ML engine, SSD, ram, typing on the keyboard, using the trackpad. Using those all at once must surely draw more than 39 W if not then not using most of them must also mean it probably draws a lot less than 39 W during normal operation.


You could buy the computer to use as a computer and then use it to mine while you’re not using it.


Though capital costs can be ignored if this is something the user already purchased.


No-one has mentioned that ETH mining is memory bandwidth bound. M1 macs, unfortunately, have slow GPU memory compared to your discrete GPU's GDDR6X.

Here's a great post showing that - https://www.vijaypradeep.com/blog/2017-04-28-ethereums-memor... - GPU mining is only 10% slower than the theoretical memory-bound maximum (if compute was infinitely fast).


I suspected that as well. Since M1 CPU and GPU shares memory, this also means you really can't do anything else while mining. You can't even scroll a webpage smoothly.


Shouldn’t it be better to mine RandomX using the CPU?


Generally yes. The RandomX dev seems to have some issues getting to to work well on apple silicon, however the Xmrig dev has it working, but seems it caps out at 2.3kh/s due to not being able to take advantage of JIT or large pages. (At least from what I read of their respective GitHub repos https://github.com/tevador/RandomX/issues/196

And

https://github.com/xmrig/xmrig/issues/1991).

RandomX mining is better at the moment, but that may be because nobody's developed a GPU minter tailored to the uniqueness of apple's GPU, if that's even possible.


It's not crazy, but it's telling the software that it's an intel GPU, and taking into account no optimizations targeted at the GPU's strengths. I could see this becoming more useful if it was coded to apple's GPU characteristics, but I'm not even sure anyone outside of Apple knows what those are right now and how to optimize mining software on their GPU.


Cool experiment. But I don't recommend getting into Ethereum mining now, as the network is on the verge of switching to Proof of Stake as part of the migration to Ethereum 2.0.


Meh, it hasn’t switched yet, there’s still mining to be done.

I made ~$100 in the past month (stopped during the cold snap/storm here) on my 3080 mining ETH. Of course, I was getting closer to 90Mh/s so slightly more viable.


Sure, if you already have a mining rig, by all means keep mining!

But I personally wouldn't make a big investment on it if I was only getting started now.


Does this also imply lower energy consumption for running the ethereum network?


Yes! In fact that's one of the goals of the transition.


Well that's great for them.

Maybe (hopefully) we'll see environmental regulations against (expensive PoW) cryptocurrency before theres economically motivated regulation.


While I'm optimistic on PoS and excited about Ethereum adopting it, I think Bitcoin and PoW are being unjustly demonized and the issue is far more complex than what meets the eye.

I found this analysis insightful: https://coinshares.com/insights/beware-of-lazy-research-bitc...

Also this Twitter thread (12 tweets) and all the articles linked inside it: https://twitter.com/yassineARK/status/1360343401627979778


Does proof of stake truely ends the GPU arms race ?


Yep. You can run a validator node by staking at least 32 ETH (or any amount through a staking pool).

Nodes will still verify transactions and execute smart contracts, of course. But there's no hash lottery to be won for mining blocks.


I wonder if this may cause some correction on hardware stocks as market would be flooded with used GPUs.


There are plenty of other cryptocurrencies to mine


There are a few (mostly Bitcoin forks), and you can probably make some profit mining them, but they are not big enough to have a significant impact on the global demand for GPUs.

At this point, most crypto projects with any real traction (with the obvious exception of Bitcoin) are either ERC-20 tokens running on Ethereum, or other PoS chains that claim to be better than Ethereum (i.e. Cardano).

Bitcoin isn't mined with GPUs anymore.


As I said, there are plenty of profitable coins for GPU mining besides Ethereum [0]. Ethereum moving to PoS will not move the needle, people will just move on to something else.

[0]: https://whattomine.com/


The hash rate for all the other projects beside Ethereum is minuscule.

Sure, some of it will move to other currencies, but my guess would be it won't be anywhere near 100% as you're suggesting.


Why is the current hashrate relevant? The market will readjust to changes in hashrate.


It is relevant because I don't think the rest of the projects will end up absorbing all of Ethereum's hashrate. First of all, some of the current miners will sell their hardware and stake ETH instead, becoming validators.

Secondly, I tend to think there'll be a mass extinction event, where most useless projects will eventually die off. You can mine all you want, but if there's no real utility in yet another PoW project (and there isn't), you won't be able to sell the coins at a price that justifies running the operation.


What is bitcoin mined with now?


ASICs like the Ant miner series.


ASICs.


Quick, rough calculations suggest to me that the RTX 3080 is about what you need to turn a profit - based on 200k annual revenue you need 600k in hardware. Three years to pay itself off - about the lifetime of the hardware, my guess. That's if you could find the hardware - and I didn't include stuff like motherboard/rack/etc.

If it's that hard to make a profit on the RTX 3080 then I doubt the M1 Mac Mini was ever in the running.


You're not taking into account the higher block rewards in the current environment. The base reward is 2 ETH per block, but the the gas fees have been averaging near 3 ETH per block. Most of the mining calculators don't take this into account.

To be fair EIP-1559 probably turns on in July. So that's going to cut block rewards by a lot.


Maybe i'm misunderstanding you, but if a 3080 costs 700, and it makes $8 a day, with an electricity cost of $0.10/kWH or roughly $0.60 a day (which is possible in many cities, and this can be improved by overclocking the memory and underclocking the core), it's making at least $200 a month. That's an ROI of three and a half months at worst, and then it can be flipped on the used market to nearly recoup that initial investment (or recoup more if supply issues are still a thing at the time).


Really? A 1080 TI still produces a few dollars a day easily.


It depends on what his assumptions are. Currently a 3080 produces $7.71/day in revenue[1]. If you extrapolate that out for a year you can break-even on your investment in less than a year, even if you pay scalped prices. However, this doesn't factor in difficulty increases, and depending on your forecasts it can vary from never breaking even to breaking even in less than two years.

[1] https://whattomine.com/coins


Can confirm. I’m mining on 2x1080s and 2x2070 Supers, and definitely and easily turn a profit after electricity costs, however, it seems like profit is starting to drop rapidly to the point where I seriously question if it’s worth it. The GPUs run at 56c/120w avg. so it’s not too taxing, but it’s still putting wear on all the hardware, fans, etc.

Selling the cards (at above MSRP) now and putting that straight into BTC would probably turn a better profit long term, albeit, very much a gamble. The mining is definitely getting harder/less profitable and at some point soon (2021?) ETH is changing from PoW to PoS.


I'm just doing rough numbers here and I claim to know basically nothing about mining outside of a simple calculator off google. Hardware wise (more my knowledge base), I imagine the older hardware might have interesting economic characteristics - like if you could stock a fleet of company vehicles with really super great 20 year old Toyotas versus newer Toyotas. Problem is, finding that many 20 year old Toyotas of that reasonable quality, etc.

Edit - "quality" not "quantity"!


With free power? iirc the RTX 3080 draws 200+ W.


I mined Ethereum about 3 years ago using the Nvidia 1060 and they never drew max power. Conversely, you could use nvidia_smi to limit their power draw without making much of an impact on hash rate.

I’d be surprised if it was any different with the current lineup of cards.

It would be really interesting to see what the M1 draws doing this. Probably not much. But then it’s not doing a whole lot of hashing either.


Was there any critical reason you stopped mining?


I didn’t like the noise and the price had collapsed. I sold the cards for a very good price. I would have made 5 or 6 figures if I had just left them on. Oh well, it’s not the worst financial decision I’ve ever made. Hell, I bought TSLA at $15 and sold at $30. F me.


I thought the main strengths distinguishing the performance of the M1 were the idle power chips that saved a lot of energy, and the shared CPU/GPU memory. Am I remembering that right?

Do either of those things equal better intensive mining efficiency, much over a dedicated traditional rig, where you're flat out all the time and emphasizing GPU?


No, and Apple even tells you that when they indicate the processor is geared toward running simplified AI for applications purposes and not meant for training new models. Mining coins seems clearly more like training new models.


The neural engine for simplified AI models is separate silicon from the GPU.


If mining speed is any indication of GPU performance (I am thinking TensorFlow training), M1 is about 50 times slower, than RTX 3090, and about 18 time slower, than 1080 Ti.


Yes, yes, miners. Please go bother the Apple community and leave NVIDIA GPUs alone.


How do you know that the mining application is using the M1 GPU and not the M1 CPU?


You can see this line from the log. Using Device : Intel GPU 0.0 Apple M1 OpenCL 1.2 Memory : 10.67 GB (11453251584 B)


Whats the spec of M1 Mac GPU ?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: