Hacker News new | past | comments | ask | show | jobs | submit login
GeForce RTX 3060 Ethereum Mining Restrictions Have Been Broken (videocardz.com)
256 points by optimalsolver on March 16, 2021 | hide | past | favorite | 313 comments



"The GPU maker seemed confident that its restrictions couldn’t be defeated, even claiming it wasn’t just a driver holding back performance. “It’s not just a driver thing,” said Bryan Del Rizzo, Nvidia’s head of communications, last month. “There is a secure handshake between the driver, the RTX 3060 silicon, and the BIOS (firmware) that prevents removal of the hash rate limiter.”"[1]

Funny how it turned out to just be a driver thing.

[1]: https://www.theverge.com/2021/3/16/22333544/nvidia-rtx-3060-...


>Funny how it turned out to just be a driver thing.

sort of makes one wonder how many whizz-bang hardware features get advertised that are little more than just software, but are oversold as unique physical engineering methods/techniques.


I understand that pretty much the only difference between $3000+ "professional" GPUs that get certified for AutoCAD and other workstation systems are identical to $1000 "gaming" GPUs, just with a different on-card BIOS/firmware and e-fuses. I remember ages ago that people were able to simply re-flash their ATI Radeons into FirePro cards and that unlocked higher performance in some applications.

Same thing with Cisco switches for port-unlocks and Tesla cars (EAP and FSD are just software features, assuming you have HW2+, and rear heated seats are standard, just not activated unless you pay to enable it if you didn't get the cold-weather package when you ordered it).


Keep in mind, Nvidia does have some genuinely good process mastery around handling encryption intended to keep users from utilizing hardware in ways they don't intend, and creating trusted computing platforms where safety critical systems are concerned.

https://docs.nvidia.com/drive/drive_os_5.1.6.1L/nvvib_docs/i...

They've moved from keeping that sort of thing in EEPROM's to burnable fuses from what I understand, and I'm pretty sure what I'm aware of is pretty far behind state of the art.

If they've managed to set up the key management as well as they have and keep things hush-hushed enough to keep the nouveau folks obstructed, this seems either like an uncharacteristically careless mistake, or some seriously well executed malicious compliance from somebody.

Either way. I still find it irritating to the extreme that this type of thing only seems to happen to the detriment of users. Nvidia wins either way. Miners or gamers will buy out their cards.


It is my understanding Nvidia allowed cryptominers to pull up semis and load them up with GPUs in exchange for about a billion dollars.

My guess is this was accidentally on purpose (see also the recent shareholder lawsuit; Nvidia did win that one btw).


> It is my understanding Nvidia allowed cryptominers to pull up semis and load them up with GPUs in exchange for about a billion dollars.

Source?

Never heard such a thing. Partners (MSI, ASUS, etc): absolutely. NVIDIA, I've never heard any evidence they sold directly to miners.

> My guess is this was accidentally on purpose (see also the recent shareholder lawsuit; Nvidia did win that one btw).

I think it's incredibly likely that we see this re-introduced on 3080 Ti or 3080 Super or whatever they call it which is about to be released, as well as with future generations of cards, which strongly argues against this being an "accidentally-on-purpose" thing.

People love to make up conspiracies about NVIDIA but all it takes is someone building a beta driver from a feature branch based on an old build that didn't have the mining brake yet, it's quite probable that this is just an accident. But it's NVIDIA and they're the only company people love to hate more than Apple.

People hate NVIDIA for releasing drivers that broke the mining brake, and they'll hate NVIDIA for reinstating the mining brake in the future. People get mad no matter what NVIDIA does.


I'm assuming they're implying that this "theft" was something else: https://arstechnica.com/gadgets/2020/12/340000-of-nvidia-rtx...


Which is still partners. And that's like, a single container getting lost. $340k is a far cry from billions as well.

I'm assuming he's referring to an RBS article which was mischaracterized by Barrons and then widely broadcast in the tech media... the RBS article looked at hashrate growth since launch and attempted to estimate the number of 30-series GPUs sold (the number they came up with was about 7% of gaming revenue). Barron's then took that number and trumpeted it as NVIDIA selling cards directly to miners when that's not what the original article said at all, they were just estimating how many cards ended up mining from any source, and it was sub-10% regardless.

But of course nobody actually went back and looked at the original article, and because it's NVIDIA everyone is completely willing to give full faith to any claims that make them look bad. Lie goes around the world before the truth has its pants on, etc.

But if there's an article I'm willing to read it.


I was indeed referring to the RBS article, which as you mention was bunk. I read it in passing several months ago. And it was $175 million they estimated, I must've crossed wires with the recent $1B shareholder suit.

Thank you for making me do a little leg work. Perhaps I should have done that in the first place.


Finally dug it up, source methodology from the original RBS article:

https://twitter.com/dylan522p/status/1332502890104188929/pho...


Thanks for being open-minded!


It's too easy to hate a company like nvidia. Even if you make things up, they're probably guilty of them.

I'll start to like nvidia when they provide anything remotely similar to the AMDGPU+Mesa experience on linux. Until then, hacking optimus and unstable kernel updates are the norm.


That's not really true anymore.

One of the big differences is RAM.. the A6000 has 48GB, a 3090 only 24 (And the 3080 12). Also, the RAM in the A6000 is ECC, and of course in the consumer cards it isn't.

The A6000 also has ~10% more CUDA and Tensor cores.

So, no, it isn't just a different firmware.


The differences you're listing off are all matters of how the same underlying silicon is configured during the final stages of manufacturing. The workstation cards that support ECC RAM are using the same RAM chips as the consumer cards, and the memory controller is just configured to use a portion of the memory for ECC instead of as additional capacity. Having one, two or four DRAM dies per memory channel doesn't have any direct connection to whether it's a workstation/server GPU or a consumer GPU, and the consumer GPUs could just as easily be equipped with the same 48GB instead of 12GB or 24GB, if there were sufficient demand.

When you look at the workstation cards that aren't top of the line, it's clear that they are nearly identical hardware to more mainstream consumer gaming GPUs, and the most significant hardware differences are usually smaller coolers on the workstation cards leading to lower clock speeds.


Note: it's less about demand and more about price discrimination. Prosumers realize a higher benefit from their graphics cards and therefore have a higher willingness to pay, but they have higher RAM requirements than do others. This allows nvidia to segment their customer base by restricting the RAM on their gaming cards. If they didn't restrict the RAM on their gaming cards then they couldn't segment them. That doesn't mean there's not consumer demand for cards with higher RAM and marginally higher prices.


Starting with 3000 series the consumer cards use ECC too right?


Newer generations of DRAM (both GDDRx and the DDRx used in desktops and servers) are dense enough that they need on-die ECC. But that only offers some protection for data at rest, while the traditional kind of ECC involves also transmitting extra bits in order to also protect against transmission errors, providing end-to-end data protection. I don't think Nvidia has started allowing the latter kind of ECC to be enabled on consumer products.


Sounds like it isn't exactly the same, but they are able to detect some transmission errors:

https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-f...


That surprises me because memory corruption in something like texture-memory for game world rendering is far-less important than memory corruption in shader program memory. GPU memory needs to be fast, and ECC can slow it down - and add cost. So if they were going to use ECC in gaming GPUs I’d expect it to be limited to areas reserved for shader programs and “program state” rather than output texture memory. If all of VRAM is ECC backed then I’m wowed and kudos to NVIDIA for doing that.


ECC slows down RAM however RAM speeds are starting to become bounded by error rates. To push into the faster timings ECC actually becomes essential to keep the system from becoming unstable.

This is the reason why DDR5 is so much faster despite using ECC effectively by default.


For GPUs, it's not just the RAM timings and clock speeds that matter. The way they implement ECC also directly subtracts from the effective memory bus width and therefore the total bandwidth.


This is true for off-die ECC (i.e the kind that computes ECC between chips using a separate chip) however on-die ECC (the ECC logic and parity memory being included in the memory chip itself) like what DDR5 will have doesn't have this bandwidth issue. It causes the dies to cost slightly more but otherwise largely results in the same performance as non-ECC memory.

In the GPU space, this is why HBM2 memory doesn't really have a performance difference between ECC and non-ECC memory.


10% cores sounds like softlocked or lasered off cores, rather than a physically different card (unless it’s a single gpu with 10% more cores than a dual GPU).


> I understand that pretty much the only difference between $3000+ "professional" GPUs that get certified for AutoCAD and other workstation systems are identical to $1000 "gaming" GPUs, just with a different on-card BIOS/firmware and e-fuses. I remember ages ago that people were able to simply re-flash their ATI Radeons into FirePro cards and that unlocked higher performance in some applications.

Yep. I have an old laptop with an Nvidia Quadro FX770M GPU that (hackintoshed) macOS sees as a Geforce 9600M GT and runs the included Geforce drivers quite happily on because the FX 770M so close kin with its consumer counterpart.

In fact, in a twist of irony for years the supposedly-more-stable Quadro Windows drivers had a power state bug with this card that would cause the laptop to bluescreen if the GPU tried to ramp down to an idle state. The only solution was to prevent the GPU from idling or to run a different OS with more generic drivers.


I’m not sure if I got tricked or not, it sounds ridiculous but apparently you have to pay extra to get heated backseats in Teslas. It’s a software thing, so it’s not like you need to install new seats. It works similar to miners - consume electricity and generate heat


Who knows, maybe the seat "is" actually using mining to heat the seats. That would be quite the April fools joke.


Every resistive heating device, be it hot water system; space heater; kettle; oven; stove; heat lamp; etc, could just be a proof-of-work IoT device that generates (very little) income.


If everything did that, the profit from doing it would be almost nothing, as difficulty increased. It would also be wildly more expensive than just a simple piece of nichrome wire, and you also now have to have a cellular module and a significant amount of continuous data traffic over it.


Yes, but people would make money in the gradient in between when nothing does it, and when everything does it.


Or Infrared cameras, they have the same image sensor, but cost much much more if you by one with a higher resolution (enabled)

https://www.eevblog.com/forum/thermal-imaging/flir-e4-therma...


Flashing Radeon 6950 cards with 6970 firmware was commonplace in the 2012 era


That was before the era of VBIOS-signing. You straight-up can't boot any VBIOS image that's not signed anymore.

It was also before the era when PROM fuses were used to permanently disable hardware at the factory. So like, nowadays when you do something like flash Vega 56 with Vega 64 firmware, or flash 5700 with 5700XT firmware, it doesn't actually turn on any cores anymore, they're permanently disabled now. What changes is things like memory speed (Vega) or power limits (5700) that are used to artificially gimp the cards and increase the performance difference beyond the actual core count, as just disabling cores often doesn't give enough difference anymore to be marketable (Vega especially was very bottlenecked by other parts of the card other than the cores).


2012 :-) how about 1999 3Dfx Velocity 1000 TMU unlock to Voodoo3 2000? https://www.anandtech.com/show/401/3

then there was 2003 Radeon 128MB 9500 to 9700 mod https://forum.beyond3d.com/threads/turn-your-radeon-9500-int...


sorta similar to cars that can get over the air updates and suddenly accelerate from 0 60 faster then before...


Yeah, little known fact. The Tesla Model 3 Performance and the Model 3 LR AWD have the exact same batteries and motors. There's literally nothing stopping Tesla from offering an even higher performance boost for LR AWD owners to grant the same 0-60 as the Performance model, but they don't because it would cannibalize Performance sales.


I present you with Flir 80×40 resolution E4, one setting away from full 320×240 E8

https://hackaday.com/2013/11/04/manufacturer-crippled-flir-e...

Oscilloscopes are another product notorious for shipping same hardware with different jumpers/variables in the firmware.

https://tomverbeure.github.io/2020/06/27/In-the-Lab-Tektroni...

somewhat famous DS1052E 50 to 100MHz hack https://www.youtube.com/watch?v=LnhXfVYWYXE

DS2072A 70 to 300MHz https://hackaday.io/project/8115-rigol-ds2072a-upgrade-to-ds...

MSO5000 70 to 350 MHz https://www.eevblog.com/forum/testgear/hacking-the-rigol-mso...


I wouldn't say nothing. A higher power profile will cause more wear, leading to more warranty claims. TANSTAAFL.


The warranty is likely to expire before the higher wear actually starts leading to reliability issues.

I'm also unsure how much it really makes a difference in an electric motor compared to a combustion engine. Unless you're going to the track (or significantly breaking speed limits), you're only putting extra wear on the motors for 2-4 seconds at a time, which is all it takes to get to your 40-70 mph target speed.


That's okay, warranties exist for a reason.


Download more ram, but real. What a world live in.


Oh, like RTX Voice - which 'requires' a 20-series GPU as a minimum requirement. Of course after editing the installer it works just fine on pretty much any GPU with cuda cores.


I’m 100% sure RTX Voice could run happily on a cpu. Might be a fun target for one of the groups working on CPU back ends. It’s a really good NR though; I’ll give it that.


https://krisp.ai/ is basically that, but I've found RTX voice running on the GPU (1660ti) doesn't slow down my laptop while Krisp does a bit.


Did anyone ever really try to pitch that RTX Voice actually needed 20 series hardware to work as opposed to being something Nvidia were gating to try to sell more cards? I thought it was pretty transparent myself but maybe I just assumed.


Marketing needs features to sell the higher end products on. Its easier to bullshit it and artificially limit than to come up with things that physically can only work on the high end sku.


I'm pretty sure this is exactly the reason why Nvidia is entirely unwilling to cooperate on FOSS drivers, and why neither Nvidia nor AMD are willing to support FOSS device firmware.

Which is a shame, because a GPU with FOSS firmware and comparable specs to the AMD/Nvidia hotness could use that software-driven flexibility to its advantage.


Reminds me of how every modern silicon has a "cutting edge, future technology" neural network built into.

Then you notice the asterisk and realize it's just an accelerator (usually just a few instructions with some dedicated logic) and all the NN is still in software.


And you notice the NN software doesn't even use the accelerator half of the time


Raytracing (RTX) is largely a software thing, although real hardware got added to improve ray-triangle and ray-AABB intersection performance.


> real hardware got added to improve ray-triangle and ray-AABB intersection performance.

Isn't that the whole point? I was doing raytracing in the CPU in 2002 for a university project. Doesn't mean I could use it in a game.


Everything is a software thing until it's accelerated by fixed function hardware like, you know, GPUs.


It’s hardware if it’s embedded with C! Never forget that -

With love,

USPTO


You meant NVidia Quadro/Titan workstation performance?


I don't think the comment was wrong. The driver is limiting the rate and the card validates you're using a signed driver. It's still the whole protection system, not just the driver.

I.e. you couldn't update the driver yourself. That relevant "hack" seems to be the official beta/development driver without the restriction. (The other solutions don't modify anything)


How would the card validate you're using a signed driver? The card only sees what the driver sends it, so presumably the input could be spoofed. Also the card is not the root on the system's TPM.

Usually it's the reverse - normally drivers validate that the card's firmware is signed.

As an example, people would hack AMD Polaris card firmware memory timings for better mining performance.

To do so you needed to disable the firmware signature check in the AMD driver, and to do this you needed to disable the driver signature check in Windows.


Yeah, people seem to forget those mining GPUs that were released specifically to prevent their resale in the aftermaket gaming community that people still managed to get to output video _despite the cards lacking physical ports_. They did it with a modded driver.


For those looking for details, I remember watching a Linus Tech Tips video on it a while ago. Just managed to dig it up: https://www.youtube.com/watch?v=TY4s35uULg4

Forum thread has more: https://linustechtips.com/topic/1021372-nvidia-said-we-could...


well somebody could sign the driver with an authenticode cert. just costs some bucks and need to rewrite the publisher.


Is it some Looking Glass [1] kind of thing, or did they add physical ports to some unused traces or something?

[1]: https://looking-glass.io/


Sort of. It's the exact same thing as Nvidia optimus. Linux calls it DRI-PRIME or PRIME offload. Get it =P ?

Big, display-less GPU renders frame, image is copied into VRAM of smaller GPU with a display. Smaller GPU draws image to screen.

Most of the gamer laptops do this. There's no physical link between the display and the big GPU. I believe the mobile cards enable a second DMA engine that is usually soft locked on GeForce to handle the transfer.


Same as first PowerVR gaming chips.

Midas3 (1996) https://vintage3d.org/midas3.php

PCX1 (1996) https://vintage3d.org/pcx1.php

and PCX2 (1997) https://vintage3d.org/pcx2.php

Tomb Raider running 1024x768 30FPS on a card half the cost of 3Dfx Voodoo1 https://www.youtube.com/watch?v=5GMesT4WKzI It could even run up to 60FPS at 640x480, the best hardware platform running Tomb Raider, and only one allowing over 30FPS.


They basically render frames on the video port-less GPU, and then pass them over to a second GPU for actual output. So probably quite similar to what Looking Glass is doing.


Sounds interesting. Any link for info about this?


I did rely on the assumption that some check exists. Of course it could be spoofed, but that could be hard enough to require reverse engineering the whole driver to figure out. Or the limit could rely on the identification done on the card itself and sent back.

Either way - my point is, we don't have enough details to say the original description from nvidia was wrong.


The way it most likely works is there is a Falcon microprocessor with a cryptographic coprocessor built into everything post-Maxwell. Nvidia burns in keys during manufacture that the cryptographic coprocessor will use to calculate a signature of the firmware loaded in by the driver against. Since Nvidia/the manufacturer have that key, they can sign firmware that can access the most important, impactful API calls on the card. Likely, the hashrate limiter was intended to be implemented in the firmware.

This is the same thing that has thwarted Nouveau all these years, and if you don't screw up your driver releaselike they did, it is quite effective. You could write your own firmware and load it onto the card, but you can't access the Hi-Security API's like power management, reclocking.

There's no network connection, so if the OEM's key was ever leaked, there's nothing they could do prevent anyone from writing whatever firmware they wished. Just as they can't take back the driver they done goofed releasing.


> "There is a secure handshake between the driver, the RTX 3060 silicon, and the BIOS"

What does this person think "driver" means?


Right? That’s exactly what a driver is lol.


Doesn't the post say that this partially removes the rate limit, still not 100%? So clearly something is still working no?


Probably the same as many fancy copy protections turned off by patching single jnz/jz command


oh, i guarantee you it is not just a driver thing.

In a few months, when games start to get awful performance on those cards, they will claim they can't fix it because it is in the silicon and can't be patched.

But this is on everyone hoping graphic card driver authors would know anything about anything. heh.


There's also the fact that a few days ago Nvidia released a signed driver that disabled the restrictions [1] _by accident_. But now that the signed driver is out, anyone can just revert to that at any time and mine whatever they want.

Fantastic failure to a flawed endeavour.

[1] https://www.theverge.com/2021/3/16/22333544/nvidia-rtx-3060-...


That's what this article is about, yeah, NVIDIA is confirming they screwed up.

It also appears the VBIOS on MSI's 3060 cards didn't include it either, I'm not 100% clear but I'm guessing you need both here. I've seen other discussions where people report flashing it on EVGA cards and then it increases performance to the expected levels.


They can just revoke the signature, right?


They can't revoke the math that makes the signature valid. As long as the card is unable to get a network connection to work out the driver is revoked, there is no going back on this.


Only if network access can access revocation


Even if they do some CA-side revoking crap you could just have your system not check, or revert to a previous state in which the signature was not revoked.


They can adopt online signatures, it's not hard to prevent rolling back an install like that. But you now need an internet connection at update time.


I can always restore my hard disk to an earlier image, or reinstall the driver from offline sources and lock down updates with apt-get hold


You can't do that to an iPhone (or a Mac with full boot security enabled) and they could do the same thing if they wanted.


Well, that's why I don't use iPhones or Macs then.


You could just turn off full boot security, but it makes you susceptible to downgrade attacks.


Makes me wonder if it was intentional. They make more money from miners buying up all their cards, no?


i'm pretty sure they did this to prevent a class-action lawsuit

limiting a product after a purchase, i'm pretty sure this is illegal


I have my cheque from Sony for OtherOS class action sitting on my desk. The cheque is dated 09/16/2019 [nine years after OtherOS was disabled via firmware], and it's for three dollars and two cents.


Still makes me mad. I bought a device that plays PS3 games and runs Linux and then Sony said I have to choose.


> limiting a product after a purchase, i'm pretty sure this is illegal

NVIDIA announced this limit prior to public availability and it made widespread news, meaning everyone who bought a 3060 could reasonably be expected to know about the restriction. No chance for a class action.


Plus Nvidia can offer a full refund of the MSRP, fully knowing that even used cards go for more.


I'm quite surprised if there isn't a binding arbitration clause somewhere, which blocks class actions


> could reasonably be expected to know

I’d wager I know about 1% or less of how restricted 99% of what I buy is.

There’s probably some term or condition on my water bill saying I can’t export it to Cuba or use it to moderate nuclear fission.


It's a GPU. It does GPU stuff, about as well as can be expected.

If you had very specific numbers in mind, you must have been looking at benchmarks, and all the benchmarks were equally locked and mining benchmarks certainly would have mentioned it.


There is not. Right of first sale. Once it's sold it's done. That's why places are so eager to add microprocessors to everything. With the right cryptography, you can lock the user out of tampering with anything.


source?


A month ago on HN, for example (a couple dupes were merged into that iirc): https://news.ycombinator.com/item?id=26180260


I’m curious how this is different from shutting down a paid service or removing software features. Seems like there’s a large grey area.


My take on this is that Nvidia knew fine well that their code would be broken. And in short order too.

I reckon this was just lip-service to consumers, and possibly their investors, that they're "doing something about those bad cryptominers that are making us loads of money".

Plus, whenever something is marketed as unbreakable, I picture guys reading it saying "Oh really? Challenge accepted!"


I was also wondering why nvidia felt the need to step in. Selling cards is selling cards, having too much demand doesn't seem like a problem they should want solved.

Sure it does incur some PR cost as gamers are frustrated not to be able to purchase the cards, but I doubt it really damages their brand in the long term, and AFAIK AMD also faces similar issues so it's not like it massively benefits the competition.

If I were nvidia I'd be cynically very happy that miners are buying my cards in droves. What am I missing?


The problem is that the mining bubble is temporary, and when it crashes, these miners dump their cards in the used market with _very_ attractive pricing.

In 2018, something similar happened; ETH mining became unprofitable and the result was that used Pascal cards were available for the cheap in the second hand market. The problem for NVIDIA? Turing, the successor to Pascal, released in September 2018.... and many people, instead of buying that, just bought used Pascal cards.

IIRC, NVIDIA even commented that high end sales were low compared to previous generations in an quarterly report.

Granted, the second hand market cannot be solely blamed for this, as Turing was really unattractively priced. (They shifted each tier up, so the $370 GTX 1070 was replaced with a $500 RTX 2070)

These mining cards cannot be used for gaming (this is not the first time NVIDIA releases dedicated mining cards. 2017 had P106 mining cards as well, and when people tried using them for gaming on the cheap++, NVIDIA blocked it through a driver update) and thus reduce the risk for NVIDIA in the future.

++there was some software that allowed you to render on a GPU, and output video from another GPU like the dedicated iGPU of Intel CPUs. Looking Glass, I think? But I don't fully remember. These days, this functionality is actually integrated into Windows 10.


Market segmentation is always better if you can pull it off. That way you can optimally serve two demand curves instead of just one. Sell more units at a lower price to gamers, for higher total profit from gamers, and sell at an even higher price to crypto miners, for a higher total profit from miners.


And those miner cards can't become gamer cards so gamers will need to buy a new card instead of an old high end miner card when crypto falls out of style again.


There is the issue that miners tend to dump the used cards on the market much sooner than a gamer would.


Also quite toxic for your brand, since those used cards may have been run under high power loads and at high temperatures for a long time - the card may wear out sooner and make the gamer unhappy that it was "unreliable" and they wasted their money


Miners undervolt their cards so as to run cooler and under a lower load.


IMO Nvidia must sell their video cards for who want to buy gaming device in this timing: new gaming consoles released and in shortage. PC vs console.


They anticipated selling stripped down cards to miners for a higher price.

If they wanted gamers to have these cards they would just give out purchase invites via gaming hardware sites like Gamer's Nexus. They already make deals to supply OEMs before general retail, so all the gaming hardware being build is getting a supply either way.


You forget the other part of the equation. Gaming companies. If gamers can't get their hands on decent hardware they'll stop buying games altogether. Thus the pressure doesn't come from consumers alone.


Even if they knew that it will be broken quickly (which I also assume they did), it wasn't still necessarily lip service; even slight deterrence against miners during launch probably helped to get more cards in the hands of gamers, which was the intention. Of course it is impossible to know how big of an impact it really had, but I'd like to think it had at least some for the so important launch day orders.


> helped to get more cards in the hands of gamers

Well, you know, except for the cards that they binned specifically so they could be sold as 'mining' cards. There's no such thing, these are all 3060 GPUs, the mining cards have a slightly different firmware that wasn't even able to do the one thing it was supposed to do.


please, I was here on Hackernews writing challenge accepted and prepared to buy as many of the non-mining cards on credit cards just waiting for the limitation to be removed


> which was the intention

Neither me, _Understated_ or you know exactly why nvidia did what they did. What we do know, is that ultimately they answer to their shareholders and they have to show profit. Their motivations for why they do the things they do, can usually be boiled down to: "because it makes us more money".

Nvidia probably doesn't care where the graphic cards go, miners or gamers, as they still make the same amount of money.


You can want brand loyalty. If Nvidia cards are soaked up in mining & game devs start optimizing more heavily for AMD gpus, suddenly Nvidia loses market capture of gaming gpus & end up at the whim of crypto. Miners are also much less brand loyal, someone comes out with a better hashrate/$ card & you've lost your customers

So there's arguments for miners not being a diverse customer base that would encourage wanting to keep gamers in line. But I'm saying this without knowing any details, so I don't know


There is also the thing that nvidia wants to get people on board the RTX train as quickly as possible, that is fairly long term strategic initiative. And miners absolutely do not help there.


Gamers will be around in 10 years, miners might not be


Funnily enough, that's what everyone said about gamers 20 years ago.


As compared to who ?


They crippled their Geforce cards in terms of mining while simultaneously launching different, more expensive, dedicated mining cards[0].

[0] https://www.nvidia.com/en-us/cmp/


If they could actually enforce this then I would be all for it. Crypto mining is the biggest and most pointless waste of energy since nuclear weapons testing. All these crypto miners are just making the planet hotter, while pissing off people who have more legitimate uses for these cards but cannot realistically obtain one because they are being hogged by the miners.


Tell me about it. The company I work for uses tons of video cards to run ML models for online advertising/customer profiling/etc..., and it's gotten noticibly more expensive thanks to crypto miners.


And I would far rather those video cards be used by crypto miners than online profiling and advertising assholes like your company.

You and GP don't get to decide what is a "legitimate" use for a GPU. All the outrage that other people want GPUs for purposes you disagree with and entitlement that only some users really deserve to have access to them is just ridiculous.


I was actually being sarcastic. I own crypto and agree that it's hypocritical to rate some unnecessary uses of power/video cards as better than others.

I do work for a Big-N, so the first part of the comment is true.


I'm putting together a workstation for my research group and the only thing I'm missing is the GPU. Apparently supply is the worst it's ever been and I can't find anything even 1-2 generations old that isn't marked up 2x.


Nuclear weapons testing kept us from using nuclear weapons, worth it.


Who is "us"? Surely you don't mean the USA.

https://en.wikipedia.org/wiki/Atomic_bombings_of_Hiroshima_a...


The event you are linking to was the real world "test" of nuclear weapons. And quite a lot of people attribute the lack of real world "testing" of nuclear weapons since then to that specific event.


Eh, the vast majority of these cards are used by people who are rendering fantasy worlds so they can run around pretending to be an elf, or a soldier, or some other stupid thing. Lots of them are full grown adults too, just wasting time that they could be doing something meaningful with.

That's way WAY dumber than powering a next generation global financial infrastructure.


While we’re at it, let’s get rid of the other meaningless stuff, like fiction novels, watching sporting events, poetry, visual arts, theater, and being a tired technocrat with shallow self-righteous ideas about what is and isn’t meaningful.

Of course, most gamers don’t game 24x7 while their multi-GPU rack is pegged at 100% either.


Heck, while we're at it both sex and reproduction can be put on the chopping block as well - nothing has meaning except the acquisition of wealth, right?


Wow, that's new. A cryptocurrency proponent who hates video games.


I would argue that entertainment is of greater societal value than polynomial solutions you can trade for crack.

That isn't to say that I think crypto is worthless, I think it has the same worth as any fiat currency. We don't spend an entire 1% of the world's energy printing and handling a tiny fraction of the world's banknotes. Crypto is orders of magnitude less energy efficient than every other form of currency on the planet. It's like if everyone started commuting to work in 18-wheeler semis.


This is the third comment on this thread I've seen that *really* needed an /s


It's possible those cards are being made with chips that are too defective to output video. Unfortunately only Nvidia knows for sure and they're not going to say anything truthful about it.


All of the mining card SKUs except one were confirmed to be using older Turing-based architecture on a completely different fab and node (and one that's not currently bottlenecked to the same extent as TSMC 7nm/Samsung 8nm are).

So they are not harvested silicon, but they also don't compete for fab time either.


If the cards were cheaper, or even the same price but available, then I doubt there'd be an outcry. But they charge more, for a surely defective part.

It'd be like buying a car and finding a clause that says if you manage to make money off of it they'll take a percentage or "brick" the hatch or something, to prevent your profitable usage. Nvidia should back the fuck off and just sell a product.


Even if they were cheaper miners wouldn't like them: one thing which substantially reduces miners' operating costs is the resale value of the GPUs they are using. They can buy a GPU, mine on it for a year or two, and then sell it for ~50% of its original value (or even higher in today's markets). The mining cards with zero resale value (which will much more quickly become landfill) would need to be substantially cheaper to make this worthwhile.


Right. But if they openly sold defective half-finished cards for mining they'd still be less popular products but people wouldn't feel insulted.


I haven't seen any pricing on the CMP cards yet. Are you sure they're more expensive?


I would assume they would be. Miners price hardware based on the hashrate, resale value is nice as a safety but serious miners aren't figuring on dumping all their hardware the first time price drops a little bit.

Take a look at the data compiled by Tom's Hardware and you'll see the 3060 Ti has a 51% price premium over the 3060 on ebay. That is the "mining premium" in action. The 3060 is a bit slower on paper but it mines pretty much as well.

https://www.tomshardware.com/news/gpu-pricing-index

As such, if the mining cards mine twice as fast, they're twice as valuable to miners, and they'll pay twice as much for them. Pricing them less than gaming cards is leaving money on the table and completely unnecessary when they'll sell anyway.


Well, in this case it looks like NVIDIA themselves broke the restrictions - or at least they are not present in the beta drivers.


> I reckon this was just lip-service to consumers

Nope, this is price discrimination so that Bitcoin miners buy their more expensive mining card. They don't want this to be broken.

See https://en.wikipedia.org/wiki/Price_discrimination


At this point NVIDIA seems to be motivated by shareholders value, and nothing else.

They shifted part of their production capacity to release dedicated cards for mining that have way shorter life-cycle, because they can't be used by anyone else.

They found a way to sell more cards with a higher price tag.


It's mostly short-sighted shareholders looking at quarterly results. I'm not sure long-term shareholders like the games NVIDIA is playing with the companies.

They lost Tesla for self driving, Waymo didn't even consider them, even though they had early advantage in AI. Comma.ai started with NVIDIA chip as well. I think Jen Huang is brilliant, but he's listening to the wrong people.


> At this point NVIDIA seems to be motivated by shareholders value, and nothing else.

It's for a long time. This is just the latest layer.


You're right!


What’s wrong with that? Being an NVDA shareholder isn’t some select privilege for an elite few. Just buy the stock. I’m currently holding an NVDA position worth over $300k off a modest investment I made 5 years ago. To attack companies for delivering value to shareholders is just jealousy, and speaks ignorance of the market.


The median US net worth is about $121k [0]. When you're talking about having $300k available to put into stocks, and saying "Just buy the stock.", that is something entirely out of reach for the majority of people.

[0] https://www.cnbc.com/select/average-net-worth-by-age/


The OP said they opened their position 5 years ago, when the price of NVDA was 20 times less. So they probably only invested about $15K out of that median $120K net worth.


The median net worth represents assets you own, not cash on hand. When someone has a median net worth of 120k that is likely the value of their house minus the outstanding debt their car and their pension fund, and unless you're willing to take a loan out on your house to gamble on the stock market, the median American does not have 15k to buy stock. In fact the median American can barely cover 400$ in emergency bills.

https://www.theatlantic.com/magazine/archive/2016/05/my-secr...

Not to mention that it would be insane for someone to pump their entire savings into the stock of an individual company.


> take a loan out on your house to gamble on the stock market, t

Most homeowners do this.


Boo hoo? Whats your point? Nobody has a human right to affordable graphics cards.


xwdv started his post by saying that having significant stock was not something restricted to an elite few, and then followed up by giving examples that directly contradict that point. My goal wasn't to comment on the graphics cards themselves, but rather to add support against the pervasive and self-destructive idea that maximizing the value of the shareholders is the sole duty of a corporation.


Aww. Game over for the discussion, so you metaphorically flip over the table? Geesh.


>modest investment

>$300k

You live in a reality that is much different than most Americans, Europeans or Asians. Reasonably, the average person can maybe get a single share, and being willing or able to blow $500 on a single share is rare. Get back down to earth, you're part of an extremely privileged group of people that fully benefits from your shares, while actual customers do not get to see returns.

Putting money in a company is not work. You are not owed anything.


Regarding your last sentence... That is opinion not fact, there are a lot of laws protecting shareholders.

But I generally agree with the rest of your comment, an investment of approximately $20000 in a single stock 5 years ago is not a modest investment for the average individual.


I'm just trying to understand your point. Are you implying that a majority of Nvidia's customers can't afford to invest $500 in something with inherent risk? Or are you using the term "customers" to mean average people in general? The former doesn't really make much sense to me, considering the majority of Nvidia customers are presumably spending hundreds of dollars on luxury goods.


> You are not owed anything.

He is owed whatever the price someone is willing to pay for his shares which for now seems to be around $300k. He is not owed anything else though.

> Putting money in a company is not work.

He did work for that money, and he did take a risk investing it in these shares, a risk many were not willing to take. This is what he is being rewarded for now. Doesn't make him a hero or whatever of course.


> He did work for that money

How do you know that? People who have lots of money often did not work for it, it's inherited.

> he did take a risk investing it in these shares

People often talk about investment risk like it's a real thing, but it's not. You can either afford it or you can't.

If you can afford the loss its not risky, it's just gambling.

"I'm taking all the risk, I deserve most of the benefit" is just bullshit rich people talk to avoid the fact that the people who actually do all of the work deserve more of the upside.


Most people in the highest net worth lists are self made. Inheritance doesn't help you that much if you are retarded. Mostly it's gone in one or two generations.


This is basically just rich people propaganda, and it's straight up not true.

The majority of the top 400 richest inherited at least 1 million.

And of the ones who did not inherit that much, often they were given that kind of money as seed money from family.

Straight up 20% of the top 400 richest were literally born in the top 400.

Knock it off with this self-made malarkey. It's just not true.


>a risk many were not willing to take.

A risk many _cannot_ take. For a company as large as NVidia, the risk of it collapsing is basically nil. The risk of itsInvesting at random in the stock market gives you a pretty much guaranteed growth on average.

However, many of us cannot blow 20k. We can't blow 10k. We can barely blow 1k. Once again, OP ought to have a little bit of humility and recognize his incredibly privileged position.


Many don’t , but that’s mostly on them , stop blaming others for misfortune.

And dragging them down too.

Humility is a good thing Tiptoeing near snowflakes is not,

He never mistreated or harped at anyone He just said his opinion that he likes the stock. He wasn’t shitting on poor people.


"This is not just for the elite few"

"I have a modest 300k position"

When you look at these two statements in close proximity, you feel that there's perfect humility to the fortune of that position?


I have a 300k position as a result of a modest investment of about $20k I made 5 years ago. Christ, I never said $300k was the modest investment and on a forum filled with software developers making 6 figures I’m sure more than a handful can invest $20k toward their future retirement.

Of course now I’ve been pelted with downvotes and no one will remember what I actually said.


Even then...

"60% of Americans could not come up with $400 for an unexpected expense".

Based on that, what proportion of Americans could afford to put even $20k into the stock market, _let alone call it a MODEST $20k_?


I suspect that there's a relatively large body of folk that read your comment and understood your intended meaning without offence, either agreed or disagreed with it, and adjusted their personal opinion on the subject appropriately without feeling the need to argue with you about it.

Good on you for investing smartly and/or getting lucky. Hopefully you use your good fortune to somehow make the world a better place, and find personal happiness in the process.

It's seemingly a non-sequitur, but I just remembered reading a short story in primary school English class years ago. It was about a distopian society where attractive, athletic folk had to wear ugly masks and weights to encumber them. Did anyone else read that?


What are you doing on a software board then? California houses cost at least 2-3 mil. What's the home ownership rate? Are you implying that all Californian home owners are part of some elite?


I struggle to see this is a good faith argument.

My income is in the "2%". I also entirely understand that that makes me in "the elite few".

> California houses cost at least 2-3 mil.

You mean "houses in a few select neighborhoods and locations", such as SF, more prestigious areas in LA. Also, they don't. Median SF house price: $1.4M. Sunnyvale, $1.6M.

Not "at least 2-3M". And certainly not in conjunction with this:

> implying that all Californian home owners are part of some elite

The median Californian home price is $700K. So no. But since you seem to imply that California is somehow defined as "places where homes cost $2-3M" then yes, absolutely. If you own a home worth $2M+ you are unequivocally "one of the elite". You may not be buying a new private jet every five years, but you are also entirely capable of a lifestyle that the VERY VAST majority of Americans have no chance of attaining.

For reference, a $3M mortgage with a substantial downpayment results in a mortgage payment of nearly $14,000/month, which with Jumbo loans requires an annual income in the region of $800K/year.

Please don't try to continue an argument that says that someone making just shy of a million dollars a year is somehow neither privileged nor elite.


I am stunned that people don't know that Americans are far and away outliers on the programming pay scale. Please tell me you're aware that not all programmers are in America making those crazy amounts.


> Are you implying that all Californian home owners are part of some elite?

This is certainly a true statement.


Here's an idea. Don't buy shit you don't need and you'll have more to spend on stock plays.


username checks out


> "blow"

Implying this is some frivolous expense akin to gambling your savings away at a casino or buying an expensive car is the wrong mindset. This is not entertainment; saving for your future and retirement is a life impacting matter that people need to take seriously. Yes I can't afford to and won't blow $3k on a fancy new TV. But I can afford to set aside $100 a month to put into a stock account even if I skip eating out or having a fancy phone plan.


Having a good phone plan is a more important investment than retirement if you're using it for work.


Legally wrong with? Nothing. Brand-wise? Huge. Gamers and miners hate them for this.

As a shareholder I'd imagine you want continuing revenue, not a short bump before the market switches to AMD.


Why would gamers hate them for this?

The whole point was to try and free up more 3060 supply for gamers.


The restrictions were never going to work against industrial miners. What they actually do is make it harder for gamers to mine in their spare time, making it harder to offset the card cost and meaning big miners have more profit and can bid up prices even more.


Because they screwed up the earlier launches unrelated to mining, because they practice annoying market segmentation anyways, because it's seen as a cash grab, because it isn't helping one bit despite the hassle, because stopping mining also degrades the value of the card for resale even if the original buyer is a gamer, and because restricted drivers are DRM and people hate that.

They just have no credibility now so nobody believes it was to help anyone but themselves.


Getting a strong vibe of 'let them eat cake'.


> Plus, whenever something is marketed as unbreakable, I picture guys reading it saying "Oh really? Challenge accepted!"

https://www.roadtovr.com/developer-reward-jailbreak-quest-2/

> Oculus Founder to Match $5,000 Reward for Anyone Who Can Jailbreak Quest 2


Perhaps this segmentation has some contractual clauses that effectively bans companies from buying gaming cards to mine with? They're not enforceable everywhere in the world, but if it covers a big chunk of the cryptomining world then that's probably good enough. That way they can charge a premium for mining cards.


Nvidia already did this once before: their driver license specifically prohibits you from using GeForce cards in datacenters. Only Tesla-branded cards (no, not the car maker) are licensed for use in a datacenter.

The license is pretty much unavoidable, even if you're using Nouveau, because the cards won't work at all unless you give it's power-management processor an Nvidia-signed binary to run, and Nvidia won't sign Free replacements for that binary.


Mining operations are pretty fucking sketch for the most part (in china they just straight-up steal electricity) and miners won't care about terms of service in the same way that a billion-dollar corporation would. You're not going to get ratted out by the secretary and have all your assets siezed, short of requiring an always-on internet connection and ratting the user out for mining (which would be incredibly unpopular) there's not a whole lot you can do, it's all possible to do stealthily.


I thought they were trying to stop used up cryptominer cards from flooding the graphics card market. So this doesn't help Nvidia in that respect.


I came here to write this comment

Nvidia gets to blame the "super tech savvy crypto hackers" while printing revenue through the roof


No they don't. They acknowledged they released the beta driver themselves that didn't have the mining brake.


I don't know why they thought that would work for their marketing though. Every tech news source I follow saw right through that. And it was reasonable to be upset at the restriction because someone buying the card for their gaming machine would be the most affected if they wanted to try to make back some of the cost of the card by mining when their computer is idle.


It was my understanding that you need very cheap electricity for cryptomining to be worth it, am I mistaken? If not I suspect that most people don't really have a good incentive to mine coins, unless the use it to heat up their flat or something (and assuming that they don't have more efficient heating available).


5700 XT makes about 6-7 USD a day, depending on electricity cost. It hasn't really been a factor for a while now. You'd make profit even in Germany.


As a technical point.


Every penny spent on electric heat would be more efficiently spent mining.


Not necessarily, if you have access to a heat pump for instance you could get much more efficient and cheap heating than a mining rig would offer you. You could also be heating your home using natural gas or some other resources costing a lot less than electricity in some markets.

Being able to reuse the heat from mining does make it easier to break even of course, but given how competitive the sector is I doubt that it's enough to offset the costs in most places.


Yes, a heat pump would be better. I was thinking of what most renters have available - baseboards and portable heaters.

Cover the costs, maybe not. Offset, certainly.


These miners are really getting on my nerves: not only they are polluting the planet (1% of electricity in 2018 was for cryptomining), but are driving off the budgets of whole deep learning practitioners


Carbon tax, carbon tax, carbon tax.

Rather than coming up with a billion restrictions for every new way people invent to waste resources, just charge people for resource usage.


Crypto mining can't run on renewables?

Careful here. The 'renewables are cheaper' narrative is sacred.


If crypto miners want to run on renewables, they can pay the cost of the resources to build those renewables and then enjoy the power they create for free.

Crypto mining on its own is not bad or evil. Only ruining our environment is bad.

How this would work in practice is you charge it at power generation. If the power company uses coal as a power source, they are hit with a carbon tax which gets passed on to the customers. Then every power company that doesn't use dirty sources has cheaper prices and all customers move over.

In the mean time you can use all profits from a carbon tax to discount other taxes so the end user pays no extra but they are constantly pushed to cleaner sources. This has been tested in the real world and it works very well.


Crypto mining only runs on renewables when both available and profitable, and will happily run on coal if that's the cheapest method.


I'd argue deep learning is just as much a waste for most businesses anyway


Strong argument.


Most business logic neural nets simply train into what could've been an or-gate


Shh!

You can’t get a $700k/y salary for being an “OR-gate logic engineer”, but you can for being an “AI engineer”.


I purposefully waited a bit to build a new computer (first one I've been able to build for myself and not others!) seeking the new AMD and Ampere cards, with the purpose of being able to do some research on my home machine and not a lab one. I figured I'd have to wait a month or two after launch, no biggie, but I got my 5900x last week and still can't find a 3080. This shortage is insane and no one seems to be doing anything about it. I'm extra peeved at NewEgg's shuffle system which I'm pretty sure someone that just took a week of a stat's class could tell you that they are giving the edge to bots and scalpers. While I'm peeved at the miners, I'm also upset that retailers aren't combating them and helping consumers.


With all this talk of crypto, it's ironic that the problem is essentially the same as bitcoin, it's a Sybil Attack problem. Scalpers can pretend to be dozens of people (registering multiple accounts, trivial variations on address, privacy cards to generate unique credit card numbers, automating all this by botting, etc) and the real underlying problem is that you need is a unique personal identifier.

I personally think hardware identifiers are a good approach. NVIDIA has GeForce Experience, they have access to hardware identifiers and telemetry that will let them see who's actually gaming. This approach works very well for games like Overwatch, where hardware bans are essentially "game over" and you now have to get a new motherboard, CPU, GPU, clean OS install, and new IP address in order to hack again, it is fairly painful and slows down hackers significantly after a ban wave.

(Or something like requiring a unique driver's license, one per person.)

All of the other approaches are trivially gameable. A queue? Scalpers will register 1000 accounts and take all the queue spots. A lottery? Scalpers will register 1000 accounts and win all the lottery cards. etc etc. There will always be retailers if you do not currently have a card or are morally opposed to GFE.


BTC uses most of the energy relating to cryptomining, and people almost certainly aren't buying GPUs in bulk to mine bitcoin since ASICs are far more efficient.


0.1%, not 1%


A counter argument I heard recently: all of the work and resources we need to have physical money is comparable in energy needs and environmental consequences.

I have no source at hand.

I’m also not sure what to think of it.


That would still be a damning statistic given how tiny a portion of overall transactions are done through crypto.


And the economics of reversibility depend on that number increasing as the amount of economic value transacted increases. If 0.01% of the world's money moves through bitcoin today, if you want that to be 10% then you're going to have to spend 1000x more energy to keep the network secure.


PC gaming alone wastes the same amount of energy. There are also consoles and the whole industry of game creators surrounding it. In my eyes they are both useless and pollute. Some get kick out of games, some out shitcoins.


Huge difference between running a graphics card at 100% capacity (ok, most eth miners underclock but still use a good amount of wattage close to 100% of the time) and someone playing a video game for a few hours after work one day.

Nowhere near the same amount of energy. Gaming certainly doesn't use 1% of the planets energy, for example...



The number of people with high-powered gaming machines-as described in that paper- is comparatively small, and their sustained energy use would be a fraction of what crypto-currencies waste.

Moreover, socially games are probably less damaging and they’re not attempting to strip money off people in large volumes.


It says 120tw/h a year right there, which is the same as bitcoin now. Do you think it's a bogus paper or what are arguing against?

I agree with the criticism about bitcoin, I just find gaming industry as useless burning carbon and trying to milk money from the naive.


> Some get kick out of games, some out shitcoins.

> I just find gaming industry as useless burning carbon and trying to milk money from the naive.

Can Video Games be divorced from the Gaming industry? Are you saying that gaming is as invalid a form of entertainment as crypto mining and speculation is? So other forms of entertainment are more valid, or is entertainment not a valid use of time?

If we assume that video gaming is not an invalid form of entertainment, how many gamers are measured compared to miners and speculators? I suspect that entertainment person-hours per joule is substantially better for video games than it is for bitcoin. In which case, if video games were a valid form of entertainment, then what we want are video games that use less energy.

If video gaming is a lesser form of entertainment, why? People said 100 years ago or thereabouts that books were a bad influence on society.

I don't quite understand what the naive part of video gaming is though. Is that in reference to gatcha?


Are you in favor of banning movies too then? We burn a lot of resources (both physical resources and electricity) to film and distribute and show movies, and we could repurpose all of that for other pursuits


No, all I said is that in my eyes they are on par, not that we should ban any of them.


I would not call video games a waste of energy. Leisure is important.


The same amount?

What gamers are running multiple GPUs in a game 24/7 with 100% utilization?

I don't think it even comes close.


So, does anyone have any technical details on how this works? Both what the block was in the first place and how it was defeated?

I imagine it has something to do with how Etherum is mostly integer math and boolean operations for hashing, while gaming workloads tend to be floating point, but I'm just guessing.

Another factor in the background: the pandemic has caused a worldwide fab capacity shortage. Lots of manufacturers are running around with their hair on fire trying to book fab slots. Even car production is being held up due to IC shortages.


I don't know of any detailed analysis, but apparently the block is based around the patterns of memory access etherium mining produces (etherium's proof of work is designed to be memory-bandwidth limited to discourage the use of FPGAs and ASICs in mining, on the perhaps mistaken basis this would prevent the kind of centralisation present in bitcoin mining). It's quite plausible that the implementation of the proof of work could be adjusted to avoid this detection, though depending on the level of sophistication of the recognition it may have been difficult to do without impacting performance.


Also your AV gear. One of the factories of a premium DSP supplier burned down, and now no-one can get DSPs for their home theatres.


Same factory made chips for Audio production hardware too. In fact I think anything using SHARC components has been hit pretty hard. Just a bad year all around for chip manufacturers across the board.


Link?



Isn’t this less due to Covid than to increased demand from different sectors like automotive, etc?


Apparently it's gotten even worse in the last month, since Samsung and NXP's fabs are in winter-blizzard-disaster Texas, and TSMC is in a once-in-a-century level of drought area


I heard that as of last week Samsung's texas facilities are still not back up - when they lost power they also basically lost containment of their materials (most of which are ultra-pure and require special handling) and pressure for the cleanroom itself (so they have to clean and re-sanitize the whole thing).

https://www.extremetech.com/computing/320511-samsungs-austin...

Turns out maybe that setting up your factory in a libertarian tax haven that went out of its way to prevent grid interconnects that would have allowed power delivery to continue, in order to forego enforcement of federal regulations that would have prevented these outages in the first place, might not have been such a great idea.

But muh Texas, "did you know we have the right to secede!?" So cool! /s


Most of Samsung's fabs are in South Korea; the Austin fab is a minority of their production.


Hopefully they give up on the idea for future cards. NVIDIA shouldn't decide what I can do with my GPU.


That's the same idea as for GeForce/Quadro. Gaming GPUs are crippled so that they work poorly with professional software (ex: CAD).

AMD does the same thing with Radeon/FirePro.

I don't expect Nvidia to give up on that. And to be honest, for me personally, it is a good thing. I don't mine and I don't CAD, having GPUs unavailable for the former and overpriced for the latter results in more affordable prices for myself.


Hell can we modify Tesla cars to mine Ethereum when it's not in self-driving mode?


Apple is laughing at this comment.


Ah yes, a secure platform is the same as an aftermarket graphics card. Boy I wish I could disable SIP on my mac and then boot whatever I want or install whatever kernel extensions I want... oh wait. I can!


These mining restrictions are... interesting, given that Ethereum is about to move towards staking rather soon anyway ..


I’ve been hearing this - “moving towards stakes soon” - for at least a year now.


They have it running, and people are making money off of it. The difficulty now is social, convincing people to switch.


No, it’s commercial: convincing the miners to switch who obviously have a vested interest in perpetuating PoW


The miners don't need to switch; in fact, they can't. There's no mining on the PoS network. Miners will effectively have to stay on ETH1 or move to another cryptocurrency.

I think it's largely political, getting stake pool nodes and people using ETH2. There will likely be ETH1 and ETH2 running side by side for some time. ETH1 can be burned to get ETH2, so there's a one-way transition (though I'm not sure if this story is complete yet, that's the plan anyway).


I wonder if we will have a new wave of miners. If you want to get in, instead of spending $10,000 on mining gear, spend the same amount on coins which are easier to sell off later when you want to get out.


Staking income is much lower than mining. Eth 2.0 is targetting a 7% annual return, outside of any currency inflation. So there will be stakers, but it won't be the gold-rush of mining.


>soon

The earliest likely merge date is Q1 2022. The only exception is if the situation is urgent (miners attacking, or nicehash having a dangerously high percentage of hash) - in this situation the merge itself could be done in a month I think.


My god, this whole thing went pear shaped so fast, it's incredible. I bet this launch will be taught in B-schools as a lesson for would be marketers.

Nvidia seemed to be doing the right thing at first, taking the concerns of their most enthusiastic customers seriously and doing something about it. Given how badly the company usually treats their customers, it seemed like a nice gesture.

Then the Internet trolls came out and accused the company of just giving the issue lip service, since it was obviously just a software fix, it wouldn't really stop the big crypto farms from using the cards. Nvidia is evil, as per normal.

Then the company responds with a lie, that it's actually a hardware implementation. No one believed it, but that's their story. Nvidia is trying to do the right thing.

Then all the gamers - most of whom aren't even in the market for a RTX 3060 - start flipping out about how the resale of the cards is going to be lower because they're all nerfed. So by charging full price for the card anyways, Nvidia is actually ripping them off!! Nvidia is evil again.

Then this screwups happens and everyone now is screeching about how they knew it all along and Nvidia is actually lying about EVERYTHING!! Maybe they're creating artificial shortages in the first place! It's all a conspiracy man!!

I don't think I've ever seen an reasonable idea like this go tits up so fast. Remember when Netflix announced Quickster and it bit them in the ass so hard they had to cancel the whole spinoff? This is what this reminds me of: Good intentions badly implemented.


I'm not too surprised. I think it's in Nvidia best interest both to claim that the hash limit is unbreakable, but at the same time also make it breackable


In my opinion they set their prices too low and have not invested enough in fabs. The demand now way exceeds their capability to manufacture and unfortunately the only way to "restart" is to set prices to a level that will allow building the capacity to satisfy the demand at lower price point. They can try doing those PR tricks, but they'll just waste even more money without addressing the problem.


NVIDIA doesn’t own any fabs, they don’t really have a choice.


Interestingly. I was watching some interviews with the founders of 3DFX, and when asked about why the company died, their straight answer was: because they bought a fab.


I thought it was because 3DFX started to compete with their own hardware partners?

3dfx was a chipset vendor - whether or not they fab their own chips shouldn't matter all-that-much: it's who makes and sells the boards that counts. The Voodoo1 and early Voodoo2 cards were made, packaged, and sold by hardware partners like Creative, Diamond, etc - but with the Voodoo3 and later SKUs of the Voodoo2, 3dfx did it all by themselves, so why would the miffed Creative Labs and Diamond lend their sales channel expertise to 3dfx? PC OEMs like Dell, HP, etc also probably had deals with Creative for their Sound Blaster, and it wouldn't surprise me if Creative politely asked them to not buy 3dfx-made boards in exchange for a sweet discount on Sound Blaster cards...


Regardless of the competition with hardware partners, GeForce 256 blew it with nothing on the 3dfx side to respond with.

Fun to read discussion here: https://m.slashdot.org/story/7743


Yes, exactly this, they bought STB Systems and began manufacturing their own boards. I wasn't sure how much of the manufacturing process this entailed.


If they raised prices they'd still make more money, rather than the money going to scalpers, and they could offer TSMC more money for more fab time, which TSMC could take into account when planning future fabs.

As far as I can tell the reason they don't raise prices is that the PR hit they'd take from all the gamers hating them for it would be a bigger deal than the additional revenue.


They went to Samsung instead to get more capacity this gen.


The had to reduce prices on the 3xxx series due to poor sales on the 2xxx series, it just so happened that crypto growth was happening at the same time. It was a bad situation, but raising prices doesn't solve the problem.


Can anyone enlighten and explain why there were restrictions in the first place?


The official reason is availability -- it's nigh impossible for a gamer to buy a GPU right now for something close to MSRP. The real reason is probably a desire to prevent miners from selling their GPUs on the second hand market, thus increasing sales of new GPUs.


Nvidia is still trying to fulfill orders for 3080 from launch day. The availability problem is real. They do not need to artificially increase the demand for their GPUs if they already are selling way more than they can deliver.


That's true today, but will it remain true when the cryptocurrency bull run ends, we enter a bear market, and mining becomes far less profitable? Not just that, but the most profitable coin to mine, Ethereum, is on track to move away from Proof-of-Work, so we can expect a lot of second-hand cards to be sold at fire-sale prices in 2022.


The problem isn't miners. The problem is no one wants to stop bots and scalpers from buying them en masse.

If eBay prevented scalping and e-commerce created bot protections, demand would stop being absurd.

Nvidia stopped selling cards on their site because they couldn't figure out how to prevent bots from buying them all. Not to mention Nvidia is selling cards directly to large miners by the pallet load.


Why should ebay prevent scalping? That's the entire reason their site exists, to see who is willing to pay the most for scarce goods.

Ebay is not going to go away just because it hurts gamers' fee-fees. Nor should it. And I say this as someone who is generally opposed to mining and generally upset with 30-series availability. Demanding that ebay shut down is both a ludicrous over-reaction and will not solve anything anyway.

You're literally blaming the market for providing a clearing-price. If you want to reduce demand for gaming cards, forcing miners into their own product segment is the most plausible way to do that, then the clearing-price falls.


I don’t think scalpers can drive up prices long term. What’s the difference between all the scalpers having the cards and nvidia having the cards? Either way people are only going to pay what the card is worth to them. If cryptocurrency miners buy the cards, they’re not reselling them.


Remember about ten years ago, how there was a fad to raise money, then use it to buy every single item on the shelves of a small convenience store? The intention was to keep small locally-owned stores in business by buying more from them. However, even though it brought a lot of profit on that one day, it meant that the shelves were empty for the next few weeks. The regulars saw that, and needed to find somewhere else to shop. The regulars left, and some never came back, leaving the store in worse financial position as before.

Cryptocurrency miners are driving up the prices of GPUs. NVIDIA wants to make sure that they have stock available for their regular customers, because that is where the long-term profit comes from. Ramping up production is not feasible on the short time scale that cryptocurrencies have been around, nor is it known whether cryptocurrencies will be around for long enough to recover such an investment.

TL;DR: Cryptocurrency miners are messing up the long-term GPU market, and NVIDIA is trying to maintain that market.


>Ramping up production is not feasible on the short time scale that cryptocurrencies have been around, nor is it known whether cryptocurrencies will be around for long enough to recover such an investment.

Bitcoin has been around since 2010. They've had plenty of time to realize this was coming. Even the thickest person could've spotted this wave coming in 2013, in addition to the continued rise of computer and console gaming.


Hard to say. I started mining bitcoin in 2013 and it had already moved past GPUs. Bitcoin wasn't taken very seriously back then, and the alt coins were taken even less seriously, so I can see NVidia not taking it seriously too.


But they didn't want to think about it for even five minutes and figure out a friendly way to do this. They should have given gaming sites purchase invites to hand out to members.

There are other ideas too, higher prices on raw hardware but cash-back incentives if bought with games or gaming hardware.

Now everyone hates them. AMD couldn't have paid for such marketing. AMD gives everyone ECC support, unlocked cards. They're (currently) the anti Intel/Nvidia, and the market darling.


Officially it is a supply/demand issue. The crypto miners are buying up all the high end cards, so NVIDIAs main target audience (gamers, workstations, etc.) end up empty handed.

What a lot of people seem to think: Used up cards could end up flooding the market while miners migrate to the newest cards, cutting into NVIDIAs profit or NVIDIA wants to make more money by selling pure mining cards that can't be reused for anything else.


This doesn't make sense to me: why don't they simply price their cards 3x higher and sell them all to miners? Selling to the highest bidder is kind of "Capitalism 101". Their profits would skyrocket.

And what about gamers? Well... they can buy used previous gen cards from miners.


Some companies might be interested in keeping their long term market happy instead of loosing their flagship products to what they might consider a relatively short lived craze.

> Well... they can buy used previous gen cards from miners.

Yeah, last gen cards, which would ruin any lead NVIDIA has over its competition. Going by the steam hardware survey NVIDIA currently owns the PC market, that wont last if they whole inventory is bought up by crypto miners and might lead to serious long term consequences if gamers and engine developers shift their focus to AMD and Intel.


There is evidence that many of the new GPUs are being bought by crypto miners and there is vocal out cry because some feel these cards should be for consumers (gamers), which is frankly bizarre.


I don't know what is bizarre about it. I think cryptocurrencies are fundamentally flawed due to their environmental impact, and should be banned on that merit alone. Add in the inability to reverse fraudulent transactions and their role in the rise of ransomware, and cryptocurrencies are easily something that should be banned.

I don't see it as bizarre to be frustrated that one's hobby is being priced out of reach by what amounts to an environmentally-damaging pyramid scheme.


>I think cryptocurrencies are fundamentally flawed due to their environmental impact, and should be banned on that merit alone

A GPU is a GPU, why is it's environmental impact fine if it's 31 million people pretending to be a cowboy in RDR2 but bad if it's being used for financial transactions.


For the same reason that sending a letter to a friend is different from using the "Send-a-Dime" chain letter. One is something that improves the human condition and brings enjoyment, while the other is a pyramid scheme with negative externalities under the guise of a get-rich-quick scheme.


My crypto investments have definitely improved my life more than Red Dead Redemption 2 did so I disagree.


That is not at all inconsistent with my viewpoint. In the same way that a traditional pyramid scheme enriches a few while the majority lose money, bitcoin speculation can enrich a few before either the bubble pops naturally or it has legislation made against it for being environmentally damaging.


The same number of financial transactions can be handled with an exponentially smaller amount of energy. A secure distributed ledger of financial transactions does not inherently need nearly this much computational power to maintain, as is evidenced by Ethereum's impending move to proof-of-stake.

You can't really say the same for video games. To reduce the carbon footprint of a user playing RDR2, you either need newer more energy efficient hardware, or you need to alter the experience the game provides to make it less computationally expensive.


> A secure distributed ledger of financial transactions does not inherently need nearly this much computational power to maintain, as is evidenced by Ethereum's impending move to proof-of-stake.

You can claim that as evidence after Ethereum has actually moved to proof-of-stake and operated in that mode for a significant length of time without any notable vulnerabilities. Proof-of-stake has some known drawbacks compared to proof-of-work; in particular, at least in naïve implementations, there is nothing to prevent a malicious party from staking the same coins in multiple chains (forks) simultaneously, a flaw which proof-of-work systems are specifically designed to avoid by making the proof depend on each chain's history. One assumes that the Ethereum developers came up with some sort of mitigation for that issue, among others, but it has yet to see real-world testing with significant funds at risk should it fail.


I'd claim traditional databases as an example of a secure distributed ledger of financial transactions. Every single credit card processing network does exactly this. What cryptocurrencies do is add "trustless" as a requirement, and that's where the power consumption comes in. I also think that's a weird requirement to have, precisely because treating every interaction as adversarial introduces so much overhead.


The requirement exists regardless of how hard or easy it is to implement. "Trustless" is a requirement because history shows that there are many circumstances where existing payment networks cannot be trusted. Payment networks sometimes refuse to do business with certain parties merely because it's not profitable due to bad credit, a higher-than-average chargeback rate, public relations, or other reasons. Even when the networks themselves are not actively antagonistic, they are vulnerable to political influence which may take the decision out of their hands.

Where trust is feasible transactions can be settled cheaply in separate records and not on the blockchain itself. The Lightning network is one such protocol; support for inter-account transfers on the same exchange is another. However, it's good that the trustless option exists for the cases where trust would not be justified.


>To reduce the carbon footprint of a user playing RDR2, you either need newer more energy efficient hardware, or you need to alter the experience the game provides to make it less computationally expensive.

Or add extra tax on those who play video games to cover the carbon impact or ban high powered GPUs so they can't even cause that impact anyway, keep all gaming at low end mobile device processor levels.

It's all still code running over the same circuits whether it's being used to verify crypto or let people pretend to be a cowboy. If crypto is taxed more for it's impact then so should video games and maybe things like Marvel movies for the impact of their CGI rendering.

or does the argument of environmental impact go out of the window when its Rockstar and Disney making money from running processors full blast rather than some nobody getting rich from crypto.


There's two separate use cases/markets both wanting the same product with significantly different budgets - gaming and crypto mining.

The crypto mining market has way more money to burn because they're making it back over time via mining, which means that unless there's enough supply for both markets the gaming market gets (almost) nothing until the crypto mining market has bought all the cards they want.

I have no idea what the solution to this problem is, and I don't think poorly designed driver restrictions is it. But it is an actual problem.

Are there any historical examples of similar situations?


Supply and demand should normalize when enough capacity has been provided.

The challenge is at least two fold in my opinion, first of all semi conductor fabrication capacity is stretched right now and additional capacity has a significant capital requirement. Secondly it's probably not clear if crypto demand will stick around long enough to warrant additional capacity, the bottom has fallen out of it before.


It seems like a good strategy for NVIDIA to prevent losing market share among gamers. They could maximize profits near term by seling cards at whatever the market will bear. But they'd yield their gamer share to AMD, and that would have long term negative consequences.


Not that AMD seems to be able to get any cards out the door either.


Ahh the old "lets make it look like a mistake and release a driver that unlocks the mining restrictions".

Maybe I'm old and cynical but does anyone actually think this was an error on NVIDIA part? Their goal is to sell as much GPU as they can, their shareholders wouldn't want it any other way. My opinion is it was a convenient mistake that will sell a lot more cards.


> Maybe I'm old and cynical but does anyone actually think this was an error on NVIDIA part?

Yes. And I assume we'll see the mining brake return on 3080 Ti / 3080 Super and on 4000 series cards as well.

This looks very much to me like someone built a feature branch on an old version that didn't have the mining brake installed, and then pushed the branch somewhere they shouldn't. I see no reason to doubt that - classic case of Hanlon's Razor in action:

"never attribute to malice that which is adequately explained by stupidity".

> Their goal is to sell as much GPU as they can, their shareholders wouldn't want it any other way.

Segmenting miners to a different, higher-priced segment makes them more money. You're selling two cards instead of one, on top of increasing your effective production capacity (because the mining cards are on a different, older node that is not bottlenecked as badly).


If you wanted to really limit it to gamers, why not partner with steam/origin/etc and have them pass out tokens to buyers for use with vendors based on previous gameplay history.

This of course requires vendors to also play with this scheme, which I'm not sure NVIDIA has the clout to ensure that point-of-sale people can enforce this.


I wish I could reply with a nuanced take, but I can't. Ban these cryptocurrency pyramid schemes already. https://www.cynicusrex.com/file/cryptocultscience.html


This could drive the bigger question of the feasibility of hardware imposed restrictions.


Exactly as predicted the day the NVidia announcement came out.

And a good thing too.


Now if someone could enable full CUDA on RTX 3060/3070/3080, that would be noice!

And multi-stream video encoding while we are at it.


> Now if someone could enable full CUDA on RTX 3060/3070/3080, that would be noice!

I thought there were no CUDA limits on the RTX 3070/3080/3090 cards?

I've got a 3080 from launch and have not noticed any artificial caps or performance issues when doing CUDA-related things.


AFAIK double-precision performance is artificially crippled on consumer RTX cards to differentiate from professional Quadro and Tesla.

https://www.techpowerup.com/forums/threads/nerfed-fp64-perfo...



Not quite the 3 days after release predicted in the original thread, but impressive and entirely unsurprising nonetheless.


I got my first even downvotes for suggestion that it would be rapidly cracked in that thread, followed by a lot of 'Nvidia is full of smart people' comments.

There always seems to be a persistent optimism for the effectiveness of this kind of thing. For example how long DRM or anti-piracy methods on games will be effective for.


It wasn't cracked. Nvidia released a driver which didn't do the restriction.


Either they're smart enough to "accidentally" release that driver, or they're actually "smart" enough to accidentally release the driver.


Wait till you meet google engineers here, who say "we give a lot of thought, and we know better" Here on HN I frequently notice that people fail to recognise, end of day, to err is human. Agendas change.


> Nvidia is full of smart people

If that were truly the case I wouldn't have to deal with multiple fuck ups caused by Nvidia releasing drivers with improper signing (windows seems fine with it, but virtualbox runs into issues with it due to hardening checks.)


I think it’s safe to assume that PC gaming is dead for the immediate future or until Ethereum moves away from mining


No one could have seen that coming... No one!


That took longer than I thought it would.


cool cool cool. I never wanted a new video card anyways.


New idea: Break the cards.

A gamer doesn't need mathematical perfection. 99% of gamers wouldn't care if a card game rendered something slightly off, a pixel or two out of place on a 8k screen. A slight mathematical error wouldn't be noticed. But that same math error would destroy any mining efforts. So I propose that, rather than feeble attempts at software-based DRM, that Nvidea actually break the cards. Add in a tiny math error that gamers won't care about but that will render the cards useless for anything requiring exact calculation.

This would of course be very difficult to design.


Modern engines render scenes in multiple passes, iterating over the same framebuffer until everything has been rendered. Engineering such a flaw in a way that wouldn't lead to dramatic cascading effect will indeed be very difficult to design.

Note that sometimes games do need absolutely exact results, or at the very least reproducible results. For instance sometimes you want to use something like `glDepthFunc(GL_LEQUAL)` which will only accept a fragment if its depth is equal to the depth buffer's value. A small fudging here would cause potentially large visual issues.

I'm not saying it's impossible but I expect that it would create a lot of headache as random, potentially unmaintained games would start glitching here and there.


Look at the Dolphin blog posts for loads of examples of where minor math errors add up to completely break a game.


The other big use and market nVidia wants to have is ML installations for data centers though. Those also want perfect math and are pretty big business for nVidia. Also no one is running an 8k monitor off a 3060 or particularly well off of a single card usually. Also I dread trying to track down a rendering bug and finding out it's an intentional defect in cards, effects are getting pretty intricate and with raytracing small errors will accumulate over the multiple bounces.


Intel managed this accidentally with the FDIV bug: for certain very specific values, floating point division gave the wrong answer.

Then there was the traditional rigged demo solution; one generation of cards behaved differently when the program running was named QUAKE.EXE. That is now almost standard, since part of the reason NVIDIA driver downloads are so large is a huge pack of compatibility tweaks for specific games.


I'm always amazed how consumers will demand less for their money.


I definitely wouldn’t buy something that had been intentionally crippled, on principle. Even if it were a very slight crippling.


Which would also make the cards useless for any GPGPU workloads.


Isn't that exactly what they want? To only allow the cards to be used for gaming, not compute tasks like mining.


The problem with that however is that games might be using the GPU for non-visual things such as physics or AI. Would those 'off by one' errors impact those tasks?


Yea. But in a good way.


That might actually suit NVIDIA's interests just fine - they could sell uncrippled GPGPU-capable cards at a premium whilst claiming that they're helping to protect the supply of cards for gamers. However I think it would be difficult to do this without also breaking rendering pipelines.


This would split the market for ML as well.


Approximate ML is a hot research topic.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: