Hacker News new | past | comments | ask | show | jobs | submit login
EVGA terminates Nvidia partnership [video] (youtube.com)
742 points by ribosometronome on Sept 16, 2022 | hide | past | favorite | 479 comments



Having watched most of the GamersNexus video, my takeaways were:

- Will maintain some stock for RMAs

- Made pre-production boards for 4xxx, will not manufacture

- 3080 and above are not profitable for EVGA

- NVIDIA is not transparent about pricing with board manufacturers, they find out MRSP same time we do

- NVIDIA limits MSRP heavily on cards so they can't do things to sell more profitable higher end cards

- It sounds like (this is my takeaway from GN's description, not their words) EVGA's CEO is tired of dealing with NVIDIA, wants to refocus on family, and there isn't a clear person to replace him and hesitant to sell to people who would mistreat employees/cut corners/damage brand.

- Will continue selling power supplies

- No plan to work with AMD or Intel making boards, CEO sounded dismissive toward it


I'm sure NVIDIA is a serious pain in the ass to deal with (Linus_middle_finger.gif), but perhaps one of the unspoken reasons is that, as suggested by the rumor mill, there's a serious oversupply problem with next gen GPUs. NVIDIA and AMD placed orders with the foundries based on the ridiculous demand levels they were seeing in the last couple of years, but now demand has cooled off from both crypto (Ethereum merge) and gamers (waiting for new gen), so there's still plenty of current gen stock sitting around. Supposedly they tried to cancel some of the volume with the foundries, but were not allowed to. So maybe EVGA is seeing the writing on the wall and getting out ahead of the game. I'm sure putting up with all of NVIDIA's crap is barely worth it even when making a decent profit. But it's definitely not worth it to EVGA if their cards will just to end up languishing in the bargain bin along with everyone else's.


>I'm sure putting up with all of NVIDIA's crap is barely worth it even when making a decent profit

It's illustrative to remember who and what started this: nVidia's practice of intentionally undercutting their manufacturing partners. See, manufacturing partners tend to apply secondary binning of their cards and sell them with an overclock thus delaying their obsolescence for a year or so, and that's bad when you're in the business of selling the chips themselves (while neutral to the secondary manufacturers; you just buy a new GPU from them a year later, who cares).

So it really isn't a surprise that that, combined with artificially limiting supply for the 40-series GPUs (and most of that allocation is likely going to be to, well, themselves), and now the fact that the price that nVidia likely demanded for those GPUs is coming around to bite the secondary manufacturers, meaning that they all bought (or were required to buy) the GPUs at inflated prices, they're leaving the business.

And guess what manufacturer wasn't forced to pay that premium? That's right, nVidia! So they alone have room to drop prices and still make a profit, while making the value proposition even worse for the secondaries (price premiums for overclocking make less sense as they diverge from the base model and those improved coolers are more expensive to make than nVidia's solutions).

EVGA is understandably the first to bow out; they've never dealt with AMD (who have supply problems for other reasons), whose GPUs haven't been the premium product for a while (arguably, ever), and if they say it's no longer profitable even before a primary usecase for GPU compute vanished overnight I'm inclined to believe them.

And sure, there are still going to be secondary manufacturers producing cards, but most of the other manufacturers are either do-all companies (large OEMs more famous for motherboards and laptops), produce AMD cards too (Zotac-Sapphire-Inno3D), or are otherwise diverse enough (PNY) that ramping down production wouldn't kill them as a company...


> they've never dealt with AMD (who have supply problems for other reasons), whose GPUs haven't been the premium product for a while (arguably, ever),

Yep. AMD can't write Windows graphics drivers that are stable and performant to save their lives, their cards use enormous amounts of power compared to similarly performing NVIDIA cards, they're nearly useless for computational stuff because most everyone goes for CUDA, and AMD cheated on benchmarks by overclocking/overvolting their cards and slapping huge heatsinks on them so that they'd last just long enough for a benchmark to run before starting to overheat.

There are only two markets for AMD cards that I see: Hackintoshes (increasingly irrelevant with the shift to the new Apple Silicon and inevitable phase-out of Intel-supporting MacOS releases), Linux systems, and AMD fanboys. All pretty small markets.

AMD is properly fucked in the GPU market now that Apple is no longer using discreet GPUs unless they manage to bring to bear the sort of enginering might they did for Ryzen (where they're also slipping as Intel pushes back.) NVIDIA can carry on with all their datacenter-class ML products and lose a bunch of money on the consumer side. The problem is...if AMD drops out of the GPU market, that leaves NVIDIA and Intel, and reportedly Intel's product is even worse than AMD. If NVIDIA is insufferable now, imagine them when they're the only game in town...


> There are only two markets for AMD cards that I see: Hackintoshes (increasingly irrelevant with the shift to the new Apple Silicon and inevitable phase-out of Intel-supporting MacOS releases), Linux systems, and AMD fanboys. All pretty small markets.

You're forgetting two more, one of which is massive: consoles and high-performance embedded gaming systems (basically, the Steam Deck and other high-end handhelds). Consoles aren't going to be super profitable on a per-unit basis, but AMD is making the chips both Microsoft and Sony are using in their boxes, and they're selling a lot of those boxes (right now, about 36MM since they debuted a little under two years ago). Similarly, the Steam Deck is a pretty damn good first attempt at a cohesive handheld that runs PC titles, and I could see that extending past Valve and into a generalized marketplace (and the fact you can plug in a USB-C cable into a dock and get a real Linux desktop is really, really cool).


You are also forgetting another market: HPC. AMD MI200 GPUs are the boss cards to have if you want to do HPC right now.


That's the first time I heard about that, do you have any examples? Google shows me one example of Microsoft using them but that's all I can find.


One I know of is the EU-funded LUMI supercomputer: https://www.lumi-supercomputer.eu/lumis-full-system-architec...

"The AMD MI250X GPU is in a class of its own now and for a long time to come. The technical supremacy and performance per watt were the primary reasons why AMD’s MI250X GPUs were selected for LUMI, explains Pekka Manninen, Director of LUMI Leadership Computing Facility."


You might look at the current No. 1 on the Top500 list, Frontier at ORNL. https://www.olcf.ornl.gov/frontier/ Arguably the most powerful machine today, delivering >1 Exaflop/s performance.


Also El Capitan supercomputer, and there are more.


I was very specifically speaking about the market for AMD's GPU cards, in the context of a discussion about GPU cards. I'm not being pedantic here.

AMD's embedded GPU unit may do fine, and it would be smart for them to shift their development priority to retain that market share. They lost the desktop/laptop discreet GPU battle years ago.


Replace GPU with CPU and you sound like someone who was very wrong after 2018 came around. Never say never.


I just looked at different benchmarks and the general consensus seems to be that latest Radeon and Nvidia cards are quite similar in performance. Nvidia seems to lead in raytracing and upscaling but in my opinion only the upscaling part is really important. And I bet that all GPU makers will have a decent solution available for that soon.


and Teslas. Seems like AMD’s play is high performance (somewhat) custom silicon as a service.


> AMD is properly fucked in the GPU market now

AMD GPUs are in XBox and Playstation.

> AMD can't write Windows graphics drivers that are stable and performant to save their lives

When they changed to RDNA they had teething issues. But their drivers are stable now.

> their cards use enormous amounts of power compared to similarly performing NVIDIA cards

Is this a joke? The 6800 XT draws less power than a 3080 12gb while having similar FPS. Ignoring FSR2 or DLSS.

Disclaimer. I've only ever had 1 AMD card (5700XT) and it was rock solid. Currently using a 3070 and 3050Ti.


> their cards use enormous amounts of power

People on enthusiast forums are now talking about buying 1.2kW+ power supplies for the new generation of nvidia cards. I don't think you can point at one or the other as bad here.


people on enthusiast forums have been sold a bill of goods by a couple attention-seeking twitter stars like kopite7kimi.

remember his rumors about 4090 having a 800-900W TGP? That got wacked way down in recent months by more reputable leakers, and now the number is rumored to be around 450W TBP (more like 375-400W TGP), which is plausible given the size of the die (and much more comparable to the rumored 400W of the dual-die AMD products).

So, off by basically double.

https://hothardware.com/news/geforce-rtx-4090-ti-allegedly-a...

Similarly he had 4080 at 600W and other Shit That Ain't Gonna Happen. Stupid partner boards pushing everything to 2x the official spec? maybe, but as a reference TGP? lol no.

if you believed that 800W was ever a plausible configuration for a consumer product, then I have a bridge to sell you. It was most likely some kind of engineering test bios that turned everything up to 11 at max clocks and he heard about that number and just didn't apply any common sense. Which, why would he when he could get months of attention off it from twitter?

People just race to buy into anything negative that is rumored about NVIDIA products and it's taken as absolute gospel as soon as anyone hears it. And the leakers have realized that and play into it.


I think that the fact it was even plausible to people who are fans of nvidia, and that it is discussed on said forums, means that customers don't have any expectations of nvidia keeping a lid on the power requirements. Both big manufacturers are recommending at least 750W supplies for their current flagships.

As such I still don't think it's fair to characterise just AMD as wasteful power hogs here, the post I was responding to seems to me to be unnecessarily partisan.


>AMD can't write Windows graphics drivers that are stable and performant to save their lives

I have an AMD GPU in my Windows PC and it works fine. It's stable. n=1 of course.

The reason I bought it is because I have a linux machine that currently has an Nvidia card. Fuck Nvidia and their linux drivers.

>their cards use enormous amounts of power compared to similarly performing NVIDIA cards

Don't think this is true at all. Eg first link I came across https://www.tomshardware.com/features/graphics-card-power-co...


Aren't all the game consoles running amd?


Yes, and I'm absolutely certain they're kicking themselves for having agreed to do so.

As far as I know, console manufacture comes with supply agreements- you agree to supply X number of processors at Y cost... even if the price of those processors, when fitted to a PCI Express card instead of a PS5 motherboard, would net 3 times the profit.

It's also a nice way to get out of having bad drivers- if developers are making games to the PS5 standard, then any errata becomes their problem more than strictly AMD's...


It's far more nuanced than this, really. Much like Apple funds a lot of TSMC's early R&D for those nodes - it's not just that they're willing to bid more, they are a risk partner in the development of the technology itself - console companies fund a lot of AMD's early-stage graphics development. Without their funding, the Polaris/Vega era of GCN would have seen even less iteration from AMD. And they often also fund custom variations on the general architectures, which helps keep hardware engineers employed. And they also contribute to AMD's wafer purchases, which lets AMD get better deals on the rest of its stuff because they're buying 2x as many wafers as they could use themselves.

The situation with TSMC being super super limited during the pandemic is unusual, and it was significantly amplified by the crypto boom happening at the same time. Normally it's actually a pretty good deal for AMD in a lot of ways.

The console companies tend to be very very conservative in their designs though and letting the console companies set the direction has left AMD significantly behind on many task-specific accelerators like tensor and RT. And tensor is no longer a niche thing, neural accelerators have been everywhere in phones for ages, Intel's laptop chips have had neural accelerators since 10th gen, intel desktop 11th gen had it, AMD's own 7000-series CPUs have neural accelerators, Intel GPUs have it, etc. Different product segments are finding different uses for it, and AMD has been not just left out of the rain but actually is slowing the development of the whole field because of how tardy to market they are. And beyond that they've taken a strongly anti-consumer position against open APIs that allow other brands to utilize hardware accelerators or closed-source libraries.

https://youtu.be/8ve5dDQ6TQE?t=974

Similar problem with RT... AMD's implementation is very lackluster and has been slowing the whole industry down. It's about half the performance of NVIDIA's first-gen implementation. A lot of times it just seems like AMD automatically assumes that NVIDIA is incompetent and wasteful, and there's some big savings to be had, so they'll take whatever NVIDIA does and cut it in half or a third or whatever (which is what they're rumored to do with tensors next generation - it has "accelerator instructions" but less powerful than the full dedicated units on CDNA, and presumably weaker than NVIDIA's dedicated units as well). Not sure if it's that, or if they're just cheap on silicon, especially with the console market's influence...


You're overselling the importance of console chips for AMDs bottom line. Console chips have notoriously low margins due to being custom in nature and cost down to oblivion, making their development an expensive PITA with very low margins.

It's why Intel and Nvidia refused to work with Sony and Microsoft for their console chips and only AMD took the original contract since they were close to bankruptcy at the time so every extra dollar mattered for them.


What I'm saying is that the final sale price isn't the only relevant number here - Sony and MS are paying to develop the graphics architecture in the first place. That's revenue on the frontend that helps develop the chips in the first place. The B2B sales of the console chips that result are lower-margin than the dGPU sales into the broader consumer market, but, it's hundreds of millions of dollars of R&D money that AMD doesn't have to find elsewhere a half decade before the product goes to market. That money has a lot of time-value, because it's available now, not 5 years and 10 billion dollars from now.

Early VC funding rounds are way less than you'll see when your product goes to market too. Applying your accounting philosophy, a couple million bucks surely must be insignificant compared to billions of dollars of sales revenue we'll see later, right? And they want how much equity? What a terrible deal.

Except for the part where you fold years before you get to "later", of course.

And that was a very, very, very real concern with AMD. They were right on the edge for years and years during the Bulldozer years, Ryzen was very much a last-chance move for them and they were obviously still scraping bottom for the first couple years even then. There are numerous factors that all aligned to keep them alive - they would not have been alive without Intel misstepping on 14nm (the early days were very bad) and then 10nm, they would not have been alive without Apple writing huge R&D checks to advance TSMC 7nm and 5nm far enough ahead of the market that GloFo bowed out of the game, they would not have been alive without consoles keeping the lights on while they spent every penny on Zen R&D, and they would not have been alive without their interconnect turning all their IPs into legos and allowing them to break and scale them.

Things were really, really, really touch-and-go for AMD in 2016/2017/2018. I figured they were going under. I have no doubt that console R&D money (along with those other factors) was instrumental in keeping the lights on at RTG.


> even if the price of those processors, when fitted to a PCI Express card instead of a PS5 motherboard, would net 3 times the profit.

The GPUs in Xboxes and Playstations are custom silicon.


True, but they are based on the RDNA arch. Same as the CPU cores are monolithic Zen cores.


I doubt they were kicking themselves considering they knew what was coming based on their experience with the last gen consoles.


It's not like they have any other options. Who else is there? NVIDIA? Microsoft's issues with them shows how precarious that relationship is. I'm sure issues with NVIDIA are why we haven't seen an improved Switch yet and I think there are clear concerns about potential SOCs for a Switch 2. Unless you have to, you don't want to work with NV.


All except Nintendo ;(


RDNA2 APUs are in a similar place as the Tegra X1 was when the Switch first came out; I don't think it's impossible Nintendo switches back to AMD graphics, they have a long history of shipping Radeon in their home consoles all the way back to the GameCube.


What? Did you go out of the loop at least 4 years ago?


citation needed.

I've had zero problems with AMD hardware on Windows (Linux kernel, on the other hand ootb still won't boot with this Vega as the amdgpu driver is a disgrace) and other than windows 11 being a complete failure, nobody has reported serious problems.

re benchmarks/power, they scale dynamically and do not have as tight power envelopes as other vendors, so obviously it will use more power [citation needed also].

Their generic GPUs are very fast, but why do they even care when they own the console market, the higher end in car infotainment market, etc?


Holy shit, is this guy being paid by nVidia?


> whose GPUs haven't been the premium product for a while (arguably, ever)

It wasn’t until the nVidia Geforce 285+295 that they unequivocally regained the performance crown (three years after the acquisition). AMD was still competitive, at the flagship level, until about 4 generations ago; so they were a reasonable alternative, if slightly less performant.

It was the constant rehash of GCN cores that screwed them up, with their limited ROP configuration and generally weak Shaders (after the third+ respin).


That lines up with the impression I got. They lose money CURRENTLY on the high end cards, it sure wasn’t going to get any better.

Might as well get out while the getting is good.


Unsurprising, considering the stupid games NVIDIA was playing during the past two years. There are no cards, but vendors aren't allowed to raise prices. Thus creating a cottage industry of zero-value-add scalpers.

It's not shocking that the vendors aren't interested in getting zero of the upsides, but all of the downsides of the bullwhip effect.


I was actually suprised to hear that high end cards are the ones where money is lost. Typically high end products are the ones with by far the biggest margins. They might just not be worth the R&D effort - but in this case I doubt that one is the problem.

To lose money, it seems like the raw chip price differences between mid level and high end cards would need to be even higher than what the MSRP differences are.

But if that's the case, why not just sell midrange cards? Or just a tiny amount of high end ones for advertising purposes.


>To lose money, it seems like the raw chip price differences between mid level and high end cards would need to be even higher than what the MSRP differences are.

NVidia's datacenter products seem to charge for GDDR like it's made from gold. Memory prices may play a role.


> I was actually suprised to hear that high end cards are the ones where money is lost. Typically high end products are the ones with by far the biggest margins.

two things:

first, high-end cards are the ones that miners liked, so, partners really loaded up on orders of those cards. They sold a huge number of high-end cards relative to previous generations. And now the mining market has crashed, and they're stuck with a huge number of high-end cards (relative to actual sustainable demand) both at the vendor (NVIDIA) level and the partner (EVGA, et al) level. And miners are dumping their cards, so you have a huge number of high-end cards flowing back into the market too.

secondly, the card he cites as being "their highest-margin card" is a model that was a bad deal at launch and has seen virtually no price reduction since then. I have actually seen quite a few people commenting on just how little prices of that specific model have dropped, with the 3060 Ti and 3070 pushing downwards (and those are significantly faster than the base 3060). On the AMD side, you can now get a 6700XT (which again, significantly faster) for the same price, which is a performance bracket higher. It's an oddity of that specific model.

so, what he's saying is, an overpriced card that has seen no price reduction while much faster cards crash around it, has high margins - "no shit", as they say. Doesn't mean anybody is really buying it though.

the whole thing is very much "true, from a certain point of view"... like his "wow we're losing money!". Yeah, now that the GPU market is crashing you're losing money... and during those years, the company as a whole lost money because this CEO has been doing all sorts of zany business ventures that ended up in massive losses... the GPU division made money hand-over-fist for the last 2 years, and he's lost money hand-over-fist by trying to break into the enthusiast monitor market and enthusiast motherboards (both extremely support, warranty, and R&D intensive markets) and doing a ton of branding deals where he sticks EVGA logos on hdmi capture devices and pcie sound cards (in 2020, seriously) on products licensed from other vendors (who it turns out were skeezes and EVGA was on the hook for a ton of defective and falsely-marketed products).

https://youtu.be/QsICWRnj-60

he ran the company into the ground and is trying to shift blame to NVIDIA. Yeah, the board-partner thing is predatory, but partners don't really add significant value to the product anymore, and middlemen get squeezed out of business everywhere. Yeah, EVGA is in deep shit, and he's lying when he says they're not going to go under, the writing has been on the wall for months now. The GPU market crashing is almost certainly the last straw for EVGA and they won't be able to pivot to their remaining markets.

But like, he did make a ton of money on GPUs over the last 2 years, he just lost a ton of money on other stuff, and that's not NVIDIA's fault. Partners in general ordered way too much (and again, particularly the high-end stuff) trying to cash in, they make fucktons of margin on it during 2020-2021, and now they are lobbying NVIDIA to cover their downside and buy back all the chips they have left over. This is part of that push, and they know the public generally doesn't like NVIDIA, so it's worth a go.

And it's true that unless NVIDIA buys back the chips from EVGA, they probably go under. I imagine that's not a very pleasant conference room to be in during the negotiations. GN notes the extremely "personal" tenor of this guy's affront. If NVIDIA doesn't cave, his company goes under, and he's toast because EVGA is a pillar of the fucking community. Super tempting to try and go around the negotiations and take your case to the public, try and cash in on anti-NVIDIA sentiment (which is broad and intense), and try to appear to be the good guy.

Partners tried to pull this same stunt in 2018... they ran to tech media framing it as "NVIDIA is forcing us to buy obsolete chips if we want next-gen ones!" and if you read the details, what actually happened is partners wanted to cancel their contractually-agreed orders after mining tanked, and NVIDIA said no, if you do that it's over, we're not giving you 20-series chips if you break your contract on the previous batch.

https://www.techspot.com/news/76103-nvidia-putting-squeeze-a...

I bet NVIDIA wishes they could cancel their contractually-agreed orders from TSMC too. That's not how it works unfortunately. But NVIDIA did knuckle under in 2018 and bought chips back from Gigabyte and perhaps other vendors, and now the board partners are hoping for a repeat. But that was 300k chips, which, while it's a lot, was tractable. The current overstock is like, millions of chips, all in the high end.

That's why both NVIDIA and AMD are launching top-first this generation... the lower chips would compete with those massively-overstocked 30-series cards. So, launch the high-end stuff over the top and let the 30-series inventory burn through.


>first, high-end cards are the ones that miners liked

Last I checked most miners went for 3070s and below, because they're energy efficient. Though when supply ran short they used everything they could get.

And you really didn't get the point about high end cards. EVGA previously served a segment of enthusiast consumers that would spend exorbitant amounts of money for slight performance gains / overclocking sport. But under the new NVIDIA rules you can't take a 3090 and put the highest quality components and beefiest coolers on it, because the price of a 3090 is limited by NVIDIA to be around the range of a FE 3090.


BTW there's a second take on some of this with Igor's Lab here.

https://www.igorslab.de/en/evga-pulls-the-plug-with-loud-ban...

Basically: other partners have confirmed they think EVGA is just being dramatic and making a statement in general here, that this is kinda just how being a partner is. And that EVGA in general has some factors that make it worse - they're super small and contract out all their assembly, so they take lower margins there, they have a very generous warranty by industry standards, and they've been losing money on other business ventures and the GPU market collapsing makes things tough for the company as a whole.

But they describe last year as a "10x margins year" which fits what I'm talking about with TUL Corp too. They think EVGA just wants to quit on a high note, and quit cleanly during a generational changeover.


We’ll that blatantly wasn’t the case - the FE’s were under half the market price of other cards for a good portion of the last two years! The markup was coming in somewhere, and if the partners didn't get any of that markup then there's something very wrong.

Actually there probably was a lot wrong all over the place over the last two years, the whole market was insane.


I'm sure some retailers also pumped the price.


Zotac's first-party storefront and other first-party stores were selling 3090s at well over $2k, so if there were caps they were capped at ~50% above FE pricing or higher. Not "around the range of a FE", that doesn't even pass the smell test lol.

TUL Corp is a great example to look at the profit margins of GPU companies, because their revenue is approximately 95% GPUs (they own PowerColor) and they're publicly traded, unlike most of the other examples of GPU-heavy private companies (EVGA is privately owned). I've seen them used for similar analyses of the 2018 mining boom on seeking alpha [0], for example.

https://www.morningstar.com/stocks/roco/6150/financials

So, profit in 2019: -80m. Profit in 2020: 30m. Profit in 2021: 1.08 billion.

So, they made a 36x increase in profits that year. In contrast, topline revenue increased only 2.3x. So they had a massive fucking increase in margins.

People need to stop being so incredibly credulous, do you really think the CEO doesn't have his own angle here?

It's completely amazing, a year ago nobody would have pissed on these CEOs if they were on fire, because everyone knew they were selling straight to mining farms. And now everybody is just taking them at their word that they didn't make any money on GPUs during the mining boom, heck they probably lost money right?

Like gosh if they say they didn't rob the bank I guess we just have to trust them, why would they lie about something like that?

It's been really fascinating watching the "green man bad" field overpower everybody's common sense. I guess the reflexive hatred of NVIDIA is stronger than the hatred of miners and companies that catered to miners. So much so that people will believe obvious lies (or significant misrepresentations) from companies that were blatantly profiteering right in front of our faces for more than a year, as long as it caters to the "green man bad" perspective. It is incredibly frustrating to watch, as someone who considers themself a seeker of truth and insight.

But then, here we are with the "partners didn't really make any money on GPUs during the mining boom" take. Could it just maybe be possible that maybe the CEO is not being fully honest here?

[0] https://seekingalpha.com/article/4181940-asian-aib-read-thro... https://seekingalpha.com/article/4186417-tul-corp-revenue-da...


Think also a oit the cooling solution, more copper needed, more money.


Their lack of profits from high end video cards also existed in the 2000 series. No doubt worse with the 3000 series, but this is a long term war not a short term battle.

People will pay more for EVGA, because they were the only company that respected their customers. The only company you could actually get a decent RMA from and didn't continually break the law.

But Nvidia says no, there's max cap on prices. Nvidia also competes with their partners with their founders edition series. It's a whole around win-win for Nvidia.

I can't imagine why Nvidia would let one of their most respected partners go. Unless, Nvidia plans to shove out their partners and sell direct as they did with founders edition.


> Unless, Nvidia plans to shove out their partners and sell direct as they did with founders edition.

It’s important to remember that this was a major contribution to the downfall of 3DFX. A good chunk of which was cannibalized by nVidia.


It was a contributing factor, not a major one. Their products were not competitive with Nvidia after 3DFX completed the purchase of STB and started producing their own boards.


Not competitive how? The only card to consistently outperform the 5500 was the GeForce 2 GTS[1], which was a half generation ahead. 3DFX fans would also argue that T-Buffering and other unique technologies made the image quality better.

What killed 3DFX was a combination of no AIB boards to push the architecture past reference; their Sega deal falling through and the delay (and eventual cancellation) of the 6000 as a competitor to the GeForce 2 and Radeon 7000 series’ (later tests of engineering samples showed it outperformed both). The latter being a direct result of the two former problems.

1 - https://www.anandtech.com/show/580/7


Yeah, this is a case where money is inevitably going to be lost and everyone is saying "not it". It's not clear that there's any "fair" way to allocate the losses that will result from massive GPU oversupply.


For what it's worth, the Ethereum merge was planned years ago (but the concrete release date may have been unknown until recently).

Enjoy the GPU glut. I for one am thinking of having fun with Stable Diffusion and the like.


Same here, I can finally get a cheap GPU worker for my weekend project (https://phantasmagoria.stavros.io, if anyone cares).


Have you crunched the numbers on how many users you need to justify renting a VM with an A100 or similar?

From my brief research they seem to be available for about $30 / hour, and can run 50 iterations in roughly 6 seconds.

One of the missing pieces for me is how many parallel requests they can serve, and whether it's feasible to start/stop the machines based on demand (it sounds like some providers don't allow the stopped machines to retain state long enough for this to be practical)

Would be interested in your thoughts, have been tinkering using my laptop GPU but it's dog slow and I have a vague idea for an online game that I might try out if I can summon enough mental energy one weekend


I didn't look for an A100 specifically, but the cheapest GPU spot instance on AWS is around $200/mo, if I remember correctly.

Since this is a hobby project, I probably will never be able to break even on that. I'm thinking of writing a bit of code that will use GPU spot instances to perform the computation, and turn it off when the queue is emptyish.

You don't really need any state on the machines, the worker communicates with a broker, receives a very small bit of JSON containing the parameters, does the work and uploads the image to R2. Then, it sends a message back saying that image is done.

That way, you don't even care if you get interrupted mid-generation, since the work is idempotent. I can just spin up a machine whenever available and tear it down whenever.

It will be nice to try, though I think I'll use my desktop for a while yet due to a lack of users. That one takes around a minute to generate an image, and can't run while I'm using it.


This is essentially what I have in mind - some kind of queue, with a mediator that starts/stops a spot instance using a custom image that has everything installed and configured to start on boot

With regards to state I just meant the image - wouldn't want to have to install everything at each start. Using something like aws or gcp I don't think this would be an issue, but I was window shopping some providers that specialize in GPU VMs and it wasn't clear if they provided equivalent services. Probably easiest to just use aws.


I think if you couldn't make a ready-made image you'd use a Docker container with everything in it instead. Though then I don't know how you'd use spot instances, because AFAIK you want those for the "whenever it's available" startup and can't really be logging into each one to start them up. If they let you run a container, presumably they let you run other commands to set stuff up as well?

This kind of depends on the provider, but it's mostly an implementation detail. The main crux is a broker and a queue, I use Dramatiq (the site is written in Python) and it works great.


With spot instances you just have to be prepared for them to get terminated at random. If you're processing messages off a queue in most cases you get this for free - the ack deadline will expire and the message will be redelivered to another subscriber.

Docker container might be a good shout, though I've not done anything using a GPU from docker before


Yeah exactly, the queue handles this for free. Docker with the Nvidia extension actually works better for me than native, Torch doesn't work on my system but works in Docker.


Finally a sane and useful realization of push messages.


Haha, more of a necessity, but yes, it is very nice to have.

Oddly, I did see an issue yesterday with the push notification arriving multiple times (far apart from each other) even though the server only sent it once.


There is a fair way: it’s the open market.


> Supposedly they tried to cancel some of the volume with the foundries, but were not allowed to.

I would hope so, considering the big chip producers bought out the majority of the supply throughout COVID. Allowing them to bail out on that contract without providing an alternate buyer would set a bad precedent and allow them to monopolize resources whenever they desired.


Who would have imagined that adjusting your production rates over the most volatile and unstable asset AKA Cryptocoins would be a bad idea.


I mean, the supply shortage was a big problem for people that just wanted to play a game. My GPU died and I had to buy a used RTX Titan (in the 3090 era) just to have anything. (It's great though, I use it for gaming, ray tracing, and machine learning. Still looking forward to the 40xx series though.)


And that's the second time they've done this, after 2018 crypto crash.


third. there was another mining boom back in 2014 when it was bitcoin being gpu-mined. It fucked with AMD's Hawaii/290X launch really badly back then - the cards were unobtanium for months and months which gave NVIDIA an opening to launch Maxwell. Then the prices crashed and miners dumped their inventory, so you had new aftermarket 290Xs being sold for $250-300 to try and compete with mining cards, with used/refurb 290 non-X bottoming out around $175.

I'm sure the idea that infinity cache decreases the relative mining performance was in everyone's minds at AMD when they were designing RDNA2 - by that point they'd had two major launches (vega too) fucked up by mining, and using more cache/less VRAM bandwidth makes your mining performance worse relative to raster, because mining loads aren't cachable.


can't they "easily" retool for "AI cards"?


I don't think AIBs like EVGA are allowed to make server cards and the demand for those is not unlimited.


It's not like they are not allowed, it's that no one will sell them actual chips to make them.


There are server cards that use GA102, GA104, etc. which EVGA has a glut of.


It's not the same variant of a chip, though.

A10 uses GA102-890-A1 and there is no retail card on that variant. Variant is very important because Nvidia drivers are...well...not consumer friendly?

For example, Nvidia vGPU only works on certain cards if you ignore ways that break EULA. I don't remember the exact details, but there was also a software lock that prevented using some optimized paths on the consumer's GA102, but used the optimized path for GA102-890-A1.

If Nvidia is great at anything, that would be hostile market segmentation by their drivers.


I gave up playing around with AI partly because the drivers are proprietary and not too good.

Only in the past year or two has Nouveau developed well enough that some games are playable. But only on cards not enforcing signed firmware.

How much more anti-consumer can you get after requiring signed firmware? That is when I started my boycott.


Nvidia has control to fix that though.


Having watched other coverage, some more takeaways:

- Nvidia doesn't give board partners real drivers until they are public. Which is why evga 3090's were bricking themselves at launch, they only had limited pre-release drivers. How can evga make a good product when they don't have dev tools to make the product?

- Nvidia doesn't give retail partners prices or info at all until the public has the information, but requires order size estimates long before that. How can retailers predict sales volumes if they don't know price or performance?

- Nvidia blacklisted Hardware Unboxed for giving the 3090 positive reviews, but they didn't talk about the proper talking points (8K gaming capability). They don't care about the review or increasing sales, they're actively the top performing GPU, they just care about controlling the narrative and being manipulative.

AMD GPU's aren't as good. but Nvidia isn't SO much better that AMD is just full stop not an option.

No company is your friend. But Nvidia seems to actively be an enemy to a lot of people, and they would be profitable even if they were nicer, they have the best product. Which means they are just actively bad people.


>AMD GPU's aren't as good

Do people still believe this?

I wonder what it'll take. RDNA3's soon and should be interesting to see happen.


I'm looking for something that performs on par or a bit better than nVidia 1080Ti, at about the same price point (~300€) and I just can't find any. Significantly newer Radeon RX 6600 cards have 10-20% worse performance according to benchmarks, for a similar price point (300€). 6900XT outperforms it by 30% but costs ~1000€.

I just want to connect 3 screens, play Team Fortress, and have my GPU not die in the next 2 years on the cheap ;-;


Comparing a used to a new card never a 1:1 comparison though. A RX 5700XT performs better than a 1080Ti and is priced around the same.


Do you mean RX 6700XT? 5700XT performs worse than 1080Ti in two benchmarks I checked and I can't even find any to buy :/ Cheapest 6700 I can find is 470€ and does perform slightly better. I just might get that when the prices come down a bit further, thanks.

Edit: Also, 2017 level performance in 2022 shouldn't cost 2017 level prices.


The 1080Ti was not a $300 card, tho ;)


Orly: I paid over A$1200 for a Gigabyte Aorus Xtreme 1080 "Tai" back in 2017.

Still works stellar today, in son's PC.


Did you factor in electricity costs?


It's a common narrative and doesn't really have a counter, but those who actually bother to look know that AMD are actually really fast (just like the old days when it was Intel v AMD)


I run Linux at home. I have heard people say over and over that AMD is king on Linux.

I have a 6900XT and while it performs well, the driver is actually shit. I have had issues with one or more of my three displays not starting on boot, or not waking up from monitor poweroff.

Pinning the kernel to an older version seems to keep things stable. I was dealing with that sort of breakage what seemed like once a month for a year.


Should have contacted bridgman from RTG on Reddit. He's very helpful with driver issues, and if what you are seeing is a bug, he will collect the info on it and study it with the team.


Seriously, one of the best things about the AMD Linux drivers is that you have direct access to the developers including a public bug tracker [0] and IRC [1]. Maybe not the kind of customer support people are used to but I honestly prefer this.

[0] https://gitlab.freedesktop.org/drm/amd/-/issues (I think this is the correct one now that FDO migrated away from https://bugs.freedesktop.org/)

[1] #radeon on OFTC - https://people.freedesktop.org/~cbrill/dri-log/?channel=rade...


> I run Linux at home. I have heard people say over and over that AMD is king on Linux.

Idk about that, I’ve never believed it. My N=1 anecdata over 12 years of using Linux with the binary nvidia blob says ive generally had a good time. I only have used it on a desktop though.

It’s not like I leave the card alone either as I game on Linux, do dynamic card binding with kvm passthroughs etc.


Aren't AMD GPU drivers known to be unreliable on linux at launch and stabilize later on?

I have similar issues with my 1080Ti by the way, it just wont recognize one or two screens and will mess up the display configuration if I turn one of them off. Going into display settings, applying a new configuration and reverting it usually helps.


GCN took a bit to stabilize but my RX 6900 XT worked fine on launch (well, about a week after launch when I managed to get one) with the newest drivers - if you are on a slow distro then you might lag behind what is available upstream though.


I have a 6900xt. It's like 3% slower than a 3090 at 1440p. Which means that it is "not as good" as Nvidia GPUs. I did not mean to imply that they aren't good, but simply that Nvidia ultimately holds the crown. If you are only looking at the absolute #1 slot, yes, Nvidia is "better".


The 3090 also uses ~20% more power. [0] 3% seem such a small difference considering that in reality a card's performance is not just a single number.

[0] https://www.tomshardware.com/features/geforce-rtx-3090-vs-ra...


3090 also costs 50% more at MSRP (not going to pretend to know the current state of post-ethereum GPU prices).

But ultimately, people who buy halo tier cards generally don't have a budget set, which means PSU, cooling options, and electricity bill aren't an issue. If you buy either a 6900xt or 3090 and can't afford an adequate PSU and electricity bill, you're probably better off with a completely different tier of GPU, so I get why "raw performance" is the crown that gets chased. MSRP of 3090 is roughly double that of 3080, but obviously it's not double the speeds. There's too many things to consider when you consider "best' GPU so "highest performing" is easiest to refine down to.

I don't fully agree, but I get it. And I opted for the 6900xt so you don't need to convince me of the benefits (for me, Linux open source drivers were on the list as well). But I'm not going to pretend like there is a super readily approachable alternative.

userbenchmark.com notoriously changes their benchmark ratings every time AMD releases a CPU because "AMD bad, Intel good". The core i3 9100f is rated higher than the Threadripper 2990wx [0]. And I don't just mean "on the subjective aspects". They put the i3 as rank #167 in all CPUs and the threadripper at #231. Yes, they claim the i3 is faster than the threadripper, because they don't count ANY scores over "8 cores", and Im pretty sure that it means "8 threads" and not 8 cores. Because "no workload needs more than 8 threads, so we're just going to ignore that." They also release a "review" of new AMD products every time they come out claiming that AMD is garbage for made up reasons.

All this just to say that the second any subjectivity is introduced into the mix, you'll wind up with people skewing tables to show the "real fair metrics" that favor their team. I'd rather just say "yeah, nvidia holds the crown and is technically 3% faster, but 3% isn't enough for me to care". I'm not going to start a debate over "Well actually, Nvidia is better because FPS is the only metric that matters!"

[0] https://cpu.userbenchmark.com/Compare/AMD-Ryzen-TR-2990WX-vs...


I know I'm very atypical in this but I like to game as well as run the occasional machine learning stuff. Nvidia has a hard lock on the latter, basically giving me no choice.


At least on HN I think machine learning and CUDA in generally is a big part of it. For that to change ROCM needs to be "better" than CUDA and easy to swap in.


Yes when I said very atypical I meant compared to the average consumer, not the HN crowd. Much more common here for sure.


> Nvidia blacklisted Hardware Unboxed for giving the 3090 positive reviews, but they didn't talk about the proper talking points (8K gaming capability).

erm, more like they got blacklisted because they have continually refused to talk about DLSS or raytracing, not because of 8K. Steve has outright said he doesn't think either of those are valuable features and he won't take them into consideration in the reviews. And DLSS 2.0 was well into the public consciousness at that point too, everyone knew it was much higher quality than the first iteration or other alternatives available on the market (still is better than FSR 2.0 too).

So yeah, I imagine that NVIDIA wasn't too happy about an outlet who flaunted the fact that they were choosing to ignore a 30-40% performance and perf/w gain on what was slated to be a pretty broad selection of titles. That basically is a whole performance-segment worth of gain that HUB managed to downplay and minimize and spin.

Similarly, on raytracing, even first-gen NVIDIA cards are twice as fast as AMD for a given performance segment. "Hybrid" titles tend to downplay this performance gap due to Amdahl's Law, if only half of your workload is RT then a 2x speed advantage only yields a 50% lead. But, for path-traced retro titles where the workload is 100% raytracing and 0% raster, NVIDIA performs about twice as fast, as do synthetic measurements of raytracing ability (and 3D rendering etc).

They quickly walked back the blacklisting, probably someone had a bad day and snapped and committed the worst crime of all: saying what they really think.

Because, to be honest this is nothing new for HUB, Tim's content is great but Steve is a bit of an AMD fanboy and it strongly leaks into the editorial positions and the things he chooses to review, which shape the numeric results as well. DLSS was persona-non-grata until FSR 2.0 came out and then it's "well DLSS is great but why not FSR!?". If he'd been saying the "dlss is great" for the preceding year then maybe it would have come off as good-faith, but, it was a shitty low-quality gimmick that nobody should use, up until AMD did something similar and wow it suddenly is great!. And that's just how Steve is, it'll be the same for raytracing... raytracing is just a fad, until AMD finally gets around to a proper high-performance implementation and then it'll be all raytracing all the time. He's constantly downplaying NVIDIA's work, and minimizing and spinning the weaknesses of AMD hardware, which has to be tremendously frustrating for the people at NVIDIA who are supposed to work with him.

It's not a popular sentiment with the pro-AMD crowd, but, even among the enthusiast world in general there's growing sentiment that Steve is not on the level, in a way that few other prominent reviewers exhibit. But, you just flat-out won't review the card doing what it's designed to do? Yeah, not sure why NVIDIA would pay to give you a sample of that product, then.

"Wow this semi truck sure has shitty track times, and I won't review the hauling capacity because that's just a gimmick, everybody uses minivans".

(also, for the record, AMD has a history/pattern of blacklisting reviewers for critical reviews as well... happened to Kitguru a couple years ago, as well as TechOfTomorrow, for similar reasons... he wouldn't benchmark the hardware the way AMD wanted, so no hardware for him. he made a video about that too. Curiously, nobody keeps bringing that up, though...)

https://www.kitguru.net/tech-news/announcements/zardon/amd-w...

https://www.youtube.com/watch?v=4Qflrf6UiWk


> erm, more like they got blacklisted because they have continually refused to talk about DLSS or raytracing, not because of 8K.

Forgive me, but what is the problem here? Reviewers of all kinds focus on different things. Some GPU reviewers care more about FPS for certain games. Some care more about fidelity. Some care about power efficiency. People follow reviewers who focus on the stuff they care about. So this one guy was reviewing stuff which he and his audience cares about. Because this didn’t make Nvidia look good (this time), it’s okay to blacklist him?

Sorry, no, that’s crazy. If reviewers focused on the things manufacturers told them to, they’d be terrible reviewers. Manufacturers don’t have a right to favourable reviews.


I agree, Steve is very much an AMD fanboy. When I called them out with him complaining about AMD not supporting certain models of motherboards such as the B350s and cheaper hardware, I pointed out that AMD never said they would and their wording was only for current X570/X470 and B550 and B450 boards at the time. Not the older stuff cause it just wasnt doable. He lost his shit at me in the comments of the video and proceeded to say how wrong I was and AMD had promised him that they would support all mother boards into the next generation (Zen 3). I still watch their channel but that argument left me some bitter moments with Steve.

I agree with his take on Userbenchmark as that site is a piece of shit but it also seems that Steve is turning into Userbenchmark AMD version given his numbers always seem to be topping intel over compared to other reviewers like GN. He claims he is unbiased but his review titles and thumbnails say otherwise. The guy just seems like an overall douchebag who gets butthurt when anyone critizes his favorite brand. (P.S I own an AMD system lol, 3700x here.)


In case the CEO is reading this -

Turn it into a employee-owned company!


I don't think that's actually easy to do without causing all current employees to incur significant taxes.


It's also difficult to win the lottery without incurring significant taxes.


The difference is that lottery winnings are liquid, you can just pay your tax burden out of the winnings.


Not sure why I'm unable to respond to iepathos' reply to this comment, but:

>The employees would not be taxed just for receiving the stock/ownership, only once they sold it and made it liquid.

This is incorrect; the initial grant is income and anything after that is capital gains. This is also how RSUs work.


Hacker news does not display the reply button for a bit after someone replies to you as a means of trying to blunt quick back-and-forth discussion.

My sibling commentor has the workaround.


I believe in this case it’s the depth (width?) of the nesting that limits the reply-ability.


Oh, hi Steve!


If you click/tap on the relative timestamp next to the comment, you’ll be taken to a dedicated page where you can reply. Same way I replied to yours.


I know there's probably a reason this isn't done, but: could you not create a new, worthless corporation; distribute its worthless shares to the new employee-owners; and then have this holding corp acquire EVGA, thus instantaneously inflating the value of the already-held equity to ~EVGA's share price?


Acquire using what assets? You're proposing fraud.


You're interpreting "acquire" as "purchase", but "acquire" just means "acquire." You, as the majority shareholders of a company, are perfectly within your rights to give your equity in the corporation you own, to another corporation, as a gift or donation. (Or, if you like, "for consideration" of $1.)

And make no mistake, that's already exactly what's being proposed here, in making EVGA an employee-owned corporation: a gift of equity from the current shareholders, to a set of new shareholders.

The only difference in what I'm describing is that there's a trust / holding corporation in between the givers and the recipients.


Would the shell company not then owe taxes on the value of the corporation it ‘acquired’, regardless of the dollar value of the acquisition?


If you ever think of a financial trick to avoid taxes and say "why couldn't companies do X", the answer without fail is "because it's illegal and you aren't aware of the specific law you're violating".

If this was legal all sorts of companies would do this all the time.


Similarly, taxes are applied to capital gains when stock is sold, not when it is received or bought. The employees would not be taxed just for receiving the stock/ownership, only once they sold it and made it liquid.


You're taxed for the value of the stock when you receive it, and you are taxed on capital gains when you sell it, if you're selling it for more than you received it.


No, you can absolutely be taxed just for receiving the stock/ownership.


If it’s not worth much and it’s spread across many employees, we aren’t talking about much.


Wrong. It will be valued at something ridiculous like 20x EBITDA (average for tech related companies) and you'll be taxed on that number.


Doesn’t that depend on who values it? Also, I get the impression they have quite a few liabilities.


In the GN interview they said they have plenty of cash.

I'm sure they'll have to write down a lot of inventory because prices went down, but for tax purposes their valuation is still going to be very high based on earnings. That's why private companies other than startups rarely go this route.


Taxes are due at vesting.


To be clear: taxes are due when you receive the shares. For options, that's when you execute/purchase your options not when they vest.

edit: (in the US at least)


The taxes can also be due when you receive your options. That can depend on if you purchase your options or if they are granted to you and entitle you to a discount.

YMMV but an NQSO purchase agreement can be your friend.


Depends on your tax laws. Here (Canada) lottery winnings are tax-free ;)

https://www.canada.ca/en/revenue-agency/services/tax/individ...


In that case by definition you have the cash to pay the taxes.


No, you don't. The issue is liquidity. If I gift you something that is of substantial value, but cannot be easily sold or used as a security, it can really fuck you over. There are jurisdictions that allow deferring the tax liability in such cases, but the USA doesn't do that.


Gifts in the usa are taxable to the giver, not the receiver. (In most cases)


Lottery winnings are liquid.


Ah sorry, I misunderstood the post I was replying to. I was actually agreeing to it: Lottery winnings are different from major gifts of illiquid assets, because you can always just pay the taxes from the winnings. In any case, you become liable for the tax immediately after receiving the gift/winning. If you cannot do that without selling the asset, and the asset is impossible to sell in the short term, and you cannot take a loan against the asset, your best option can be to refuse the gift.

And yes, this is a little bit insane.


I should have been clearer, but we were pointing out the same difference.


I’ve heard of companies doing something similar, but instead it’s a special class of stock (and the original one gets phased out through buybacks etc). This new class basically has a one time dividend that covers a chunk of taxes.

Not sure in EVGAs case what laws govern their business and where employees are and what existing structure is… but where there’s a will, there’s a lawyer to make a way.


Throw enough lawyers and structuring at it and you can definitely make it work. Not make the tax go away permanently...but phase it over time & ensure they get dividends along with it to cover bill etc.


Exactly. I've had enough theoretical conversations about this to know there are organizations who would be happy to help assess any particular situation.

Different industry, but it might be worth looking into how New Belgium went employee owned, with apparently equitable results:

https://www.forbes.com/sites/christophermarquis/2020/05/01/e...


I know nothing...

...but give the current employees voting rights without equity, proportional to their seniority... (Or just 1 vote each.)

And then start rolling out equity, maybe, depending.


I don't think that is true. There are a number of paths to convert to a employee owned coop that doesn't incur taxes for the employees.

Easiest would be to convert to some sort of new 501 non profit with employee control.

People don't have to pay taxes when they are hired at an employee owned company like winco, and students don't have to pay taxes when they join a housing coop.


Taxes? You mean the thing poor people pay? /s


They are such good board designers, they should open a design service.


OK, but they will still need a CEO.


Yes but the CEO's role would be very different.


if anyone needs any inspiration, in Germany the company's employee labor union has a board seat or two, by law (for share based companies above a certain size).

I just never see anything remotely close to that in the US discussions, by the people most interested in employee activism or union formation. I get the impression people don't even know its something to they could be putting energy towards.

first English google result I could find on the concept

https://www.worker-participation.eu/National-Industrial-Rela...


At every opportunity I drop Mondragon corporation. They have a long and impressive history, BBC had a documentary on it, which despite being filmed in the '80s, and worthless to them is being withdrawn from Youtube due to IP. Very interesting if you can find it, but most of anything I've seen written about the topic is.

Actually:

The Mondragon Experiment:

https://vimeo.com/161252994



How so?


Accountable to employees and not "the invisible hand of the market" nor shareholders.


Not if they form a co-op!


All companies are employee owned. Vote with your feet.


> All companies are employee owned

False. Only a few are: https://en.wikipedia.org/wiki/List_of_employee-owned_compani...


The person you're responding to is not speaking literally; they're using what's called a metaphor to make a point that employees of any company exert walking power. The point may or may not be useful, but you have to finish reading the post before trying to reply with a gotcha correction.


> they're using what's called a metaphor

It isn't a metaphor, it is, in the words of Zizek, pure ideology.


has this power ever brought down a company? Has it made bosses responsible for worker injuries at work? Has it given maternity leave?

if not, its not really power


It certainly has. Employees have real power.

Power is not the same as ownership


It has.


Thank you. One addition:

- Video cards are 78% of revenue for EVGA.


I think this was 78% of gross revenue. As OP stated, the GN video makes it clear that the margin on this was, in many cases, negative.


> As OP stated, the GN video makes it clear that the margin on this was, in many cases, negative.

No - the margin is in many cases negative, right now, with high-end GPUs 50% below MSRP. This is an important distinction.

Margins were great for 2 years, EVGA's GPU department made tons of money during that time even with a few gestures towards the public. He lost tons of money during those years with frivolous, high-R&D, high-ongoing-support-cost ventures like entering the motherboard and enthusiast monitor market.


Is pulling out of video cards now related to Etherium switching to POS? As in, downward pricing pressure for GPUs means they’ll never see positive margins in the foreseeable future? That combined with NVIDIA being hard to deal with makes it not worth it?


Not in the slightest. Even if there were no crypto mining, videos cards would all still get purchased.


Prices are in free fall and cards sit. 3090 ti is down to $1000 new. At the height of mining, a 3090 went for $3400.


Maybe… depends on how many they make. If there is an oversupply, margins will still suffer.


Thank you for the clarification!


But PSUs make 300% more profit.


Maybe so, but I wonder how much of their PSU sales come from the excellent reputation of their GPUs.


When I bought my power supply several years ago, I went with the highest rated one from johnnyguru at the time which was an EVGA one. It is only one anecdote but at the time at least, the G3 EVGA power supply came highly recommended from johnnyguru with comprehensive tests completed.


IIRC the EVGA G3 failed the ATX requirement of holding voltage after mains goes down, dropping way too quickly. It was the only one of several that they tested in a group review. Not sure how much storage vendors relies on those milliseconds, but I decided to get something else.


I wish the site was still up because I swear the EVGA SuperNova G3 was the highest rated power supply johnnyguru ever reviewed. Their review from my memory never covered that. I may have that all wrong seems multiple people remember the issue.


To be fair, it rated well on most other aspects as I recall.

I found a review by Tom's Hardware where it barely passed the minimum[1]. It's not unreasonable small differences between PSU units and measurement equipment leading to it just failing for some other reviewer.

To be fair, I recall it was rated well in other categories, and the Tom's Hardware review does as well.

[1]: https://www.tomshardware.com/reviews/evga-supernova-650-g3-p...


G2 was the only ever recommended, G3 had issues with some protection circuits.


I do not remember that and this is the first I have heard of that issue. My G3 has had zero issues in 5 years of operation. Do you happen to have a link to that?

Seeing as johnnyguru is not available as far as I can tell this is the only proof I have: https://www.reddit.com/r/buildapc/comments/5kl4v0/jonny_guru...


The Internet Archive is your friend: https://web.archive.org/web/20200702165629/http://www.jonnyg...

It had a total score of 9.8 out of 10


> - NVIDIA limits MSRP heavily on cards so they can't do things to sell more profitable higher end cards

It's so weird to put a ceiling on card prices. Maybe this was a measure due to the crypto-induced GPU demand?


its happening in cars right now. The Manufacturers aren't too happy that some dealerships gouge the customers. Its bad for the brand, though profitable short-term


Car dealerships don't do significant product modifications like GPU integrators. Unless you consider an undercoat or bedliner to be significant modifications.


These days, AIBs really don't do all that much. In the past you had "enthusiast" cards with more RAM, dual GPUs on a single card, etc. but Nvidia cracked down hard on that stuff with (IIRC) the 10-series, which is also when they introduced the FE line.

Now, the AIBs differentiate themselves mostly on coolers, aesthetics (cooler design, RGB, PCB colors, etc.), and power delivery (though the latter is much less common).


Ah, back in the day with the crazy graphics on the coolers. Orcs and wizards on them with some ridiculous blower fan out the back if it was a big card.

This was one of my first cards when I was building.

https://pics.computerbase.de/1/1/6/0/8/1-1080.1308227202.jpg


Most GPU integrators just resell reference boards with RGB leds.


Because they can’t add anything else… (but I don’t know what else they’d want to add).


>Because they can’t add anything else…

They do secondary binning; the best ones are overclocked (and more importantly, warranted to perform stably at that overclock) compared to nVidia's specifications.

Frequency gains are mostly linear; a 10% increase in clock speed does indeed yield 10% more frames rendered per second.

Importantly, they don't tend to be over-volted to achieve this overclock even though their coolers are more than capable of dealing with that, but overvolting decreases the usable life of the GPU where "standard" overclocking, generally speaking, does not.


This is a legacy thing, NVIDIA now does binning themselves. It started with Turing and the "A" chips, now it's evolved with Ampere. Partners will buy chips that are already pre-binned into "bad/good/exceptional" tiers (bin 0, bin 1, bin 2). As a result, partners generally no longer apply any special binning to most models of cards, since that mostly covers what they need to know.

And for the Ampere generation, they hardly built any of the lower-tier card models, and those chips have to go somewhere. So they end up using low-tier chips in FTW3/Strix/Gaming X and other high-tier products too. So buying a high-tier model is not any particular guarantee of chip quality anymore either - you can buy a 3090 FTW3 and get a bin-0 chip in it. The only exception is true halo products like the Kingpin/HOF/etc targeted at the XOC market which will definitely get Bin-2 chips and maybe additional binning on top of that. But partners shipped 27 quadrillion FTW3 and Strix and other "flagship" cards and couldn't put good chips in every one of them.

For the mass-market (non-XOC market), everything goes onto the same PCB and you test the finished product and the better ones get the "OC" model sticker. That's it, that's entire binning process for most partners/most of the products on the market. You can have a FTW3 with a bin-0 chip in it. Why do anything more, if you know it's going to sell in 0.1 milliseconds anyway?

The partners that still do binning, typically on limited segments like the Kingpin/HOF/etc, no longer actually test the chip performance, they test the electrical characteristics and then apply statistical analysis to figure out which of the inter-pin voltages/etc are statistically predictive of chip quality. But there is less and less of this because it's expensive and time-consuming, and the differences don't tend to produce significant gains in finished product performance as much as they used to, due to GPU Boost algorithms getting better, etc.

Much like Silicon Lottery exiting the CPU binning business, it's become a bit of a time-waster and partners no longer really bother with it like they used to.

https://www.igorslab.de/en/chip-is-not-equal-to-chip-first-i...

But again, that gets back to the problem with AIB partners: "so what, exactly, would you say you do here?". They made sense in a world where they were allowed some freedom to experiment and service niche product segments/etc. With super tight control, the only thing they are allowed to do anymore is pump up the TDP to try and squeeze out a win, and bigger coolers. Can't do certain kinds of smaller coolers that might compete with the server business either. They just crank up the TDP and slap a bigger cooler on it and crank the price up 20%, it's not really all that significant a benefit to consumers anymore.

They no longer bin. They don't modify the silicon or software in any way. NVIDIA even provides the memory as part of the bundle with the GPU sale. A lot of them use NVIDIA's reference PCB design, and the only common change to that design is custom VRM, which is pointless in a world where overclocking is dead and everyone is undervolting. Cooler design is all pretty functionally-interchangeable across brands, you need something, but a Gaming X is not really different from a FTW3 in any meaningful way. Everything meaningful is done by NVIDIA now.

They are supposed to "buffer inventory" but in practice every time there's a glut they come back to NVIDIA with their hands out looking for refunds for their leftovers and kickbacks on the inventory they keep. So heads they win, tails NVIDIA loses. And it's not like their prices are awesome the rest of the time either.

They no longer even act as a price anchor... 10-series launch, only EVGA and Zotac put out cards anywhere near the "official" MSRP (in hideously limited quantities) and they were still $20-30 over. Everything else was actually higher than the FE cards, in the $725-800 range. Same for the 30-series where NVIDIA cards were the only remotely affordable ones. So these days it's NVIDIA price-anchoring the partners, partners want a 20% higher street price over the NVIDIA design for what seemingly amounts to a higher factory TDP configuration and a bigger cooler. (Which is of course tied to the BOM cost which NVIDIA controls, but... the price structure kinda doesn't matter to the consumer.)

Most of them don't even build the cards themselves... EVGA for example contracts out their assembly to third-party factories in Taiwan, they don't own the factory. So they basically just take the spec from the customer, and give it to the engineers. Well, at least their secretary does that I guess.

So what, exactly, do partners do around here anymore? I realize that a lot of this is the result of product strategy by NVIDIA, some other parts are just the direction that tech has gone over the years (automatic boosting, undervolting, etc), but, regardless, consumers don't really benefit from the things that AIBs are able to offer nowadays. Partners provide almost no real value-add anymore. Functionally they are middlemen, and middlemen get squeezed to zero by the market.

Ironically, in true Office Space fashion, probably the strongest argument is that they deal with the fucking customers so NVIDIA doesn't have to, EVGA could still distinguish itself on warranty/etc. But that's really the only meaningful distinction between AIB partners anymore and it was really only a couple of them that had worthwhile service. With EVGA gone, that argument starts to fall apart too.


Factory overclocks and cooling preferences mainly. I do not like blower style coolers.


RAM.


Yeah, I so love not being able to buy a car or a graphics card / going through shady resellers rather than just paying more at a reputable place. Makes me really appreciate the brand.


For cars, I much prefer getting on a wait list and paying MSRP than having to pay some huge markup.

We’ve been waiting on a Bronco since late 2020. Got bumped to a 2022 model year which is now built and on a train to the dealer.

Yeah, we could have had one sooner for like a 50-100% markup… rather wait.


Is it better for the brand when the only way to get the product is to buy it from a scalper?


I think it might depend on the brand. This is the approach Rolex is taking, for example. They haven’t ramped production yet the brand is stronger than ever. The secondary market is able to set higher prices than dealers so that is where the volume is found. I don’t see that tactic working outside of veblen goods however.


not weird at all. this is a huge reason for the rpi’s success

and look at the last 2 years of nvidia gpus. you mentally associate $1,000+ with any gpu. even 5+ year old models. they want to put an end to that, bad for the brand


You saved a small infants life in the # of cumulative minutes you just saved us by not watching that video, thanks


Also a lot of energy in avoided video transmission and playback.


>It sounds like (this is my takeaway from GN's description, not their words) EVGA's CEO is tired of dealing with NVIDIA, wants to refocus on family, and there isn't a clear person to replace him and hesitant to sell to people who would mistreat employees/cut corners/damage brand.

So they're voluntarily choosing to go out of business..?

How does that work?


From other comments, ditching 78% of their revenue but not going out of business. They still make power supplies. I think the CEO sees it as exiting a unpredictable market where they have little control over what they sell.


They also make top of the line priced motherboards, seemingly for enthusiasts.


> How does that work?

Not at all complicated.

1. They have no debt based on the video

2. They own their HQ so no long term rental contracts. They would need to sell the building.

3. California's Labor Code contains a presumption that employees are employed at will. This means that either the employer or the employee may terminate employment at any time, with or without cause or prior notice.

4. They file and pay any outstanding taxes.

5. Since EVGA is a Stock Company they need to hold an election to dissolve then file two forms with the California SoS and you are done. The required forms each are one page long, takes a few minutes to file them online but here's the paper version: https://bpd.cdn.sos.ca.gov/corp/pdf/dissolutions/corp_stkdis...


They’re not publicly traded. That’s how. They can do whatever they want within the letter of the law and aren’t beholden to shareholders.


Revenue does not matter on a private company profit is more meaningful. What’s the point of doing 10b in volume if you loose money on it and have no path to profit. It’s smarter to do 1B and make 2-300m profit


revenue always matters, it's how you know market share, it's how you know margins, etc. etc.


But having a large share of an unprofitable market produces little actual value to your investors if you have no pricing power.

It only makes sense to peruse it in tech where you are looking to unload it onto the next sucker. If you aren’t planning to do a rug pull or go after government subsidies it’s a bad plan


I think you're implicitly assuming the employees want to work there forever.

The CEO told GN there won't be another round of layoffs; Instead there will be voluntary attrition.

Pretty sure the implication is that everyone there, not just the CEO, is burned out after 2 years of covid, supply chain issues, and dealing with Nvidia. Employees are planning to leave regardless of EVGA's plans so it doesn't make sense to stay in a low margin market.


it sounds like they are taking losses selling video cards. Closing a portion of business that is actively losing money on every card sold, without even accounting for lots of overhead, is not going out of business, its stopping the bleeding which they can't do any other way because Nvidia controls every other option.

So they'll continue to make power supplies and whatever, but they're done with video cards, where they lose money.


From what I understand, MSRP is the suggested retail price. Is EVGA contractually obligated to list at that price? And if so, how does that not violate antitrust laws?


In the US, antitrust doctrine is largely based around preventing dominant market players from causing direct harm to the consumer and does not care much about health of the markets. It is hard to argue that NVIDIA saying "use this price that you think is too low" to EVGA would constitute an antitrust violation when that means lower prices for consumers.


Well, EVGA isn't just reselling Nvidia's product. They are building a new product (a vidio card) that uses Nvidia's product (the GPU itself). So by setting a price ceiling, they prevent card makers for making higher end cards with more features and/or higher quality. And I would argue that preventing the existence of a product that consumers might want hurts some consumers at least.


> And I would argue that preventing the existence of a product that consumers might want hurts some consumers at least.

unfortunately, a product that doesn't exist isn't really a harm, it's hypothetical harm at best, which would be really difficult to argue as monopolistic.

I actually think this business agreement ending is a good thing - it gives AMD and other card manufacturers a way to get in, and compete with nvidia. In the short term it might hurt, but long term is better.


EVGA is not obligated to allow the retailer to sell using EVGA's trademark, or give access to reseller pricing if the retailer does not follow their terms and conditions on advertised price. The retailer actually can sell at lower prices, so long as it is not advertised. This is why you see weird "log in to see price" online or "call for pricing" in print.


While trademarks and pricing is an element, the main lever the manufacturer has is mainly just future chip supplies. If you get one thousand 3080 chips from NVIDIA and make them into video cards that you sell well above the price NVIDIA wants, they will simply just not renew your contract when it expires. No chips, no cards.


They would be breaking us law and ftc regulations if selling price and not advertised price were used that way.


EVGA is not obligated to allow the retailer to sell using EVGA's trademark

I'm not an expert, but I imagine nominative use would be ok? i.e. you'd be able to run an ad saying "EVGA RTX-3080-DSFHDSFG - $999" or showing a picture of the box they sell it in (including the EVGA logo on that box).


No, not ok. That’s why you see login for price and call for price on some products.


NVIDIA directly competes with them. I don’t know how it’s legal, possibly not.

It’s hard to sell an “EVGA NVIDIA 3040(16GB)” at $1999 against an “NVIDIA NVIDIA 3040(16GB)” at $999. Exaggerated but basically that.


Summary strongly appreciated. Thank you very much.


If their CEO wants to refocus on their family, there's a super easy way to do that: you step down, either to a position that lets you spend more time with your family, or by just leaving the company.

A CEO doesn't run the company, the board does. If the CEO wants to spend more time with their family, that's their call to make, but then the board will replace you because you are no longer the best person for the job.


It's privately owned and don't seem to have a board of directors, and the owner(s) can do whatever they want.


It's a privately owned Stock Corporation registered in California. California requires at least 3 directors unless there are less than 3 shareholders.


EVGA is doing what every iOS developer secretly dreams of doing whenever they have to deal with some new stupid bullshit from Apple. Respect.

Also, I've always liked EVGA. I've bought a few of their cards over the years, and many power supplies, and I've never had a bad experience. It seems like they really care about making quality products, which is pretty rare to see nowadays.

If they start expanding into new products with the same focus on quality, that'll be a net gain for the world. Maybe they'll try making laptops again?

Or maybe this is all just an elaborate power move to put pressure on Nvidia and renegotiate their relationship.


My favorite part is having a gun pointed to my head to buy a MacBook (tbf the physical design is awesome), then have my OS version inexplicably tied to the ide, which has a 3/5 star rating on their own appstore, and then pay $100/y for the privilege


Could be both. If you get the negotiating power as an effect of basically telling them to get fucked, even though that wasn't your main thing, then great.


> EVGA is doing what every iOS developer secretly dreams of doing whenever they have to deal with some new stupid bullshit from Apple

Oh, I did it. I quit being an iOS dev professionally. Later decided not to even support iOS in my side projects. Just threw away that skillset and focused on developing web skills instead.


First thing I thought of when I read EVGA didn't get 30XX drivers until they were public was how Apple drops breaking updates on users and most developers same day.

It's crazy how poorly some companies treat their partners. Take Microsoft in stark contrast to Apple. They aren't perfect but I, as an END USER, have access to major Windows updates 6-12+ months before they are rolled out. New major versions of .Net are in pre-release almost as soon as the current major version drops.

Apple drops OSX/iOS updates on their consumers and developers same day leaving everyone scrambling to add new requirements, re-compile, and submit updates to the app store. Apple drops freaking M1 on the market and the bulk their developers same day.


This is just false. Apple does developer beta's for all of their operating systems. The first iOS 16 beta came out over 90 days before the official iOS 16 release date. For the Apple Silicon transition they released a Developer Transition Kit (DTK) Mac Mini with an ARM chip in June/July 2020 and the first M1 Mac came out in November 2020, so developers had at least 5 months or so where they could have been doing some testing. Even then Rosetta works great and the vast majority of apps just worked out of the box and only a small number of users were using Apple Silicon Macs at first anyway since the actual Pro models weren't updated until 2021.

Apple doesn't give 12+ months of heads up to developers because their release cadence is too high to do that. A .NET release is more analogous to a Swift release, and Swift is an open source project so you can go and use the bleeding edge whenever you want.

Apple definitely isn't the most friendly to their developers but to say they drop Mac OS and iOS updates on consumers and developers on the same day is completely false.


Don’t let the truth get in the way of a good story.


Yeah, I quit the Apple dev ecosystem for entirely different reasons. The betas weren't hard to access.


iOS apps that support X version can only be released once X version is publicly available so testing is through TestFlight only.


It's crazy to see them go, they've been a big name in the GPU space since as long as I can remember. If they truly don't expand into AMD GPUs or something else as they've said, I don't see them sticking around in the long term. A bit sad, they have been a good company to their customers.

I don't blame them though, it sounds like Nvidia is a PITA to deal with, and the recent pains with GPU shortages and mining have likely exacerbated it as well.

My best wishes to their employees, hope they all find good work elsewhere.


I have one of their GTX 960s, and it's been rock-solid even when it has been continuously crunching on BOINC projects for months on end.

I was hoping that the crash in crypto mining would bring some sanity back to the GPU market, but (like all too many things) it's really not a market after all, and sanity is too much to ask for.

I'm sad to see them go as well, but if they can't make money building graphics cards, then I don't blame them for getting out.


The crash is pretty evident, for what it's worth. Founder's Edition 3090ti is in stock at Best Buy for nearly half off in the US.


Damn, you're right. An absolutely excessive and possibly outright pointless purchase, but for someone it's a deal


For what it's worth, I managed to get an EVGA 3090 at retail two years ago just to play Minecraft RTX.

This'll probably be a deal for anyone who has esoteric expectations of their hobbies like I do.


I wasn't aware of Minecraft RTX, but it looks sweet. So do a few other games


They’re very useful for deep learning!


I do hope it is a feint and they end up in AMD-land.

I would not be the least bit sad if I never put another dime in Nvidia's pocket. They're not the ugliest scumbags out there, but definitely in the bottom 20%.


They do a lot of things besides nvidia based cards. They have a lot of talent and will do fine. This is clearly a survival decision though, I guess nvidia has become so bad they are just losing money hand over fist trying to maintain a partnership. This is why everyone went nuts when nvidia tried to buy ARM, it's an awful company, up there with Oracle.


Right- I've been a loyal EVGA customer for 10+ years but only for GPUs. I was planning an upgrade to the upcoming 4000 series but now I'm questioning whether to upgrade at all. But I agree with the sentiment expressed above, thanks for all the high quality GPUs over the years and good luck with what's next.


EVGA was pulling out of gpus for some time, they e.g. Did t sell gpus in some EU countries but sold PSUs.


I see that EVGA selling through amazon in Italy, I don’t see any availability from their own website. Paying the amazon’s cut seem like even adding things on the top.

I miss their GPU+PSU bundles (and i need one now again). Their PSUs are great as their GPUs.


The problem is that AMD GPUs basically don't work. Intel also has issues. ML is big, and that's NVidia's game.

Many years ago, Matrox had a nice niche. Their cards were stable, reliable, and open-source. They weren't the fastest, but they were widely used for CAD, workstation, and business work. I'd like something like that today. I'm surprised Intel can't pull it off.


Define "Basically don't work"? They've been competing at the top end with Nvidia for at least the last couple generations. It was a toss up between things like the 3080 and 6800XT for the same price. They've been on a hell of a roll for awhile now.


Yeah, what? I've been pretty happy with my 6900xt. ML workloads are not as good without Nvidia branded stuff, but games run great and with lower power draw/heat than competition.


Ryzen 5900x / RX 6900xt here

Life's pretty good in Linux land too. Around the launch of the 6000 series AMD pushed a bunch of good stuff in the kernel. With Vulkan being the big boy in town I can practically play any game I want sans shooters, and that's because of anticheat.


How is ROCm playing out? I am eying new GPU, especially for stable diffusion, very curious about this. As i see, AMd is loaded with vram but lacks cuda-thingies, is it right?


Modern AMD GPUs are fantastic, and I would recommend them over NVIDIA cards for most people. I'm not really sure where you're getting this from.


Have you bought an amd gpu in the last decade? They don’t beat nvidias top cards but they perform well. Xbox/PS4 use amd hardware and so does the steam deck. I have no idea why this comment was even made?? Have you compared their open source driver to nvidias? Night and day difference.


Dude what? Ok sure older and cards were hot garbage but not nowadays


Intel did pull it off.

Which company did you think shipped billions of chips with integrated graphics in the 2010-2017 era, before Ryzen? Their iGPUs weren't amazing, but they were good enough to be used in business/office PCs across the world.


The only reason I switched to team Green was the noise reduction for microphone. My 5700XT worked just fine.


You know that you not need. NVIDIA card to using a AI noise filter ? RRNoise does the same thing and not need a Nvidia GPU to work.

A happy user of RRNoise on Linux with an AMD GPU


Ah, no I didn't.

Well, to be honest the 3070 is twice the performance of the 5700X so I'm not going to get upset. But that's good to know for the future.


I watched the video. Seems like nvidia's CEO wants to go the apple route and sell everything themselves.

I do wonder about EVGA's decision. Why call gamersnexus in for an interview? It seems to me this is part of a strategy to get intel and AMD tripping over their dicks to get EVGA working on their GPUs - and get one/both of them to agree to conditions nvidia wouldn't.

I think its a smart business move if nvidia has laid their cards on the table so clearly. Reminds me of how valve went all-in on linux development when microsoft threatened their whole business with windows 10.


>It seems to me this is part of a strategy to get intel and AMD

This announcement weakens EVGA's leverage since now EVGA doesn't have the NVidia option.

Also, if EVGA were willing to work with Intel, Intel would have jumped on it. Intel has a lot to gain from EVGA - EVGA could give Intel some credibility in the discrete graphics market that Intel doesn't have.

Most likely EVGA actually wants to be out of the game?


Maybe EVGA does not want thr brnad associated with low or mid-tier components. One day Intel will have a flagship GPU, but today is not the day.


Almost nobody buys a 3090. A huge, huge portion of the market is fought in the mid-tier. The 2060 was the most popular 20-series card, if memory serves.

AMD and Intel can compete in the mid-tier.


That would work if their reference 3090s were not notoriously hitting 110C on the memory junction with any significant memory load, spinning fans to the volume of Concorde flying cruise speed. Other manufacturers actually had that problem covered.


I do not understand how Nvidia thinks putting ONE tiny fan on something that draws 200 watts is a sane decision. Back in the day they were even using centrifugal fans just to add a few extra decibels.


I wish they were that good! The card is actually 350W. I run mine with power limit 250W. Fortunately the slowdown of minGPT training is only ~3% with this setup.


>Fortunately the slowdown of minGPT training is only ~3% with this setup.

The cards are run at a very unfavorable part of Freq/Voltage curve. Increasing freq. (performance) effectively scales the power in a cubic manner.


To be pedantic 1.03 * 3 is ~1.093, which is much less than 1.4 = 350/250, so the behavior is still quite surprising.


There's 2 fans, one on each side.


Not on most of the cards they make. I was talking in more general terms, every large format GPU should have 2-3 large ass fans so it can run quieter with them running at low rpm instead of one screaming its bearings off.

Given that the 3090 has 2 of them makes me think it would probably actually melt or explode with just one, they wouldn't give you an extra 2 dollars worth of fans on a $2k product for free you know.


I think there's a bit of prejudice going on in this thread based on experiences in the past. The earlier FEs with 1-fan design at the end were indeed not great but the current gen is very nice.

I have the 3080TI FE and it's a really solid card. Very well made, strong metal case. Nowhere is there any exposed board (ESD risks!) except for the obviously necessary PCI header. Fans run fine and are not too noisy and it doesn't overheat.

My last card was an MSI 1080Ti beast and it was just a poor design. Yeah it was a bit quieter but physically and mechanically it wasn't as well designed as this by far. It seemed to have been designed much more for looks than performance.


You're replying to a comment about the 3090. I'm not aware of any single-fan 3090s.


I think what Nvidia wanted is bully all of their AIBs until one of them breaks and sell to Nvidia.

> It seems to me this is part of a strategy to get intel and AMD tripping over their dicks to get EVGA working on their GPUs

To doubt, margins are slim on GPUs. Intel & AMD have enough infrastructure to not need AIBs.


Nvidia makes the chip regardless. Surely they make more money if they sell the whole thing to customers themselves?


Well, in the past, Nvidia would make a reference card design with a horrible cooler and power delivery, then AIBs would make an actual card that you want with proper cooler and power delivery. After some time, production of reference card would slow down or stop and custom cards will dominate.

For 10x series, Nvidia decided that AIBs are parasites that leech on Nvidia success and started designing their own "good" cards called Founders Edition that has a good cooling and power delivery. That was not related to the reference design card given to AIBs.

10x series was still meh in terms of cooling, better than reference, but meh compared to say EVGA FTW or really any other AIB card that isn't a reference design. Nvidia, kept nice things like Titan to themselves.

For 20x and 30x they doubled down on their own cards, both had good cooler and power delivery. Again, certain consumer chips for only available in Nvidia made cards.

Note: in all cases, some AIBs sell generic reference design cards under their brand.

> Surely they make more money if they sell the whole thing to customers themselves?

It's about who takes the risk. Yeah, their margin for cards is probably better than margin for chips, but:

- Nvidia had much smaller supply chain - Nvidia didn't have to deal with distribution - Nvidia didn't have to deal with end customer - Every was sold in bulk - Logistics were not as complicated


To be fair Nvidia reference is fine as a rear exhaust card, but average PC user don't want it.


It's a fine as somewhat noisy read exhaust card that you won't overclock and hide in a case without glass. There is a use case for them, but like you said, it's not for average PC users.


In the video, there is a quote by Nvidia CEO complaining about AIBs making all the money when they(Nvidia) do all the work


Weird. I currently have a FE (3080Ti) and I would take this card over an AIB any day. Much cleaner design without all the screaming gaudy plastic panels and RGB lights. No heat pipes sticking out the sides.

The problem is the limited availability. They just don't want to sell a lot of them, sadly.


If you enjoy jet engines in your office, certainly buy a founders edition.


My FE is absolutely not too noisy. During 2D use the fans aren't even running, and during 3D the noise is totally fine.

I always game with headphones on anyway, so the noise is not an issue either way. But it's really not bad. On this generation at least the cooling solution is fine IMO.


Yeah, I'd much rather have some screaming plastic in a case where it can't be seen than a screaming fan that's heard in the next room.


Eh anyone that cares about noise will watercool their GPU.


I do feel like Nvidia is the only one with the right mix of market muscle, knowhow, and patents/licenses to pull of an ARM based competitor to Apple's products. AMD is doing some great things but can you imagine an M1 level product that sips power like a champ but has the graphical muscle of a proper 3070? The M1 Ultra is decent, but just doesn't compare to a proper dedicated GPU (on blender benchmark, the desktop M1 ultra seems to perform slightly below a laptop 3050).


With all that, and they still can't make a cooling setup for their cards worth a damn.

I'm still waiting for a Noctua GPU partnership.



Oh yes!


They should just bite the bullet and require water cooling.

I have a 2080ti in a custom loop and it is almost silent.


nvidia has already released the best arm chips they could (tegra) years ago. They all were so crappy they were part of the reason I never believed apple would go arm.


They don't have the board design chops to do it. They need the partners because they can't build enough PCBs to keep up with the number of chips they make.


As a consumer I would prefer it. If all they did was narrow selection it would be a win. An online search for 3080 RTX yields more brand flavors than a toothpaste aisle. Except consumers can understand toothpaste flavors. No idea why we need dozens of otherwise indistinguishable video card variations.


> Except consumers can understand toothpaste flavors.

Can we? Here's the currently available US flavors / varieties of ONE brand, Colgate:

    Advanced White Charcoal Toothpaste
    Advanced White Toothpaste
    Anti-Tartar + Whitening Toothpaste
    Cavity Protection Toothpaste
    Charcoal Natural Extracts Charcoal Toothpaste
    Colgate Baby Toothpaste 0-2 years
    Colgate Kids Magic Toothpaste 6-9 years
    Colgate Kids Toothpaste 3-5 years
    Cool Stripe Toothpaste
    Deep Clean With Baking Soda Toothpaste
    Elixir Cool Detox Toothpaste
    Elixir Gum Booster Toothpaste
    Elixir White Restore Toothpaste
    Gum Invigorate Detox Toothpaste
    Gum Invigorate Toothpaste
    Max Fresh Cooling Crystals Toothpaste
    Max White & Protect Whitening Toothpaste
    Max White Charcoal Whitening Toothpaste
    Max White Crystals Whitening Toothpaste
    Max White Expert Anti-Stain Whitening Toothpaste
    Max White Expert Complete Whitening Toothpaste
    Max White Expert Original Whitening Toothpaste
    Max White Expert Shine Glossy Mint Whitening Toothpaste
    Max White Extra Care Enamel Protect Whitening Toothpaste
    Max White Extra Care Sensitive Protect Whitening Toothpaste
    Max White Luminous Whitening Toothpaste
    Max White One Whitening Toothpaste
    Max White Optic Whitening Toothpaste
    Max White Sparkle Diamonds Whitening Toothpaste
    Max White Ultimate Catalyst Whitening Toothpaste
    Max White Ultimate Renewal Whitening Toothpaste
    Max White Ultra Active Foam Toothpaste
    Max White Ultra Freshness Pearls Toothpaste
    Maximum Cavity Protection 3+ Kids' Toothpaste
    Maximum Cavity Protection Fresh Mint Toothpaste
    Maximum Cavity Protection Toothpaste
    Nature IQ Enamel Repair Toothpaste
    PerioGard Gum Protection + Sensitive Toothpaste
    PerioGard Gum Protection Toothpaste
    Sensitive Instant Relief Enamel Repair
    Sensitive Instant Relief Enamel Repair Toothpaste
    Sensitive Instant Relief Multiprotection
    Sensitive Instant Relief Repair & Prevent
    Sensitive Instant Relief Whitening
    Sensitive Sensifoam Multi Protection Toothpaste
    Sensitive With Sensifoam Toothpaste
    Total Active Fresh Toothpaste
    Total Advanced Clean Gel Toothpaste
    Total Advanced Deep Clean Toothpaste
    Total Advanced Enamel Health Toothpaste
    Total Advanced Gum Care
    Total Advanced Pure Breath Toothpaste
    Total Advanced Sensitive Care
    Total Original Toothpaste
    Total Plaque Protection Toothpaste
    Total Visible Proof Toothpaste
    Total Whitening Toothpaste
    Triple Action Original Mint Toothpaste
I'm an adult male. My dentist has not informed me of any particular concerns. I wouldn't mind my teeth being a bit whiter. Which of these am I supposed to buy? Or am I supposed to buy a product from one of the other brands of the same manufacturer... Elmex, Meridol, several others, each of which has their own outrageously long lineups. Or look at a different manufacturer?


One of my professors described Toothbrush buying as the absolute most pathological case for the Paradox of Choice: dozens of ostensibly differentiated products, all in basically the same price range, all basically affordable, with no real way as a consumer to discern their efficacy or quality. You'll use the product for the next 6 months to a year, and if you pick the wrong one, your breath will stink, your teeth will fall out, and you'll need expensive surgery.


What were they a professor of? I don't see a paradox of choice here; none of those Colgate products, or anything else sitting on the shelf that's ADA accepted, is going to rot your teeth.

It is, however, a classic result of oligopoly. Oligopolists compete among themselves--and, more importantly, prevent entry from newcomers--by hyper differentiating their products. The "artificial" product differentiation also makes comparison-shopping harder and softens price competition between the oligopolists. It's a really fascinating, kind of counter-intuitive, but well known dynamic.

E.g., Crest offers a slightly differentiated product--"gleaming white plus plus" or whatever--and Colgate responds with "extra pearly white super plus."

Or a newcomer tries to sell "natural something," so the oligopolists introduce "natural stuff" and "pure friendly paste" to prevent customers that like the sound of "natural" from defecting to the newcomer.

Hotels are the classic case. Hilton and Marriott have tons of brands. Take extended stays for example. Marriott has three extended-stay hotel brands. No one thinks extended-stay hotel customers started a letter-writing campaign to Marriot saying, "I really want an extended-stay hotel that's just like Residence Inn, but with a different color combination. That's where you should invest your money. I don't need nicer furniture or lower prices, thank you. Just give me the new color scheme, thanks."

The dynamic is a counterintuitive feature of oligopolies, but it's very well known. But academia is hyper-specialized, so, yeah, everyone sees their own pet theory in everything.


If your teeth fall out after 6 months of bad toothpaste, perhaps you should stop eating sugar and drinking battery acid :)


On the other hand, if you've been eating sugar and drinking battery acid all along and your teeth haven't fallen out yet, please let me know what kind of toothpaste you've been using.


It's made funnier by the fact that I've never had a dentist who's cared. I asked a couple and each time got the line that the physical act of brushing matters much more than the type of toothpaste. I vaguely remember one saying "as long as it has fluoride"

I think every toothpaste ad says "4 out of 5 dentists recommend" their brand because 4 out of 5 dentists says "Yeah sure. Okay, use that one. It doesn't matter. Just brush and floss regularly and methodically"


Toothpaste was a bad example, OP's original point is still good IMO. Sometimes analogies fail, that doesn't mean that the main point isn't valid anymore.

There is a dizzying mix of GPU brands and offerings out there. Some have better cooling than others. It is a giant mixed bag just like the toothpaste example.


I love this comment. I get what the parent commenter was trying to say, but selecting toothpaste as the shining example of clear and accessible consumer choice is really funny.


Professional cleaning at home! * by removing surface stains

Restore tooth enamel! * contains flouride which promotes remineralisation

Appear universally on toothpaste ads in the UK


> My dentist has not informed me of any particular concerns

So my dentist did actually explain stuff to me, and generally you want something less abrasive.

Brush more frequently (every meal), but not for long, and using a very soft brush, with the primary goal of raising ph above acidic levels, and increasing gum blood flow.

CTx4, Sensodyne pronamel, something like that.


So if the main distinction I'm looking for is abrasiveness... does that mean that the key differentiator is completely undocumented to customers?


Pretty much.

Although I would recommend specifically Sensodyne 'ProNamel' because it contains BioMin F, which afaik is one of the few ingredients beyond fluoride to show efficacy at remineralizing damaged enamel. It does technically require a prescription in the US, but its easy enough to find grey-market.

edit: Had novamin confused with BioMin.


I think that was EVGA's point: Nvidia is highly abrasive.


Is it whitening toothpaste? That's abrasive.


Right, almost nobody ever stops to ask what makes a good toothpaste.


You're obviously supposed to conduct a double-blind longitudinal study over the course of many years, with at least 30 participants in each group, for every single sub-brand you listed.

Also, make sure there's no conflict of interest at any point in the studies.


Customers also understand fan speed and cooling, which cards can compete on. Less noise, or more overclocking ability. Different additional software which may or may not have any value for you.


Here's the top three search results from the most popular online store in my country:

- MSI GeForce RTX 3080 Ti VENTUS 3X 12G OC - Gigabyte GeForce RTX 3080 GAMING OC 10G LHR - ASUS GeForce TUF RTX 3080 GAMING

I know TI means beatter - not sure how much better. What the hell is a Ventus? Is it Ventus 3x or just 3x? Also what is being 3x-ed here? I assume OC means overclocked - which only raises more eyebrows. Presumably 12G is better than 10G - I haven't seen any games with graphics memory requirements since games came in box sets and we counted things in megabytes not gigabytes. The ASUS card I guess is memoryless [I kidd], but comes with something called a TUF - I guess that's better than a mere OC? LHR means London Heathrow - I hope the Gigabyte card provides a more enjoyable experience than a trip to LHR. Good to know that the last two cards for gaming, I was afraid I'd buy a $1000+ graphics card that I cannot use for games.

Imagine if Apple + friends would market phones as "ASUS IPHONE 13 Vroom 2x 8GB NRT".

TL;DR I want to buy a card to play Elden Ring at 4k that won't make my computer sound like a 747 about to take-off. I don't want to spend my weekend on Reddit dissertations to decypher some marketers idea of high-tech hieroglyphics just so I can understand what I'm buying.


Like all other parts for custom PC building, you can either arbitrarily pick a pricepoint and a cool-sounding name and slap it together, or you can read benchmark comparisons and have a rough idea what works best for you. Should I buy an Intel CPU or AMD? Which of the dozen models from each brand? Motherboards, power supplies, cases, coolers, this staggering array of choice is everywhere throughout the custom PC space. You also have the option to not deal with it and buy a full PC from an OEM or system integrator, even Dell if you hate both yourself and your money.

PS the 3080 and the 3080 Ti are different GPUs, so that is nothing to do with the AIB manufacturers, that's Nvidia directly. Like an Apple M1 vs M1 Ultrapromax.


ONE source for all Nvidia GPUs is preferable? No thanks.


Hell, if only because the reference cards use those shitty tiny loud blower fans, at least the ones in my price range.

I DGAF about the leds and bullshit but I want quiet fans.


I’m guessing they had multiple outlets call to interview, and Gamers Nexus are one of the more trustworthy ones out there. Probably just chose them for the best coverage of their departure.


I think this announcement can be summarized as this from the CEO/Owner's perspective(especially after watching GNs video)

- why am I dealing with this crap? Its taking from my time that I could've spent on my family!

- And we aren't making much profit at all!

- I don't know people that make products and manage the company like me

- Life is about people, and I wouldn't betray anyone, so I will keep paying the people that kept my company running

- AMD and Intel probably do the same crap, I don't want to do it again.

- PSUs are doing fine


[flagged]


This idea that AMD GPUs don’t work is absolutely ludicrous, and you’ve said it at least twice now.

Go ahead and back up your claim, if you don’t mind.


Here's my short history with AMD GPUs:

- Buy an AMD GPU labeled for ROCm during the Great GPU Shortage

- 6-9 months later, learn AMD discontinues ROCm support for that GPU. When I suggest this might be a false advertising or warranty issue, AMD and vendor both point to each other

- Install an old version of ROCm to get work done

- See odd crashes where my system goes down hard for no reason

- Read more documentation, and learn ROCm only works headless

- Find ROCm runs more slowly on my workloads than CPU

- Find most of the libraries I want to use don't work with ROCm in the first place, but require NVidia

- At that point, I bought NVidia, and everything Just Works.

That's a shortened version. I'd take a 5x speed hit for open source, but I won't take 'not working.'

This story is super-common.


I really, really don't think this scenario is super common. Shitty, yes, definitely. But super common? Definitely not. You're talking about running GPGPU workloads on a budget, consumer-grade GPU that had a ~5-6 year old chipset when you purchased it. Yah, you can find other people complaining if you Google it, but realistically, how many people do you think this affected?

It sounds like Nvidia was a much better option for you from the start, and I'm surprised anyone purchased a Polaris 10 AMD GPU for GPGPU in 2020.

Conversely, I've had an RX580 that I purchased shortly after launch, and I've had zero issues with it. I've used it in a normal PC, a self-built Hackintosh and an eGPU enclosure that I use with my 2016 MBP in both Windows and macOS.


Look, this is an unfortunate experience, but I'm going to give you a wake up call: ML support has literally zero relevance in consumer GPU trends.

The idea that EVGA would make this the reason they don't work with AMD is laughable.

This would be like me claiming that NVIDIA's subpar Linux support is why Sapphire only works with AMD.


ML, no, although that will change with tools like Stable Diffusion.

GPGPU, definitely.


Most people are buying AMD GPUs for gaming or productivity (e.g. blender), not machine learning, and its working great for them. ROCm is currently a joke. Maybe in a few years AMD will care enough to try to participate in the ML hardware space.


I am willing to bet less than 1% of the GPU market cares about ROCm, as much as I want ROCm to be competitive with CUDA.

i.e as a Gaming GPU, AMD is doing fine. Not Great, but fine.


>This story is super-common.

Amusing introduction for what manages to look like elaborate FUD.

Particularly, there are quite telling parts.

- It seems to be very specifically about compute, which is not what most people buy their GPUs for. Interestingly, your former "does not work" comment didn't even mention that.

- No timeline (is this 2016? 2018? 2020? 2021?). Particularly, ROCm today has nothing to do with ROCm two years ago.

- We know nothing about your application (what are you even trying to do?).

- GPU model and Vendor are omitted, so we cannot verify your story about support removal.

- Libraries "you want to use" are omitted, so we cannot check today's status of ROCm support.

NVIDIA, everything just works. (advertisement thrown in at the end)


> It seems to be very specifically about compute, which is not what most people buy their GPUs for. Interestingly, your former "does not work" comment didn't even mention that.

It's an anecdote. There are many more like it. However, compute is increasingly common, and I suspect we're hitting a critical point with tools like Stable Diffusion.

> No timeline (is this 2016? 2018? 2020? 2021?). Particularly, ROCm today has nothing to do with ROCm two years ago.

"Great GPU shortage" places it a bit after COVID hit.

> We know nothing about your application (what are you even trying to do?)

NLP, if you care, but that's true across most compute applications.

> GPU model and Vendor are omitted, so we cannot verify your story about support removal

It's a conversation, not a jury trial. RX570, if you care.

> Libraries "you want to use" are omitted, so we cannot check today's status of ROCm support.

If it please the court, the most popular NLP library for the type of work I do is:

https://spacy.io/usage

If it please the court, I was also using Cupy extensively, which has experimental ROCm support, which completely didn't work. It isn't officially supported either:

https://docs.cupy.dev/en/stable/install.html

If it please the court, I just made my own library which is tied to CUDA as well, not for lack of trying to make it work with ROCm. AMD will have a bit more of a hole to dig out of if it ever tries to be competitive.


>"Great GPU shortage" places it a bit after COVID hit.

>RX570

My condolences. I was lucky enough to get a Vega 64 (Sapphire's) on launch. I'm still using it today. RDNA3, together with new electricity prices, might finally get me to upgrade.

>If it please the court, I just made my own library which is tied to CUDA as well,

HIP is meant to solve that problem. Your library might be auto-convertable. It's pretty much CUDA with everything renamed to HIP. It can then run on both CUDA and ROCm.


Here's my basic problem. Neither I, nor anyone I work with, want to understand what "Vega 64 (Sapphire)" or any other of this stuff is. I'd just like things to work.

I bought a card advertised to work with ROCm, and got 9 months of use from it, which was just about enough to set up a development environment, since most of the real work is in data engineering, dashboarding, etc.

I did take the time to understand this when things broke, but that's not really a reasonable expectation. My recollection of this will be:

"AMD market tools at the stability and maturity of early prototypes as production-grade code" and "AMD GPUs might stop working in a few months if AMD gets bored." Experiences like this DO burn customers. If AMD had advertised this as being not-quite-ready for prime-time, I would not have felt bad. The gap between advertising, fine-print, and reality was astronomical.

You could point me to github issues and say all this was public, but that's not reasonable to expect of someone buying a GPU. If I walk into a store, and walk out with a card labeled for ROCm, then ROCm should work.

One of my colleagues bought an ancient NVidia card, during the same shortage. It just works.


Intel Arc sure, but I haven't experienced any issues with my 580, or my 6700 XT, nor have I heard of any widespread instability.


Your 580 no longer supports ML: https://github.com/RadeonOpenCompute/ROCm/issues/1373

If it did, it would only do so headless: https://docs.amd.com/bundle/Hardware_and_Software_Reference_...

If you tried to run it with a monitor, it would appear to work, and then your system would become very unstable and crash hard.

I hope that helps!


Wow. EVGA was always the gold standard for Nvidia cards IMO (until Nvidia started producing and selling their own). Unbeatable warranty. Cooler designs that weren't too offensive or jarring. Decent software. What a shame.


> EVGA was always the gold standard for Nvidia cards IMO (until Nvidia started producing and selling their own).

Oof, this sentence just proves their point.


>>Unbeatable warranty

They turned from god tier to trash tier after Brexit. Literally the only company that didn't invest in a UK-based repair centre, so when I recently had to send my broken EVGA Black 2080Ti for repair, I was told to ship it to Germany, then got charged £200 in customs duties for replacement shipped back to me. The thing is, there is a way to avoid customs on replacement items following warranty claim - but EVGA doesn't care to fill out the customs documents properly. I messaged them telling them exactly how to fill it out properly so it will go through without charges(we do this kind of thing all the time), they basically said yeah sorry not our problem, we gave you a new GPU for free so why are you complaining. And I'm not the only one with this issue - quick look at their forums shows loads of UK-based people getting charged customs duties for parts sent by EVGA. They just advise to appeal to HMRC for a refund(which is actually pretty difficult to do and can take months), like it wasn't their bloody issue in the first place.

Already made a vow to never buy anything from EVGA again after that, so I'm pretty glad this won't even be a risk now.


That sounds like it's more a problem with your country's ongoing deinternationalization than EVGA being bad. Would you expect a company to open a Russia-based repair center right now?

When a country is deliberately harming its international trade ability, it's not on companies for not playing ball—it's on the country not to pull the ground out from under its citizens and companies doing commerce there.

EVGA did nothing wrong. Your country changed the deal, EVGA chose to not humor it.


Ironically the EU warranty rules are still in place and it feels to me like this would probably break them. EVGA can say "Britain has become an bad place to do business post-Brexit" and leave, plenty of people, especially young people who buy graphics cards, would support it (but probably wouldn't care enough to import their products from elsewhere).

But to stay and then make your customers responsible for unexpected charges when your own product turns out to be faulty, seems like a bad sign.


But that is how repairs and warranties work across international borders? Unless there are trade agreements that say otherwise. It sounds like this person got a new GPU as a warranty replacement, and customs was like, well, this is such and such tariff.

And typically, companies and people will just ignore you if you ask them to do something they think is illegal, like saying an item is gift when shipping internationally. People try to get me to pull this when shipping on eBay to save on the import costs all the time.


There is a tariff code specifically for warranty replacements though, and EVGA doesn't use it because they are incompetent. If they did, no customs duties would be charged. I had to send my other GPU to Nvidia for replacement, they shipped a brand new card for me from Hong Kong and.....no customs charges. Why? Because paperwork was done correctly and indicated this was a replacement. EVGA includes paperwork as if it was a brand new card sold to a new customer, which it is not - and again, I'm not the first person to complain about this to them, but they just refuse to do it correctly for some reason, effectively sticking out their middle finger to all their UK customers(who maybe only bought EVGA to get their excellent support, and instead got.....this).


I disagree, seems like Nvidia actually violated the tarrif code: https://www.gov.uk/guidance/using-outward-processing-to-proc...

"For replacements, you will be charged VAT on the full VAT replacement value."


No they didn't, because the replacements are zero VAT rated. You include the original invoice for sale showing that VAT was originally paid and then the replacement becomes 0% VAT rated on import. So the page is correct, but it doesn't tell you that in most cases for commercial goods the VAT due is 0%. You would be liable for VAT if the original didn't have VAT paid for, or if you couldn't prove that it did.


I don't know what the rules are in UK these days, but in EU you can always deal with the seller (assuming they still exist). That would be guaranteed to be free for you. If they want to import replacement they can do that & deal with the customs.

https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CEL... & https://europa.eu/youreurope/business/dealing-with-customers...


Indeed. Resellers telling you to deal with the manufacturer is such a massive shirking of responsibility


No, I'm sorry, but you are 10000% wrong about this. I hate brexit and everything this country has become, but this is absolutely one hundred percent on EVGA. There is a way to send items internationally after warranty repairs so they don't attract charges, there is a tariff code that you use and declaration that you make, which is something that I told EVGA explicitly, and they still decided to ignore. That's shit customer service.

>>EVGA did nothing wrong.

Yes they did. You can't fill out customs documents incorrectly and then just throw your hands in the air and say "well it's your fault for brexit, what do you want us to do? file official paperwork correctly? Fuck you for buying a £1200 GPU from us I guess".

I'd honestly much rather that they just said "look, we are too lazy to read up how to fill out customs declarions, so we are just going to stop providing warranty services to our UK customers". Would have been more honest at least.


The methods you speak of cost more (in labor) for EVGA. EVGA honoring their pre-Brexit policy and not moving mountains is well within their rights.

It is your country's fault that Brexit happened, not EVGA's. EVGA did what they were supposed to—replace your GPU. You choose to stay in England, as an immigrant, meaning that while you may not like Brexit, you consent to the country's laws.


I'd love to hear how typing in the correct tariff code and adding the original invoice to the customs paperwork costs them more money in labour. Filling out customs paperwork correctly isn't "moving mountains", it's the basic order of doing business. If EVGA doesn't want to do this, then they should stop supporting their UK customers full stop.

Also this isn't a "pre Brexit policy" - if you buy anything from EVGA right now here in UK, are you aware that having it repaired/replaced will incur customs duties due to EVGA's incompetence? You won't have this problem buying any product from almost any other manufacturer, so not disclosing this information is dishonest at best and fraudulent at worst.


To do discovery to find the proper forms, and what they have to fill out in them, and to get their legal department to make sure they're doing it correctly, has significant costs.

You bought your card before Brexit. They're honoring the deal they made with you—not their fault they had the ground pulled out from under them with laws you consented to.


Sounds like a very weak excuse. Millions of businesses manage to fill out their paperwork correctly, but EVGA can't?

>>You bought your card before Brexit. They're honoring the deal they made with you

If I bought it after Brexit I would have had the exact same issue. Would it change how you feel about EVGA's behaviour then?


We kind of can. England turned India into an impoverished country, backstabbed Poland at the start of WWII, raped half of Africa, broke the Middle East, killed most of the Native Americans, not to mention the whole slavery bit. It still has artifacts from all over the world. Oh, and China+Opium. Irish potato famine.

I have nothing against England, mind you, but I do have a problem with the English whining about tariffs, tourism, or whatnot after Brexit. Seriously.

If EVGA were addicting your country to drugs, shipping you off for slavery, and stealing your artifacts, you'd have a case. Complaining about someone not filling out your tariff form the way you'd like is sheer entitlement and hypocrisy. Boohoo.


>> but I do have a problem with the English whining about tariffs, tourism, or whatnot after Brexit

Good thing I'm not British then.

>>. Complaining about someone not filling out your tariff form the way you'd like is sheer entitlement and hypocrisy.

Really? Not filling out customs documents correctly is now anything other than lazy and incompetent? Wow.


> Good thing I'm not British then.

I'm not quite sure what you are. Can you clarify?

Brexit only affects the British. Do you mean you're not English?

For the general public: England is a country. Britain is an island, which includes England, Scotland, and Wales. The British Isles are a set of islands, which adds principally Ireland and the Isle of Man. The UK is a union of several countries, including Scotland, England, and Northern Ireland. Most of the English have nearly as poor a grasp of the differences here as they do of the random former imperial belongings they screwed up. Most assume Britain=England and call the English "Brits."

> Really? Not filling out customs documents correctly is now anything other than lazy and incompetent? Wow.

You mean like randomly drawing maps on a line to divide up the Middle East, without getting anyone local involved? Yes! It is.

Fortunately, it only costs you a couple hundred bucks, and not decades of war.

I have nothing against the "Brits," but each time they do this, I have this image of a rich, spoiled brat yelling at a minimum wage barista for 15 minutes about screwing up an order. Yes, she should have made a DOUBLE soy latte, but get over yourself. Sheesh.

As a footnote, I've met people who make "mistakes" like this precisely because they're annoyed at the "Brits" for any of a variety of reasons. I find it funny.


I'm an immigrant living in the UK. So British laws and decisions like Brexit affect me despite the fact I have no say in them in any way shape or form. But that's beyond the point.

I take the issue with you calling my approach entitled. There's a correct way to do paperwork and incorrect way. EVGA does their paperwork in an incorrect way - demanding it to be correct is not entitlement, because....how can it be? It's like saying that asking someone to follow the speed limit is entitlement.

Or to give another example - it's as if I opened a company sending products to American customers, but never bothered to read up on American customs forms, so my customers would end up getting charged extra duties unnecessarily.

If they complained about it, would you call them entitled?

>>You mean like randomly drawing maps on a line to divide up the Middle East, without getting anyone local involved? Yes! It is.

So British imperial past means that EVGA can simply not bother to fill out their paperwork correctly? That's a leap of logic if I have ever seen one.

>>Most assume Britain=England and call the English "Brits."

I think you actually have a really poor understanding of how it works. English, Welsh and Scottish people are all British, so calling English people Brits is 100% correct. The only citizens of the United Kingdom who are not British are the people of Northern Ireland who are simply Northern Irish, but I'm just not sure how that's relevant to this discussion or why am I explaining this to you.


This is an inappropriate response for HN. Please stick to the topic and contribute in an interesting and constructive way.


"not to mention the whole slavery bit"

Stopping/banning the trans Atlantic slave trade started by other European nations?

Still has artifacts from around the world? Many musuems do and previous owner often took item in battle from another nation.

Wouldn't Americans be more responsible for killing most Native Americans?


Your mistake was to pay the customs duties and receive the goods.

What you should have done was taken them to tribunal for the missing warranty item they owed you.


Very true. Should have done that.


Did you buy it from EVGA? Or a retailer? If a retailer, how old was it? If fairly new then it was very much their problem and not for you to do deal with the manufacturer


From Scan UK - it was nearing 3 years old(in fact it had like 30 days left on the EVGA warranty). I contacted Scan but they said they can't help, they would in the first 2 years but after that it's up to the manufacturer if they still provide warranty or not.


Yeah I feel like the whole 6 year consumer protection has got quite weak recently.

Amazon told me to go get bent recently when one of their own products products broke after 18 months.

I guess it doesn’t mean much when you have to prove the defect was there originally. It seems like one of those laws that sounds good on paper if you highlight small parts in isolation.


I had the opposite experience with their warranty. I had an EVGA PSU die after less than a year, called them for warranty replacement, and they point blank refused to provide service unless I signed up for an account on their website, the sole purpose of which as far as I could tell was to harvest data for marketing purposes. I had the original proof of purchase, receipt, retail box, etc, but they didn't care. So I just don't buy EVGA products now, which isn't hard since I also don't buy Nvidia products. And it's certainly possible that this is standard (bad) practice now, but I haven't had any other PC parts fail under warranty.

That being said, I think this is a good, principled move. Nvidia is infamous for being unpleasant to work with, but their overwhelming dominance of such a large market forces people to do it anyways, so it's nice to see someone stand up to that.


Their 1080 launch was botched with improper RAM cooling and ensuing RMAs/having to send out replacement memory thermal pad kits. I went through 2 cards that died on day 1 myself.

I seem to recall their 3080 launch also being a bit bumpy...


EVGA is one of the most reputable board makers, so they will be sorely missed in the market. It was always worth it to spend a few extra bucks on EVGA, because you knew that they'd have good warranties and service.

At least their PSUs will still be around?


This is a pity. I called them once 1 week before the warranty expired on a graphics card that had become glitchy. I think the people I spoke to were in SoCal, they were knowledgable, took their time, and happy to facilitate. I'll be favoring their other components the next time I want a PC built.


My limited experience with their warranty/RMA folks has been very positive, this is an unfortunate development. Watching the GN video, I understand their motives though.


My 1070 was two years out of warranty when it started on fire and they still replaced it. They're so helpful to their customers. I'm sad to see them going.


Yikes how did it catch on fire?!?


It was a known issue with the first gen 1070/1080 cards. Something about inadequate VRAM cooling. EVGA put out a notice that this was an issue and they would replace the bad part for free but I apparently missed the memo. They also pushed out a BIOS update that would prevent it from overheating but I missed that one as well.


Presumably EVGA didn't do forensics, and if they did, they may not have had incentive to publicize the defect.

Suffice it to say that most materials are flammable if they get hot enough. And there are plenty of reasons why a circuit board with many components hooked up to a power supply might get really, really hot. I'm surprised electronic components don't catch fire more frequently.


They publicized, I just didn’t see it.

https://eu.evga.com/thermalmod/


Slowly and then very quickly.


Presumably sometime after the magic smoke escaped.


Well don't buy their PSUs, since they're just some random Chinese PSUs with an EVGA logo slapped on it.

Source: I had one of their PSUs, it randomly died due to a failing fan, and I had to wait 3-4 months for a replacement. They only sent me one after I complained to them on their customer satisfaction form which was also broken and directed me to some US agents, while I'm in EU. I finally got a new PSU with US cord, and some EU cord adapter thrown into the package. Kinda bizarre.

Many people on the web report issues with their PSUs and recommend against them.

PS: I had great experience with their RMA process. They replaced a 4.5 year old GPU and even gave me a better one. So I'm still pretty happy with them.


> Well don't buy their PSUs, since they're just some random Chinese PSUs with an EVGA logo slapped on it.

That's basically every brand. Corsair/ASUS/EVGA/NZXT/Cooler Master... none of them actually make PSUs.

The only "big" brand I know of that sells directly is Seasonic.


I have an evga PSU made by SeaSonic and they have others made by FSP. I wouldn't say their PSU OEMs are "just some random Chinese PSUs"


I've had bad enough luck with component failures that I've probably cost them nearly as much in RMA costs as I've paid them for products - but I've still always had a good experience with their customer service and warranty processing for motherboards, power supplies and GPUs. Definitely a shame to have them step out of the GPU market.


The nVidia/Board-partners relationship seemed to go completely sideways around the launch of the 3xxx series. If you remember back to the launch, they just hurled chips at them with late notice (so some glitchy 3rd party boards shipped) and pricing they couldn't compete with nVidia's own boards on.

No idea when crypto took off and supply was constrained, who was taking the profits - but I'd guess nVidia. Now the partners are left holding huge numbers of boards as the prices crash. My guess is to get allocation from nVidia they had to commit to volume and price, but retailers can just return un-sold stock as sale or return (or just demand to pay less, as they drop prices)

I can see why if you survived that, you might just want to step aside from the market - as next time it might take you down. Especially if you're EVGA and prided yourself on premium products/support - and smell a race-to-the-bottom coming.

As majority of what you pay goes on the GPU itself (memory is at least a commodity) - there's probably not a lot of meat left for the partner that's taking the risks.

Does make you wonder what quality's going to be like on 4xxx.


The title is slightly wrong. The actual announcement says EVGA will not produce any next-gen video cards, not just Nvidia’s. Implying they won’t produce AMD or Intel ones either.


I am watching the GN video right now. And, while talking to him, they mentioned to him that they aren't planning on producing intel or amd cards


has EVGA ever sold AMD or Intel GPUs?

AFAIK they haven't.


Yeah, I'm fairly certain they were 100% NVidia.


Correct.

According to Kotaku, they won’t be in the GPU game anymore with 3rd parties: https://kotaku.com/evga-pc-gaming-nvidia-gpu-rtx-40-series-3...


What a bummer. My plan for this fall was to go all in on a new high-end machine with all EVGA parts once the 4090 and 13900 are available

Now I'll have to spend a ton of time trying to figure out what other brands to deal with instead. Makes me think I'll be better off with a pre-built instead of buying components this cycle


You will still be able to buy EVGA motherboards and PSUs, they aren't quitting the PC component space altogether.

I think ASUS is probably the next best option if you still want to stick with Nvidia.


Hard pass. I owned a few hundred GPUs over the past couple of years and asus reliability and customer service are only second worst from nvidia itself.

I actually had much better success rates with gigabyte cards this gen than asus. For nvidia rmas, I would often have to fight with someone in India that would demand I try the GPU in two separate full PC builds. I had the ability to do that, of course, but what normal person could satisfy that RMA request?

I received more than a few asus rma cards that had severe damage. Ports didn’t line up, pins were bent, and in a couple cases there was just tape all over the card and it wouldn’t post. Tape. All over the card.


Same boat, I sat out the 30XX madness on an RTX 2080 (ironically an EVGA) and fully intended to just buy whatever EVGA had in the 4080ish range - I've used EVGA cards forever, I'm genuinely sad they are going away because their customer support was/is amazing and on an 800 quid part that is reassuring.

Now I have to figure out who is the next best supplier, gonna guess Asus but will have to figure it out.

Might even bin nvidia entirely and look at the AMD options for next build.


I'm in the same situation; I had withheld from purchasing a GPU during the pandemic and had decided to buy an EVGA 4000 GPU when they came out, because I've only heard good things from them.

Was going to be my 1st cycle in which I'd build it too.


I know they don't have plans to work with AMD but I'd consider switching to AMD for my next card if EVGA was manufacturing it. They fixed an old card of mine way out of warranty so I have some brand loyalty here.


Same here. I was already considering AMD. Only thing keeping me back was driver stability. Is it as good as NVIDIA’s?


Seems extreme, but if you aren't making a profit, or enough profit to fund future operations, and there isn't any sign of that changing in the future - then yes you should close that part of the business.

Which is certainly rough on the employees. But the perplexing part of the announcement is that they aren't going to be entering other segments of the PC industry. Perhaps margins on motherboards aren't healthy enough? High-end designer cases could make money but I suspect the volume isn't there.


Don't they already make motherboards?


The gold mine of crypto mining has dried up. NVIDIA needs to squeeze their partners so they can maintain profits, as does AMD. They want to be like APPLE, the only ones to sell their cards. And EVGA had something no other board partner or NVIDIA had-great customer support.


Uhh, what does EVGA carry besides graphics cards? Are they going to carry AMD cards now or just going to downsize and survive on other minor component sales?


The latter. They have no plans of building video cards, period.

My guess? They got so stressed out dealing with the chip shortage that they're all burned-out internally.


They're being undercut by Nvidia as Nvidia rushes to clear their remaining 30-series stock. EVGA claims to be losing money on every 3080-and-higher card, and they can't compete with Nvidia's margins.


Motherboards, power supplies, a bunch of stuff. All visible on their website :)


I bought a ludicrously cheap 1200w power supply from them once. No house fires yet!


The last I checked, EVGA's PSUs have a very good reputation


Yes, but EVGA does not exactly make PSU's, they are all rebranded OEM's like Super Flower, FSP, etc.


That works for me, if there is competent quality control.


Well they only rebrand Seasonic and Super Flower so they should be very good products. I currently run a G6 1000w unit (rebadged seasonic) and hope to use it for the rest of this decade or more.


The only PSU I've ever actually had fail on me was from EVGA, but their customer service was incredible as usual and replaced it quickly with no fuss.

I'm willing to chalk that up to just bad luck on my part, because they certainly don't have a reputation for selling bad PSUs, unlike Gigabyte.


… I've been wanting one of their PSUs for a while now, but every time I try to buy it, the site is down. (And the "buy" site appears to be separate from the "marketing" site, so I can see the page with the PSU, the "it's on sale!". It's just the actual "shut up and take my money" part that apparently doesn't function.)

So IDK, maybe their PSU dept. is doing great.


Their PSUs are actually quite decent quality and are usually good value for the money. Not a big fan of their motherboards though (hardware is largely fine but the BIOS can be rather flaky).


They were always a PSU company to me, and that is arguably more important than the GPU. One can cause significant damage and house fires.


I mean, from the beginning their name was styled "eVGA" and they sold VGA cards.


Does it have anything to do with huge stockpiles of Ampere GPUs that were stuck with AIBs after prices crash and right before Lovelace launch?


Yes. Exactly the same thing actually happened in 2018 with the launch of Ampere, right after the previous crypto bubble collapse.

I guess EVGA saw that Nvidia was making no effort to change how they handle this situation and decided to quit rather than deal with it again.


Likely. According to GamersNexus, EVGA is currently losing hundreds of dollars on high end 3000 series GPUs right now.


Damn. When I saw the headline and saw it was a Youtube link, I could think of only one person I'd expect to explain this properly. Was not disappointed (at least in that regard).


Here's the reason WHY EVGA is no longer working with NVIDIA:

-----

> EVGA and competing board partners have told us in nearly every launch that they don't find out basics about the very product their partner to sell, like the MSRP until Nvidia CEO is on stage. We're told this is true even for costs to buy the chip and video apparently only give us placeholder costs for some GPUs like flagships until the MSRP is revealed publicly.

> It's hard to run a business when you don't even know what the cost of your product is that you're imminently launching, we've learned that NVIDIA has both a floor and a ceiling for card prices on some cards so only flagships for the ceiling, with board partners restricted from selling flagship models above a certain cost.

-----

TL;DR

EVGA and competing board partners said they don't know the MSRP until Nvidia's CEO is on stage. Even processor and video expenses are unknown until the MSRP is released.

When you don't know your product's price, it's challenging to manage a business. NVIDIA has a price floor and cap for select cards, and board partners can't sell flagship models above a certain price.


Oh wow the price cap must be infuriating during the mining boom.


It's hard to run a business when you don't even know what the cost of your product is that you're imminently launching

This strongly seems like a lie. They've been in business long enough that the current relationship between them and Nvidia appeared to be successful. If it truly wasn't successful, they could have moved over to manufacturing AMD GPUs. Instead, they just dropped the business entirely.

It seems much more likely that they recognize that the five-year GPU boom is over with the ETH merge, didn't want to have to weather the coming bust cycle, and are trumping up excuses to exit the business so that their brand isn't damaged for their other products.


If it is any help, their decision was made in April, 5 months before the merge. They told they lost money on 3080+ cards as well.


It was June, not April. See the correction.

Regardless, they knew the merge was happening, and what that would mean to the market, so it's fairly suspicious nonetheless.


Big bummer. All of my gpus (1070 and 1080ti) have been quality evga. Was planning on going with them if there is a 48gb 4090. Fuck nvidia, really hoping another chip designer will compete in the consumer deep learning hardware space. amd hasn’t succeeded with a cudnn equivalent. i can run some models on the m1 max but at roughly the same speed as the 1080 ti. it’s really all about ram. if the m2 extreme or whatever has 256 gb of gpu ram i’m willing to tolerate the slower training time to avoid playing jensen huang’s games. fuck nvidia.


I thought they were principally a GPU company and Chinese. Wrong for 2/2. They're Californian and apparently not going to be a GPU company at all. What a fascinating discovery!


I believe their founder was born in China, but the company was definitely established in California.


I have a lot of respect to EVGA for honoring GPU waitlists over the last few years.


Well, except for the SKUs like FTW3 where they took a waitlist and then promptly decided not to manufacture any for the next two years in favour of higher markup SKUs until they sent an email "there's plenty of supply now, so we're done"


I've never really understood why there are all these 3rd party variants of GPU's in the first place and not just all from Nvidia.


Same reason fast food places franchise instead of running every store: all the benefits with none of the risk.


This is a shame. While I haven't bought a new GPU in some time, I've been using EVGA since my 7800GT. They've always had excellent customer service and RMAs were painless and quite often they'd even send you an upgraded card. One of my 780 Tis broke and I received a new GTX 980. I also had to RMA my 7800GT and received an 8800GT. Their upgrade program is also quite generous, though I personally have never taken advantage of it.

I'm not sure that it'll affect me as I don't know the next time I'll consider building a gaming pc (if ever again) but it still saddens me that one of the "good ones" is exiting the business. I do hope they consider supporting AMD cards in the future, which I find easier to support as a consumer than Nvidia.


There was an official statement released by a EVGA employee on their company forum https://forums.evga.com/Official-Message-from-EVGA-Managemen...

- EVGA will not carry the next generation graphics cards.

- EVGA will continue to support the existing current generation products.

- EVGA will continue to provide the current generation products.

"EVGA is committed to our customers and will continue to offer sales and support on the current lineup. Also, EVGA would like to say thank you to our great community for the many years of support and enthusiasm for EVGA graphics cards."


EVGA will sell you the new power supply and AIO cooler you'll definitely need for the 4090 to reach it's advertised clock speeds.


Did it ever make sense to have partner cards? I never understood the business model. You effectively let go of substantial control over product. Understand EVGA had good/great reputation but this might be another indication that Nvidia is preparing for the coming down swing as crypto mining comes to a halt. That said, Nvidia still has a monopoly.


Yes, depending on the class of GPU

* at the lower end, partner cards can support special form factors and lower power variations.

* at the midrange, partner cards can, like motherboards do aswell, design better power delivery and cooling solutions to get more out of a specific chip than reference designs.

* at the high end it's all about different cooling solutions and better power delivery to squeak every last inch of performance out of these chips.

Adding to that, don't underestimate how much industry knowledge is there at the partners. They often know better how to make a board for a GPU than Nvidia and AMD themselves. There are lots of optimizations that require years of knowledge and experience to design.


Midrange cards are low margin and are redesigned to be cheaper than the reference, not to have better power delivery. It's more like outsourcing because a Taiwanese OEM is well positioned to have an army of labor design and manufacture boards for cheaper than Nvidia or AMD can.

> They often know better how to make a board for a GPU than Nvidia and AMD themselves.

Maybe a long time ago, but not for recent generations. AFAICT almost all 3rd party AMD cards sold are reference design.


Is this a nice way of saying they're winding down their business?


Sounds like a wind-down but still offering support for existing products.

It's an expensive thing to do, but is very noble!

I guess the CEO will be taking the odd support call at 2am 10 years from now...


There's a picture of Linus Torvalds thay seems appropriate here.


This is crazy, but I don't blame them. I've heard that Nvidia really takes advantage of its partners.


This is a big blow for Nvidia. I currently own a Founders Edition card, but all of my other NV cards were Evga. For me, I'd exclusively buy Evga cards, the only reason I don't is because 2-slot cards are hard to come by from 3rd party vendors. Evga is the first NV brand I go to. It was my first card after I finally moved from 3dfx. A Geforce 2 MX. I used the step-up program many times over the years. They honored their RMAs, with advance RMA.

If I'm NV, I'd give Evga privileged status over other vendors (more profits). They've done a lot for Nvidia.

The other brand I could see taking their place is Corsair, but no one is going to take a bad deal with Nvidia. This may be the beginning of the end of 3rd party cards. Just buy them all direct, like CPUs.


As somebody who is out of the loop, why might this be?


GamersNexus has a long video on it: https://youtu.be/cV9QES-FUAM


Anyone care to summarize it?


"EVGA has terminated its relationship with NVIDIA. EVGA will no longer be manufacturing video cards of any type, citing a souring relationship with NVIDIA as the cause (among other reasons that were minimized). EVGA will not be exploring relationships with AMD or Intel at this time, and the company will be downsizing imminently as it exits the video card market. Customers will still be covered by EVGA policies, but EVGA will no longer make RTX or other video cards. "



EVGA was probably the best Nvidia AIB, very shocking


The video says they have around 280 employees. That seems really thin for a company that's so well-known and produced so many products for so long, especially also making PSUs and accessories too. They must be really efficient.


Still rocking my EVGA 1080ti Hybrid SC2 from 4 years ago and its rock solid, not a single issue.

Even mined with it during the winter and obviously lots of gaming over covid...

Sad I won't be able to get a 4080 hybrid (so much quieter with the AIO).


high sales != profit

even if EVGA GPU sales is high, the company can lose money.

EVGA probably make more money on their PSU consider the cost is low and the markup can be high


It would be interesting to know what the board thinks of this. Does Founder/CEO Han control a majority of the board? What about the minority shareholders? Will minority shareholders bring a lawsuit to stop the majority from harming their interest in the company (breach of duty to minority shareholders)?


I got a FTW 3080ti for my new PC a few months ago so I am very sad to see them exit the market. I worry about my current card's warranty though, will they still have 3080tis in storage in 2-3 years if I get unlucky and my card dies...


They said they are planning to keep some stock for those purposes


Their mouses are on sale, highly recommend them: https://www.evga.com/products/productlist.aspx?type=12


I'm surprised they're not pivoting to AMD cards. I'd have guessed they were a top tier AMD card manufacturer, based on their brand, and the fact that I have some of their old NVIDIA cards laying around.


Intel would seem to be the bigger opportunity since Intel is trying to get established and has a lot of money.


There was _ONE_ truth in PC gaming. EVGA is quality. Period. If you had the cash, you bought EVGA for peace of mind, this is the end of a dynasty wow!

What are they going to make to continue making money? Sell power supplies alone?


I hope Sapphire will stick with AMD and it's not some bad trend going on.


I downgraded my GPUs to passively cooled 1030 with DDR5 memory.

My bet is all mainstream games will work on those at 60 FPS for eternity, because of peak performance per watt and energy costs rising.


I'm just glad the ARM acquisition didn't go through. Jealous of the margins Apple pulls out of their products, nvidia clearly wants to be apple. Not good for the consumers.


Now the ARM ISA can die in peace, snuffed out in loving grace by Apple engineers desperately trying to turn it into x86.


I got a 3080 thanks to their waitlist program. This sucks to hear and hopefully they get back into the game in the future. Great support as well.

Which brand is as good as EVGA now that they’re out?


I’m incredibly disappointed but not surprised with NVIDIA. My next card will likely be from AMD. Never thought I’d say that (I haven’t had an AMD card in more than a decade.)


Great decision by EVGA. They were not going to make any profit on 4000 series. Now is the best time to negotiate a big sign-up bonus from Intel.


I never understood what value these third parties (EVGA, MSI, etc) were providing over just getting the card straight from NVIDIA / AMD.


There was like fifteen years where you couldn't get the card direct from Nvidia/AMD because they didn't want to be in the business of making boards, just chips.

Also right now you can't buy direct from Nvidia outside select markets and specific retailers, who spent much of 2021 charging for that privilege.


Linus tech tips piling on with some serious shade throwing too. Sure seems like there was a lot of resentment below the surface


Can someone think of a single company that has a positive relationship with Nvidia? They were laughed out of the CPU industry when they tried to buy ARM, virtually nobody wanted them to close that deal. They're such a bizarre company, I mean I've bought several GPUs simply because they were usually the best available when I was looking but most loyalty belongs to the AIB's and not to Nvidia themselves. They're kind of this weird entity that many want but nobody likes.

My guess is that Nvidia, realizing they represent 80% of EVGA's revenue tried to get them to sign a very one sided contract and now they're down one of their biggest AIB's on the eve of their latest GPU launch.

Indeed I think there must be some pretty bad blood here between Nvidia and EVGA leadership.


Strange, NVidia entered existence off of their great relationship with SGI. Who would think they would have relationship issues?


Any chance they plan to make their own graphics chips to directly compete with NVIDIA and AMD?


Not much information, wonder why?


AS much as I hate video, Gamers Nexus did a video on it - https://www.youtube.com/watch?v=cV9QES-FUAM

Seems like main reasons are that NVIDIA has been a very hard to work with partner, by not disclosing MSRP to partners until announcing to the public, along with the direct competition in the form of Founder's Edition cards the compete with AIB cards.


It would have been better if they spun off their gpu decision, but that's fair.


Maybe. The CEO told GN their margins were negative for high end GPUs. I dunno if the GPU division is worth enough to make up for losing control over the EVGA brand because it's shared by 2 companies.


Spun off with a new name was my idea


This must be one of the biggest dummy spits in the history of computer hardware!


Imagine working for a company and one day the CEO announces that it will literally commit company suicide.

I can't understand why he wouldn't step down, this puts the livelihood of 500+ people at extreme risk for what seems like a selfish decision.


Did you even watch the GN video? "voluntary attrition" = employees are burned out and intend to quit.

I love how people pretend CEOs can make decisions that sacrifice profit just because they feel like it.


Beginning of the end for the recent GPU boom


Anyone want to provide a little context?


Wow EVGA was my favorite company


I wonder if this has anything to do with Intel ARC release, what if EVGA and Intel have a secret upcoming deal?!


The ceo is on record pretty much saying AMD and Intel could go to hell too


They said at length they weren't doing GPUs from Intel or AMD.


On their forums people don't seem so sure: https://forums.evga.com/Will-evga-be-a-card-partner-for-new-...


? What does it matter what people on forums say, they said it wasn't an option and they were shutting down their GPU business.

Also I dunno if you heard but Intel may be getting out of it [consumer] as well as they're not competitive at all at profitable prices.


How much of this has to do with the overnight evaporation of the GPU mining market due to the Ethereum merge?


the timing is suspect. But I imagine having to deal with crypto bouncing all over and bubbles popping every few years is an incredible headache for them. With the fact that their profit is even negative on certain GPUs. Who would want to deal with that much unpredictability and nvidia holding you hostage? Sounds pretty sane to want to get out of that.

Really hate to see them go though. I've been with EVGA GPUs for more than a decade.


This was my first thought. They were prepared to do this, and once that market has dried up, they are ditching people and trying to lay blame elsewhere, and it seems to be working.


Announcing you aren't making the next gen product is not ditching people. They specifically state that they will continue to sell and service current gen gear.


They are firing (i.e. ditching) employees.


In the video, they say they informed nVidia of this in April of 2022, so probably nothing.


Beacon chain launched in late 2020. The writing was on the wall by early 2022.


Some of you may not realise that VGA stands for Video Graphics Adapter so bit silly for EVGA to stop selling video cards and keeping the name as is.


Guess I might as well RMA my 3090 FTW 3 before the gig is up.

Very sad to see this announcement, some of my happiest nerd years (not spend blowing stuff up with friends in my backyard) in highschool were building PC's trading up EVGA GPU's.

RIP EVGA - I hope PNY picks up the slack, ASUS boards have only really been trouble for me.


They will continue supporting 30 series cards and will respect warranties.


EVGA needs Nvidia, Nvidia does not need EVGA.

So this is either business suicide (Doubtful), a publicity stunt or an attempt to get the community to take side in a pricing beef between Nvidia and EVGA.


What is evga? Seems like a forum ?


A well-known graphics card manufacturer since the late 90s, with a strong presence in the US market.


So they manufacture gpus ?


tl;dr – they turn Nvidia's product (the GPU, memory configuration, and chip design) into the thing that consumers buy (the actual graphics card you slot into your PC).

They're what's called an AIB (stands for Add In Board). They buy the actual GPU chips from Nvidia (or AMD, for other companies) and manufacture their own graphics cards.

For some cards, an AIB might design their own PCB, or they might license the "reference" design from Nvidia. They'll also engineer their own cooling solution.

Often, an AIB will offer several different variants based on the same GPU. For example, EVGA offers a few different models of Nvidia's 3080 GPU – mainly their XC3 and FTW3 models. The FTW version is a little more expensive, but has a larger heatsink, better power delivery, etc. The XC version will probably run hotter and louder, but can fit in smaller cases. They're both "RTX 3080"s, but EVGA has built 2 distinct products (actually way more than 2, in real life) from the same Nvidia GPU.


Jensen Huang stated that Nvidia was going to make moves to move cards before the upcoming launch of the 40 series in August's earning call: " We implemented programs with our Gaming channel partners to adjust pricing in the channel and to price-position current high-end desktop GPUs as we prepare for a new architecture launch." <- https://www.yahoo.com/video/edited-transcript-nvda-oq-earnin...

Could this be a move in that direction? Will EVGA suddenly come back once they sell out? Is this planned? Consipiracy?


Basically NVIDIA can withstand the hit to margins by huge price drops but AIBs cannot.


EVGA is forever on my shitlist for selling me two cards with unlimited warranties (geforce 4's?) them switching teams (red amd) & telling me they wouldnt replace the card that up & died like 2.5 years latter.


Can you elaborate? I don't recall EVGA ever making AMD GPUs


He might've gotten them confused with xfx that switched from nvidia to amd.


This is, by far, the most confusing thing to happen in the industry in quite a while.

Per the video: Nvidia represents 80% of EVGA's revenue. EVGA isn't planning to expand their product lines. They're "financially sound", staying in business, and not planning to sell the company. EVGA's view is that Nvidia has been disrespectful as a board partner, making it "difficult if not impossible" to be profitable.

This is just incomprehensible through any lens except that most of EVGA's statements are lies or half-lies and they're not actually financially sound.

They're a company that has tried, several times, to expand into other market segments; motherboard & power supplies being the two biggest ones. These never took off quite like graphics cards, but a big reason feels like they never invested enough into those areas.

Really sounds like, hard financial straits, not enough R&D, and an aging CEO who just wants to retire. I wonder if they ever reached out to a company like Corsair as an acquisition?


Is the percentage of revenue from a given source really a relevant metric? Hypothetically if 80% of their revenue comes from selling GPUs, but the GPUs aren't profitable, then they are just doing free work for NVIDIA.


>This is just incomprehensible through any lens

What about the lens of "Nvidia only allows us to sell at a 1% margin"? Revenue means nothing if you're not making profit.


Checking out EVGA's Glassdoor page, looks like employee satisfaction is abysmal with little approval (23%) for the CEO.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: