Hacker News new | past | comments | ask | show | jobs | submit | blihp's comments login

I don't know... I think it's pretty cool. In fact, I just found a podcast talking about this very thread (about a minute in, your comment even gets discussed ;-) https://drive.google.com/file/d/130s6OzcfsZam8V-6S5ugKmc0M7O...

I may have to re-task my paperclip making AI on generating podcasts...


They can't since Meta can spend billions on models that they give away and never need to get a direct ROI on it. But don't expect Meta's largess to persist much beyond wiping out the competition. Then their models will probably start to look about as open as Android does today. (either through licensing restrictions or more 'advanced' capabilities being paywalled and/or API-only)

> But don't expect Meta's largess to persist much beyond wiping out the competition

I don't quite follow your argument - what exactly is Meta competing for? It doesn't sell access to a hosted models and shows no interest of being involved in the cloud business. My guess is Meta is driven by enabling wider adoption of AI, and their bet is more (AI-generated) content is good for its existing content-hosting-and-ad-selling business, and good for it's aspirational Metaverse business too, should it pan out.


I'm arguing that Meta isn't in this for altruistic reasons. In the short term, they're doing this so Apple/Google can't do to them with AI tech what they've done to them with mobile/browsers. (i.e. Meta doesn't want them owning the stack, and therefore controlling and dictating, who can do what with it) In the longer term: Meta doesn't sell access... yet. Meta shows no interest... yet. You could have said the same thing about Apple and Google 15+ years ago about a great many things. This has all happened before and this will all happen again.

Likely not tense as it seems pretty clear where everyone stands on this. OpenAI and Anthropic are likely having conversations along the lines of 'move faster and build the moat!' while Meta is having a conversation along the lines of 'move faster and destroy the moat!'

It's not that Meta has anything against moats, it's just that they've seen how it works when they try to build their moat on top of someone else's (i.e. Apple and Google re: mobile) and it didn't work out too well.


Also the EU is far more aggressive and opinionated about what they want to see in the market than during the smartphone era. And this is spreading across the world.

So being committed to an open, standards-based strategy is an easy way for them to avoid most of the risks.


So they’re trying to be what Android is to the iPhone?

Precisely, and that extends to their XR ambitions as well: https://www.meta.com/blog/quest/meta-horizon-os-open-hardwar...

When we get to the end of the hype cycle, they will be. The only question is if people will be interested in footing the power bill for any of the ocean of obsolete data center GPUs that companies will be dumping.

This was just the author's current issue on Android. In another month or two it would have been something else. I fought similar battles for the better part of a decade before finally giving up when Google's policies made it so that even keeping apps running (at least in my case) was an economic non-starter. The sheer amount of bureaucratic B.S.[1] they constantly fling at you while simultaneously bit-rotting existing applications is insane.

[1] Sometimes it's related to their store listing policies which are constantly changing, sometimes it's related to taxation in a specific country, sometimes it's related to laws in a specific country, sometimes it's actually related to software (on-device or web services!) they are forcing you to update/change etc. etc.


This was the company that made all sorts of noise about how they couldn't release GPT-2 to the public because it was too dangerous[1]. While there are many very useful applications being developed, OpenAI's main deliverable appears to be hype that I suspect when it's all said and done they will fail to deliver on. I think the main thing they are doing quite successfully is cashing in on the hype before people figure it out.

[1] https://slate.com/technology/2019/02/openai-gpt2-text-genera...


GPT-2 and descendants have polluted the internet with AI spam. I don't think that this is too unreasonable of a claim.


I use FreeCAD on a very regular basis and can understand why it's not more popular: it's very powerful but has some very sharp edges that will often have me using it in a state of near rage. Topological naming comes to mind but there are other various issues that I've hit like a brick wall (in that you can't work around the bugs/limitations so much as you must rework your design to avoid them which can be tedious and frustrating) when designing something non-trivial.

That said, each release continues to improve it just has further to go than most open source projects.


There were lots of PC databases, of which dBase was one of the more popular ones.


So there was a content-layer market for the early PC-based DBMS's, a la someone marketing the early 1980s equivalent of Wikipedia or Encarta?


Yes, but as I recall it was a rather tiny subset of the database market unless one was very forward thinking and had Internet access and/or was part of a BBS community that shared information electronically which put you in the minority of the minority at the time. Financial market data was probably the largest one early on. Mostly individuals and small businesses were just rolling their own solutions and entering the data themselves until the late 80's / early 90's. Larger businesses had many of islands of systems and data but it was nearly all internal or used to generate paper documents... even internally.

Roughly speaking: prior to the 80's one (often) wrote their own programs, prior to the 90's one entered their own data (i.e. for structured content like databases and spreadsheets), prior to the 00's one created their own content (i.e. documents etc.)


Beyond the technical reasons for the limit, it provides for a relatively painless way to begin to build out/for RISC-V[1] without an uncomfortable transition. For those who just want a better next iteration of the controller, they have it. For those who build tools, want to A/B test the architectures, or just do whatever with RISC-V, they have that too. All without necessarily setting the expectation that both will continue to coexist long term.

[1] While it's possible they are envisioning dual architecture indefinitely, it's hard to imagine why this would be desirable long term esp. when one architecture can be royalty free and the other not, power efficiency, paying for dark silicon etc.


The tweet thread makes it clear: AMD has consistently based future strategy on the past which is why they were so ill-prepared for every major non-PC trend this century in CPUs until Ryzen. (which was basically catching up + much better value than Intel) This also translates to their GPUs where they seem to have absolutely no consumer GPU vision beyond 'we want some of what nVidia's getting.' Their current strategy seems to be weaker hardware + weak compute drivers + a little cheaper than nVidia = success.


I wonder if the lack of a "consumer GPU vision" is sort of a forced conclusion.

With ~15% of the market, it's going to be very difficult to pull the market in a direction you want, so you're forced to say "I can offer you what nVidia does, but cheaper"


AMD pulls the market with near 100% of consoles.

PC vs console is weird. They are different markets but you'd think the x86 based PS5 would have more pull these days.


The PC gaming market doesn't seem to track that well with the console market, though. I was always surprised by this-- you'd think that all the optimization skills and tricks they learned to get the most out of console APUs would result in a lot of ports being optimized by default for Radeon cards.

I suspect the problem the PC gaming market is very halo-product steered: Intel's product credibility is still buttressed by whatever 700-watt-from-the-wall 16900WTFBBQ they can showcase for benchmarks, and Radeons winning at various price/performance tiers means nothing when they don't have a 4090 killer.

I was also surprised how effectively ray-tracing was sold to the market, considering plenty of games still don't use it, and those that do take a big performance hit for it. The RTX2xxx cards were sort of turkeys, but I suspect it now provides an excellent FOMO/FUD scenario for newer cards-- that 7900XT might not ray-trace as well as a 4080.


Ryzen wasn't "catching up". Ryzen was literally inventing the future. It was stubbornly insisting on shipping on chiplets on a fabric at a time where intel and nvidia were both insistent on monolithic as the right choice.

The GPUs aren't playing catch up either, They're the shared memory APU systems. widely celebrated as novel on the current Apple silicon and shipping in configuration since the playstation 4.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: