On desktop, CPU decoding is passable but it's still better to have a graphics card for 4K. On mobile, you definitely want to stick to codecs like H264/HEVC/AVC1 that are supported in your phone's decoder chips.
CPU chipsets have borrowed video decoder units and SSE instructions from GPU-land, but the idea that video decoding is a generic CPU task now is not really true.
Now maybe every computer will come with an integrated NPU and it won't be made by Nvidia, although so far integrated GPUs haven't supplanted discrete ones.
I tend to think today's state-of-the-art models are ... not very bright, so it might be a bit premature to say "640B parameters ought to be enough for anybody" or that people won't pay more for high-end dedicated hardware.
> Now maybe every computer will come with an integrated NPU and it won't be made by Nvidia, although so far integrated GPUs haven't supplanted discrete ones.
Depends on what form factor you are looking at. The majority of computers these days are smart phones, and they are dominated by systems-on-a-chip.
I would generally agree, but in many cases 1) people don't read the comments/replies, 2) interesting responses get drowned out by low-quality responses, 3) the criteria by which useful responses get highlighted can be skewed by a variety of factors, including vote brigading and algorithmic bias or sometimes just a bias towards the earliest comments (which get upvotes, which then get more views, which get more upvotes).
Waymo is pretty good (but not perfect) as far as safety, but there's too many ways it can get stuck. Including vandalism from humans like "coning". And if a significant number of them are on the road, it could gum up traffic when that happens.
I still think it'll do well because even if you need to hire 1 person to remotely monitor every 10 cars (I doubt Waymo has anywhere near that many support staff) it's still better than having to pay 10 drivers who may or may not actually be good at driving. But to really take over they'll need to be much more independent.
The Information claimed today that ByteDance is renting GPUs in the cloud, although ByteDance denies it (well, they call it "inaccurate" which is not exactly a strong rebuttal).
They're industrial robot arms, not humanoids, although the concept of android workers getting paid an hourly fee or "wage" (going to their masters, an android rental corporation) would be fascinating.
Even just a robot arm with an appropriate sensors and hand attachment could replace human employees in the world's oldest profession. Consider what drove the video industry if you're looking to invest.
You can be smart and still slip up, especially under pressure (and as we're learning, possibly mental and physical pain). Presumably one of the reasons why professionals are better is that, beyond working experience, they have a lot of practice.
"Engineers are more likely to be terrorists" is different from "terrorists are more likely to be engineers".
You can imagine that engineering is a useful skill for terrorism and thus terrorist organizations might spend extra effort trying to recruit engineers. They may also have a higher survival rate working on behind the scenes tasks rather than firefights and suicide missions, which could cause a survivorship bias in data collection.
(It's also interesting how many foreign leaders and dictators have engineering or science degrees, and/or went to US universities prior to becoming leading figures in their home countries.)
The guy's face has been plastered over the news for several days and there's a $60,000 reward. Getting a tip from a fast food worker is very plausible.
More plausible in my opinion than the FBI having some kind of agreement with McDonald's to access their store surveillance network in real time.
People vastly overestimate the ability of giant bureaucracies to keep secrets. It only works if a few people are in on it (that's part of what compartmentalization is for). I'm always suspicious of claims that federal agencies are colluding with companies for the purposes of mass surveillance because while I trust those agencies to keep secrets, I absolutely do not trust the vast majority of companies to do so. There are narrow exceptions--defense industry, telecommunications, aerospace industry--but mostly secrets like that are hard to keep unless your org is built around keeping secrets. The orgs I've worked for are the opposite of compartmentalized. I doubt McDonalds' software engineering org is, but I'd be curious to be surprised!
CPU chipsets have borrowed video decoder units and SSE instructions from GPU-land, but the idea that video decoding is a generic CPU task now is not really true.
Now maybe every computer will come with an integrated NPU and it won't be made by Nvidia, although so far integrated GPUs haven't supplanted discrete ones.
I tend to think today's state-of-the-art models are ... not very bright, so it might be a bit premature to say "640B parameters ought to be enough for anybody" or that people won't pay more for high-end dedicated hardware.