The AI is the least interesting part of this announcement. Microsoft is giving ARM another try with a special branding that's supposed to guarantee some level of performance and quality. That's possibly huge news.
I still won't believe it until reviewers get hands on. In October 2022 I wanted an efficient laptop for web development but didn't want to grab a mac, so I bought a Lenovo Thinkpad X13s with the Snapdragon 8cx Gen 3, which Qualcomm made similar promises about and it lived up to exactly 0% of them. The main draw, the battery life was worse than my old Blade 14 that ran Fedora, windows on ARM and their current emulation continues to suck, every so often core services would just not work. The Linux support is only just barely getting there, Ubuntu and Armbian have custom images, where a lot of stuff still doesn't work. The camera will never work because of proprietary blobs and the battery life is way worse than windows, no suspend, audio barely works, etc. And those aren't just problems of Lenovo or that thinkpad it stems from the platform. I ended up buying a macbook a few weeks ago and it was 2 years late, should have got it in the first place.
It's also notable that Qualcomm is officially upstreaming kernel support for the Snapdragon Elite platform that Microsoft is pushing, so those systems may actually not suck at running Linux.
Yes, and Lenovo is releasing a new Qualcomm-powered ThinkPad, which are known to be a Linux-friendly laptops.
> The ThinkPad T14s Gen 6 is, of course, business-focused. It will have the same Snapdragon chip, storage capacity, and webcam but will support up to 64GB of memory and one of three 14-inch display options: an IPS with up to 400 nits of brightness; an IPS touch display; or an OLED that covers 100 percent of the DCI-P3 color gamut, also with 400 nits of brightness.
>Lenovo expects the Yoga Slim 7x 14 Gen 9 to start at $1,199 and the ThinkPad T14s Gen 6 to start at $1,699. Both will be available in June.
Lenovo has been selling a Qualcomm laptop, the Thinkpad x13s, for several years already with questionable at best Linux support so I wouldn't expect the new ones to be much better.
It isn't so much about the chip but that there are almost no standards for how things should work when compared to a PC. Almost every PC boots the same. Almost no ARM device boots like another.
There has only ever been 1 or 2 ARM SoCs for Windows for the last 10 years. It didn't make Linux support on them easy.
They might be "known" (recent experience may vary. mine sure did) but if you can't get support for the pre-installed Linux, you're just asking for pain.
Hmm. The first thing I do when I buy any new machine is to format the hard drive so I can install my own OS on it. I don't really trust any manufacturer to get that right.
So, if it's so hard to get Linux running that you need to have it preinstalled for you, then it's not really a good Linux machine in my view.
It's not as much "How do I install Linux?" as it is "The manufacturer supports these Linux images but my hardware isn't working right with it so I can call them and they're supposed to have an answer as to making it actually work". Otherwise you're just as good to buy any random laptop and try to make sure everything is supported yourself (not a horrible option, just not the premise of these kinds of laptops).
I had decent luck with Dell (though it was an n=1 interaction so I'm not sure how it indicates overall) ~5 years back on this where there was some issue with the dual GPU nature of the 7730 where on this model you could actually completely bypass the iGPU (it wouldn't even show up as a PCIe device anymore) for the main screen but it was causing some sort of display desync after a few minutes on Linux but not Windows. Loaded up the official image, reproduced, opened a ticket, they sent a firmware patch, it worked.
That and that the manufacturer has worked to ensure that at least some version of Linux works on it well, i.e. has done the systems integration work. Otherwise it can be a death of a thousand paper cuts, where things kind of mostly work, sorta, occasionally.
I usually take it as, they installed Linux on it and support it. I will use my own install after formatting a drive (or carrying one over from the previous machine) but it’s more like a seal of approval that Linux works.
And if the company are good stewards, they will upstream any drivers/kernel modules for that hardware too.
In the past a few Snapdragon 8 Gen x chip dev kits have had Linux support if I recall correctly. I'd love to have built a device out of them but they seem quite expensive unfortunately for consumers ($800-1000, often from grey market sources). It's nonetheless good to see Linux support.
Snapdragon 8cx Gen 3 chip laptops are only just getting there. Last time I tried to daily it maybe 6 months ago, they didn't even have hardware rendering on the GPU, everything was rendered on the CPU. Ubuntu and Armbian have custom images for the x13s, no idea about the Volterra but it's much worse than the windows on ARM experience for that laptop and that's saying something.
I thought Microsoft requires UEFI + secure boot for Windows, and with no disabling option in the firmware setup, for Windows Arm PC? Or maybe it was "only" Microsoft Secure Boot and you can actually use a Linux distro? If this is the latter case, can you build and run the kernel you want or not?
Yeah, “ARM-based Linux laptop with 22h battery life” is much more interesting to me than “Windows 11 ARM-based AI PC”. If the TPU can eventually be utilized by open models under Linux that’s just a cherry on top.
Acorn tried very hard, but without a critical mass of software ported to the platform it was nearly impossible - Apple was the only other platform that survived.
Now, with open source, it’s easier to open up a place for other architectures. Android and iOS took over a lot of space traditionally reserved to Windows and Microsoft is not oblivious to that fact.
Laptop battery life estimates outside of Apple are downright criminal. I'm still bitter over the time I bought a fully loaded Sony Vaio which had an advertised battery life of 11 hours. When in actuality, I only got less than three hours of battery life and complained, they explained to me that they meant it gets 11 hours of standby mode with the lid closed. I only made the mistake of buying a non-Apple laptop one more time with a fully loaded XPS which could barely manage 4 hours of battery life. I honestly have no idea how anyone can put up with non-Apple laptops. It's pure garbage out there.
2-4 hours sounds wrong. Were you mining bitcoin in the background?
"Battery life on the ThinkPad X13s will depend on your workload ... and while you might not get close to the 28 hours that Lenovo touts in its own testing, our testing with PCMark 10 got us an amazing 15 and a half hours..."
I was working on a Rails project and had a YouTube video playing in the background, its a 49whr battery and the CPU can easily burst above 15w and usually sustains around 7w plus backlight and all the other components of a laptop, a sustained power draw of 15w while doing development work isn't unusual.
Exactly. They're now aggressively competing with Apple with supposedly better performance and better battery life. That's huge. Apple was years ahead with the introduction of the M1 but it appears competition has finally caught up. All of this has to be proven in real life though, but so far all the newly announced ARM devices [0] look like impressive MacBook competitors on their own.
[0] So far it seems this are the devices that have been announced.
Microsoft: Surface Laptop 7, Surface Pro 11
Dell: XPS 13, Inspiron 14 and 14 Plus, Latitude 5455, Latitude 7455
They do some sleight of hand when doing the comparisons during the release today.
The Surface Laptop is physically a competitor to the actively cooled MacBook Pro (in fact it’s thicker).
Their performance and battery metrics are against the slimmer and passively cooled MacBook Air.
Their performance comparison is between the M3 and the Snapdragon X Elite when the M3 has throttled (their wording is sustained performance)
Their battery comparison is between the M3 and the Snapdragon X Plus.
That they interchange both freely is a strong tell that their devices don’t compete on all fronts like your comment suggests.
The only area where I think the snapdragon X will compete is price, because they’re claiming it’s $200 lower than a comparable M3 MacBook Air (but don’t disclose the exact comparison).
Still, a very strong showing from Qualcomm/Nuvia. However the sleight of hand leaves a bad taste in my mouth.
I am excited on having an ARM on a PC with Linux support but I never see Windows as an OS optimized for batteries beyond if they use ARM or x86. The Apple advantage continues to be a complete control of the device from hardware to the operating system, while the bloatware of Microsoft Windows makes an arbitrary use of resources.
This is not to say that Microsoft Windows is not an advanced OS, the problem is that it is not laser focus optimized.
I guess that's true but you can't make a leap like Apple did with their transition to ARM every few years. So it's good to see the competition catch up. And I'm perfectly happy with Windows laptops trailing Apple by a close margin instead of a 4 year gap.
Unfortunately, even though I wish I could get an M4 (or equivalent) ARM SoC in a PC, it is very unlikely that these chips are going to live up to the hype. Qualcomm has a bad track record of overstating and cherry-picking benchmarks.
All of their claims so far have been both impressive, and very squishy. Microsoft's own messaging around performance and battery life for their Surface devices was also unusually squishy. And I am a huge Surface Pro fan that would love nothing more than a fanless AND fast ARM Surface Pro.
I live in San Diego, and know countless Qualcomm employees, none of them give a shit about anything other than modems really. The rest of the SoC is just something to push 5G. They care as much about CPUs/SoCs as Intel did about 5G modems.
> Microsoft is giving ARM another try with a special branding that's supposed to guarantee some level of performance and quality.
I hope so. I've been a happy Windows for Arm user (via Parallels on Apple Silicon) for a year+ and it's been good. Based on that, I think drivers are going to be the biggest PITA for ARM-based PC users for the first couple years — for example, Google Drive doesn't work for that reason.
> I think drivers are going to be the biggest PITA for ARM-based PC users for the first couple years — for example, Google Drive doesn't work for that reason.
Google Drive does ship with Arm64 drivers, and patching the platform check out of the installer gets them installed just fine (40 84 f6 74 08 -> 40 84 f6 90 90).
I tried Windows on UTM (based on qemu, I believe) and the graphics were choppy. I attributed that to the lack of graphics acceleration. Is it also the same on Parallels?
Yep. Both Parallels and VMware have good graphics acceleration, but Parallels is better. I too have been running a Windows ARM VM for work for a year or so.
Not quite. Hidden at the very end of Microsoft's blognouncement[1] is this tidbit (emphasis mine):
> We look forward to expanding through deep partnerships with Intel and AMD, starting with Lunar Lake and Strix. We will bring new Copilot+ PC experiences at a later date.
So it's less Microsoft pivoting to and giving ARM a try again but rather testing the waters and distributing the risks by introducing ARM into a line of laptops and tablets that will still be fundamentally x86. Arguably, the only reason ARM is first to store shelves is because Qualcomm released this generation first before Intel and AMD.
This isn't as significant as Apple throwing Intel out to pasture and converting to ARM wholesale, not yet anyway.
apple never wanted to use intel hardware, they were forced to by motorola/IBM et al; whoever was selling them the PowerPC chips told them to pound sand because the xbox and playstation needed way more PPC chips than apple. Apple made a business decision to switch to intel, which caused a bit of a to-do in the community at the time. That apple switched off intel at their earliest possible chance - that which took time to "design" their own ARM cpu - doesn't really mean anything, in my opinion.
I wonder how many people remember all of the hardware platforms that NT 3.51 and NT 4 ran on (Sparc, etc)
IBM didn't want to design a power-efficient PowerPC chip. And then they switched from Intel once Intel stopped being able to design new power-efficient chips.
Well, that's not what the story was at the time. "You don't order enough" was the reasoning.
Further, nearly contemporaneously after apple M chips debuted intel released chips with P and E cores. Furthermore, Intel made lots of chips that have TDP under 10W, even 5W - multicore, even. I still use them to run HA VMs for emergency communications internet gateways.
"power-efficient" is a weird thing to claim, anyhow. What does it mean? PPC were much faster per socket than maybe even server class chips by intel, if you wanted power efficiency you could run them slower and get whatever FLOPS/J intel could give.
I am very sure i remember Apple not having a choice.
> "power-efficient" is a weird thing to claim, anyhow. What does it mean?
Cycles per Joule, or cycles per second per watt. While staying at the laptop-level available power envelope.
> PPC were much faster per socket than maybe even server class chips by intel
The absolutely top POWER chips were at the time _somewhat_ superior to Intel (and AMD), but not by much. And that superiority was achieved by raw strength, the POWER chips had a very large die with tons of additional cache, and ran at higher frequencies (i.e. more power dissipation).
However, laptop-class chips were absolutely underpowered. Intel ran circles around them. The same was true for consoles, PS3 had a puny underpowered CPU (with multiple co-processors that were supposed to make parallel tasks easier), and XBox360 was barely better than then current top desktop CPUs.
> The relationship between Apple and IBM has been rocky at times. Apple openly criticized IBM for chip delivery problems, though Big Blue said it fixed the issue. More recent concerns, which helped spur the Intel deal, included tension between Apple's desire for a wide variety of PowerPC processors and IBM's concerns about the profitability of a low-volume business, according to one source familiar with the partnership.
>IBM loses cachet with the end of the Apple partnership, but it can take consolation in that it's designing and manufacturing the Power family processors for future gaming consoles from Microsoft, Sony and Ninendo, said Clay Ryder, a Sageza Group analyst. "I would think in the sheer volume, all the stuff they're doing with the game consoles would be bigger. But anytime you lose a high-profile customer, that hurts in ways that are not quantifiable but that still hurt," Ryder said.
furthermore, if you look past the "Intel is just too slow for the power envelope" that you stated:
>The "bad quality assurance of Skylake" was responsible for Apple finally making the decision to ditch Intel and focus on its own ARM-based processors for high-performance machines. That's the claim made by outspoken former Intel principal engineer, François Piednoël. "For me this is the inflection point," says Piednoël. "This is where the Apple guys who were always contemplating to switch, they went and looked at it and said: 'Well, we've probably got to do it.' Basically the bad quality assurance of Skylake is responsible for them to actually go away from the platform."
So both of the claims about cycles/J being the reason are just sound bite ahistoria.
>The PowerXCell 8i powered super computers also dominated all of the top 6 "greenest" systems in the Green500 list, with highest MFLOPS/Watt ratio supercomputers in the world.
that's 2008 - when apple originally announced they'd complete the move to intel.
PS3 underpowered?
>According to Folding@Home in a statement on its website; “Using the Cell processor of the PS3, we should be able to do more folding than what one could do on a PC. Also, since the PS3 has a powerful GPU, the PS3 client will offer real time visualization for the first time.
Also if you compare like for like, the CPUs IBM put out in 2005 with 3.2ghz had a TDP of 75W. Intel CPUs in 2005 at 3.2GHZ used 130W TDP. both used 90nm processes.
You're fractally wrong. I could actually keep going with every claim you made and put paragraphs in. I'm sorry.
> furthermore, if you look past the "Intel is just too slow for the power envelope" that you stated:
Well, yes. Intel was not able to deliver fast mobile CPUs on time. So Apple decided to give up and do it on their own.
> PS3 underpowered?
Yes. It was terribly underpowered. I worked with it back then, and it was slower than a 1GHz Pentium for practical tasks. The CPU in PS3 was clocked at around 3GHz, but it had full in-order execution. So any code with branches just died.
Sony's way around it was SPUs - special coprocessors, that were even more underpowered with little local memory. But there were 8 of them, and the common data bus was pretty fast.
It worked great for graphics and for something computation-heavy like Folding@Home. Kinda like modern GPUs. But it sucked for general-purpose code.
> Also if you compare like for like, the CPUs IBM put out in 2005 with 3.2ghz had a TDP of 75W. Intel CPUs in 2005 at 3.2GHZ used 130W TDP. both used 90nm processes.
Now compare their performance for actual tasks. You'd be surprised.
Cell architecture bombed. As a result, Sony switched to a regular architecture for PS3.
Microsoft has no courage. They have to keep catering to every possible audience, so they’re not willing to pull the plug on x86, which means ARM will always play second fiddle.
Microsoft and Windows (and by extension x86) achieved their desktop market dominance by respecting that most people want backwards compatibility.
Everything that has tried to go or is going against that tide either failed (eg: Itanium, Windows RT) or never had market share to lose in the first place (eg: MacOS, Linux in the consumer space).
Microsoft would be stupid to be "courageous" and drop backwards compatibility, that would even trump Apple's courage abandoning the headphone jack. It also makes business sense to keep your eggs in multiple baskets, assuming those baskets are each commercially viable.
> They have to keep catering to every possible audience
That's not Microsoft's job at this point, they heavily invest and push the envelop where makers don't want to take the risk, but from there it's for Lenovo, Dell, HP, Asus etc. to decide what they want to market and which chip to push. The same way some of them put weight behind AMD while other went full Intel 100% of the time.
Think it’s worth mentioning that their Qualcomm exclusive windows laptop deal ends soon and this should allow AMD and NVIDIA to ramp up (arm) windows laptop cpus soon within the next few years.
Microsoft has been consistently trying to give ARM a try since the surface RT. Consumers are not going to bite. marginal power saving is not meaningful.
The first iteration of Windows ARM didn't have any x86 emulation layer, so that one was doomed from the start. The second iteration did, but it initially couldn't run 64bit apps and the performance was poor. They do have 64bit support now and it sounds like the emulation performance has come a long way.
Here is my question though, comparing how this works on Mac.
Will Windows have the opposite? ARM running on x86?
I continue to wonder how Microsoft expects to work long term. Are they expecting that every developer is just going to keep x86 and ARM based app perpetually or users be stuck always using that emulation layer if they are running ARM?
Microsoft won't be able to 100% transition to ARM like Mac did. At some point all Intel Mac's will be old enough to no longer get the latest version of Mac and for developers to stop targeting and they drop Intel support.
I just don't see many developers bothering with an ARM native Windows version when doing so means they have to support both or risk annoying customers later.
> I just don't see many developers bothering with an ARM native Windows version when doing so means they have to support both or risk annoying customers later.
The market dictates what developers do. If Windows on ARM is the new shiny and it hits the three key laptop parameters of no fan noise, long battery life, cool case, then people will buy it and developers will build for it.
I think the official line from Microsoft would be that most software should be using .NET anyway, and in that case the same binary should Just Work on either architecture. In reality there is still a lot of native software though, so who knows how that will play out. Games in particular will always be native.
You have to understand that Windows comes from a separate division than .NET and they have no overlap. Microsoft isn't a cohesive company. .NET comes from the developer division (DevDiv) and UWP comes from the Windows division (now Server & Cloud). The Windows folks always hated .NET and the developer division has been lukewarm about UWP.
It's actually kinda annoying once I started paying attention, as many software vendors just detect "Windows" and give you a x86/x64 installer, even when the company offers a ARM64 build that would presumably be faster or be more energy efficient. I installed a bunch of stuff that were Intel binaries without even knowing that I wasn't running native. But I haven't noticed any performance issues, and yeah everything just works.
In 2018 that lockdown situation morphed into "S Mode" which you can turn off in the control panel. The only trick is that you can't turn it back on. It's just that the ecosystem isn't there, both in terms of developers and performant devices.
Hopefully today's announcement is a turning point for that but atm windows on ARM is about on the same tier as a pre-carplay infotainment system.
I think the idea is to all apps and developers gradually transition and develop with ARM support - after all even the mobile devices will be running on ARM sooner or later so future apps, games will be developed with ARM in mind anyway. x86 apps will be supported - with some paid support for example.
But it all depends on the market share of ARM at one point. But you can run DOS apps still so with emulation layer - and the increasing performance of ARM - one way or another old apps will be able to run on ARM. For those who will need to those.
Unlike Mac, Microsoft just can't drop past generations and call it a day.
> But it all depends on the market share of ARM at one point.
Right thats kinda my point, unless I have missed it I have yet to see any real talk about ARM on custom built machines and I doubt gamers are going to give that up anytime soon.
Apple was able to force the transition to happen. I highly doubt Microsoft is going to risk actually dropping x86 from Windows on any reasonable timescale and there has to be something for ARM to x86.
Unlike when Apple announced that all of Mac was transitioning, there isn't a reason for a developer to think that anytime soon they can drop x86, so why complicate what they have now by adding ARM?
> Right thats kinda my point, unless I have missed it I have yet to see any real talk about ARM on custom built machines and I doubt gamers are going to give that up anytime soon.
A lot of gaming these days is running on mobile phones and portable PCs - and now laptops - will highly likely leverage ARM sooner or later. Add to that some eGPU with Nvidia cards and you get a monster.
Intel is in a deep trouble.
>Unlike when Apple announced that all of Mac was transitioning, there isn't a reason for a developer to think that anytime soon they can drop x86, so why complicate what they have now by adding ARM?
ARM is the future as there is a desire to have long battery life and performance increase. Microsoft right now does have x86 emulation layer and app support right now is much better already than it was before (in RT era where it did not even have the emulator).
Devs are developing apps across all the devices and ARM based Mac is already requires you to develop ARM compatible apps.
>I have yet to see any real talk about ARM on custom built machines and I doubt gamers are going to give that up anytime soon.
The vast majority of gamers game on smartphones and tablets with ARM processors.
Some of the biggest gaming hits recently have also been cross-architecture and cross-platform, namely Genshin Impact and Honkai: Star Rail. Native ARM and x86 releases, runs on Windows, Android, and iOS. There are also gaming hits like Fate/Grand Order that don't have an x86/Windows release at all due to not even considering desktops/laptops.
> The vast majority of gamers game on smartphones and tablets with ARM processors.
Those are clearly not the gamers I am talking about. The gamers I am referring too are not switching to playing on mobile phones. If they are switching to handheld devices they are going with x86 devices like the Steam Deck.
There is a massive market out there of games that do not support those platforms. That are only just now scratching the surface with games like Death Stranding releasing on iPhone and Mac.
Except for Nintendo the 2 main AAA consoles are x86 based, and I have seen no rumors of that changing.
So great, there are large mobile games but lets not pretend that there is not a huge market that the future is not already here for and shows very little signs of actually changing anytime soon.
https://steamcharts.com/ that is what I am talking about. Which unless I am mistaken the only one of those in the top list that actually runs on mobile is PUBG.
> There are also gaming hits like Fate/Grand Order that don't have an x86/Windows release at all due to not even considering desktops/laptops.
That is nothing new, Pokemon GO came out in 2016. That isnt a sign that gaming is changing but that gaming is expanding to include new types of players. But the "hardcore" AAA gaming market still very much exists, and is firmly on x86 right now.
Porting a game from x86 Windows to ARM Windows may take some effort, but for most games, nowhere near as much as porting to a different operating system. There just isn’t that much assembly code or even SIMD intrinsic use in your average game. And thanks to Microsoft’s Arm64EC ABI, the conversion from x86 to ARM can be done piecemeal. If, say, the game depends on some proprietary third-party library that isn’t willing to offer an ARM version, that library can be run in emulation while the rest of the game is compiled natively for ARM.
The AAA game world is very conservative, so I can’t guarantee that PC game developers will port their codebases to ARM. It really depends on the size of the audience and how well the x86 emulator works as a substitute. Even if ARM takes over on Windows laptops, I’m not sure laptops are enough, when laptop users are already accustomed to not being able to run AAA games well.
But if the audience gets large enough, it’s hard to believe that developers won’t try recompiling. It’s just not the same level of effort as a port to Mac or Linux.
> The AAA game world is very conservative, so I can’t guarantee that PC game developers will port their codebases to ARM.
Unreal, Unity, CryEngine and Godot all support ARM, so - testing and third-party binary libraries aside - there shouldn't be any reason to not have an ARM port.
> Which unless I am mistaken the only one of those in the top list that actually runs on mobile is PUBG.
Even in that case it's "kind of but not really". PUBG Mobile is a distinct game from regular PUBG, they have similar core gameplay but they are developed independently of each other.
Fortnite is the outlier there, being the exact same game across every platform. COD Mobile and Apex Mobile are/were also officially sanctioned clones of the original game, similar to PUBG Mobile.
>Those are clearly not the gamers I am talking about.
You specified gamers, you should have explicitly specified PC gamers if they are who you referred to.
Note that PC gamers are, as much as they deny it, a minority of out of all gamers as a whole. The vast majority of gamers play on mobile or consoles, and of those mobile far outnumbers consoles too.
Consoles can also switch processor architectures with the changing forces of the wind, they don't have to support backwards compatibility unlike x86 and Windows. If Windows ends up becoming more ARM dominant than x86, consoles will likely follow suit to make subsequent Windows ports (and then also mobile ports?) easier.
Going on a tangent, I find it very annoying that PC gamers despite being the minority somehow want to claim gamers aren't gamers. PC Master Race is a meme, not reality.
>Which unless I am mistaken the only one of those in the top list that actually runs on mobile is PUBG.
Stardew Valley at #10 also has mobile ports.[1][2]
>But the "hardcore" AAA gaming market still very much exists, and is firmly on x86 right now.
The games I cited are AAA games, FSVO AAA; they are developed and/or published by big, established studios and/or publishers. Frankly, I find the AAA moniker worthless these days, but I digress.
That didn't sound correct to me, and I found an article [0] that says the numbers are pretty similar.
I play some games on PC+PS5 and some on mobile, so I'd probably count as both a mobile and "legacy" gamer, but if I had to choose one gaming market to immediately disappear from the face of the earth, it would be mobile gaming for me, absolutely.
The graph is misleading because they group PC and consoles together against mobile. That implies mobile would slaughter both of those segments individually.
It's also missing some very important markets like Japan and South Korea, presumably included in "48 markets multi-market average" but not explicitly shown individually. Makes one wonder, eh? :V
I'm a Windows ARM user (Surface Pro X). For me the benefits (fanless, battery not running down randomly in your backpack, phone charger compatibility, integrated LTE, 16G RAM in that envelope), are worthwhile.
No one cares for power saving. Turn it into higher performance at same power usage and people will bite. Of course it has to actually be a real upgrade like the Apple Silicon chips were.
An interview with Satya and the WSJ indicates it is actively cooled. Joanna asks him whether you’ll be able to hear the fans and he says they’re quiet versus missing.
There's the AI code assistant thing that github actually started, there's the horrible chatbot maker GUI demoware, there's AI stuff you might be able to do with your sharepoint (if only you could get hold of the right ms sales rep to take your money), there's an app that does genai things on your personal MS account... And now there's a Surface rebrand?
That org chart meme about Microsoft being little fiefdoms pointing guns at eachother never stops being relevant.
See the hellish naming and branding they've done with the .NET ecosystem... It is so convoluted, even people actively developing on their stacks would have and issue figuring out if they are downloading the right stuff.
Microsoft appearingly random naming conventions make me want to actively avoid using their products because I know the web searches will be heavily polluted.
I thought big companies like this would have some sort of internal committee that decides if a new products branding makes it easier for customers to understand what the product is and where it fits in their offerings.
Having had to come up with a name for a corporate product and deal with things like trademarking, I can understand why they seem to repurpose names frequently. It is enormously time consuming to generate, vet, and apply for a trademark. There are so many other products and names and trademarks out there that it’s no wonder drug companies end up with unpronounceable gibberish for new product names.
The same with Visual Studio. It is referred to by a year, eg Visual Studio 2019, but then you need to look in the make files for the actual version number of the project, and then look on Wikipedia for a table matching versions to years.
It was very confusing, and for a while there every team inside Microsoft started adding .NET to their name for some internal visibility points regardless of any connection with the common language runtime.
That's how you ended up with names like Windows .NET Server 2003.
ASP.NET Core is not a web framework for .NET; it's at least four different web frameworks you can choose between. It's more like the overarching branding for anything web-related in the dotnet ecosystem.
All these, effectively, plug into ASP.NET Core, on top of WebApplicationBuilder and WebApplication. They are then, usually, hosted with Kestrel (web server) and operate with the same set of abstractions. Razor Pages and Blazor are distinct names and I have never seen anyone confuse them with the ASP.NET Core itself.
You see, this is where it gets confusing. ASP refers to a specific technology, active server pages, but it's also the overarching term used for anything to do with dotnet and the web. So you get this: https://learn.microsoft.com/en-us/aspnet/core/tutorials/choo... - you can have an entirely frontend SPA, in at least two ways, and still be called "ASP.NET"
I have an application which only serves over GRPC. It has to pull in "Asp.Net" nuget packages, because that's the branding under which the Kestrel HTTP2 server lives.
Yes, it's still called "ASP.NET Core", even though ".NET Core" was renamed ".NET" from version 5 -- not to be confused with ".NET Framework 5", which was renamed ".NET Core 1.0" before launch.
Don't forget ".NET Standard" that could be used from both .NET Core and .NET Framework, until version 2.1.
You see, that just doesn't lend itself as nicely to tribal parroting of worse than mediocre developers here in the comments (if such people are able to code at all).
I mean you literally work on Dotnet, don't you? That's probably why you don't have issues with the naming. I agree that it's currently fine now that it stabilized on the core naming scheme, but I don't see how it was very confusing when there's stuff like ASP.NET Core on framework, and ASP.NET Core on Dotnet core... that's just confusing, especially since other ecosystems don't usually have such a weird naming scheme.
It’s only problem in the eyes of HN because it is low-effort complaint from people who never used .NET and simply repeat what they read elsewhere. It’s just a popular thing to do, to make negative comments like this.
Otherwise, this problem is completely made up in terms of anything that happened in last there-four years.
And no, I don’t work on .NET save for a few simple contributions.
So the asp.net thing I highlighted isn't true? I'm sure the issue isn't that bad, but it sure is weird to claim that the naming wasn't horrible. Like, surely you could agree that the naming was much worse than it should've been?
It seems they identify so closely with dotnet that any perceived criticism is taken as a personal slight. It's the only thing that explains such a rabid response to a reasonable observation.
I don't identify with something that is just a tool (although one of the best ones). What does piss me off however is when people perpetuate false facts, straight up lie about arbitrary matters, are incapable of changing their mind when facts change and when disagreed with, resort to personal attacks.
This can be seen through other issues in the industry but is particularly felt in bad teams - social cohesion resides on a set of commonly agreed upon beliefs within a group and the worse the team is the more such beliefs are at odds with reality, and all I've been seeing in the past year is HN slipping more and more into this when it comes to programming.
Which languages do you program in? (asp.net core name is fine, because - who cares? it's not like js does any better, it's something you don't think about twice and is irrelevant to the experience)
Well no said that it was a huge deal. The point was that msft is horrible at naming things. Do you have any example of something like this happening in any other ecosystem?
"
In summary:
ASP.NET MVC 5:
ASP.NET MVC 5 was a short-lived successor
to ASP.NET MVC 4.
It was released alongside ASP.NET Web API 2 in 2014.
It actually ran on top of ASP.NET 4 (i.e. .NET 4.x version of System.Web.dll). Note that the entire
ASP.NET MVC library is now obsolete.
ASP.NET 5 was EOL'd and rebranded as ASP.NET Core and it includes the functionality of "ASP.NET MVC 5" built-in.
ASP.NET Core 1 and ASP.NET Core 2 can run on either .NET Core (cross-platform) or .NET Framework (Windows) because it targets .NET Standard.
ASP.NET Core 3 now only runs on .NET Core 3.0.
ASP.NET Core 4 does not exist and never has.
ASP.NET Core 5 exists (as of August 2020) however its official name seems to be "ASP.NET Core for .NET 5" and it only runs on .NET 5."
Again, not a big deal in retrospect now that it has stabilized. But it was a huge deal. Because you couldn't easily figure out if you needed to use Asp.net MVC, or if that version is now deprecated, and if the core you're using means dotnetcore or aspnet core on framework... again, it's the type of stuff that matters when it happens and leaves a mark afterwards.
Whereas I do use dotnet daily, have done so for years, and I like the C# language and a lot of the ecosystem, certainly over Java .. and I still hate the naming scheme. I know the difference between all the confusingly named things, it's just that Microsoft branding insists on flattening them all together.
The internal codenames were better. If I say "Roslyn" it's a lot clearer what I'm referring to.
Do you expect anything else from the company that introduced "Plays For Sure" branding over a wide ecosystem, only to kill the ENTIRE thing (not just the branding) less than 5 years later? (as in, all purchased content became unplayable)
Still one of the funniest pieces of corporate stupudity ever.
And not just because they killed it. Even when they introduced it, the "plays for sure" brand meant music that couldn't be played on the ipod or most mobile phones. That it quickly came to mean music that couldn't be played anywhere was just the icing on the cake.
I was peripherally involved around that time. Trying to go through in my head what caused a lot of that, it feels like meetings were MBA heavy and light on engineering and design; whereas other startups have had more balanced meetings where the grunts could push back against the totally out-of-touch ideas of the pointy-hairs.
It's kind of like those cryptographic keys they use a dictionary for: a nonsense noun phrase representing a number. Perhaps they're just encoding a SKU?
Windows 95 email client "Exchange" / email server platform "Exchange"
"Outlook" / "Outlook Web Access" / "Outlook Web App" / "Outlook.com" / "new Outlook for Windows"
"Microsoft Teams" / "New Microsoft Teams"
"Office Communicator" / "Microsoft Lync" / "Skype for Business" / "Skype" / "Skype for Business Online" / "Skype for Business for Microsoft 365"
The most guffaw-inducing branding, to me, was the recently-announced remote desktop client called "Windows App". That's going to be an easy one for users to search for.
(For guffaw-inducing I suppose there's also the Windows 98-era "Critical Update Notification Tool"[0])
> Universal Windows Platform (UWP) apps (formerly Windows Store apps, Metro-style apps and Modern apps)...
Ironically, that list misses another former name, "Windows App" (different from the "Windows App" you guffawed at). That name was used around 2017 and used extensively in the 7th edition of Windows Internals.
To add to this, I have always found the Xbox naming conventions to be confusing, personally. "Xbox One" is the third one, not the original "Xbox" and the two newest models are named almost identically; "Xbox Series X" vs "Xbox Series S".
To make it even more confusing, the Xbox One had the mid-generation updates called Xbox One S (slimmer, a few additional features) and the Xbox One X (more powerful.)
So from oldest to newest it's
- Xbox
- Xbox 360
- Xbox One
- Xbox One S
- Xbox One X
- Xbox Series X and Series S (released simultaneously: S is smaller, X is more powerful)
So for a period of time in stores you might see a One S, a One X, a Series S, and a Series X. If you aren't a gamer, it's a complete mystery which is the newest and most powerful. I'm sure some kids got the wrong console for Christmas, as the One X was at times more expensive than a Series S, despite being an older console that would later not support many games that the Series S supports. This would be even more likely to happen if the Series X was out of stock (so the most expensive Xbox console at the store might be a discontinued model that won't support all the new games.)
In contrast, it's pretty obvious that a PlayStation 5 is going to be better than a PlayStation 4. Yes, a quick search will show which is the newest and most powerful Xbox, but if people have to do research to find out which is your best console and they don't have to do that for your competitor, then you have a confusing naming scheme.
I owned an XBox One something. I believed "Series X" was short-hand for "Xbox One X", as I believed that there were maybe other kinds of "Xbox X". I even bought a game that didn't run on my console because it was for a "series" something, which was not actually what I had. "Series" is often used as an english word to identify a product line. Like "Is that the 'premium' series?"
Sometimes I joke about how confusing the xbox names are. I probably couldn't come up with a more confusing set of names if I tried.
Rumour has it (not sure if this was ever confirmed) that one of the big reasons the second Xbox was called the Xbox 360 was to avoid unfavourable number comparisons with Sony. The Xbox launched vs the PS2, which meant the "Xbox 2" would compete against the PS3. As 3 is bigger than 2, it would make the second Xbox look bad. Hence, Xbox 360. Both have a 3, no number issues. For what it's worth, Robbie Bach (former Chief Xbox Officer) is on the record as saying one of the potential names for the second Xbox was just "Xbox 3" to catch up the PS3.
While officially the meaning of the "Xbox One" name was something about it being an all-in-one entertainment system, I would put money on it being chosen as some kind of subliminal naming scheme as it sounds like "Xbox Won".
Steve Ballmer was hoping people would call it "the one". This was also around the time that SkyDrive had to be renamed to OneDrive due to trademark issues with Sky.
I always judge corporations whenever they resort to "One" as a brand because it signals a total lack of creativity and is likely the result of executives fighting each other and settling on the most mundane and inoffensive concept to represent "it does everything".
I always thought calling it Xbox One was the most bizarre choice in the history of branding and marketing. Given how common it is to retroactively refer to the first item in a series as "One" (Rambo 1, Rocky 1, Playstation 1, etc), it seems intentionally designed to cause confusion.
This is beyond being bizarre. I have never owned an Xbox, and always thought that Xbox One was a re-release of the original Xbox, similar to the Original PlayStation -> PS One. I am hearing it for the first time here that it was a third generation device.
I find that name even more baffling when the reason they apparently branded the previous one Xbox 360 was so that they wouldn't go against the PS3 with an Xbox 2. Somehow it was now fine for an Xbox One to go against a PS4.
Sure, but the problem is S and X sound very similar when spoken, causing more confusion. Try clarifying which one you are talking about in a loud room at a conference.
Ugh... and don't even get my started about the pronunciation of "Azure" (or the fact that, somehow, they took a project code-named "Red Dog" and named it after the color blue. Then there's the JEt Red and Jet Blue database engines, one of which was used by Active Directory...)
You're forgetting Azure Active Directory Domain Services, which is presumably now named Entra AD Domain Services which is different from Azure AD/Entra AD because it's a managed domain controller in Azure...
They're actually still separate products. They don't want to sell office 365 but it still exists.
M365 = office 365 plus windows as a service licensing. If you buy your licenses as lifetime with your laptops it is much cheaper to simply subscribe to O365. Thus Microsoft is gating more and more things behind M365 to get companies to pay for the expensive windows subscription.
It’s called “Intune” / sorry, “Microsoft Endpoint Manager” is a way better name / just kidding, it’s “Intune” again! We had you there for a second though!
Edit: you updated defender, but you missed the depth of the rabbit hole. There's defender for office 365, there's defender for IoT, for Containers, for cloud, for cloud apps, for identity. There's one for gramma too
dotnet for me is the most obvious example of how terrible they are at branding. First there was .NET framework which was windows only (note I'm ignoring Xamarian because it was originally not owned by MS).
Then they decided to do a reboot with cross platform support and named that Dotnet Core. This was honestly fine. But then we reach late in the 3.x timeframe and they declare for real and for true that Framework is a dead end, and Dotnet Core will be the one true Dotnet moving forward. And to indicate this, the next version will remove Core from the name, skip 4 because it would be too confusing with Framework, and just call it Dotnet 5.
I wish they'd stuck with the Core name, if no other reason so if they decide in another 15 years to do a major rebuild again they can just come up with another new descriptor the way Core described the transition away from Framework and towards real cross platform support from MS itself.
It was a joke about the nature of these names and the nature of that particular problem (and nothing about engineers, really) -- but I guess the downvoters didn't get it.
The thing about Copilot is that outside of the developer niche, nobody knows the term. The vast majority of Microsoft's customer base will recognise neither the Xbox feature nor the Github product.
In the same way, their .NET naming has never bothered anyone they actually care about selling stuff to. It's a tad annoying for developers, but nothing more than that.
I find the way they renamed their Office products every five years much more baffling. Consumers probably don't care beyond "office" but I'd expect them to protect their business clients from their ever changing names for office products at least.
It's not a terrible name either. Assistant would be better, but they can't use that name[1] of course... I'm having a hard time thinking of something else that instantly communicates this idea of an intelligent subordinate aide that doesn't have negative associations.
Personally, I think Microsoft wastes the name "Cortana" on their mediocre Google Assistant competitor. I haven't really seen Copilot do much copiloting, it mostly seems to answer questions and follow instructions. Maybe Windows 12 will be different, but I kind of doubt it.
A Microsoft employee did once joke to me that if Microsoft had invented the holy grail it would be called the Microsoft life preserver 3.4 Pro+ or something like that
I don't find it all that confusing, most of them work similarly and I don't see how you could call them different things. I just hate that their logos aren't uniform. The Windows & Bing logo is a rainbow color (looks awful), and the Edge version is blue and green (looks way better). It's not remotely comparable to their other branding flubs (Teams, .NET, etc). I wish they would have kept the Cortana and Continuum brands. Recall is basically what Continuum should have been, why not just keep the name? Co-pilot works for naming in some cases, but when you look at the GPT 4o voice demo that seems like the Microsoft white labelled version should be Cortana.
Cortana's biggest fault was mostly that it wasn't very good, and the things it was good at required the Cloud...but with the new AI chips, some of that can be offloaded and work much faster. It's like when they added Cortana to Xbox and killed the other voice commands. Then it just became a very slow process when the old on-board model was way faster. Even the voice commands became longer "Xbox on" to "Hey Cortana, turn on my Xbox" then having to wait for it to ping a server and come back to your device.
> Their "copilot" brand is so weird and... muddled.
It’s Watson.
Ha ha, only serious. You’re right. It feels like an umbrella brand they’re just tossing around, because AI — and Copilot in particular — is hot in Redmond.
I find the new Copilot key funny, because it feels like a pantomime of the Windows 95 keys[0], but with Logitech characteristics.
[*] Okay, it’s been 30 years. I haven’t used a Windows computer in almost as long, and so I ask. Do people who use Windows actually use any of those keys? It always seemed weird that you’d need the start menu at a single button press, and the right click menu at a keyboard press felt even weirder. I think I only used the Windows key as a meta under Linux, and I don’t think I ever hit the context menu key out of anything but curiosity.
I use the "Windows" key pretty extensively, including to open the start menu (and then type in a search term, i.e. a program on my computer to launch).
I also use it extensively for "Windows" (operating system) level shortcuts: Win-R to open a run dialog, Win-E to open Explorer, Win-<left arrow|right arrow> to move/resize windows, etc.)
That being said...I use it in basically the same way on Linux, and use the Command (Apple) key on Macs for essentially the same purposes.
I don't think I've ever used the "right click menu" key for anything, though. Most modern Windows keyboards don't include it, or have it hidden behind a manufacturer-specific function key.
Huh. You’re right. Some keyboard have both windows and the menu key, and others have only one menu key. I don’t know if this means Microsoft relaxed their “Made for Windows” standards, or higher profile manufacturers don’t care.
I use it a lot. Win and start typing to launch just about any app or open any document is really handy. Win and a number key launches or switches to that app pinned at that position on the taskbar. Win+L locks the screen whenever I get up from my desk. Win+Shift+S starts the screen clipper. Win+Left/Right snaps an app from one side to the other, win+shift+left/right switches between desktops, Win+Tab lets me drag apps from one desktop to another and see what's open where if needed, Win+E opens a new explorer window, Win+. opens the emoji keyboard. Those are just the ones I use almost every day, I probably use a few others a lot as well.
The windows key is pretty handy. Lots of good shortcuts, and they add new useful ones often. I don't tend to hit it by itself much anymore, because the start menu is so terrible and inconsistent, and anyway, we only run three programs anymore.
I don't think I've used the menu key... If I want to right click, there's the mouse, or mousekeys... But maybe I just missed out on learning to use it. Mostly everything in the context menu is in other places too that you might get to with the keyboard.
> you really can't convince me the above is the result of a coherent company-wide strategy
Well that was an ignorant thing to say given how widely famously Microsoft implemented it's stack ranking system and the toxic culture it produced (as intended). This is exactly the result of that strategy. People who thought it had gone away are mostly parroting Microsoft propaganda. I live here in town with these people. You don't get hired without having "Microsoft morals". It's all a desperate gold rush to find out who is going to get promoted.
Yes, and additionally the Copilot 360 user interface is a mess, processing time is slow, and the quality of results is poor. Using the Chat GPT or Claude interface produces much quicker/better results.
Copilot is also the name of a windows (and presumably xbox console) feature that allows you to combine multiple controllers and have them show up as one device.
That name was cursed from the start, too. The original "Microsoft Surface 1.0" was tabletop platform that got renamed "PixelSense"[0] years before there was a "Surface" tablet computer.
I recall it was renamed to PixelSense right when the Surface tablets arrived.
Then they named an entire input device line "Surface" as well. When you search for "surface keyboard" you will get results for desktop keyboards and type-covers for tablets.
I agree, and it's strange to see 'Copilot' everywhere. By the way, does anyone know how the development/update of GitHub Copilot is going? I tried to look for a blog from the engineering team or something similar, but I can't find anything.
Microsoft is a "student body left groupthink" company. We once named EVERYTHING "Active" something, then EVERYTHING ".NET" something. This is just the latest in a long line of tradition of groupthink.
Copilot is a game-changer in coding, helping developers with real-time suggestions and code snippets. However, it's controversial because it relies on large datasets scraped from public code repositories. This raises intellectual property issues and concerns about algorithmic bias, as Copilot's suggestions are influenced by that data. Despite that, it's super useful, but Microsoft needs to address the legal and ethical issues around data usage and bias to keep it on the right track.
If the following does not happen in 2 years I will eat my hat:
They will have performance problems. They will have compatibility problems. They will have poor repairability and zero repair network and support. The software will be abandoned and completely useless within 18 months. They will fail very early but just outside a standard 1 year warranty. This will be a lot of e-waste. Regulatory or national bodies will step in and force privacy regulations which make all of this unworkable.
They've been selling ARM Surface devices for a while now. Not sure why these would suddenly fall over and stop working.
Even if the AI stuff doesn't pan out that doesn't make it e-waste, that just makes it a normal PC that could do everything previous PCs could do anyway.
If in 18 months anyone feels that their ARM laptop isn't cutting it, i'll gladly pay $0.15/$1.00 for them. I know lots of people who could definitely benefit from a laptop that cannot afford them, and IME ARM runs linux just fine.
You're lucky to get that far. I got a tablet with 32-bit Windows 10, which doesn't get updates any more. When I tap the search box, the on-screen keyboard appears ... and then immediately closes again.
I think Valve disable baloo (understandable since you don't want file indexing while gaming). That's probably why the default KDE search will perform poorly.
You should be able to use KFind which will work kind of like the Windows XP search.
This Apple-genesis (Nuvia) and Microsoft-led (Pluton) Arm Oryon hardware provides rare boot standardization and optional upstream Linux on Arm EL2. With enterprise PC OEMs on board, there should be a healthy supply of used Arm Linux laptops in a few years. https://news.ycombinator.com/item?id=40350408#40355554
The system firmware will ship with 3 boot modes selectable via the setup interface:
- Windows (this one has the Windows tcblauncher escalated to EL2 through Secure Launch)
- Linux (this one stays at EL1)
- Linux w/ KVM (which jumps to EL2 [at ExitBootServices] before kernel handover)
Mainline Linux support is underway via Linaro/Qualcomm and Dell supports Ubuntu Linux as a first class OS. Linux support won't be perfect in OryonV1, but if enough customers use these devices with Linux, it can only improve. Device trees are likely still needed for Linux.
Arm SystemReady SR/ES assumes sane ACPI tables. That's something that Snapdragon X (1st-gen) very much doesn't have. The ACPI tables present there are pretty much only usable for Windows if you want full functionality.. ESXi-Arm is bootable, but a number of patches were required.
> the upstream kernel was used during the Snapdragon X1 Elite SoC Linux bringup.. demo booting upstream kernel with a Debian/Ubuntu userspace on a Snapdragon X1 Elite QRD (Qualcomm Reference Device).. Boot to console support has already landed on kernel version v6.7 and is on track to have remaining kernel support land by the time the first commercial device with X1 Elite SoC comes out on the market.
> We set out to solve one of the most frustrating problems we encounter daily – finding something we know we have seen before on our PC. Today, we must remember what file folder it was stored in, what website it was on, or scroll through hundreds of emails trying to find it.
It is indeed frustrating that one still cannot search effectively a local device, but it doesn't need AI to solve. It needs a proper search engine, and Microsoft has resisted that, for some mysterious reason, for 30+ years.
What there are still people who don't install Voidtools' Everything on all Windows PCs and assign it a global shortcut? Since I have it I don't even bother organizing stuff
It's amazing. It does what Google Desktop tried and failed to do on Windows XP. And it's blazingly fast by reading NTFS directly instead of using the Windows API. (OK, back then needing 1GB of RAM was pretty much impossible.)
Does Everything search file contents? I've only used Everything for searching filenames (and it is fantastic for that). Google Desktop's sweet spot was searching within files.
The stable version cannot. 1.5 Alpha can do it. It can also index properties (as in resolution, author, song/movie length, etc.)
So far I would recommend that one, it has tons of new features (including a change journal that shows renames, moves and deletes in real time) and I didn't experience a single bug yet. And it is even faster.
Does anyone know of a similar tool for macOS? I'd gotten really comfortable with a Ctrl+Alt+S binding on Windows for Everything and miss it these days on macOS.
If you had it index everything you want to find it would do just that.
And be excruciatingly slow, unreliable and wasting Gigabytes of storage. All while consuming tons of CPU time in the background.
The idea of an indexing service is good though. KDE's Baloo faces similar issues. It's not that easy to make it a good experience.
Everything (mentioned in this thread) comes closest to being fast, reliable, and usable. It can even index external disks and search them while offline.
It always baffles me how long it takes for Windows to search for a file while the GNU's `find` command churns through filenames and paths like it's nothing.
Windows Everything finds any file globally in the blink of an eye, let's you sort in real time by size/date, live-edit your query, and also perform operations on the result in the list, so much better that any find
It's highly optimized, but it still needs quite a bit of RAM when the index is large. But RAM is cheap nowadays.
It's the only thing I really, really miss when I am on Linux. (there is FSearch, but it's not quite as great, yet)
Losing Google Wave, then Google Desktop search and then Google Reader like shot ducks in a row truly signified the death-knell of "Don't Be Evil" days.
It's really baffling, isn't it? A part of me wonders if Microsoft is simply unable to figure out how to make a good Windows search and so is looking for AI to do it for them.
But it seems like shooting a mosquito with an elephant gun.
Microsoft has tried to make a "better" search before, by redesigning the file system and the metadata that can be tied to files.
The idea was that you could search this:
> the phone numbers of all persons who live in Acapulco and each have more than 100 appearances in my photo collection and with whom I have had e-mail within last month
They had hyped this up as coming in Project Longhorn (which was eventually split into Vista and salvaged in Windows 7), but the new filesystem was eventually dropped like their other attempts:
They should implement simple search first, so I don't need to reach grep for everything. Then they should implement pdftotext and fuzzy search. Of course everything should work instantly for small folders. That would be enough for 99% people.
A million times this. Windows search is, as near as I can tell, completely worthless and I have to install other utilities to have working basic search.
I don't need or want anything fancy like semantic searching. I just want to be able to grep and find things by filename.
The server version called "Windows Search Services" was rather alright. It was even integrated into the Win7 Libraries and you could search through contents server-side from Explorer.
It supported plugins and was reasonably fast. Needed a lot of RAM though and some admin to babysit it.
It is less baffling when you consider decisions are not being driven by engineers. Investors don't want a search engine. Investors (and other non-engineers) want "AI"
I don’t understand why you say it doesn’t need AI. It’s like saying a financial application doesn’t need a database. Maybe it can use something else, but who cares what technology is used as long as it’s good? I’m willing to believe that AI is making this tech much more effective
There is locate32, which also builds an index, but then is blazingly fast. Windows does build an index too, but it doesn't feel like it uses it in any way.
Billions of dollars to cram "AI" down our throats, but they can't be bothered to get Excel back up to the level of quality and performance it had 20 years ago.
One thing that has continued to baffle me about Excel is the fact that they force worksheet tabs and the scrollbar to co-exist in the same pane. Why in the world can I not have a full horizontal scrollbar and a full width tab collection?
Imagine the crass software quality of an AI startup on top of 30+ years of legacy code. That's Copilot+PC for you, boys. Enjoy it to the fullest when your company IT sends it your way.
I assume (without any evidence) that excel is basically mothballed / offshored other than the occasional LET and LAMBDA type of deal every now and again
Being the designated "tech" person for my extended family and friends circle, I don't think I could recommend this to any of them because of the privacy nightmare.
At least with Apple you have a single vendor who is vertically integrated and makes a huge song and dance about data privacy. Even if you discount their PR and marketing spin, IMHO you are still miles ahead of the likes of Microsoft + (pick one) HP, Asus, Lenovo and the rest of them.
There is no way I would trust any of them not to take advantage of the data gold mine.
I was speaking to someone at the weekend about this. She said "but why would I need this"?
I suspect most people aren't putting AI into any purchasing decisions. Most people really actually don't give a shit about it. They just want things to work exactly how they did before without people moving stuff around because they just want to get stuff done.
As it has been said so many times, tech privacy aware people are the minority. It won't make any difference for my non-tech neighbor (in fact, he will probably be delighted if it helps him with something).
Unless there is a specific, believable, near term risk people will just ignore it.
Most would submit genetic material to 23andme and similar organizations with no restriction on its use. Yes, if could theoretically backfire not just on them, but also on their kids. But unless they see it as a near-term likelihood they will not care enough. My 2c.
They're trying to have it both ways and it's not clear to me as a consumer what is local and what is cloud. (As a developer, I can tell they're doing a few things locally like OCR and webcam background blur on the NPU, but they are not running ChatGPT on an a laptop anytime soon)
Although the line can get fuzzy when they want to ship a feature that's too big to run locally. Android has run into that, some of the AI features run locally, some of them run on Googles servers, and some of them might run locally or on Googles servers depending on which device you happen to have.
The whole point is making the consumers pay the cost of running LLMs (both in hardware and power), not your privacy, they will still get your data to train better models.
Above was about people who care about privacy though. It's not that surprising, but most people don't care about it to the degree that would make them use OSes that respect it. People who actually care tend to use Linux since they have an obvious reason.
It is only possible for photos to resurface because they were stored in iCloud for months or maybe years essentially in spite of user intention for those photos to be deleted.
Apple is a heavily lock-in oriented company with ulterior motives and macOS is filled with DRM to boot. That's poorly compatible with the concept or privacy by definition.
I cannot imagine building on Microsoft in 2024. They have aggressively thrown their weight around doing anti-consumer moves. These are the warning signs they don't need to care about their customers and can fully exploit their market monopoly.
I still have ads after an update in windows 11 pro.
Not to mention I have file explorer bugs and one note path hijacking annoyances.
Legacy software is why we still use windows. There is nothing better about windows than say, Fedora.
That is true.
I do not use windows anymore.
None of my servers ever ran windows.
One can only pretend that closed source operating systems ate acceptable for so long before one has to address the lack of configurability, in a deep sense.
A few details here: "Recall leverages your personal semantic index, built and stored entirely on your device. Your snapshots are yours; they stay locally on your PC. You can delete individual snapshots, adjust and delete ranges of time in Settings, or pause at any point right from the icon in the System Tray on your Taskbar. You can also filter apps and websites from ever being saved. You are always in control with privacy you can trust."
>can law enforcement ask your laptop "Has your user done anything suspicious lately"?
The logging already exists. Whether the interface is AI or a command line tool is not so interesting.
Besides the "suspicious" stuff has other touch points like Google, ISPs, AWS, etc, etc. Those logs have been easy to get a hold of (with legal backing) - historically speaking.
Coverage on Windows Central says it is all local. As for the law enforcement question, it doesn't matter. If they want to get info from your computer, they will and they don't need AI to help.
> As for the law enforcement question, it doesn't matter. If they want to get info from your computer, they will and they don't need AI to help.
This is a poor take. Law enforcement needs a warrant to seize your computer to get your data normally. Law enforcement only needs a compliant and willing corporate person to willingly give over your data to law enforcement. It's about who owns it and where is it located and the protections around it.
getting a warrant requires probable cause, and a judge to approve it. maybe that's not hard, maybe it is, but i'd personally prefer more steps being required to access what data i have.
Indeed. It's the difference between already having substantive reason to believe you committed a crime, and surveilling you willy-nilly looking for a crime to pin on you. The less arbitrary access law enforcement has to your data, the more difficult the latter becomes.
Exactly. There are some judges that will rubber stamp warrants without even looking at them closely, but there are still plenty who don't and there's a paper trail as well that provides some protections. When you just go to where no warrant is required, it won't make a difference when the police want to get a specific person, but it will make a big difference on whether they are able to surveil wide swathes of the population and how easy it all is for them.
In Germany it's the same in theory, in reality even bogus causes are approved by the judges who neither the time and knowledge to really check the inquiry.
In the United States, protection from unlawful search and seizure has been in our constitution since the very beginning through the fourth amendment. This is a critical component of what we believe are human rights, so we will continue to insist that law enforcement should "get a fuckin warrant" if they want to dig through our personal data, regardless of your opinion on how easily that warrant will be granted.
I mean you can encrypt and refuse (forget?) the password in the USA, but there's really not much else. Judges (at least here in Texas) give out warrants like they're America Online (TM) CD's and the state attorney general is a well-known "alleged" grifter. It's looking real bad for the 4th amendment in the USA, it's on the ropes and has been especially since the Patriot Act kicked off a lot of police anti-constitutional actions.
Even if it is run locally, does it sync between your devices?
Considering we seem to be putting safety largely out the window, I can see someone saying "yeah I want to be able to ask my other computer what I did on a different computer" or whatever. Thats valuable! Ignoring the risks that involves.
Also, even if it is run locally that likely means that some sort of additional logging (possibly screenshots given the mention of "photographic memory") on top of what is already in the system logs to achieve this.
Windows 11 LTSC, which will hopefully have these features ripped out, can't come soon enough.
Windows 11, if one has OneDrive turned on backs _everything_ up in terms of data/files.
I bought a second computer recently which was the same as my Samsung Galaxy Book 3 Pro 360 --- after setting up Windows I found that it had downloaded _all_ of my files onto the new computer, but the wallpaper wasn't the same --- went to my old one, figured out where the file was and copied it to the desktop. While I was trying to figure out how I'd copy the file over, the file appeared on the desktop of my new computer.
Genuinely curious: do any of the features/use cases they market here sound appealing to people? I am struggling to imagine wanting to use any of them. I can, however, easily imagine the sound my cooling fan will make while the NPU is running 24/7.
I find this useful as a productivity tool. For example, this can give me my standup update summary. It knows what I worked on and can summarize it for me.
AI/LLMs are great at staying organized over huge amounts of data and this is the perfect application.
disclosure:
I am the founder of Perfect Memory AI https://www.perfectmemory.ai/ that does something very similar today.
Until the day it submits something to standup you don't want to and don't tell me you will always carefully filter it and then in best case you get fired. Worst case, you get criminally prosecuted.
Not in the least. I'm looking forward to when this trend dies and all of these worse than useless features hopefully get scrapped. Meanwhile there is now extra hardware being put inside new laptops. I wonder if we'll be able to buy CPU's without NPU's?
My feeling is much of this client AI hardware push is to dodge the power cost. They're looking at spending $100B on a 5GW 'Stargate' datacenter for AI, and paying back that investment will go better without the ongoing (forever?) costs of running the resulting models centrally.
Honestly no. But then computer marketing hasn't sounded appealing for a while IMHO.
Were you excited by the Bezos charts explaining Apple Silicon's blow away performance ? or how the Macbooc Pro M3 is the proest macbook you'ever seen, or how delightful the spaceness or midnightness of the metal finish is ?
The machines themselves might actually be pretty good, and have an impact on our daily lives with better battery, better keyboards, clearer screens etc. But expecting the marketing events to honestly assert the incremental improvements is a tall order.
I ran windows under qemu, with a GPU, and a dedicated soundcard, and multi-monitor for years - even though ostensibly there was a 10-15% "overhead" due to emulation/virtualization. I had exactly zero issues, and to be quite frank i couldn't tell any difference in framerates, especially compared to my windows laptop running the same games with roughly equivalent GPUs.
Could you share a little more about your setup? I assume a linux host, and perhaps 2 GPUs with one per OS? I'm guessing it's a desktop build and not a laptop (iGPU+dGPU)?
I've really wanted to switch fully to linux, but I still use some "power" features in MS Office which apparently don't play nice on linux. Dualbooting Fedora is decent... when I can understand what's happening and don't need to go 4 layers deep every time I have a problem, unfortunately.
ryzen 3700 64GB(total), gentoo on NVME, nvidia 1060; windows on another NVME as a qcow (or whatever), nvidia 1070ti, 8 physical cores no HT, 32-48GB RAM.
you have to disable the specific GPU you want to use for another OS in the kernel command line - this means you need two different spec/brand GPUs, probably. There were some tweaks that i could probably dig out eventually, but most of what i used to troubleshoot were the archlinux wiki and forum posts pointing to blogs. However if you're just needing Office, just installing windows in qemu on literally any GPU will probably be just fine!
you can rsync the qcow to back up the entire windows OS.
for me to run games and audio software in windows my command line was:
you press both ctrl buttons to switch back and forth between host and guest, windows gets its own ip. Running office wouldn't require >60% of that stuff
i should note that command line is probably from the first time i got it working, i'd have to boot that machine to get the latest version
I cannot wait to see Apple's response. While Microsoft announcement seems rush and products half backed/thought, there are on to something. I would hope Apple would learn from it, and find the right path.
I would really like to have a AI companion that runs locally, on my dev laptop, tailored for my developer's needs. Something optimized for the hardware, and yet with tight privacy.
I don't know if this is the case for others, but I'm seeing missing images on their laptop and tablet marketing pages. The new product line feels very rushed.
Same. Our household has one pc from 2019 running windows, 3 on Linux, and one Mac, plus two macs from work. The only thing keeping me on windows is league of legends and stone art programs and I might get a mac to do that.
We've been linux-only now for quite a few years, and I've been really pleased with how much we can get done without Windows (or Mac). All the Windows only games we play work great on Proton (disclaimer: we don't do any games with anti-cheat though). The only thing that I still must have Windows for is a Cricut machine that my wife has. I've got it working in a KVM VM though with USB pass through.
I don’t know. I tried NFS Undercover on my gtx 1650 laptop with highest settings - it couldn’t do it, lags to the point of totally unplayable. On windows it works perfectly fine. Why is that? Can somebody explain?
Kernel-level anti-cheat is egregious. It's a good thing that it doesn't actually run on Linux. Nobody should be installing that garbage on their computers.
helldivers 2 has one and it still works. afaik the anti cheat developer has to explicitly disallow linux users, and at that point its not really a linux issue anymore (but still good to know)
My only question is how much of the models is being run offline and how much is the computer just siphoning up all the actions you do and feeding it into the cloud. Microsoft has a very bad history with user privacy, especially lately.
They have surely just looked at the price of an iPad Magic Keyboard + Apple Pencil and thought, “yeah we can probably get away with that too”.
Not sure how well that will go given that even the most ardent Apple fans will concede that a keyboard case costing approximately the same as one whole entry-level iPad is a bit much.
There are a number of aspect of this I find very interesting if only so that Apple has to come up with an answer to them (which I'm sure we will see some of at WWDC) but the "Copilot" brand is clear as mud... It's like that period they added ".NET" to bunch of things but way worse.
We're not always in control when telemetry cannot be turned off, when informed consent cannot be given, when choices are repeatedly overridden. They've proven this time and time again, as well as being unable to keep their own house in order.
There is a giant hole in their pitch. Recall is supposed to be local only right? Thats what makes the record everything ok…
Now imagine you actively use it and it has all your memory and history. And then your SSD dies.
They’re going to have to push this to cloud for backup - the entire thing. But they can’t say that part out loud at this stage because people will freak
Or you could just lose that history, which is basically a bonus for the experience, not a necessity. The competitor to this, rewind.ai lets you set your retention policy and deletes on a rolling basis anyways.
The value of the data shrinks dramatically over time for this kind of data, I have been using rewind for around a year now and don’t think I have ever liked back past three months.
Yes, but that isn’t what I really use it for. An example recent use was to pull up a recording of a conversation I had on zoom to see a detail on one of the slides I couldn’t remember. Prior to that it was searching for a specific set of words I knew I had seen in a proprietary dataset (think Ipsos/Gartner) that their own search wasn’t finding.
I generally don’t need/recall those kinds of details after a certain time and will accept those losses.
> They’re going to have to push this to cloud for backup - the entire thing. But they can’t say that part out loud at this stage because people will freak
This doesn't mean unencrypted, though. Will have to wait and see how much it exposes and how the key management works.
I'm not optimistic but it does seem rather unlikely they would botch the privacy of a constantly screenshotting daemon, it would be a little on-the-nose even with MS's horrendous privacy record.
It’s less a technical problem and more principles and optics.
Can’t really argue a strong local only stance and do cloud. Encryption sure but that takes us into “it’s not local but trust us it’s all good” territory…from a company known to comply with all sorts of gov info requests.
>Live Captions now has live translations and will turn any audio that passes through your PC into a single, English-language caption experience, in real time on your screen across all your apps consistently. You can translate any live or pre-recorded audio in any app or video platform from over 40 languages into English subtitles instantly, automatically and even while you’re offline.
No. Microsoft aren't even clear with their own copilot branding in the core MS products. In MS-land, anything to do with AI is just labeled Copilot right now.
This is another example of MS branding that is clear as mud. Skype vs. Skype for Business, Visual Studio vs. Visual Studio Code vs. Visual Studio for Mac, OneNote vs. OneNote for Windows 10...
To my knowledge, each one of the products in this list is a completely different beast despite naming similarities.
And Surface initially was an interactive table (now called PixelSense) before they reused the name for their tablets. Which made it really difficult to find relevant information on developing software for the old Surface after the tablets were announced!
Oh man, what a fail. I loved the idea of the Surface, and was bummed out when they coopted the name for their PC/tablet line, as I assumed that meant the original product was dropped. I had no idea about PixelSense and I could have been a user.
As an aside, when I was at MSFT in Building 40 in 2013 we had a Surface table in the lobby; it never worked as well as the demo videos made it look: the table-top wasn’t glass but had a rough rubberised protector layer on-top that ruined it, the built-in WPF-based demo apps that we played with were all somewhat janky: you’d get 12-15fps not 60fps, touch drag latency was also abysmal, and most of the demo apps’ rendered scenes didn’t use global lighting, so pinch-rotating two objects in different directions would just look bad.
(Pre-teamroom) campus building lobbies were where once-cool hardware goes to die; another building at the other end of campus (where the Direct3D people were) had a widescreen rear-projection TV running Windows Media Center 2005 until well into 2015 IIRC.
Don't forget .NET, which simultaneously referred to an abstract machine runtime, an SOA strategy involving SOAP and XML, rebranded versions of Microsoft services intended to align with this strategy, and even Microsoft's centralized authentication service (called .NET Passport at first). It wasn't until later years that Microsoft associated the brand more or less strictly with the runtime.
Not to pick too much on Microsoft, Sun had previously done the same thing with Java, sticking Java stickers on anything and everything they could get away with. Arguably, it worked: Java was the buzz of the industry in the late nineties and table stakes for greenfield enterprise development in the early 2000s -- unless you were using .NET.
Nope. They are not even included in the same subscription. At my dayjob we had GitHub Copilot for the developer team for some months before everyone got MS Copilot. I had to spend quite some time explaining to central IT what the difference between these two things are, and do quite some digging into MS documentation to show them that the subscriptions are strictly disjoint.
There is also Microsoft 365 Copilot, Microsoft Copilot for Sales and I'm sure several more.
Copilot is really great branding, and also really confusing
The coordinated launch of real ARM laptops across 4 different companies, Dell, Lenovo, HP and Samsung is quite an achievement for Microsoft after a decade of failed ARM launches. This seems uniquely different to me because, there is a real and marketable limitation to what you can find in Intel/AMD (the lack of NPU).
Before Microsoft pushed ARM as an option for "all day battery power" however, the huuuuuge tradeoff was compatibility with your existing tools and very underwhelming performance.
However, can I just complain for a moment?! Why are these laptops shipping with fixed options at exactly 1TB of storage and only 16 GB of memory with no way to specify different configurations? So with all this cross-company coordination and opportunity for eyeballs on this renewed ARM push, they still felt the timid 16 GB would be more than enough to generate excitement?
Yes, yes, I know, it's fine for a generic office product, but am I the only one who is thinking this is a red flag, and this is going to be just another failed launch and huge missed opportunity at wider ARM adoption?
Both Intel's and AMD's newer chips have NPUs as well. They are also better in the power consumption area, not as good as arm though. I don't see a reason they can get there as well, the actual instruction set does not matter much for power consumption.
The MS announcement also talks about Intel/AMD favors of their "Copilot+ PCs"
The thing is that everything I need just works on my ARM Mac and it’s been so for couple years already. Yet I keep hearing that “MS is committing to ARM”.
Don’t get me wrong, I’d love myself a Surface that’s super fast, responsive, has many app and good battery life. But I just can’t get myself to buy one whenever I try it in the store. So MBP+iPad it is, year after year.
But is it a normal Windows PC that supports normal Windows software or the weird abomination they had last time that supported a few specific apps and nothing else?
It's normal Windows, just on ARM. Windows on ARM already supports x32 and x64 emulation. Pretty much the only stuff that doesn't work is hardware drivers (to be expected) and games with kernel level anti-cheat (also expected). They've also announced a new 20% faster x64 emulation layer. Recently there's been a big push of apps getting native ARM ports too, Chrome for example.
For the last 5 years starting with Windows 10 (Snapdragon 835 based devices) you could run any x86 software (that doesn't require driver/drm rootkit) and 64bit emulation was added with Windows 11 (when intel patents ended).
Sometimes it feels like the commenters on HN are being kept in suspended animation for 10 years then revived for a couple of days to comment then put back to sleep.
What I don’t understand about this that they tried it by shoehorning it into windows updates (without asking if you wanted it) and it was (in my bubble at least) universally loathed. Does anyone know anyone actually using and enjoying the windows 11 “copilot” update?
It's clear that AI can be the catalyst for creating a lot of customer-facing value. (Though I still contend that the current incarnation of large language models ain't it yet.) Requiring a powerful (and power-hungry) GPU in all devices is non-tenable, so having a dedicated NPU is a pretty sleek way forward (that, to be fair, Apple spearheaded).
I'm very curious to see what APIs will be open to developers, because I think "next billion-dollar startup"-style value-generation won't be found in the generative aspect of these models, but rather the promise of automation, synthesis, summarization, agent invocation, and so on.
If Microsoft thinks I'm going to let Windows store screenshots of my work every second indefinitely with their current approach to online identity and privacy, they are nuts.
The word Copilot has been used for so much shit now by Microsoft that it’s literally vomit inducing at this point. Say one more time Copilot and I’m gonna lose it.
I think this describes almost all AI products being promoted these days. They all seem rushed-to-market, in an attempt to get a boost to the company's stock price: "They're an AI company! BUY!"
Pluton will be on by default per the article. This is MS latest play to lockdown PCs from being general purpose computers to 'large mobile phones' and the ARM platform seems to be the trojan horse to get that across the line.
* grabs tinfoil hat, considers how it depressingly hasn't worked before, puts it on anyways *
"In fact, 87% of the total app minutes people spend in apps today have native Arm versions." - Are they meaning web browsers? I'm pretty sure the majority of time people spend in apps are web browsers, so it feels incredibly misleading if that's what they're referring to.
Probably whatever they're calling Windows Store apps now, universal windows app or something. These are .net applications that are natively compiled when installed (I think).
I work for Microsoft, so take what I say with a grain of salt. I could not be more excited for these AI productivity features built into the OS. “Recall” is almost exactly what I’ve been dreaming of since the AI boom. I didn’t catch it in the release, but something that can “record” my work life and do things like auto-manage my todo list or answer a question I had about how I did something a year ago is the killer AI product I’ve been waiting for. Not sure I’d want this as a personal device, but would be amazing for work.
Possibly, but instant-find software already exists (eg copernic desktop search), and the extra functionality posited by Recall sounds like a potential surveillance nightmare in workplaces given Microsoft's form in the corporate arena.
Still waiting for an actual use case for these things. And no, "easily generate AI images" is not one of them, no matter how much Microsoft likes to push it.
Being able to vaguely ask my computer "what that cooking thing Greg talked about and I looked up on that kitchen supply website three months ago" and have it instantly pull up the chat and the web page is pretty handy.
Having my computer automatically reference some document I was supposed to look at an hour before a meeting scheduled titled "Important Document Review Meeting" even though it's not attached but on some vague share and a deep directory I forgot about would be nice as well. Maybe this would do that.
Running high-powered AI locally seems like a pretty good use case to me. Cloud-based AI, like OpenAI sells, is expensive and provides inadequate privacy. I want a truly personal computer, not a dumb client.
If the chance of hallucination could be statistically guaranteed to be near 0, then it would have many more use cases, imo... but why would I have it summarize a pdf or write some complex code or something if I have to always double check everything for accuracy?
A calculator has no use case if it might be wrong.
These kind of stats really need a “for 30s” and “for 10m” qualifiers. I strongly suspect that an iPhone will start throttling within seconds of not minutes compared to a laptop with a fan.
(Although there are definitely laptops with fans that still throttle within minutes, and they usually have a fruit on the side of them.)
To really be an AI PC, I'd want to be able to run something like llama 3 70B locally, and that's going to need a lot of RAM. Even running a 7B will take 14GB if it's fp16. So these really just need to be able to run a few MS AI apps. A quick googling tells me things like the Gen-AI in photoshop is actually done in the cloud, so that should still work but doesn't really require a special "AI PC".
There is extremely little quality loss from dropping to 4-bit for LLMs, and that “extremely little” becomes “virtually unmeasurable” loss when going to 8-bit. No one should be running these models on local devices at fp16 outside of research, since fp16 makes them half as fast as q8_0 and requires twice as much RAM for no benefit.
If a model is inadequate for a task at 4-bit, then there's virtually no chance it's going to be adequate at fp16.
Microsoft has also been doing a lot of research into smaller models with the Phi series, and I would be surprised if Phi3 (or a hypothetical Phi4) doesn’t show up at some point under the hood.
I had already read the comment I was responding to, and they actually mentioned both.
Here's the exact quote for the 7B:
"Even running a 7B will take 14GB if it's fp16."
Since they called out a specific amount of memory that is entirely irrelevant to anyone actually running 7B models, I was responding to that.
I'm certain that no one at Microsoft is talking about running 70B models on consumer devices. 7B models are actually a practical consideration for the hardware that exists today.
> > Since they called out a specific amount of memory that is entirely irrelevant to anyone actually running 7B models, I was responding to that.
> Which is correct, fp16 takes two bytes per weight, so it will be 7 billion * 2 bytes which is exactly 14GB.
As I said, it is "entirely irrelevant", which is the exact wording I used. Nowhere did I say that the calculation was wrong for fp16. Irrelevant numbers like that can be misleading to people unfamiliar with the subject matter.
No one is deploying LLMs to end users at fp16. It would be a huge waste and provide a bad experience. This discussion is about Copilot+, which is all about managed AI experiences that "just work" for the end user. Professional-grade stuff, and I believe Microsoft has good enough engineers to know better than to deploy fp16 LLMs to end users.
You're not the intended market for this product. This is aimed at the general consumer who is interested in a fully tailored experience powered by "AI", and doesn't care whether the magic happens locally or in the cloud, not at someone who wants to run arbitrary models locally and tinker with the experience. Whether these machines can be repurposed for what you want to do, and whether the experience will be worth it, is yet to be seen.
Are these suitable gaming laptops? I run local LLMs (Ollama) using my NVIDIA graphics card but I can't tell if these ARM chips are suitable for gaming.
7 Days to Die, BeamNG.drive, Borderlands 3, Control, Dark Souls III, Dying Light, The Forest, God of War, Resident Evil 2, Shadow of the Tomb Raider, Skyrim SE, Sons of the Forest, Totally Accurate Battle Simulator, Unturned, Warframe, and The Witcher 3
But you will get much better experience and longer battery life with using something like Geforce Now
I intuit parent was referring to PC gaming, which is far more demanding than mobile gaming. Also, most PC games are compiled to x86_64, which means there will need to be a virtualization layer for non-native games (the majority), which will add latency.
Just because most gamers game on mobile and consoles do not make them any less of gamers. PC gamers are a minority, not the majority; PC Master Race is a meme and not reality.
If anyone wants to refer to PC gamers, refer as such. "Gamers" with no additional descriptors by definition is anyone who games, and that includes gamers who play on mobile and consoles.
I also would be interested about this. Because I somehow expect that this screen logging feature is ripe for a security disaster. This is not just your browser history. Imagine a video of everything you ever did on your PC ending up in the internet.
I hope the Snapdragon D3D12 drivers come with actually working H265 video support, unlike on AMD where this is broken and no one seems to bother to fix it.
Is this is it or did they already lock down "their" hardware so you wouldn't be able to install alongside or solely another operating system? It seems that it's something we shouldn't take for granted as of now that the separation of OS and hardware allows you to install Linux or any other homebrew operating system on "your" computer.
Base macOS uses more than 4GB, or close to it, on a fresh install right after booting. 8GB really is a minimum for macOS. But tons of consumers use their computers in ways that require more resident pages in memory, like having hundreds of open tabs in Chrome or Safari. 8GB is getting really constricting even for casual use.
If they're true to the demo, the new Surface devices look impressive, with the Surface Pro clearly pushing in the direction that most have wanted the iPad Pro to go for some time.
Microsoft has clearly been baking AI into their laptops and the updated Windows 11. Will be interesting to see what Apple has up their sleeves next month.
Somewhat OT but still related - I am using less AI in the last few weeks. I pay for ChatGPT but I barely use it. I sometimes use Perplexity but then also verify the response using Google search, which means I don't use Perplexity often.
It's certainly early but as a consumer I feel the AI hype is just that.
I believe this is only the beginning of the modern OS with a user-interface that is based around AI. Imagine the platform lock-in from a lifetimes worth of meta-data around interactions with an AI agent, which obviously will not be transferable between vendors.
The modern OS that nobody asked for. This could backfire pretty badly if it doesn't succeed. We're still in a hype bubble. When the bubble bursts some thing will stay but many will be washed away
This feels like classic Microsoft. Put some great marketing out there. , claim the victory and deliver the actual product months from now. This puts any other vendor or company in a bad place with investors and the public.
Yeah I was going to say. The very first Surfaces in 2012-2013 were ARM-based (using Nvidia Tegra), then they switched to Intel, then they tried ARM again in 2019-2020 (using Qcom Snapdragon) then went back to Intel again. Third times the charm?
At the very least they seem to have decent x86 emulation performance this time.
AI is now that crusty squirt bottle full of mystery beige sauce they squirt all over everything at the food truck. Yum. Just drench my whole OS in this.
Qualcomm makes the CPUs in most android devices, as well as VR/AR (e.g. Meta's Quest line). Qualcomm probably makes more CPUs (and GPUs) than any other vendor.
Hey copilot, generate name for our next hardware product that sounds similar to windows xp professional with service pack 2. Cortana: Sure thing, here you go.
Intel (and AMD) are also going to be part of this line of computers:
>We look forward to expanding through deep partnerships with Intel and AMD, starting with Lunar Lake and Strix. We will bring new Copilot+ PC experiences at a later date.
I'm not blaming Microsoft for skipping Meteor Lake, it's been an awkward generation to say the least. Of course, if Lunar Lake sees delays then Intel really has shat the mattress.
Feel like corpspeak given the vast majority of their products ship on x86. You don’t want to come out and say your current and future products are garbage. Apple could do that because the own the whole stack, but not Microsoft.
The iPod saved Apple, the iPhone made it the powerhouse it is now. Everything else seems like marginal impact. The new chip is nice and helps them have more control over and lower prices but as of yet doesn't seem to be an immediate game changer.
Disagree, the power series Mac’s were in the verge of not being able to run photoshop on par with windows. The last core group to keep the Mac alive was designers. If Apple had stuck with power series they wouldn’t have survived.
personally i am not all that excited with their hardware prospects, but am curious about all the "optimizations" they did at the OS level.
if they are real, can they bother trickling that in the x86 builds, pretty please?
i feel they are at the crossroads for supporting legacy, decade-old stuff with an architecture switch. i wonder if they are compromising on all that for the ARM variant.
I am honestly not excited about the "AI" tech here--I'm excited about the battery life improvements that will come from using ARM chips. Battery life was the reason I chose an M1 laptop over the alternatives. Hopefully we can get some more competition in the space with these new chips!
the last few years focused on short term conversion metrics for bing and edge have already done their damage, its going to be an uphill battle for them
Tone-deaf. The last thing I want is AI deciding into which column of my start menu I’d like my Ads crammed. How about Microsoft design an OS that isn’t actively user-hostile and then, maybe, _maybe_ bake in some AI around that?
I suspect that there will be a LOT of TPU e-waste in about 3-5 years.
One can call me reactionary, but I see a greater benefit in providing SIMD primitives for known bottlenecks, co-processing for known inner loops, than to continually jump through the hoop of committing passing trends to hardware.
I suspect thst Microsoft will implement brilliantly on a bad idea, but that it still won't gain much traction, as Co-pilot is not the code assistant we need.
But since hammers are in fashion, expect attempts to shoehorn the world into nails.
If vendors will go for strictly on-die TPUs, then maybe. However, I expect these machines to be quite capable without the AI hype.
I hope ar least some laptop vendors will use PCIe cards for their TPUs so that there's an upgrade path of sorts, and reusing older cards for simpler models remains possible.
LLMs aside I would guess we may see more modest use cases for TPU/NPUs on PC. Phones have been coming with them for several years now - granted with stronger use cases around AR, photo retouching, and voice assistants. Maybe you are right though.
Looking at the landing page for the new Surface devices (https://www.microsoft.com/en-us/surface) the top third of the page looks really amateurish in terms of web design. Clicking one of the "Meet the new" buttons really drives it home.
Looks like they're relying on system fonts instead of serving them over the web. It uses Segoe UI (a Windows font) by default. I tried accessing the page via Fedora and iOS and neither load the font correctly.
>Looks like they're relying on system fonts instead of serving them over the web.
This is a good thing. Using system fonts means less bandwidth consumed, more privacy (presumably), faster rendering, and better consistency with the rest of the user's environment.
The only failure could be not serving a web font as a failsafe, but I'm not going to count that against them because I hate the idea of web fonts.
Edit: Actually, nevermind all that. Microsoft is serving Segoe UI as a web font in addition to referring to the system copy.[1] If that's not rendering properly, either Microsoft got the URL wrong or something is fubar on the browsers concerned.
Microsoft is shifting away from Intel chips, favoring Qualcomm's latest Snapdragon X Elite processors.
The new Qualcomm chips boast better performance, power efficiency, and battery life, aiming to compete with Apple Silicon.
Microsoft is launching Copilot Plus PCs, featuring built-in AI hardware for enhanced performance.
Surface AI Announcements:
Major updates to the Surface lineup.
Introduction of a new era of AI-driven PCs.
Asus Vivobook S 15 (S5507):
Powered by Qualcomm's Snapdragon X-series processors.
Features 45 TOPS of neural processing power for AI-driven programs.
Thinner chassis and display bezels compared to previous models.
DaVinci Resolve AI Features:
Uses Copilot Plus PCs’ neural processing unit for AI color corrections.
CPU and GPU offload tasks to the NPU.
Acer Swift 14 AI:
Powered by Qualcomm Snapdragon processors.
Supports new AI features in Windows 11.
Configurable with up to 32GB of memory and 1TB of SSD storage.
HP Laptop Lineup:
Streamlining of product lines to OmniBook (consumer-focused), EliteBook, and ProBook (corporate-oriented).
Dell Qualcomm Laptops:
Announcing five Qualcomm Snapdragon laptops, including XPS 13 (9345), Inspiron 14, and Latitude models.
Offers multiple display options and up to 64GB of memory.
Lenovo Laptops:
Introducing Yoga Slim 7x 14 Gen 9 and a new ThinkPad with Snapdragon processors.
Features include up to 32GB of memory, 1TB SSD storage, and a 14.5-inch OLED touch display.
Adobe Creative Cloud on Arm64:
Full Creative Cloud suite available for new Copilot Plus laptops.
Native Arm64 versions of Photoshop, Lightroom, Firefly, and Express.
Microsoft Real-time Translation:
New translation feature available across any video calling or entertainment app.
Demonstrated real-time translation capabilities.
New Surface Pro:
First Surface Pro with an OLED display.
Capable of producing perfect blacks and HDR output.
Powered by Qualcomm Snapdragon X processors, up to 90% faster than previous models.
Surface Laptop:
Arm-based Surface Laptop with Qualcomm’s Snapdragon X Elite or Plus chip.
Configurable with up to 64GB of RAM and 1TB SSD storage.
Available in multiple colors.
Copilot Plus PCs:
New branding highlighting built-in AI hardware and support for AI features across Windows.
Supported by major laptop manufacturers including Dell, Lenovo, Samsung, HP, Acer, and Asus.
Copilot Assistant:
Upgraded to GPT-4o.
Demonstrated guiding a player through Minecraft using GPT-4o for real-time interaction.
Recall Tool:
AI-powered tool that logs and retrieves everything you see and do on your PC.
Can track activities in apps, meetings, and web research.
Opera Browser for Windows on Arm:
Native version for Snapdragon-powered Windows devices.
Promises over double the speeds of emulated versions.
Dell's Future XPS Plans:
Confidential document leak reveals detailed specs and future plans for XPS 13 variants.
Includes multiple display options and Snapdragon X Elite chips.
Qualcomm Snapdragon X Plus Processor:
Entry-level laptop chip with 10 cores and 45 TOPS NPU for AI applications.
Competes with Apple, Intel, and AMD on speed.
From my experience running 'AI' locally I would have expected 32GB to be the minimum spec machine. Assuming memory unification if they try to compete with Apple.
My 32GB machine is remarkably not all that useful for testing and using AI models. It can use some good models, but I’m frequently disappointed by how many I can’t even come close to running.
they're not running any LLM locally, the NPU so far is just an accelerator for OCR (for rewind.ai (oops I mean Recall)) and webcam background replacement without a GPU
Came here to say the same. I would expect that you'd want to be able to expand to at least 64GB and probably even more if you want to target the AI market.