AMDs mobile and APU line is generally about a year behind desktop, so expect to see Zen 2 cores in mobile mid 2020, numbered as Ryzen 4xxx.
Zen 2 combined with much better than intel integrated GPU is likely going to make a very nice dent in marketshare, unless Intel pulls a rabbit out of their hat.
There are no rabbits in the hat until late 2021 when 7nm comes to market. The 10nm process is completely broken. Watch how the majority of the roadmap is still on 14nm into 2021, there never will be a desktop 10nm for example. The Rocket Lake desktop chips in 2021 still on 14nm will drop down to eight cores to cram the new GPU architecture into desktop chips at a truly ridiculous 125 W. The Ryzen 9 3900 today does 65W w/ 12 cores and and you can easily squeeze in a 1650 into the remaining 60W.
Now TDP isn't really reflecting output and can't be compared well across brands. But a 2x difference in TDP is a lot more than the different ways of calculating them between amd and intel
> There are no rabbits in the hat until late 2021 when 7nm comes to market. The 10nm process is completely broken. Watch how the majority of the roadmap is still on 14nm into 2021, there never will be a desktop 10nm for example.
From what I heard from a man close to Intel process crowd is that Intel will simply slap a 7nm marketing designation on their current 10nm process when they finally get it going.
They will then change design rules to deliver enough density change, comparable to a node shrink without any change in the process.
Design rules don't change because marketing says so.
It is reasonable to begin early processor development in a new process with conservative design rules (in order to be sure that everything works and yields are acceptable) and rework some components, pushing the envelope a little more, if tests allow it; but such improvements are going to be small "without any change in the process". Maybe the same specifications on a slightly smaller die to reduce costs or a slightly higher clock or lower power SKU. Or nothing at all because the tooling costs for marginally improved revised processors aren't justified.
Well from that roadmap, I see 6 core Comet Lake U (high end mobile) set for Q2 2020; we don't know how many cores AMD is planning for Zen2 mobile. If AMD sticks to 4, and Intel has 6, that's going to swing some buyers. The (leaked) Intel roadmaps tend to have more specifics than the AMD ones though.
For as long as the ultrabook design remains popular (and it shows no signs of waning), the high-end CPU options will be riding the edge of their thermal limits during sustained use. OEMs aren't going to suddenly start over-building their cooling solutions just to help out marginally with a niche use case. As long as we don't get back to the problems from a decade ago with dying mobile GPUs, there's not really anything wrong with having CPUs that boost up to the thermal limits of the system form factor.
If someone comes out with a laptop CPU that can't boost up to those thermal limits, it means the chip's undersized and that vendor will probably need a different microarchitecture for the desktop or server markets.
> OEMs aren't going to suddenly start over-building their cooling solutions just to help out marginally with a niche use case.
A friend from a laptop engineering company has worked on this exact problem recently. Chinese OEMs are all trying to squeeze 35-45w chips into small chassis now.
To my big surprise, doing so in even thin bezel 13 inch models is not that big of a deal actually. Big OEMs simply were never bothered enough to try that before.
You have to take into consideration what kind of workloads lead to throttling. Laptops are usually not used for the kind of tasks that keep a CPU fully loaded for several minutes or hours at a time. People who do use laptops in that manner are a tiny fraction of the market, and when they experience throttling that does not have any bearing on whether the cooling system of an ultrabook is adequate for the kinds of more typical workloads it is actually designed for.
There have been some ultrabook-style designs that offered inadequate cooling even for fairly normal use cases, but that's a separate issue. Mainstream laptops will be designed around mainstream workloads, and heavier workloads will push them to their limits. Better cooling doesn't come free, and if it doesn't benefit mainstream workloads it's unreasonable to expect mainstream laptops to put more emphasis on cooling capabilities.
Right. I get the sense that most developers browse with adblock on, but the average user experience is for their computer to be effectively running Prime 95 during regular web browsing.
Sorry, but if I buy a six core laptop, I'm not going to be in the casual notepad user category.
In many laptops, thanks to bad thermals I'd be better off with a 4 core where the thermals can keep up. That's where the 7nm stuff could really bring advantages.
I've been able to load up my desktop six core plenty using e.g Docker and a bunch of microservices. It has a fairly decent 360mm AIO water cooler so stays pinned at max perf. Had a bad cooler before, though, and it really impacted perf and stability.
The point still stands that most people buying these machines are generally not running them at 100% CPU usage for extended periods of time; their usage is much more bursty, with short periods at full power separated by longer periods of idling or low power. This gives the CPU plenty of time to cool down in between bursts.
If OEMs optimize for that use case, I suspect that a more efficient CPU will simply mean that they cut even more corners on the cooling, not that the thermals will actually be significantly better.
It's sad that this is the norm. Aluminum is cheap, a bigger heatsink in a regular laptop (not Ultrabook) should cost what, a dollar more? Yet laptops are never designed for full load for hours. Just "mainstream" use. Even business "workstations" have the same problem. I've had to do hardware mods or undervolting on all laptops. WTF.
If only the industry stopped for one second to pursue angstrom thick laptops designs in favor of more thermally efficient ones. Laptops could have their CPU and chipsets facing downward in contact with the bottom cover entirely made of aluminium, then use a second aluminium made upper shell with small thick fins facing outward that when closed works as a sturdy cover to protect the lid carrying the screen but when opened it could be removed then attached to the lower one to increase thermal exchange with the environment.
People obsessed with the thinnest hardware wouldn't touch it with a 20 meter pole but those in need of serious performance and mobility would probably find it interesting.
So, have the bottom cover be the heatsink? That actually sounds brilliant.
HP thinned the ZBook series by turning everything upside down and having the bottom be just a dumb panel instead of the main frame.
Sadly they, once again, used a standard, barely capable heatsink. It will run for days loaded, but it will go over 90 degrees and even throttle, which is unacceptable imo.
Yes, the upper cover with the fins up should be designed to match perfectly with the lower one when mounted with the fins facing down. Cuts should be arranged so that the bottom cover rubber feet wouldn't prevent perfect surface contact. It would become a fairly large heatsink in which the size and combined thickness would likely be enough to counteract the small fins size and absence of fans.
Battery/disk/memory covers on the bottom side would be accessible by removing the additional cover.
"You have to take into consideration what kind of workloads lead to throttling."
I'm driving four displays at work with Windows 10 (two 21.5' 1080p monitors, my laptop flipped open, and an iPad Pro 12.9 connected via USB C running Duet Display) and my idle desktop CPU utilization hovers around 15-20%. Having Outlook and Chrome open gets it into the mid 30s. This is a four core i7 Dell Latitude 7490 with 16GB memory and an NVMe drive that I was given in May 2019.
Yes, all the OS/applications I'm using are resource hogs but I'm not even doing software development - this is all business analyst work. Seeing that the general trend of applications/OS will continue to be resource hogs, let's hope that six core thermal chassis design for 14' ultrabooks is figured out in the next two or three years.
Throttling is what happens when your CPU stays at 100% for a long time. Perhaps you could describe the part of your workload that exhibits that behavior, rather than describe a workload that is obviously not causing thermal throttling?
We're talking about the current state of the real market here, not abstract hypotheticals. Laptops overheating at 30% CPU usage is not a widespread issue in the real world; to a first approximation, the only way to get an ultrabook's CPU to thermally throttle is to keep at least one of its cores completely busy so that the processor stays in its boost state long enough to pump out serious thermal energy. Bursty workloads give the CPU too many opportunities to cool off.
I think AMD is still (slightly) behind in power consumption, especially if the CPU is idling. Some tests suggest that this is heavily dependent on the motherboards chipset though. But power consumption is probably very important for the industry and for notebook devices.
That said, AMD becoming a worthy competitor on the market again is awesome.
I recently bought a new laptop and actually wanted AMD, but none of the laptops I was inteested in were available with one. It appears that high-end laptops are still ruled by Intel. Do AMDs run too hot for mobile or something?
It's not about heat, more probably vendors have a contract with Intel that makes Intel parts more expensive if they are not bought exclusively (it will be worded differently because Intel already got sued for this practice).
I recently bought a Thinkpad E495 with a Ryzen 3700u and thermals are just fine. The surface barely heats up under sustained load. The E series is more budget, though. But the T495 (14") or X395 are of higher build quality (and price) and also have Ryzen 3x00u CPUs.
Typing this on my x395 (with 3700u) and can confirm these are really well made laptops.
Linux support is excellent. Openbsd is good but doesn't support this generation of wireless cards yet, which is quite inconvenient; next version perhaps.
Integrated Vega full support landed in June 2018 (kernel 4.17, Mesa 18 for 3D), as long as you have these or newer and a firmware for your card in the system — you'll get full acceleration.
I've been running Debian Testing on 2400G since 04/2018, where I had to put firmware and compile the kernel with the config changes to enable the support, but it was already in kernel tree nonetheless. It's out-of-the-box since 07/2018, way before the release happened.
At least on Arch, X doesn't even start 4.19 (linux-lts package), but works fine on 5.2+ (linux package); probably also in some kernels in between, but I only bought the laptop recently.
Likely specific support for 3700u was introduced at some point between these two kernels.
AIUI 5.4 is meant to become the next LTS, so it won't be an issue going forward.
There's the amdpro drivers, which nobody uses; they're partially open source, as I understand it.
The point is that AMD releases detailed documentation for every GPU they sell, and thus it's very easy for mesa to support their hardware; AMD themselves also contribute code to Mesa, of course. This is in contrast to NVIDIA, which even uses encryption and signatures to twart nouveau efforts. I avoid NVIDIA entirely, for this reason.
I haven't had issues so far. Life has been good, same as with the vega 64 on my workstation.
GPU Accelerated graphics (2d, 3d and video codecs) on both Linux (5.2+, probably earlier) and Openbsd (6.6+), with open source drivers (kernel DRM, userspace mesa3d and xf86-video-amdgpu).
If you get tearing, try:
xrandr --output eDP --set TearFree on
To enable the anti-tearing workarounds. This shouldn't be necessary on a modern composited desktop, but I do need it with a simpler i3 setup, to not tear videos on youtube. Mpv seems to not need it either.
My guess is that laptops probably have a much tighter integration with the whole system, and so it takes more effort to change the CPU. It might take a few years of consistent performance from AMD to convince the laptop manufacturers to invest in the engineering to do that integration with a new CPU.
T495, T495S, X395 are AMD laptops. I would guess the P line don't have an AMD option because AMD doesn't have a high end laptop CPU line like Intel does.
USB4 is essentially Thunderbolt 3. Since TB3 also defines how a USB 1-3 connection can be pushed over the wire, that's what most devices can fall back to.
So you are telling me a usb microphone needs to implement hdmi and displayport if it wants to be compliant? Or are these devices expected to stay on lower usb versions? What if the device needs some of the features from usb4 but it makes no sense to have thunderbolt features?
> So you are telling me a usb microphone needs to implement hdmi and displayport if it wants to be compliant? Or are these devices expected to stay on lower usb versions? What if the device needs some of the features from usb4 but it makes no sense to have thunderbolt features?
All USB Type-C connectors all have a pair of USB 2.0 wires in them. You can still be a USB 2.0 device and talk on those USB 2.0 pairs on USB3 (and presumably USB4) just fine with just a little bit of care (get your resistors correct on the CC lines, for example).
If you want some features of USB3 or USB4, then things get a little more complicated. For example, if you want to be able to draw slightly higher power (900mA or 1.5A), then you need to have some active circuitry on the CC lines (One of these is about 70 cents: https://www.ti.com/product/TUSB320HAI) and you have to respond properly when the system tells you to draw less power even though you don't need to use the full-blown USB3 communication pairs.
If, however, you want USB3 or USB4 speed or very high power, then you need a full-blown controller chip and you incur all the grief that demands. Of course, if you actually need a couple Gbps, you're in the realm of doing serious signal integrity analysis anyway, and you're probably not going to balk at the $3-$5 required for a true controller chip to handle it all.
The only new feature in USB4 is Thunderbolt, so yes, if a device doesn't need Thunderbolt features it should stay on an older version. Of course, older USB versions will end up rebranded as something like "USB 4.0 1x1 High Speed".
I want to get a new laptop next year and I hope it comes with a good mobile Ryzen processor with Thunderbolt support.