> The ironic thing is, even with the benchmark booster disabled, the Note 3 still comes out faster than the G2 in this test. If the intent behind the boosting was simply to ensure that the Note 3 came out ahead in the benchmark race, it doesn't appear to have been necessary in the first place.
The Samsung phone division upper management and marketing team don't seem burdened by an overabundance of neither ethics nor common sense. Or maybe I'm just being naive.
There's another possible explanation: some Samsung head demanded an extra 20% in unobtainable improvements (maybe offered bonuses?), so the engineers found a quick and dirty way to make it happen.
Reminds me of how some old ATI graphics drivers detect whether you were running Quake 3, and turn down the graphics quality for the sake of better benchmarks.
Renaming quake3.exe to quack3.exe would fool the detectors and skip these "optimizations".[1]
To be fair, Quake 3 fans are known for prioritizing frames-per-second beyond all other considerations. One could argue that tweaking the graphical quality for Q3 to encourage high framerates is just giving customers what they want.
Except this was done against the explicit graphics settings the customers had set and only for a handful of games that were regularly used as benchmarks.
Customers who actually want framerate-beyond-all-else don't enable/would disable those settings in the first place.
So if it were "giving the customer what they want" they'd have respected the customer's settings.
What it actually was, was a simple recognition that people used "framerate at max quality" as a proxy for net card performance and an attempt to "game" such measures by not actually running at max quality, for those games that were common benchmarks.
One could, but that'd be a stretch. It should be a discrete option left up to the user. If they're tweaking that much, they'll know or seek the knowledge on how to activate it themselves.
Nvidia has been caught doing exactly this multiple times. Same with intel compilers. Same with car makers who optimize for EPA testing conditions. Given a benchmark people will "optimize".
So, Samsung, the dominant (#1 manufacturer of mobile phones (Q2 2013 sales: 107 million, market share: 24.7%)[1]) has been caught unambiguously and intentionally deceiving the public on their product's performance. Approval of cheating marks a disgusting corporate culture.
>"the Note 3 CPU locks into 2.3GHz mode, the fastest speed possible, and none of the cores ever shut off."
Good! I want to know what the maximum performance is from a benchmarking app.
I don't know why regulars apps are unable to get the Note 3 to use all 4 cores. Perhaps they're not multi-threaded or they haven't enabled the right settings? Surely 3D games will access all cores?
At the end of the day, Samsung wants the benchmarking to show the theoretical maximum. Other manufacturers may not be too bothered and are happy to show benchmarking of normal usage. It's like reading the specs on a car, you're never going to drive at 160 mph, but at least you know you could.
> It's like reading the specs on a car, you're never going to drive at 160 mph, but at least you know you could.
It's like testdriving your car on the manufacturer's track, seeing it go to 160 mph, then taking the car to race day and you hit a 90 mph limiter they built into the car for fuel efficiency.
Ironically, there's a Nissan car that does enable and disable the limiter based on the GPS detecting it being on a race track or not. But they're transparent about it.
> Ironically, there's a Nissan car that does enable and disable the limiter based on the GPS detecting it being on a race track or not. But they're transparent about it.
Because the OS is trying to balance performance and battery life.
Saying I want to see the maximum performance is fine but they should also be forced to show how that impacts other elements of system performance.
Personally I think it would be good to see two sets of benchmark typical and maximum, but the most important thing is that whatever scores are being shown are clear and honest and I think it's hard to say that these are.
I don't expect a benchmarking app to be throttled by the OS under any circumstances--ideally the benchmarking app would run on the raw hardware, with no second-guessing by the OS about whether it should be throttled because the battery is down to 10%, or because some other process would like some network or CPU, etc.
The way this probably should work is that the benchmarking apps themselves have a flag telling the OS they want all cores at full power all of the time, instead of having to hardcode this list into the OS. (Perhaps this could be one of the permissions you have to accept when installing the app.)
Is that really the best thing for a typical user to understand what they're getting? Shouldn't they get it the way the OS will treat an application under normal circumstances as that's what they'll see day to day.
As I said, I don't mind if they put it flat out for the benchmarking, so long as they do the same for the battery tests too.
looks at a mobile phone with a 2 hour battery life
>I don't know why regulars apps are unable to get the Note 3 to use all 4 cores. Perhaps they're not multi-threaded or they haven't enabled the right settings? Surely 3D games will access all cores?
That doesn't explain why the exact same app performed worse (or normal, rather) when renamed .
The customer is buying a complete product, not a cpu. Maximum cpu performance without context is not meaningful. A 500 horsepower engine is pretty awesome, until you drop it into a cargo freighter. Then it's worthless.
> At the end of the day, Samsung wants the benchmarking to show the theoretical maximum. Other manufacturers may not be too bothered and are happy to show benchmarking of normal usage. It's like reading the specs on a car, you're never going to drive at 160 mph, but at least you know you could.
And they should be vilified for re-interpreting preconditions that would usually guarantee valid relative comparisons and are used like so. ("Theoretical maximum" is a problematic end to work towards, though - why shouldn't clock-speed specifications suffice, then? I presume you're talking about stock speeds sans the software stack - but cranking up frequency scaling isn't much of a meaningful hop and skip away from just running the benchmark.)
Which is great. And I totally agree with them - if they had disclosed it at least to the tech community.
After all when Nvidia 6xx and 7xx series self overclock when there is room in the power envelope available this was disclosed and taken into account by the reviewers.
This is nothing like that. The note doesn't have two modes that the user or the software can choose from. The adjustments are made secretly, and only when specifically named benchmarks are run.
I think the key idea here is transparency. There's nothing wrong with showing theoretical maximums, but be transparent about it!! Otherwise it looks like you're trying to get away with something .
There is very little that is transparent about Samsung and their relatively recent history is pretty seedy. High level corruption, bribery etc. That the culture of deception flows down the company into actual products doesn't surprise me.
In this case it's not the theoretical maximum. It isn't available to regular apps because the OS and CPU don't allow it, for good reason - to conserve power. The reason benchmarking apps can do it is that Samsung added code to specifically detect them.
Your car spec analogy is a good one. If the car maker removed the seats and carpet as well as inessential engine components before testing, that would be the equivalent of what Samsung have done here.
Samsung's entire business model is being a sketchy fly-by-night knock-off company. First it was Sony and other Japanese electronics makers and now it is Apple. And I don't think Samsung would really deny it. They're pretty good at what they do.
Somewhere along the way they became a huge global company that makes a lot of money, but they forgot that they need to start acting like one.
They don't need to fake benchmarks or anything like this. They are the main event when it comes to Android.
I think it's a very misleading tactic, because then many customers buy based on those benchmarks.
Ironically, I believe these smartphone OEM's are a lot less bad than PC OEM's/chip makers. I hate how Intel, Nvidia and AMD all "win benchmarks" or show numbers much higher than the practical limit of those chips.
All of them seem to have a "base clock speed", and then a "turbo speed" that goes even up to twice as high. What do you think gets shown in the benchmarks? The base clock speed, that you'll be using most of the time? Of course not. It's the "turbo" burst speed that will be calculated by the benchmarks.
This ends up being very misleading because for example I bought a laptop with an Nvidia GPU that was supposedly 3x faster than the integrated Intel one. But after about 30 minutes of playing a game, it will invariably get throttled to about half of that speed, and then it will only be 50 percent faster than the integrated one.
Intel is even worse than this, because they're becoming more desperate about the ARM competition, they start to become increasingly more misleading to make the appearance that they're "closing the gap" with ARM much faster than they really are.
Before I start, I want to say that I use a Core i7 in my laptop, but I just hate shameless misleading of consumers, whether it's from Intel, Nvidia, AMD or Samsung.
There are 3 "misleading strategies" that Intel has used lately:
1) Turbo-boost, of course. Their new Atom chips are actually 1.3-1.5 Ghz, but they market them as 2-2.4 Ghz chips, a speed that also helps them do very well in CPU benchmarks (they're still about 2x behind any other ARM GPU, but I think they'll start using turbo speed for the Atom GPU's in the next generation, too, again to make it "seem" like they caught up - except not really).
2) Launching the most powerful version they have first, even if it's not going to be used in most of the devices the consumers want, but this way the press is all over it, and then the consumers remember that "brand" that was very powerful, even if it's not what they're getting in their device.
Example: Haswell's Iris Pro. It was the first to be sent to sites and benchmark, and it showed much bigger GPU performance than before in laptops. The only problem? It's probably never going to be in a laptop. It was meant for PC's mostly, and again as a "halo product", to make people think that "Haswell is very powerful". In fact most laptops have only seen about 20 percent increase in GPU performance with Haswell, compared to IVB.
Second example: They launched Bay Trail first, the "tablet version", to make people think "how powerful Atom is" and that it's "good for smartphones", and then they will release Merrified for smartphones, half a year later (which would already put them behind competition), and probably even weaker than Bay Trail performance wise. But most consumers will just remember that Bay Trail was great in benchmarks (see 1, also), even that's not what they're getting.
3) They're starting to push the "SDP" measuring, so tech websites can start writing about and promoting their chips as "5W chips" or whatever, and hope that even if they're now using the SDP name after the power consumption number, they will eventually forget to mention it ("Intel's 5W chips", etc).
SDP is nothing but an absolute shameless scam from Intel. They want to have their cake and eat it, too: a) declare super high performance with the burst speed that it's not practical at all for too long, and b) declare super low-power with SDP, even if the chips themselves can get a lot hotter than Intel is implying with SDP, especially if they start using the Turbo-Boost speed.
If I can't use my laptop at the advertised clock speeds for more than 10-30 minutes continuously, then it's nothing but a scam, meant to fool consumers into thinking their chips are either a lot more powerful or a lot more efficient than they really are.
If your laptop doesn't have the cooling to support the maximum continuous performance of a CPU or GPU, how is that the chip-maker's fault? The chips offer you as much speed as possible under the power/cooling constraints - isn't that exactly what you want, as a user?
I think the benchmark manipulation by Samsung is in a different class; it configures the system in an impractical mode that actually isn't available to the end user at all.
In contrast, high-consumption turbo modes in laptop CPUs and GPUs really do benefit the end user, even if the benefit isn't everything you were hoping for.
Yeah. I don't really care what the manufacturers show in their marketing material. As far as I'm concerned they are all biased and their numbers should be taken with a grain of salt. There is an ethical problem, however, if you try to game benchmarks used by independent agents. At the very least you should be transparent about what you're doing.
I think you missed the part where in order to appear competitive with ARM, Intel are advertising their chips as having lower total power consumption and lower cooling requirements than are actually required to run them at full performance. They're then configuring them to throttle back in order to meet the advertised thermal figures.
Maybe all this is shocking if you're not already familiar with the way CPUs are specified and the way they have dynamically adjusted their speeds and power consumption for many years now.
ARM systems do the same tricks for the same reasons. Here an ARM CPU is throttled back to keep total system consumption below a target TDP, meaning it's impossible to achieve the rated CPU+GPU performance simultaneously. They are "configuring them to throttle back in order to meet the advertised thermal figures."
That just strikes me as good engineering. If the clocks were given simplistic hard upper limits to achieve the same ends (4W TDP limit in a system otherwise capable of consuming 8W), the result would be a less flexible, less powerful system. A waste, really.
There is a real disparity between old-school absolute-maximum-TDP and realistic modern ultra-portable battery-powered use scenarios. Maybe Intel's SDP isn't the right answer, but it doesn't strike me as purely shady marketing malarkey either.
Mobile devices are energy/power-limited (battery life / chip temperature) and you always have to optimize between those and performance. Theoretically you can add performance as much as you'd like but with a huge power draw. Basically, all these benchmarks are meaningless without knowing the power draw.
Don't care. Mobile processors in all of the high-end handsets long ago surpassed my needs. As far as I'm concerned they are all now awesome and we're living in the future.
I can't understand the issue here. If you are using a benchmarking app or a game that's heavy on the cpu, these same optimizations would kick in. So you will cry for the speed of the device while crying for the battery juice draining.
It appears, from the extracted code, to be the case that a 'special' (read: hardcoded) set of optimisations is being enabled solely for benchmarking apps.
Whilst 'smart' behaviour might enable the phone to switch on some such functionality when the SoC/CPU is put under heavy load (by something such as a game), the effect of such automated switching is unlikely to have anything like the effect of a hardcoded force-to-high-power-mode as shown here.
And you know this based on what? "hardcoded" != "faster"
It just guarantees that it will happen in that case. Unless you have some reason to believe that this mode is unattainable under any real world circumstance, I think the original point still has some credence.
If the mode were attainable in a realistic end-user scenario, there would be no point in detecting the benchmark tool in the first place. A high CPU load (like from a benchmark) would kick the CPU up into high performance mode and that would be that.
I would guess that the high-performance mode drains the battery so quickly, or creates so much heat, that it's impractical for general use.
Based on the fact that they are Samsung. That they aren't transparent about what they are doing. That there is no evidence that these optimization kick in in other contexts.
They are doing something to fuck around with benchmark numbers, the onus is on them to explain it and be upfront about it.
Would they actually, though? If these optimizations would kick in, why does Samsung feel the need to force the optimizations to kick in for specific apps, rather than via CPU usage like any other game/app?
Because they are incapable of producing competitive products (their phone market is people who can't afford anything better or who are motivated by fanboyish prejudice) and because their corporate culture appears to emphasise cheating, deception and dishonesty above all other values.
I mean really, even if you thought Android was a better solution why would you buy Samsung?
It's only triggered if it is a benchmarking app. Did you read the article at all? They look at the name of the app running and if it matches the name of a benchmarking app, they shift into a mode that only exists for benchmarking apps.
> The ironic thing is, even with the benchmark booster disabled, the Note 3 still comes out faster than the G2 in this test. If the intent behind the boosting was simply to ensure that the Note 3 came out ahead in the benchmark race, it doesn't appear to have been necessary in the first place.
The Samsung phone division upper management and marketing team don't seem burdened by an overabundance of neither ethics nor common sense. Or maybe I'm just being naive.