Samsung has reportedly been cheating in benchmark tests, artificially
boosting the scores of its latest and greatest system-on-chip, the
Exynos 5 Octa, on those performance-ranking number generators so
beloved by reviewers and product evaluators.
"Oh hell Samsung, shame on you!" wrote a Beyond3D forum member in a
posting on one of that site's forums last month.
The poster, a Luxembourger who goes by the handle Nebuchadnezzar, had
been testing a Samsung Exynos 5 Octa when he discovered that although
he thought he was running the chip's GPU at 532MHz, it only hit that
clock speed on two benchmarks he used for testing: AnTuTu and
GLBenchmark. For all other apps, it ran at 480MHz – a much better
speed for battery-life testing.
The Exynos 5 Octa, by the way, is so named because it has four
high-performance ARM Cortex-A15 cores and four low-power ARM Cortex-A7
cores, all baked into a single 28-nanometer die. It comes in two
versions: the 5410, which contains an Imagination Technologies PowerVR
SGX544MP3 GPU, and the 5420, which uses an ARM Mali-T628 MP6 GPU.
Nebuchadnezzar was testing a 5410.
Anand Lal Shimpi and Brian Klug over at the ever-interesting deep-tech
site AnandTech were tipped to Nebuchadnezzar's discovery, and since
they are both proud owners of the international version of the Samsung
Galaxy S 4 powered by an Exynos 5410, they decided to see if they
could replicate his findings.
They could – and with a few additions and clarifications. For example,
the GLBenchmark v.2.5.1 did indeed run at 532MHz, but its latest
v.2.7.0 incarnation – GLBenchmark having been subsumed into GFXBench
along with DXBenchmark – was throttled to 480MHz.
Samsung hasn't published megahertzage for its GPU, but Shimpi and Klug
said that their sources tell them it runs at 480MHz – which in fact is
what they discovered its clock rate to be when running any games,
"even the most demanding titles." But when running GLBenchmark 2.5.1,
AnTuTu, or Quadrant – benchmarks that reviewers and product testers
might naturally use to rate a products – they ran at 532MHz.
Although Nebuchadnezzar had only reported on GPU behavior, Shimpi and
Klug checked out what the CPU was doing when running GLBenchmark
v.2.5.1 and GFXBench v.2.7.0. To their surprise, they discovered that
when running v.2.5.1, the four powerhouse Cortex-A15 cores were pinned
at their top speed of 1.2GHz no matter what load the benchmark put
upon them. When running v.2.7.0, however, the Exynos 5 Octa switched
over to its less-powerful Cortex-A7 cores.
"A quick check across AnTuTu, Linpack, Benchmark Pi, and Quadrant
reveals the same behavior," they write. The CPUs were gunned to their
highest possible power capabilities when the benchmarks were running.
Digging into the Galaxy S 4's operating system support files, they
came upon one with the name TwDVFSApp.apk, and since DVFS is short for
dynamic voltage and frequency scaling (and, The Reg opines, "Tw" might
be shorthand for "tweaking"), they opened it up in a hex editor and –
behold! – in it were a list of what appeared for all the world to be a
series of strings that allowed for top performance for some apps and
not others, and a group-identification string with a rather
incriminating name.
"The string 'BenchmarkBooster' is a particularly telling one," they write.
The gun may not be belching great clouds of damning smoke, but there's
more than a mere wisp emanating from its barrel. As the AnandTech duo
put it, "This seems to be purely an optimization to produce repeatable
(and high) results in CPU tests, and deliver the highest possible GPU
performance benchmarks."
If Nebuchadnezzar, Shimpi, and Klug are correct in their testing and
analysis – and we have no reason to believe that they're not – there's
only one possible conclusion: Samsung is cheating. And if they're
cheating, there's a fair chance that others are, as well. But Samsung
got caught.
Your Reg reporter has been around the technology-evaluation block
enough times to remember – as Shimpi and Klug discuss in the
conclusion to their article – when benchmark manipulation was rampant
in the PC industry. As the director of a product-testing lab in the
90s, such cheating was the bane of my 9-to-5 existence.
Well, here it comes again – both fairly blatantly, as in Samsung's CPU
and GPU rigging, or in a more slippery fashion, as in the use of an
Intel-specific compiler in a test that enabled Chipzilla's Clover
Trail+ platform to outperform ARM processors.
Today is different from the 90s, however. In those far-away days,
speeds and specs were important even to consumers, while in today's
shiny-shiny world, the average fandroid or fanboi couldn't care less
about gigatexels or TMUs. "Experience" rules the checkbooks of the
marketplace, not benchmark scores.
But deceit is still not right. Having experienced Samsung's chicanery
directly, let's give our cheater-finders the last word on this sorry
state of affairs.
Shimpi and Klug: "Just because we've seen things like this happen in
the past however doesn't mean they should happen now."
Nebuchadnezzar: "Oh hell Samsung, shame on you!" ®
Bootnote
Here's a li'l fairness v. bias test we suggest you might find
personally illuminating. Read the story above one more time, except
each time you see the word "Samsung", substitute "Apple".
Then ask yourself: "Is my response any different?"
Wednesday, July 31, 2013
Egad! Could Samsung be CHEATING in Galaxy benchmark tests
Posted on 10:39 AM by Unknown
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment