
The Samsung Galaxy S4 runs an Exynos 5 Octa central processing unit with multiple cores and a GPU or graphical processing unit which operates at 480 MHz GPU. In February a forum member at Beyond3D suggested that Samsung overclocks the GPU to 532 MHz while running benchmarking tests that are intended to showcase how powerful the processing unit is, while throttling it to just 380 MHz when in actual use. The folks over at AnandTech recently tested and researched the claims and discovered code that Samsung uses to speed up the processor when (and only when) running popular speed testing suites such as GLBenchmark, AnTuTu, or Quadrant.
Now some may refer to such a situation as optimization while others might claim that it helps save battery. Most are saying it’s just plain cheating though. The switch seems to have real, everyday consequences as many people make decisions on which device to buy based on performance. It’s not just the GPU either though; AnandTech discovered that the same thing happens with the CPU.
Certain benchmarking suites such as GLBenchmark, trigger the CPU to jump to 1.2 GHz, while others such as GFXBench, are not triggered and use a default CPU speed with lower performance, at just 250 MHz. Other benchmarking apps trigger different CPU speeds, with Linpack inspiring Samsung to offer up 1.6 GHz and Snapdragon hitting the same heights as GLBenchmark 1.9 GHz.
The South Korean electronics giant apparently uses a settings application, TwDVFSApp.apk, which includes hard-coded CPU settings for the different benchmarks, among them a “boost mode” that kicks the CPU and GPU into high gear. Trying to game benchmarks is not technically something new, it’s been happening ever since enthusiasts started comparing Apple’s PowerPC processors to Intel’s x86 CPUs. In most cases however, what’s going on is a deliberate attempt to maximize the attractiveness and perceived power of a certain chip while reserving the power for artificial testing situations.
How do you feel about the whole situation?
Source: AnandTech, Beyond3D (Forums) (Thanks to MMi user Daniel DelVisco for sending this one in)
Message