Quantcast

Asus, HTC, LG and other Android licensees join Samsung in faking test results

Discussion in 'Off-Topic Discussion' started by sparkyscott21, Oct 3, 2013.

  1. sparkyscott21

    sparkyscott21 Moderator Staff Member

    Joined:
    Nov 4, 2010
    Messages:
    7,280
    Likes Received:
    103
    Trophy Points:
    63
    Location:
    Northern Mich.
    [​IMG]




    Cheating test scores and misrepresenting device capabilities are rampant at Samsung, but are also being used by virtually every other Android licensee apart from Google itself, research shows.

    Samsung has been shamed twice this year by confirmed reports of its intentional falsification of performance benchmarks involving both of its flagship products: the Galaxy S4 and Note 3.

    According to testing by Anand Lal Shimpi and Brian Klug of AnandTech, benchmark cheats are not limited to Samsung, but rather pandemic among Android licensees.

    "With the exception of Apple and Motorola," the site observed, "literally every single OEM we’ve worked with ships (or has shipped) at least one device" that similarly fudges benchmarks.

    Even the cheating on Android is fragmented



    However, since each Android licensee cheats in different ways on different types of benchmarks, it's difficult to benchmark the benchmark cheating. It's also hard to convince them to stop, authors stated.

    "Virtually all Android vendors appear to keep their own lists of applications that matter and need optimizing," the report observed. "The lists grow/change over time, and they don’t all overlap. With these types of situations it’s almost impossible to get any one vendor to be the first to stop. The only hope resides in those who don’t partake today, and of course with the rest of the ecosystem."

    The lone holdout to Android's rampant cheating is Google and its Motorola subsidiary and Nexus brand. However, those products collectively make up only a tiny fraction of Android sales, and are intended to serve as a guiding reference model to other Android licensees.

    Specifically, Google's "pure Android" products intend to demonstrate to mass Android manufacturers how to deliver clean, updated products that take advantage of features in the latest version of the OS, enable uses to get timely updates after purchase and direct users' attention and revenue exclusively to Google.

    Android licensees haven't followed Google's lead in any of those respects. Instead, Chinese and Korean hardware makers have done the same thing they did to Microsoft Windows in the PC market: race to the bottom to deliver super cheap products that cheat on specs, substitute substandard components, skimp on build quality and direct attention to their own layers of junkware, ads and subscription plans.




    10-3-13


    Source




     
  2. sparkyscott21

    sparkyscott21 Moderator Staff Member

    Joined:
    Nov 4, 2010
    Messages:
    7,280
    Likes Received:
    103
    Trophy Points:
    63
    Location:
    Northern Mich.
    [​IMG]





    In the face of a second set of investigative reports detailing exactly how Samsung leads other Android makers in exaggerating its performance on specific benchmarks, the company has issued its second denial this year, and which still fails to address any specifics.

    In a statement issued to CNET UK today, Samsung stated, "The Galaxy Note 3 maximises its CPU/GPU frequencies when running features that demand substantial performance. This was not an attempt to exaggerate particular benchmarking results. We remain committed to providing our customers with the best possible user experience."

    Samsung's benchmarks for the Galaxy Note 3 do indeed indicate that the device "maximises its CPU/GPU frequencies when running" benchmarks, which certainly are "features that demand substantial performance."

    However, Samsung's claim that this "was not an attempt to exaggerate particular benchmarking results" is belied by the fact that the maximization only occurs when running specific benchmarks and stops happening when the name of benchmark app is changed.

    Or when the benchmark maker specifically changes their benchmark to stop such cheating, as was the case this week with AnTuTu, one of the benchmark apps Samsung's flagship devices identify by name when running, expressly in order to exaggerate their results.


    This all happened before


    Back in July, Samsung offered the same explanation to John Paczkowski of the Wall Street Journal "All Things Digital" blog, which the site characterized as "Samsung’s Bizarre Benchmark-Boosting Explanation."

    Samsung's statement then was nearly identical: "the maximum GPU [graphical processing unit] frequencies for the GALAXY S4 have been varied to provide optimal user experience for our customers, and were not intended to improve certain benchmark results."

    However, the company also contradicted itself and confirmed what AnandTech had reported about specifically juicing certain benchmarks within the same statement, adding:

    "A maximum GPU frequency of 533MHz is applicable for running apps that are usually used in full-screen mode," Samsung said, "such as the S Browser, Gallery, Camera, Video Player,
    and certain benchmarking apps, which also demand substantial performance."

    Additionally, the facts showed back then that tests involving Samsung's Browser, Gallery and Video Player didn't results in a GPU boost, instead leaving the processor clocked at 266MHz. And of course, the name of the software controlling the change was "BenchmarkBooster."

    So Samsung admitted boosting benchmarks back in July while also saying it didn't (in the same statement!), and said it was also boosting other apps that it actually wasn't. This time around, Samsung has only removed the explicit admission that it considers benchmarks among the apps that "demand substantial performance" and require special handling from its BenchmarkBooster code.


    Why Samsung's cheating doesn't win


    Computing benchmarks are similar to a road test designed to compare the speed and agility of different cars driving on a specific route. However, mobile benchmarks also aim to test not just the brute speed, but rather the overall performance typical of running within real-world efficiency parameters related to battery consumption and heat dissipation.

    Mobile benchmarks are therefore like a test track that involves not just a speed element, but also additional constraints related to mileage, emissions and overheating, to show not just what a souped up modified vehicle could do, but instead how well a customer's car will actually perform in real world use.

    Samsung activates various performance modes when running specific benchmarks, including activating all idle cores, increasing the clock rate and making other, unknown changes that boost its graphics performance over identical hardware.

    While these changes boost scores by 20 to up to 50 percent (in AnandTech testing), they do so at the cost of overheating components and running down the battery. These impacts are so costly to the overall experience that Samsung does not activate "maximized frequencies" when running real apps, because if it did, its other benchmarks would suffer, particularly battery life and product reliability.

    Samsung's efforts to shoo attention away from investigations by AnandTech and ArsTechnica weren't enough to convince Wall Street Journal or CNET, but did seem to pacify the concerns of some Android fans.

    "Which is true? Will we ever know? Does it even matter?" asked Zach Epstein of BGR after reciting Samsung's statement.

    "Bam! Straight from the horses mouth, they don't cheat!" one reader commented. Another, represented by an Android avatar with the caption "hope," wrote, "I agree. I will believe anything from Google/DROID/Samsung. They are honest. I will never listen to or believe anything from or about Apple."

    A higher rated comment observed, "Cheating or not on benchmarks, their Exynos and Qualcomm chips they use in their Galaxy line up are getting beat on every benchmark by Apple's A7. So Samsung is playing catch up."




    10-3-13


    Source





     
  3. sparkyscott21

    sparkyscott21 Moderator Staff Member

    Joined:
    Nov 4, 2010
    Messages:
    7,280
    Likes Received:
    103
    Trophy Points:
    63
    Location:
    Northern Mich.
    Following Tuesday's report that Samsung artificially inflates its benchmarking scores, well-respected hardware review site AnandTech has published evidence suggesting nearly all mobile manufacturers, with the exception of Apple and Motorola, use CPU/GPU optimizations to game benchmark tests.

    Samsung and other OEMs use a variety of methods to enhance device performance when a benchmark is detected. For example, with its Galaxy S 4 Samsung raised its thermal limits (and max GPU frequency) to get an edge on certain benchmarks and also raised its CPU voltage/frequency to its highest state when a benchmark was sensed, a tactic engaged by multiple manufacturers like LG, HTC, and ASUS as well.

    In the table below, Anandtech highlights devices that detect benchmarks and immediately respond with max CPU frequency.





    [​IMG]




    With the exception of Apple and Motorola, literally every single OEM we’ve worked with ships (or has shipped) at least one device that runs this silly CPU optimization. It’s possible that older Motorola devices might’ve done the same thing, but none of the newer devices we have on hand exhibited the behavior. It’s a systemic problem that seems to have surfaced over the last two years, and one that extends far beyond Samsung.

    AnandTech notes that it’s a continual "cat and mouse" game discovering which devices have optimized for which benchmarks, because targeted benchmarks must be avoided.

    The only realistic solution is to continue to evolve the suite ahead of those optimizing for it. The more attention you draw to certain benchmarks, the more likely they are to be gamed. We constantly play this game of cat and mouse on the PC side, it's just more frustrating in mobile since there aren’t many good benchmarks to begin with.

    There's no single solution here, but rather a multi-faceted approach to make sure we’re ahead of the curve. We need to continue to rev our test suite to stay ahead of any aggressive OEM optimizations, we need to petition the OEMs to stop this madness, we need to work with the benchmark vendors to detect and disable optimizations as they happen and avoid benchmarks that are easily gamed.

    Despite all of the effort that OEMs put into benchmark optimizations, the gains are negligible. The impact on CPU tests revealed a 0 to 5 percent performance increase, and a less than 10 percent increase on GPU benchmarks.




    10-3-13


    Source



     
  4. guest

    guest Active Member

    Joined:
    Aug 9, 2012
    Messages:
    756
    Likes Received:
    42
    Trophy Points:
    28
    These things have been going on for a very, very long time. Governments going back more than half a millenium were shaving/debasing gold coins. Here's another recent "scheme" to gain an economic advantage. MPG Fraud - McCuneWright, LLP - Attorneys at Law
     
  5. jonw747

    jonw747 Well-Known Member

    Joined:
    Sep 16, 2012
    Messages:
    1,127
    Likes Received:
    163
    Trophy Points:
    63
    So, this explains why the N4 never scored very well in benchmarks in spite of having perceptually terrific performance. Still, gaming benchmarks has been going on for a long time. One answer has been to use actual Apps in the benchmark, and to update the benchmark to include new Apps on a regular basis.
     

Share This Page