How to Properly Evaluate an APT Security Solution?

Posted by George Yunaev on 2015-01-08 15:30:00

As mentioned previously in Detecting Advanced Persistent Threats – myths and realities, the technologies used by some Advanced Persistent Threat (APT) security vendors may not result in a good detection rate. To add to the problem, most companies providing APT protection do not participate in the standard industry detection tests run by reputable companies. These are the main tools that measure the “effectiveness” of a security solution in terms of how well the solution prevents the penetration of modern malware.

Typical excuses are:

  • Our solution is optimized for real world usage, not for tests. This excuse takes advantage of the users’ potential ignorance with regards to how industry tests are performed. It gives the impression that industry tests are artificial and that they do not represent real-world use scenarios. However, most tests are designed to emulate such scenarios. There are, for example, tests which emulate a user trying to download and run the infected file; it is close to impossible to get a more realistic use case than that.

  • Our solution should be set up differently to perform the best.” The test company sets up the solution according to the manufacturer manual. They may ask the vendor some questions if they fail to configure the solution, but they would not ask for a special version or configuration designed solely for the test, as this would not reflect the real use scenario. They test the solution the same way the user would use it.


  • Those tests use outdated malware nobody has seen for ages, and we focus on new threats. This is another false claim. The test companies publish their test procedures and AV-Test, for example, clearly states[1] that they use only current threats, discovered in the wild during the last 6-8 weeks, in their test.

  • Tests only measure whether the solution has a signature for this malware, and we are not a signature-based solution. This excuse is an attempt at distracting the user's attention from the issue. For the user, it does not matter whether the threat was detected by a signature, by generic detection or by some another built-in functionality – it only matters that the threat was detected. This is exactly what the test measures - whether a solution detects a specific threat or not. No solution ever failed the test because it detected a threat without having a signature for it.

  • We specialize in detecting advanced malware, not some known, stale samples. Unfortunately, it is impossible to verify such statement as there is no definition of “advanced malware.” The test results only tell if the tested solution doesn't detect some piece of known, real-world malware which is threatening users at the moment. Whether it detects any advanced malware it is not shown. More importantly, the number of advanced malware threats cannot be high. If we consider “advanced malware” to be something like Stuxnet or Flame, there are probably less than a dozen new such malware threats created in a year. At the same time, AV-Test sees over 220,000 new malware samples each day[2]. Even if we assume that the APT security solution indeed detects those extra ten threats – which is, again, unconfirmed – the commercial value of such a solution would probably be limited.

  • We specialize in zero-day detections, and this is not what is being tested. AV-Test, for example, runs the zero-day detection test[3], analyzing the threats never seen before, to find out how the security solution would react against them.

As a general rule of thumb, make sure the vendors’ claims are supported by evidence, seek unbiased sources, stay up-to-date with the latest comparatives from third-party AV testing labs, and pay extra attention to false positives rates. Learn more about the proper evaluation of APT security solutions from this whitepaper: 

Advanced Persistent Threats


[1] "The most important category where the protective effect of products is concerned is the test against current online threats (protection against zero-day malware attacks from the Internet, inclusive of web and e-mail threats)”

[2] "The AV-TEST Institute registers over 220,000 new malicious programs every day.”

[3] An example of such test result is here:[report_no]=140687

George Yunaev

George Yunaev is a Senior Software Engineer at Bitdefender. He joined the company's OEM Technology Licensing Unit in 2008, after working at Kaspersky Lab for seven years. Aside from developing SDKs for various OEM solutions, George is also providing partners and prospects with useful insights into emerging threats and potential pitfalls of technology licensing. His extensive software engineering experience of 19 years also covers reverse-engineering and malware analysis. He is based in Silicon Valley, California, and enjoys traveling and active sports such as skydiving and wakeboarding.

Topics: Threats, OEM Business