[close]

Wednesday, April 28, 2010

AV-Comparatives Tests

| |
<Content Page>
Last update on 30 Aug 2011, Sunday

Whole Product "Real-World" Dynamic Test

Settings

We use every security suite with its default (out-of-the-box) settings. If user interactions are required, we will choose the default option.

Our whole-product dynamic test aims to simulate real-world conditions as experienced every day by users. Therefore, if there is no predefined action, we will always use the same action where we consider the warning/message to be very clear and definitive.

If the message leaves it up to the user, we will mark it as such, and if the message is very vague, misleading or even suggests trusting e.g. the malicious file/URL/behaviour, we will consider it to be a miss, as the ordinary user would.

This year we will be stricter with required user decisions/interactions than last year. We consider “protection” to mean that the system is not compromised.

This means that the malware is not running (or is removed/terminated) and there are no significant/malicious system changes.

An outbound firewall alert about a running malware process, which asks whether or not to block traffic form the users’ workstation to the internet is too little, too late and not considered by us to be protection.

For more details on how they test the anti-virus, refer to their AV-Comparatives website.



Tested products

The following products were tested in the official Whole-Product Dynamic main test series.

In this type of test we usually include Internet Security Suites, although also other product versions fit, because what is tested is the “protection” provided by the various products against a set of real-world threats.

Main product versions used for the monthly test-runs:




Whole Product Dynamic "Real World" Test - Graph Bar



Note:
Blocked(Green) / User Dependent(Yellow) /Compromised(Red) - [All in percentage %]


Results:
01) F-SECURE: 99.2 / 0.2 / 0.6
02) SYMANTEC: 99.1 / 0.4 / 0.4
03) BITDEFENDER: 99.1 / 0 / 0.9
04) G DATA: 98.9 / 0 / 1.1
05) TREND MICRO: 98.6 / 0 / 1.4
06) PANDA: 98.6 / 0 / 1.4
07) ESET: 98.2 / 0 / 1.8
08) KASPERSKY: 97.7 / 0.9 / 1.3
09) AVAST: 97.1 / 1.3 / 1.6
10) AVIRA: 96.9 / 0 / 3.1
11) QIHOO: 96.1 / 1.3 / 2.6
12) SOPHOS: 95.6 / 0 / 4.4
13) AVG: 95.1 / 0.2 / 4.7
14) WEBROOT: 94.7 / 0 / 5.3
15) MCAFEE: 93.5 / 0 / 6.5
16) K7: 91.8 / 0.7 / 7.5
17) PC TOOLS: 91.2 / 6.7 / 2.1


Reading according to graph, is slightly different from the result is given in the table below.



Top 7: F-Secure, Symantec, Bitdefender, G Data, Trend Micro, Panda, ESET



Whole-Product “False Alarm” Test (wrongly blocked domains/files)

The false alarm test in the Whole-Product Dynamic test consists of two parts: wrongly blocked domains (while browsing) and wrongly blocked files (while downloading/installing).



Top 3: AVG, BitDefender, ESET



Overall for Whole Product "Real-World" Dynamic Test
Bitdefender and ESET have 1 of the highest detection rate and lowest false alarm.

Overall Test for Performance test and "Real-World" Dynamic Test:
ESET is chosen as it is the top 2 overall for Performance Test and has the 1 of the lowest false alarm and 1 of the highest detection rate for "Real-World" Dynamic Test.


Reference:




<Back To Content Page>

0 comments:

Facebook Blogger Plugin: Bloggerized by AllBlogTools.com Enhanced by MyBloggerTricks.com

Post a Comment

Popular Posts