Tuesday, 8 October 2013

An informal anti-virus comparison

I use VirusTotal quite a lot for looking at malware and determining how difficult it is to determine, and over time I've built up a fair amount of data on what performs well with the sort of malware that I throw at it.

This isn't a particularly scientific test, the malware I scan has a strong tendency to arrive by email rather than a being a drive-by download and the product settings in VirusTotal may not match typical settings when deployed.

The small print: Data is taken from the past six months and only products that have been active on VirusTotal for that whole time period are included. The scans are those that I took at the time, and they don't take into account that products would be updatesd probably catch them later (once they have infected your system). It also doesn't take into account that other components would be downloaded, some of which would subsequently be detected (again, once they have infected your system).Your mileage may vary. Other anti-virus comparisons are available.

So, which was best in this test? The full details are below, but the product that was clearly the best with detecting nastiness was Kaspersky with a very impressive 73% of samples detected. McAfee (58%), Malwarebytes (53%) and Emsisoft (50%) were the other products that detected half or more of the 62 samples.

The hall of shame is pretty shocking. ClamAV, ViRobot and Antiy-AVL detected no samples at all. TotalDefense and TheHacker detected just one sample (1.6%). Fifteen products detected 10% or less.

The Kaspersky result was surprisingly good, but McAfee's showing indicates that this product has improved a lot over recent years, leaving arch-rivals Symantec lagging with 58% detected compared to 34%. SUPERAntiSpyware has a surprisingly low detection rate of 3.2%, considering that this is a product I often use for difficult task. F-Secure, Sophos, Trend and Norman all had disappointing results. But the results for TotalDefense were shocking as this product is widely used within corporate customers, and is the endpoint security business spun out of CA.. for a paid product it seems to be essentially worthless.

The chart below shows the staggering difference in detection rates between the best and worst vendors.


Or if you prefer a table..

 
Product
Detection rate
Type
72.58%
Paid
58.06%
Paid
53.23%
Free / Paid
50.00%
Free / Paid
48.39%
Paid
48.39%
Corporate
43.55%
Paid
41.94%
Corporate
38.71%
Corporate
38.71%
Corporate
37.10%
Free / Paid
33.87%
Paid
32.26%
Free / Paid
32.26%
Paid
32.26%
Paid
29.03%
Paid
27.42%
Paid
27.42%
Paid
25.81%
Paid
24.19%
Free / Paid
24.19%
Free
19.35%
Paid
19.35%
Paid
17.74%
Free /Paid
14.52%
Free
12.90%
Free / Paid
11.29%
Free
11.29%
Paid
11.29%
Paid
9.68%
Corporate
6.45%
Paid
6.45%
Paid
6.45%
Paid
4.84%
Paid
3.23%
Paid
3.23%
Paid
3.23%
Free
3.23%
Corporate
3.23%
Free / Paid
1.61%
Paid
1.61%
Paid
0.00%
Corporate
0.00%
Free
0.00%
Paid


In my opinion, your anti-virus product should always be the very last line of defence. But that last line should at least be effective and it may well be time to switch if your vendor is sitting near the bottom of this list.

8 comments:

martijn said...

You know that Virus Total makes a pretty clear statement saying that their site shouldn't be used for comparative anti-virus tests.

I also don't agree that AV, especially in the way it is deployed at VT, should be the last line of defense. That last line should be something that prevents malicious activities from happening. Command-line AV scanning merely looks at the content of a file.

Full disclosure: I work for a company that performs comparative anti-virus tests. (I don't run these particular tests myself though.)

Conrad Longmore said...

Hmm.. there's a difference between "endpoint protection" which is a product that includes many elements, and signature-based AV scanning which is a lot dumber. A lot of people still rely too heavily on the last part, in my opinion.

It should also be noted that some of these products form PART of whole defensive package, especially gateway products and things like ClamAV also fall into that category a little.

I did say it was an informal comparision as well :)

martijn said...

Yeah, I know you said that :-)

It's just that.. well.. I think these comparisons don't really help anyone.

I think there is a serious question to be asked about how good anti-virus protects against real threats. The problem with these kinds of 'tests' (despite you being very clear about its limitations) is that "AV haters" are made to believe they're right for the wrong reasons. And "AV lovers" aren't forced to be self-critical because the methodology is flawed.

Jamie said...

The funny thing is Clam will catch stuff when it is in the .eml file that it doesn't catch as a zip or exe.

Stuff that the bottom 75% of that list doesn't catch at all.

Conrad Longmore said...

Yes, it's not as simple as a league table, and some products are great for consumers but are difficult to manage for corporates, and vice versa.

However, one key thing that is different in my methodology from a more formal approach is that the statistics are taken from emerging threats rather than established ones. It's all very well having a product that can detect malware that was written in the last couple of weeks, I'm concerned about malware that has been written about in the past couple of hours.

I've had conversations with vendors who say "yes we can detect all of this" when you know for a fact that it just lets the bad stuff sail past and only gets around to detecting it much later.

I guess one interesting experiment would be to wait for the next zero detection binary and see how quickly the vendors react..

Conrad Longmore said...

@Jamie: I was surprised to see it down there. But this is one of the products you would apply as part of a layered defence.

PC.Tech said...

"... YMMV. Other anti-virus comparisons are available..."
You betcha, such as this one:
- https://www.virusbtn.com/vb100/rap-index.xml
... and this one:
- http://chart.av-comparatives.org/chart1.php
... which (for me, anyway) just adds to the "gray" area in all this.
It's been going on for years, these comparisons, and as long as there is
competition, it will go on - confusing us all the more on this subject.
.

Steve Basford said...

Although not on VirusTotal you can add-on Sanesecurity signatures to ClamAV:

http://sanesecurity.com/usage/signatures/

rogue.hdb will detect the emailed zipped stuff, as well as the phish.ndb database.