A year of AV testing

14. April 2014 Security 1

A little under a year has passed since I took over the AV testing from AppleSerialNumberInfo.com. It has taken up tremendous amounts of time and took a little while to evolve into what it is today. I think by now I have the hang of it though and figured out a good way to test thoroughly, reliably and frequently. For me to do this I had to drop the individual AV product reviews that focused on behavior, resource usage, interface and more as it simply took up way too much time. By focusing purely on the detection of malware I am able to use my time much more efficiently and get much more testing done.

I appreciate all the feedback and help I have gotten from readers all over the world that continue to make this test better!

So, after a year, are there any trends? Have I collected enough data to definitively state a product is the best or the worst? Actually, I haven’t. But I will share what I have observed so far, taken from 22 tests done over the past year.

Top 3 AV:
The AV industry is constantly changing so there is no definitive top 3. Looking at all the results and averaging thing out these are the names that have occupied the top 3, in order of frequency.
– Intego
– Panda
– Avast
– Sophos
– Avira

Intego had multiple products occupy multiple spots in the top 3 for quite some time and as Panda used Intego’s databases I guess their results can count towards Intego as well. As rankings have shifted around back and forth over time, a few things have become clear; Intego appears to be the best paid product. Other paid Mac AV have not even come close to scoring as well as Intego has over the past year. Big names in the industry like Norton and McAffee have done poorly since the test started and I am convinced they have no place in a Mac environment at home or in a business. When it comes to free AV products, names like Avast, Sophos and Avira stand out. Well known AV ClamXav has scored poorly for a long time but has recently started improving a lot. While not making it to the top 3 yet, it may become a good free solution in the future.

All of this can change overnight, that’s why I do not feel a year of data is enough for a definitive answer. There are definitely trends like the ones I mentioned just now but there is no guarantee this will be the same a year or even a month from now.

Samples:
The amount of samples used in the tests have greatly influenced the results. One day an AV would score great only to make a significant plunge in rank when new samples were added. Starting out with roughly 250 samples I have worked my way up to 420 samples (at the time of writing this article). Over time some samples have been removed and others were split up so both archives and contents can be tested.

I have definitely noticed a switch from obvious malware like Flashback and MacDefender to more stealth spyware, Adware is a lot more common too now. The focus has shifted from tricking people into paying for a fake product to going undetected as long as possible, mostly with targeted malware and spyware. The future of Mac AV should be quite an interesting one.

As the sample set grows it is increasingly more obvious (to me at least) that OS X’s built-in defense against malware, X-Protect, is mostly useless. Most good AV vendors learn about new malware and update their definitions as soon as possible. Apple still takes their sweet time adding signatures to X-Protect and a lot of them never even make it in there. Those that are lucky enough to stumble onto a piece of malware before X-Protect is updated will be carrying that malware around for quite some time. Even once X-Protect is updated, it does not see a machine is already infected. X-Protect has a long way to go and grow before it becomes useful enough in the fight against malware.

Other reviews:
Just do a search online and you’ll find plenty of Mac AV reviews and comparisons out there. The vast majority of them will recommend a specific product because of the features it has and/or because they get paid for the recommendation. Product X has this boatload of features and is therefor the best Mac AV! These reviews and comparisons are, in my opinion, useless. You want an AV that can detect and clean up malware, spyware, adware and all other wares, right? So shouldn’t detection be what matters in a test? I think so. Having a great firewall built-in to your AV is great but if the AV has poor detection rates what’s the point? As I have mentioned before, do your own research when looking for a good AV solution. Poke through all the garbage out there and you will find good tests and comparisons that focus on what matters.

Positive Feedback:
The majority of the feedback I get is from people that have switched from Windows to Mac and are wondering if they need AV. Other feedback is from readers that have used the tests to switch products or convince their IT dept to implement AV solutions. I also receive feedback from AV vendors about samples I use or results I have logged. If a sample turns out not to be good for the test and several vendors agree on this, I remove it. If a sample was marked as undetected while it should have been, the vendors will ask for specifics so we can figure out if it was the database, product or my test that didn’t work as it should. The test is also used by some vendors to add signatures to their database that were not yet in there so I share samples and trace materials when possible or upload them to engines like VirusTotal and Metascan. Overall the feedback has been very positive and has lead to close connections in the AV industry, security researchers and readers that contribute to the constant improving of the test.

Negative Feedback:
Oh yeah, there is some of the less fun stuff too. I have been accused several times (both by readers and vendors) that I favor some AV over others because I am paid to do so. Readers assume this because other reviews out there focus on features and actually do receive compensation for a favorable review, so the results and top recommendations are usually very different than mine. If all these other websites recommend product X, why do I recommend product Y? I must be paid off to do so!

AV vendors that do not like my test, because it clearly shows their product is severely lacking, contact me with similar accusations. They will quote me other reviews and tell me how many users around the world are “very satisfied with the performance of their product”. Metrics that mean absolutely nothing to me. Again, having a futuristic shiny interface doesn’t mean a product is actually good at detecting malware. Having millions of users doesn’t mean the product is actually good at detecting malware, it just means all those users have either been lucky so far or have a malware infection they are unaware of. If a product receives a poor score or review from me it is for one reason only; it performed poorly.

I wouldn’t mind receiving some compensation for all the time spent performing these tests and I may consider some affiliation programs in the future but I am not being paid by any of the AV vendors that are in the test. I mention and recommend certain AV because my test shows they are clearly the superior product that does it’s job; protect users from malware. If I ever join an affiliation program or receive payment from a vendor this will be clearly mentioned on the site and it would never affect my results.

Other feedback I have received is about my testing methodology and how it does not live up to the amtso standards. I’ll leave it up to you, the reader, to head over to that website and read all the documentation and guidelines they set for antivirus testing. I’ll just say that following these guidelines and standards will require time, equipment, manpower and funding I do not have. I doubt any independent researcher has the means to live up to those standards. As I have told some of the critics; pay me enough to get the staff and equipment and I’ll gladly do it. Naturally I never hear back from them.

I try to make my tests as ‘real world’ as possible. The machines are set up in a way most home users would have their Macs configured and the AV products are configured for best performance as they should be. The results are reliable and give a very good indication of a product’s abilities to protect you. Something many readers and vendors agree with. There is always room for improvement and that’s why I always listen to feedback (good or bad) and adjust the test where I can to accommodate the suggestions. Suggestions that have been improving the test continuously since I started.

The Future:
I hope the feedback and suggestions never stop coming. These tests take up a lot of time and it’s good to know there is an interest in the work, it keeps me motivated. I also hope more AV vendors want to get involved in the test. As I mentioned there are already quite a few that are working with me and we help each other out so both the test and their products benefit and improve. As the test improves and the AV products improve, in the end it’s the end user of these products that benefits from it as they will get the protection they want. A win for everyone 🙂

I am passionate about my blog, the topics I research and write about, learning new things all the time and of course the Mac antivirus test as a part of all this. In fact I have just added a pair of brand new SSD’s to my Mac Pro that will host some of the virtual machines I use in testing. This will allow me to run more VM’s simultaneously and get the testing done faster.


1 thought on “A year of AV testing”

  • 1
    Tom on April 14, 2014 Reply

    Jay,

    I have to say, I for one think Security Spread’s AV test is the best Mac anti-malware test on the internet. Why…. I get to see one of the biggest PDF’s I have ever seen to date. LOL!

    I like to see all the malware families and files that are being tested. Other tests just state we tested 40 malware samples and looks as Intego caught all 40 and Avast caught 36, just doesn’t cut it for me. I want to see them and what families of malware those files are in. Your test does that.

    While AV-Comparatives is the gold standard in Microsoft based testing, I feel they completely fail in testing Mac AV testing. It always seems to be about telling the reader what “features” each has. Using Avira for Mac as an example, this is all you get from AV-Comparatives as far as the malware test.

    “Malware protection/detection test
    Avira Free Mac Security detected all the
    malware samples, both Mac and Windows,
    used in our tests.”

    That’s it??

    Security Spread’s test is 1000% better at giving us users way better data points and a way to really compare the different programs.

    I for one think you should put a 2 window side bank of ads so “we users” can log into our “guest accounts” and follow the ad links and spend some time on each, so you get some $$ for all your hard work.

    Great job Jay!! Your tests tell me more then any other tests done so far.

    Added note to Intego:

    I wish Intego would of added VirusBarrier X6 level of “logging” to their new X8 version in an Advanced section. X8 still is as far as design, another “also ran” boring uninformative AV like the rest of the AV’s out there. Also adding a option to SEE what is being scanned would be nice (file tree line like some others do and you use to do) Why Intego thinks hiding and reducing “features” is the wave of the future, I just don’t get. Why be another “also ran” as far as design. I think Intego had the best designed AV on the market, with Virusbarrier X6. I sure wish the French CEO was back in charge from being pushed out, It bet we would of had a killer feature rich VirusBarrier X7 if he was still there.

    Thanks for the great work

    Tom

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.