I have adjusted the testing methodology and some of these changes are reflected in the latest PDF. These changes are:
– Results for the top performing AV applications now show results for on-access / real-time scanning. In short, every AV that has a real-time scanning capability will show an extra column of results in the PDF. AV that uses on-access scanners have not been tested for that feature. On-Access scanning, as implemented by some AV, require the user to actually double-click a file before it is scanned. While this may be just as effective in the end as real-time scanning, testing this would take a ridiculous amount of time, time I do not have. This is why only AV with real-time scanners have been given the extra love as testing those types of scanners are much easier and less time consuming. This is how I performed the real-time scanner test:
A brand new VM was created running OS X 10.8.5, latest software updates, latest Flash and latest Java. 4GB of RAM and 4CPU cores. The AV product is then installed and updated to it’s latest version and definitions. The AV product is then configured for best possible performance and the VM is restarted. After the restart all of the malware samples are loaded onto the VM, a properly functioning real-time scanner should deny access and/or quarantine the malware before it has a chance to finish copying. Malware that does make it onto the VM is then selected and duplicated in the Finder so the AV has a second chance to detect it. When done, results are logged and a VM snapshot is made.
The real-time testing is only done on samples, not on trace files. For the trace files to be present the machine has to be infected already and this is not the case in the real-time testing scenario. Both tests combined (On-Demand and Real-Time) should paint a clear picture of how an AV can protect currently infected systems and how a clean system can be kept clean.
– In the past, once testing had been concluded, a snapshot of the VM was made so the test results and test environment could be recalled later on if needed. Since testing began the VM’s have moved from a 1TB to a 2TB and finally to a 3TB hard drive. With more frequent updates, the adding of samples etc. the amount of snapshots was getting ridiculous. I had imagined this would happen eventually but not this fast.
Since late September I now only save the latest snapshot of a tested AV product. When a new VM is created with updated OS, plug-ins, infections and samples the old one is destroyed. I do however keep all the log files of previous scans archived. If there was an inconsistency or problem in the previous test, the new test should correct this anyway and the PDF will reflect this so there is no need to keep old snapshots around.
– X-Protect is still present in the PDF. I have changed my mind a few times about removing it from the PDF, then adding it back again… Until I can make up my mind X-Protect will remain listed. I’ve heard good reasons why it should not be included and good reasons why it should be. I was convinced it should be removed for a little while because people told me it is not fair to compare AV products against a non-AV system. While this makes sense I must also point out that one of these people now refers to X-Protect as “XProtect is essentially just a basic anti-virus scanner“. This doesn’t help my confusion 😉 X-Protect is an AV system (not a product) and was designed like one and acts like one. It is a signature based list of malware. This makes including it in the PDF fair right? I don’t know. Until I do, it stays 🙂 I will go through all samples and mark X-Protect only for executables and malicious scripts, things that I feel should be detected as they pose a risk to the user if they are allowed to run. Not sure when I will get around to this though.
That’s all for now. Thanks again to all those that have provided their feedback. F-Secure, Intego, ClamXav, Thomas Reed, Eddie, Al, quite a few readers, friends and colleagues. Keep the feedback coming 🙂 And here’s the latest PDF.