Recently I have found it is getting harder to ensure that the latest malware definitions are really catching the latest problems. In the last three weeks we have found several pieces of malware or viruses which have had to be carefully hand removed - with new definitions to detect them coming out 2 to 4 days after we have already discovered them.
Malware detection always lags behind the advent of new malware as a new virus / spyware / trojan etc, when it is released, normally has at least several hours, if not days, head start on the first definitions being released. In order for a new definition to be released the Malware has to be noticed, caught, reported, analysed and finally a fix / detection signature released for it. Finally the update has to be downloaded by the end user.
Part of the process we employ when doing a "Virus Bust" is to run several anti spyware / malware removal and root kit detectors across a system. This of course is quite time consuming, and again - if the malware is a new one, sometimes the only way it is detected is by seeing the results of the malware still present (e.g. rubbish exiting the firewall, strange PC behavious, pop ups etc). Which started me thinking ....
Is it possible that the number if items of legetimate software on the average users PC is growing at a slower rate than the number of malware instances. For example, the average user only wants to surf the net, send emails, write letters, do some word processing and listen to music / videos. Throw into that mix a bit of spreadsheeting, VOIP and games and you are stil only looking at a fairly limited range of software.
On an average week the average user does not add much new software to a system. Microsoft updates and anti virus updates probably account for most of the changes to executable code on a system. Instead of scanning for malware maybe a better solution would be to have a list of known good executable software and run a scan based on that. Any executable code found on a system not in the known good DB can then be flagged as suspicious and that subset of files be scanned / isolated instead of scanning an entire system of mostly good code for the odd piece of rot that has crept in.
Security based not on positive detection of malware but the isolation of unknown code offers a chance to allow quicker detection of potentially dangerous software on a PC. Certify the good code, isolate the unknown code and then apply positive antivirus detection methods against the unknown executables.
Not only does this method have the possibilty of being faster in its scanning of systems (creating and checking hashes is potentially faster than applying heuristic algorithms against an entire executable) but means the ability to certify code as being safe might alleviate some of the Zero Hour threats we face now days. Certainly for someone like me isolating the known good from the unknown means we can rapidly discard 99% of all files in a system has safe and concentrate on isolating the threats in the unknown one percent. It also offers a very positive way of providing reliable scanning from an alternative boot disk on compromised systems.
Historically old anti virus systems (circa DOS and Windows 3.1) were able to add CRC codes or hashes to executable files and then check to see files matched a known hash. That method presents problems today, and has fallen out of favour. However as an off-line virus scan, booted from an alternative operating system or boot disk and making use of a 'white list' database, it has the potential to add another tool to the security experts arsenal.
Heaven knows we need it.
This has been a random thought from the fertile and over caffinated brain of Shane. Thoughts, feed back and offers of millions for the idea welcome.
Other related posts:
Further Cause To approach Virii With White Lists not Just Black Lists
Comment by xcubed, on 19-Sep-2008 16:34
The biggest issue with any scanner is that it needs to read the file from start to finish to determine if the contents are valid. It makes no difference if it is comparing a hashed result against a list of known good executables, or if it is comparing it 1:1 to another copy, it still requires the whole file to be parsed. Even if the executables were cryptographically signed, as many Microsoft executables are these days, you still need to hash the entire file to ensure it matches the signature.
The other big issue is hash collisions. In many cases, especially with short hashes like CRC32, two or more different files can produce the same output hash. This increases the chances for malware writers to modify a known good executable to get it to do something it shouldn't, while still bypassing the scanners.
Ideally, the operating system would be stored on ROM so it couldn't be modified, and the operating system would refuse to execute code on any other media. However, this is still vulnerable if the OS is prone to buffer overflows and the like.
I don't think there's an easy solution to this problem.
Add a comment
Please note: comments that are inappropriate or promotional in nature will be deleted.
E-mail addresses are not displayed, but you must enter a valid e-mail address to confirm your comments.
Are you a registered Geekzone user? Login to have the fields below automatically filled in for you and to enable links in comments. If you have (or qualify to have) a Geekzone Blog then your comment will be automatically confirmed and placed in the moderation queue for the blog owner's approval.