Malware

Phishing: Is that Web site real or not?

The phishing is good, which is probably not what you wanted to hear. Let's take a look at why and figure out what to watch out for.

Phishing attacks rely on deception, pure and simple. Using realistic looking, but fake Web sites was one of the first techniques used by phishers. Eventually that approach became somewhat ineffective. Web sites didn't look exactly right or the URL was wrong, alerting us to the deception.

The real thing

Phishers still use fake Web sites, but have developed a better mouse trap by altering official Web sites. How you say? It's simple; phishers leverage the same vulnerabilities that are used for Web site defacement and various other attack vectors. It's a good idea, since there's no need to create anything, just alter what exists. Besides it's the perfect deception, the site obviously looks right and the correct URL is displayed.

The "how and why" Web sites are exploited is well documented, with leveraging weaknesses in PHP to gain a foothold on the Web server being one of more preferred methods. An example of this would be the vulnerability discussed in the National Cyber-Alert CVE-2008-3239:

"Unrestricted file upload vulnerability in the writeLogEntry function in system/v_cron_proc.php in PHPizabi 0.848b C1 HFP1, when register_globals is enabled, allows remote attackers to upload and execute arbitrary code via a filename in the CONF[CRON_LOGFILE] parameter and file contents in the CONF[LOCALE_LONG_DATE_TIME] parameter."

What makes this vulnerability unique is the developer's insistence that there's nothing wrong with the code. So they aren't going to change anything:

"Tough we do not intend to release a security fix for this issue at this time, we want to remind our users of the importance of disabling the "REGISTER_GLOBALS" option of their system. This option will not only enable this vulnerability to be exploited but will also open multiple breaches into your system. Note that if your system is configured properly (with "REGISTER_GLOBALS" disabled), this vulnerability does not apply to your website."

Kind of a strange statement from a vendor, but it's exactly what the bad guys like to see. As proof, I did a simple search and found several Web sites advertising exploit code for this vulnerability. I've linked one example that's published at the Milw0rm site.

Current research

I've just finished reading a paper written by researchers Tyler Moore (CRCS Harvard University) and Richard Clayton (Computer Laboratory, University of Cambridge) titled "Evil Searching: Compromise and Recompromise of Internet Hosts for Phishing" (pdf). Don't worry about the title; the paper is a good read shedding light on the effectiveness of Web sites altered to steal sensitive information. For example, one interesting statistic was the mix of compromised Web sites versus fake Web sites:

"By far the most common way to host a phishing Web site is to compromise a Web server and load the fraudulent HTML into a directory under the attacker's control. This method accounts for 75.8% of phishing.

A simpler, though less popular approach, is to load the phishing web page onto a ‘free' web host, where anyone can register and upload pages. Approximately 17.4% of phishing web pages are hosted on free web space."

Locating vulnerable Web sites

OK, we now know that phishers prefer to alter real Web sites and how they do it. The next question begging to be asked is how they find vulnerable Web sites. In reality, phishers don't have too much trouble. They use readily available scanners designed to check for PHP weaknesses. One example is the Web Vulnerability Scanner by Acunetix:

"The best way to check whether your web site & applications are vulnerable to PHP security attacks is by using a Web Vulnerability Scanner. A Web Vulnerability Scanner crawls your entire website and automatically checks for vulnerabilities to PHP attacks. It will indicate which scripts are vulnerable so that you can fix the vulnerability easily."

Still, most would admit that this type of scanning is slow and very inefficient, especially considering the number of Web sites in existence. Moore and Clayton's paper again sheds light on what phishers are using to make the locating process easier:

"An alternative approach to scanners, that will also locate vulnerable websites, is to ask an Internet search engine to perform carefully crafted searches. This leverages the scanning which the search engine has already performed, a technique that was dubbed ‘Google hacking' by Long.

He was interested not only in how compromisable systems might be located, but also in broader issues such as the discovery of information that was intended to be kept private. Long called the actual searches ‘googledorks', since many of them rely upon extended features of the Google search language, such as ‘inurl' or ‘intitle'."

The article that the above quote refers to is written by Johnny Long and titled "Google Hacking Mini-Guide". It's a treasure trove of information on how to maximize Google search instructions to get sensitive details about Web sites.

Let's see if it works. If you remember the PHP vulnerability described by CVE-2008-3239, the key search phrase would be "PHPizabi 0.848b C1 HFP1". I entered that phrase in Google search and after some digging to get past all the entries referring to this exploit, I found results that definitely would be of interest to phishers:

Side bar: It's not Google's fault

In researching this article, I quizzed some of my friends and walked away a bit surprised. A few remarked that Google is partially to blame for this. I totally disagree with that attitude and hope that you would as well.

Google provides a service that makes finding and retrieving data a whole lot easier. As you know I get on Google's case about storing this information safely, but totally acknowledge that their search engine is the best bar none. In my opinion, the problem lies elsewhere.

Nothing new

Using search engines to find vulnerable Web sites isn't new. What is new is the way Moore and Clayton were able to statistically link the Web search results with the probability of a specific Web site becoming compromised. They accomplished this by using Webalizer, a program that creates reports from Web server logs. Of special interest to the researchers was the recorded search terms used to locate the Web site:

"In particular, one of the individual sub-reports that Webalizer creates is a list of search terms that have been used to locate the site. It can learn these if a visitor has visited a search engine, typed in particular search terms and then clicked on one of the search results.

In the following slide (courtesy of Moore and Clayton) you can see that several of the Webalizer entries are related to the search shown in the browser window:

Key points of the report

So what's it all mean? In a convincing fashion, Moore and Clayton have figured out how to pull all of the important data together and assemble it in a usable format which has turned up some interesting results. The following points are two of the more notable ones:

  • 90% of the Web sites in the study group were compromised almost immediately after suspicious search terms were found in the Webalizer report.
  • One surprising statistic was the rate of being compromised multiple times. The report showed that almost 20% of infected Web servers were likely to become re-infected, but when Webalizer found suspicious search terms directed at a particular Web site, the chance on becoming re-infected jumped to 48%.

The fact that there are servers being compromised multiple times is something that I don't understand at all. That needs to be fixed. To that end, let's look at what the researchers are suggesting Web hosts do to reduce their risk.

Room for improvement

I hope Web hosting services take what the researchers learned seriously, especially the following suggestions:

  • Obfuscating targeted details: Suspicious searches would be less effective if identifying information such as version numbers of the software being used by the Web server were not publicized.
  • Suspicious search penetration testing: Motivated defenders could run searches to locate Web sites that appear vulnerable, warning their owners of the potential risk.
  • Blocking suspicious search queries: An alternative approach is for the search engines to detect suspicious searches and suppress the results.
  • Lower the reputation of previously phished hosts: In addition to flagging active phishing URLs, mark previously compromised hosts as risky due to the high likelihood of being compromised again.
What can we do

There are a few things that we as Internet users can do to protect ourselves. I've been suggesting that everyone use McAfee SiteAdvisor, even Moore and Clayton mention it in their report. It works by installing a browser add on:

"With SiteAdvisor software installed, your browser will look a little different than before. We add small site rating icons to your search results as well as a browser button and optional search box. Together, these alert you to potentially risky sites and help you find safer alternatives."

An alternative that's not as user-friendly is to visit the PhishTank Web site if there's any question as to whether a particular Web site is real, fake, or possibly compromised:

"PhishTank is a collaborative clearing house for data and information about phishing on the Internet. Also, PhishTank provides an open API for developers and researchers to integrate anti-phishing data into their applications at no charge."

The Anti-Phishing Working Group has a Web site that's full of good information and specifics as to what's going on in the world of phishing:

"The Anti-Phishing Working Group (APWG) is the global pan-industrial and law enforcement association focused on eliminating the fraud and identity theft that result from phishing, pharming and email spoofing of all types."

Final thoughts

All of us, businesses and individual users alike are becoming very reliant on the Internet. So when something like phishing disrupts that trust, I tend to take it personally. Finding out that Web sites get exploited a second and third time just adds to the frustration. It's just not right. Not sure what to do though, do you have any ideas?

About

Information is my field...Writing is my passion...Coupling the two is my mission.

30 comments
seanferd
seanferd

I've read about other PHP mega-vulnerabilities in the past, not to mention the consistent refusal of maintainers to secure their sites and turn off defaults on database engines and filter queries. I've accidentally found rather revealing results with search engines, and used dorking tools or references. And remember when Googling for web-enabled Security cams with unsecured configs and bad ActiveX controls was in? I just like the way you put this one together. Don't know how I missed it, but it was a good chaser for Chad's article I just read. Mydoom.FUD: a lesson in Fear, Uncertainty, and Doubt

howiem
howiem

Frankly, I could care less about whether a site is fake or not, since I only access banking sites by first doing some due diligence to make sure I get the correct site the first time. Then, after logging in the first time, I bookmark a random link to a https page within the secure site. Thereafter, I only use that bookmark to access the web site. Even when I get a legitimate email from my bank, I never click links in it, preferring to use the bookmark. As for the recommendation to use Site Adviser, I prefer to use Link Extend which combines the analysis from 8 site analyzers, including Site Adviser, WOT, Browser Defender and some others. Google Linkextend to find it. Most users will be better off if they are advised to focus on protection rather than analysis, and using bookmarks means they do not have to worry about fake web sites. A tool like LinkExtend will also increase protection by giving more options for warnings about bad sites.

Jaqui
Jaqui

PHP is not insecure in and of itself. It's the site scripts that do not have security in mind, and server side data verification that get exploited. Blaming PHP for poor coding on the part of developers [ Dreamweaver users usually ;) ] is like Blaming MS for an exploit in Acrobat that escalates into system access. No website scripting language is any more, or any less secure than any other. It is how the developer(s) using the language / technology handle security that is to blame. edit to add: Yes, many pre-written PHP functions are insecure, because the person that wrote them didn't pay attention to security, but that also applies to perl, python, ruby, .asp [ VBscript ], .NET enabled languages .... Popular scripts, like Joomla and Drupal, will be gone over by the criminal types looking for bad coding habits. This means even the pre-written functions available in PHP.

The Scummy One
The Scummy One

These things are always changing, it is good to have up to date info. Thank You for this!

santeewelding
santeewelding

Certain features of Sharia law come to mind.

Michael Kassner
Michael Kassner

Is important, so thank you. I also thought Chad's article was excellent. Not sure if you are interested in Twitter or not, but I tweet when I have a new article out.

Michael Kassner
Michael Kassner

Mentioning LinkExtend. I will have to check that out. It seems that you have fake sites covered, but altered real Web sites should still be a concern.

Michael Kassner
Michael Kassner

I somewhat feel that the vulnerability I used as an example is an issue with PHP. Developers have suggested that the default be the disabled condition: "Tough we do not intend to release a security fix for this issue at this time, we want to remind our users of the importance of disabling the "REGISTER_GLOBALS" option of their system. This option will not only enable this vulnerability to be exploited but will also open multiple breaches into your system. Note that if your system is configured properly (with "REGISTER_GLOBALS" disabled), this vulnerability does not apply to your website. Please read through the documentation at PHP.Net to check if your system is configured properly. Contact your system administrator to take appropriate actions in order to void this vulnerability if your system has the "REGISTER_GLOBALS" option enabled. This vulnerability affects all versions of PHPizabi 0.8 to HFP3 SF1 (included). There is no security fix pack release planned at this time." It would be to their advantage to do so it seems.

Michael Kassner
Michael Kassner

I was too general in that comment. Thanks for pointing that out.

Michael Kassner
Michael Kassner

Is that either these reports are getting better or I'm starting to understand better. I suspect the former.

santeewelding
santeewelding

With his one good, bulging eye, said, "You hain't taken to rum, 'ave you boy?!"

boxfiddler
boxfiddler

Yellow light. Glad to see you got it off the ground. ;)

seanferd
seanferd

Usually I would catch on to your articles, and any others I was interested in, right away. I don't Twitter, though, but I think it's pretty cool. I'd probably have just as much trouble following my interests on Twitter as I do otherwise. :)

howiem
howiem

Michael, If I use an https bookmark within the genuine web site to access the bank site each time, it redirects me to the genuine log in page. Are you saying that I could get redirected to a login page that has been altered. Wouldn't that mean that the entire bank site has been compromised? After all it seems like the https server would have to be compromised for that to happen. At no time do use an http address to access the bank.

lastchip
lastchip

Little golden nuggets of information like this and the fact you are prepared to follow up on your articles, is invaluable. It prompted me to take a look at my own php configuration, and fortunately, register_globals was already off. But if it hadn't have been, it could have been a real life saver. Let's face it, few of us have the available time to consider every possible combination of security on our servers. I know we should, but in the real world, it just doesn't happen. What I need is precise information to go to exactly where I need to in the shortest time possible and this article serves that purpose admirably. Thanks again.

fatman65535
fatman65535

I think he is referring to the common punishment for thieves!

howiem
howiem

I have always understood that by accessing via a random https link, I would avoid this problem unless the https server is compromised. But I am not tech savvy to the extent you are. From what you have said, it means that the attacker would have to compromise the bank's https server, as how else would an attacker know that I was trying to access the bank except by setting up a redirect for every possible secure page within the entire bank site? I also change the random https bookmark from time to time. In addition I use no-script which claims to "sanitize" XSS attacks, openDNS, and keep the Firefox phishing protection on. In one recent event I did find that one bank I use, instead of redirecting me to their https login page, was redirecting me to an http login page. I have asked the bank about this and am waiting for a reply. No-Script alerted me to the possible XSS attack, saying it had sanitized it. In addition, to reduce the potential for an attack even further I have set up a separate sandbox for each bank site,and only opening one tab on the site, so only one tab/site is open in each bank sandbox. I am using Sandboxie for this.

Michael Kassner
Michael Kassner

I understand it. Phishers are able to find scripting flaws in the code on the real Web server and inject malcode. That's the whole point of using search engine to see if a Web server is using a certain version of PHP. The malcode could be such that it is logging your keystrokes and then sending that information to a remote location.

Michael Kassner
Michael Kassner

I'm just curious about why they are so adamant about it.

lastchip
lastchip

I'd be delighted to offer an answer, but I've no idea. I guess someone with php programing knowledge may be able to help. If I were pushed, I would guess that it's some sort of generic entry point, but that is just that; a guess and I could be a million miles away from the truth!

Michael Kassner
Michael Kassner

Do you have any explanation as to why the developers wouldn't want to have register_globals off by default?

Michael Kassner
Michael Kassner

I find that they tend to jar a semi-old brain into new and exciting places.

santeewelding
santeewelding

You won't mind, once and again, a swift kick in the butt, then?

Michael Kassner
Michael Kassner

I did a great deal of research on the subject and it's fascinating. I need to leave my sheltered geek world more and Santee is pushing me.