The underlying structure of a typical website is made up of different folders and sub-folders, much like the ones that are on your computer. A webmaster (is this term still used often lol?) transfers files back and forth using an FTP client in order to update the website.
In most cases, specific folders are created for a specific reason. For instance the ‘pub’ folder is usually a public repository that allows anybody access to.
The directory structure of a website could reveal a lot about it, therefore web owners hide it. Otherwise anybody can see it directly from their browser:
It is not that complicated to hide FTP directories, and highly recommended. If you use a Microsoft server (IIS), here is a quick guide. If you are using Apache, you can create a file named .htaccess in the root directory and add a line to it:
This will turn off directory listing.
As you can imagine, loads of websites are not configured properly (alas!). Finding them is fairly easy with a simple Google query such as “index of”. As a matter of fact, this query is used a lot by people looking for ‘open’ FTP servers with, say, free mp3 songs or games…
Now what about the bad guys? Well, obviously they first look if they can compromise a site using for example an SQL injection. If they can put some malicious code in there, they may infect users browsing to that particular site. But there is another element to this. A lot of the time the ‘exploit’ code that is injected into hundreds or thousands of regular sites calls to another location where the payload is hosted.
The payload could be stored on each compromised site, but that would be more work. Or simply, injecting the exploit code was successful, but the hacker does not have access to the FTP server where he would be able to upload the payload files. So, it’s not unusual to see those files hosted on dedicated server, although they are also easier to blacklist or shutdown once spotted.
As we know, the bad guys like to be secretive about their operations, and they do not want to leave traces behind to be identified. In that regard, hosting their scum on loyal citizen’s website is actually pretty handy. In order to stealthily store their files, the bad guys like to copy them in the ‘images’ folder because no one would really suspect there would be anything but images, right?
There are thousands of sites out there, containing executables in their images folder. Now, this does not mean that these executables are all bad. In fact, most of the time webowners are just messy and put things all over the place.
Google lets you search for pages that list their content, and you can parse the results for pages that include “.exe” files.
The following screenshots will show you some interesting things:
Both those files are Trojans.
In this one, we see malware files as well as exploit source code (c99.txt):
The following is an extract of nsTView, a toolkit for hackers to control the websites they got into. They can literally harvest everything from it, run shell commands etc…
How to they get to work? Using a simple access page on the sites they took over:
A Google search for this web hacking kit shows that thousands of legitimate websites have been compromised:
The Internet is far from being a safe place. But as you can see, all of us play a role into its state. By leaving our websites poorly configured or hacked and serving malware, we actually help the bad guys, we give them tools to perpetrate their crimes.
As a Security Researcher, we can only hope to educate people about the dangers and responsibilities of owning a site, even a simple PC. It’s not just about protecting your own interest (obviously that comes first, and kudos for making your PC secure) but also not contributing to the problem. Did you know that 90% of spam comes from zombie PCs (regular Joes whose computers have been infected and are at the control of hackers)?