Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security

2M New Websites a Year Compromised To Serve Malware 72

SkiifGeek writes "Sophos claims that they are detecting 6,000 new sites daily that have been compromised to serve malware to unsuspecting site visitors, with 80% of site owners not aware that they have been compromised — though this figure is probably on the low side. With increasingly vocal arguments being put forward by security experts criticizing the performance and capability of site validation tools (though many of these experts offer their own tools and services for similar capabilities), and rising levels of blended attacks, perhaps it is time you reviewed the security of your site and what might be hiding in infrequently used directories."
This discussion has been archived. No new comments can be posted.

2M New Websites a Year Compromised To Serve Malware

Comments Filter:
  • by Anonymous Coward on Friday January 25, 2008 @10:48AM (#22181220)
    I would just run a perl script that does a regex on the access logs for anything that does not match the files that should be delivered to clients. Put the perl script in a cron job and let it run. Also do an MD5 hash on those files regularly and check for any changes to static files. And use very strong root passwords, don't let root account login remotely, and use ssh keys with no interactive logins.

    my 2 cents...

  • by Smidge204 ( 605297 ) on Friday January 25, 2008 @10:52AM (#22181260) Journal
    I thought about this myself. One possible solution that I considered would be to maintain a local list of files on your server and their CRC/Hash values. A script on the server would scan all the files and output a similar list than you could then check against your local copy and would quickly identify any new or changed files. This could be set to a cron job to do periodic scans or just initiate a manual scan whenever.

    Might not be the best solution but it should be easy to implement. Larger sites can do incremental scans. It would be harder to detect corruption of databases, though, unless you know what to look for or have a concrete way of validating the contents.
    =Smidge=
  • by Bongfish ( 545460 ) on Friday January 25, 2008 @11:10AM (#22181510)
    For this, you'd want to use something like Tripwire or AIDE. It's been used for years, and will detect changes to files.

    You're right that it won't help you detect that somebody has managed to insert a chunk of javascript or PHP in your insecure mySQL/PHP web app, though. Perhaps a combination of Snort, Ntop (if it wasn't shit), a "hardened" PHP binary and config, and log monitoring would alert you in the case of an attack.

    The problem is that there's a lot of badly written or out of date software out there that can be exploited, even without discovering new holes. If you're running this sort of thing and making it publicly accessible over the net, somebody is going to take advantage of it.
  • by flydpnkrtn ( 114575 ) on Friday January 25, 2008 @11:25AM (#22181698)
    OK I know I'm feeding the trolls but you know you can choose to NOT see certain authors' stories under Preferences->Homepage, right?
  • by Anonymous Coward on Friday January 25, 2008 @11:26AM (#22181706)
    That's exactly what Radmind does:

    http://radmind.org/ [radmind.org]
  • by the_olo ( 160789 ) on Friday January 25, 2008 @11:30AM (#22181764) Homepage

    I thought about this myself. One possible solution that I considered would be to maintain a local list of files on your server and their CRC/Hash values. A script on the server would scan all the files and output a similar list than you could then check against your local copy and would quickly identify any new or changed files. This could be set to a cron job to do periodic scans or just initiate a manual scan whenever.

    Congratulations! You have just described Tripwire [sourceforge.net].

  • Radmind (Score:2, Informative)

    by fitterhappier ( 468051 ) on Friday January 25, 2008 @01:34PM (#22183582)
    Radmind: http://radmind.org/ [radmind.org]. Radmind's is designed for this purpose exactly. It's a tripwire with the ability to roll back changes, or capture them and store them for deployment to other systems.

The one day you'd sell your soul for something, souls are a glut.

Working...