Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Spam

Splogs Clog Blog Services 241

SuperWebTech writes "A new generation of spam has emerged lately in the form of automatically-created spam blogs, or "splogs." One wily programmer manipulated Blogger's API to create a "spamalanche" of thousands of blogs whose sole purpose was to increase their real sites' pagerank. This clogged search engine results while filling RSS feed services with useless listings. Though Google, Blogger's owner, is doing its best to fix the problem, in the meantime several services have stopped listing any site they host. So far nobody has found a solution."
This discussion has been archived. No new comments can be posted.

Splogs Clog Blog Services

Comments Filter:
  • Re:Username trend? (Score:3, Informative)

    by De Lemming ( 227104 ) on Monday October 24, 2005 @11:22AM (#13864050) Homepage
    That should read "Bayesian filtering" of course.
  • by digitalgimpus ( 468277 ) on Monday October 24, 2005 @11:24AM (#13864064) Homepage
    In hopes of not looking so spammy, they will take real blogs, and either copy the contents, or just key words (such as authors name and perhaps post title.

    So when you search for something... spammers with your name come up, rather than yourself.
  • by ianmassey ( 743270 ) * on Monday October 24, 2005 @11:27AM (#13864084) Homepage

    The problem surfaces when the "splogs" are used to comment spam and trackback spam legitimate blogs. It's through these links that PageRank is increased. If everyone starts proactively dealing with spam on their own sites, this problem will solve itself. MovableType users can upgrade to 3.2, which has spam blocking features, or use the great plugin MT-Blacklist. Either will eliminate this problem. An AC mentioned that WordPress has a similar set of options. I know that TypePad does. The only major blog service provider left to come up with a solution is Blogger, and in the interim you can require registration to post comments on your Blogger site or turn comments off entirely. LiveJournal and all the clones are blocked from trackback by 90% of normal blog sites already, so they don't even count.

    Another poster suggested that we ignore this problem, and it will go away. Untrue. Ignoring the 600 spam comments a day is exactly what the spammers would prefer you do, so that they can stink up every site on the internet with their crap. We are fortunate that in the case of this "new" form of spam, the tools necessary to get rid of it are already there and effective, we just need to get them all turned on.

  • by Animats ( 122034 ) on Monday October 24, 2005 @11:43AM (#13864223) Homepage
    Word verification is obsolete.
    • Programs have been written that can successfully decode capchas most of the time. It turns out not to be too hard to modify OCR programs to do this.
    • Word verification can be outsourced to third world countries at low cost.
    • Most cleverly, word verification can outsourced to users of your porno sites, who have to type in soneone else's capcha to get free pictures.

    All these approaches are in active use.

  • by Myself ( 57572 ) on Monday October 24, 2005 @11:46AM (#13864254) Journal
    If someone's willing to pay for a higher search ranking, the spammer can pay humans to beat the CAPTCHAs. I can see it now, a sweatshop in a low-wage country with hundreds of workers monotonously typing in the text from the skewed and scrambled images.

    There's also PWNTcha, a CAPTCHA decoder. [zoy.org] (Previously slashdotted.)
  • Re:Capcha? (Score:2, Informative)

    by Cramer ( 69040 ) on Monday October 24, 2005 @11:51AM (#13864307) Homepage
    Capchas don't solve anything. 90% of them are easily decoded by software. (Software made them, software can decode them.) And as others love to point out, there are ways to get actual people to decode them for you. [However, I've never seen actual evidence of one of the "pr0n traps".]

    The only thing that appears to work is charging for new accounts. Yes, it's annoying. Yes, it will drive some, otherwise legit, people away (because they don't use online payment systems, etc., etc.) And yes, it's a hassle for the site. But, aside from stolen credit cards, there's no getting around it. (And very few spammers are willing to commit credit card fraud to increase their pagerank.)
  • by PeeAitchPee ( 712652 ) on Monday October 24, 2005 @12:48PM (#13864788)
    Maybe beatable, yes, but still 99%+ effective and definitely not obsolete in practice. Most of the successful existing CAPTCHA attacks use a dictionary matched to the default wordlist that ships with the CAPTCHA and can usually be defeated by running the CAPTCHA in random mode with a few more characters than usual. I get maybe four or five hand-entered spam comments / week, which are usually quickly blocked after the first attempt by blacklisting the target "online drugstore" / poker / whatever site's URL. If I shut my CAPTCHA off I get *thousands* of spam comments / week. So while the technology has its limitations (such as, for instance, excluding blind users), it's a tradeoff that most individual blog owners find beats sifting through hundreds or thousands of spammed comments / week.
  • by Anonymous Coward on Monday October 24, 2005 @01:23PM (#13865039)
    While Google is the *best* commercial search engine it completely ignores the most useful information that can be found through the "Invisible Web" research.

    Sure if you wanna find this or that web site or quick info, Google is great. But when you want to find something truly meaningful that you can use as reference, try http://lii.org/ [lii.org] or http://dmoz.org./ [dmoz.org.] Of course this requires subject search (much like going to the library) and recognizing the set of terms you want to find. I just discovered http://www.factbites.com/ [factbites.com] is a decent search engine Web site that digs through other "invisible web" sites to deliver results.

    People really have to get out of this "Google or bust" mentality if they want to get any real research done.

    If you're *really* desperate for a commercial search engine, just go with www.dogpile.com it compiles searches from yahoo, google, jeeves and MSN Search.

    ps: PageRank flaws are considered "GoogleHoles" coined by Steven Johnson
    http://slate.msn.com/id/2085668/ [msn.com]
  • Comment removed (Score:3, Informative)

    by account_deleted ( 4530225 ) on Monday October 24, 2005 @01:52PM (#13865276)
    Comment removed based on user account deletion
  • by LocoMan ( 744414 ) on Monday October 24, 2005 @03:00PM (#13865784) Homepage
    You can already. Just add -site:(URL here without the ()'s) at the end of the search, as many as sites you want not to be listed in the results... :)

Old programmers never die, they just hit account block limit.

Working...