Cornell Software Fingers Fake Online Reviews 122
Eric Smalley writes "If you're like most people, you give yourself high ratings when it comes to figuring out when someone's trying to con you. Problem is, most people aren't actually good at it — at least as far as detecting fake positive consumer reviews. Fortunately, technology is poised to make up for this all-too-human failing. Cornell University researchers have developed software that they say can detect fake reviews (PDF)."
Re:power (Score:2, Informative)
Re:read negative ones? (Score:3, Informative)
Not the GP here, different AC who mostly agrees with him. I also agree with you re: informative stuff. Both filters are useful alone, better in concert.
What about fake reviews that are posted by the competition. I have a hard time believing that anybody that is posting fake reviews of their products isn't doing that for their competition as well.
It's simple economics. You spend your review-spamming money where you get the best ROI.
If I post a positive review for my product, I get all the gains, while the loss of sales is distributed over all my competitors _and_ the null competitor, (i.e. people who wouldn't have purchased anything but for my review).
If I post a single negative review to a competitor, he takes all the loss, while the gains are distributed over all my other competitors and me and the null competitor (people who would have bought from my competitor, but now don't buy at all).
So in a case where there's me and one other competitor, and assuming equal influence of positive and negative reviews, positive reviews are slightly better (because they pull customers in, rather than putting them off entirely). Equal influence is bullshit. of course; if my competitor is larger, a negative review there will be seen by more people, so more effective, and many people do have a skeptical bias (as recommended by GP and seconded by me), so negative reviews will have more influence. This means in a two-seller market, negative reviewing likely is viable.
Still, if there's a "sufficient" number of competitors (depending on exact influence levels), I'm obviously going to see much more benefit per review from positively reviewing my own than negatively reviewing competitors. Now there's diminishing returns with more fake reviews for myself, so at some point a negative campaign will be better ROI than further expansion of my positive campaign; I personally suspect other advertising techniques will have better ROI, so you'll likely never get to that point, but that's just a hunch. I wouldn't go as far as to suggest negative reviewing has negative ROI, so it's still quite possible if you have ridiculous advertising budget and have exhausted all higher-ROI approaches, it's just not prevalent in practice.