Cornell Software Fingers Fake Online Reviews 122
Eric Smalley writes "If you're like most people, you give yourself high ratings when it comes to figuring out when someone's trying to con you. Problem is, most people aren't actually good at it — at least as far as detecting fake positive consumer reviews. Fortunately, technology is poised to make up for this all-too-human failing. Cornell University researchers have developed software that they say can detect fake reviews (PDF)."
First Review? (Score:5, Funny)
This topic is TOPS! It is a beautiful summary!!! I highly recommend it!!!
Re: (Score:1)
The title needs work. I spent about 15 seconds wondering what a "software finger" is.
Re: (Score:2)
"Come on, chop chop. We haven't got all day you know."
Re: (Score:2)
Re: (Score:2)
I don't know but I know now that they are faking online reviews.
Nice. (Score:1)
They also gave hints on how to defeat it.
power (Score:1)
if (review == negative && product_made_by == someone_we_received_cash_from)
review_fake = true
Re: (Score:2)
Thanks. I was wondering why it didn't compile.
Re: (Score:1)
Ship it anyway: We'll just buy some good buzz so it sells. You can fix it when the support calls come in.
Hugs and Kisses;
The Marketing Department
P.S. Your supervisor will be calling you shortly.
Re: (Score:1)
Re: (Score:1)
Re: (Score:2, Informative)
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
I see now. I misread "they" for "me", so I interpreted the joke from the perspective of lazy scientists instead of unscrupulous scientists. For instance, the following might be a lazy scientist's code:
if (review == positive && product_made_by == someone_they_received_cash_from)
Re: (Score:3, Insightful)
What about fake reviews that are posted by the competition. I have a hard time believing that anybody that is posting fake reviews of their products isn't doing that for their competition as well.
Ultimately, I tend to look for the reviews that are the most informative, things that look plausible and give me more information than what's in the listing. Usually those are fairly reliable as they're harder to fake. Not that it's a perfect system, but it is more time consuming to post a review like that since yo
Re: (Score:2)
Anyone with an IQ over 100 can filter stuff. I was looking for a particular camera and all of the 'negative' reviews were people that expected it to come with more than a 32MB SD card or AA batteries. They were utterly crushed when their "christmas was ruined" because of these details. That or people that had a problem with Amazon and had no fault of the product itself.
There were a core group of middle of the pack reviews that touched on all the positives and negatives and then the praising reviews that may
Re: (Score:2)
When looking at eBay buyers I don't spend much time on the positive reviews, I spend most of my time looking at the negative ones to get a better idea about the seller, mostly about how they handle dissatisfied customers. Oh and by not buying from the Asia region. Yeah yeah, its just that there are high leve
Re: (Score:2)
"Anyone with an IQ over 100 can filter stuff."
That's not true: We have inherent unconscious bias (M.Gladwell, et. al.) and it's powerful enough to make us eat more food, buy more products, etc., because the "game" of selling to you capitalizes on your blindness, yes you (yourself) will have a subset of the human traits that make you a sheep to some marketing campaigns. Moo.
# (: There are two kinds of people in this world: You, who think youre immune from the science about human behavior ("Duh I filter all
How long as reviewer used product? (Score:2)
Agree. Also, there are too many reviews by people who have only used their product for less than a week, and YOU CAN'T TELL WHICH ONE THESE ARE (unless they state so). "I've played with this thing for one whole day now! FIVE STARS!!!!"
I would much rather have reviews from people who have used the thing for 3 months, to see if there are any late-showing
Re: (Score:2)
Re: (Score:3, Informative)
Not the GP here, different AC who mostly agrees with him. I also agree with you re: informative stuff. Both filters are useful alone, better in concert.
What about fake reviews that are posted by the competition. I have a hard time believing that anybody that is posting fake reviews of their products isn't doing that for their competition as well.
It's simple economics. You spend your review-spamming money where you get the best ROI.
If I post a positive review for my product, I get all the gains, while the loss of sales is distributed over all my competitors _and_ the null competitor, (i.e. people who wouldn't have purchased anything but for my review).
If I post a single negative review to a competit
Re: (Score:1)
Economics does not trump morality and ethics.
Re:read negative ones? (Score:5, Interesting)
That's why I appreciate Amazon's "verified purchaser" or whatever it's called. When a review is posted by someone who actually bought the product from Amazon, it shows up beneath the title of the review. By only looking at those, I'm able to eliminate a lot of the junk reviews. Now, nothing stops the competition from actually buying the product and then trashing it, but I can also look up other reviews by that reviewer. This gives me very good insight into the way that person thinks.
I suppose nothing stops a phony from creating an Amazon review, buying a bunch of products and then buying the product he intends to trash with a bad review. But I figure that's not really likely. And I could probably figure it out by reading their previous reviews anyway.
Re: (Score:2)
Except that some of the reviewers receive free merchandise to review as a part of that program that Amazon has for their top reviewers.
Re: (Score:2)
Re: (Score:2)
Yes, but as long as the recipients of the free merch are duly noted, then there's enough transparency for me to make informed decisions.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Delicious copypasta.
I'm apparently pretty good at rating myself (Score:1)
I picked out the fake review immediately.
Re: (Score:2)
I did too! It's not hard to spot fake reviews!! I'll be sure to spot more in the future!
It's easy - fake reviews are overly-enthusiastic, and some of them go as far as to slam other brands. Real positive reviews are usually more sedate, little over-use of exclamation points!! and usually point out shortcomings of a product. Also, companies who astroturf usually submit multiple reviews, and it's usually posted in the same wording and typing style, which makes the fake reviews stick out even more. That's why
Re: (Score:2)
I look for reviews that include both positives and negatives. Nothing is perfect, and most good reviewers will find at least one drawback or limitation in the object being reviewed. Reviews that are nothing but positives I don't give a lot of credit to - eve if they are real, they're probably not being objective.
I know anytime I write a review, I give a quick breakdown, hit the positives, hit the negatives, and close with a summary. (a lot of professio
Re:I'm apparently pretty good at rating myself (Score:5, Insightful)
If I was being to paid to astroturf a product, I'd prepare a number of generally positive and a few 'negative' reviews that pointed out a few flaws in the product. Of course those would only be flaws that the majority of people will either not care about, or will see as positive to themselves. The idea is to seed a small element of trust in the product, and take away the feeling of blind risk from the potential customer.
In some of those reviews I'd also mention another product or two that I felt were 'better', again only in some specific way that most people wouldn't care about. This seeds an element of legitimacy to the product, especially when comparing it against known good competition. The idea is not to get 100% of people thinking your product is the best, but to catch a large number of customers that would have otherwise never even considered your product.
The third sneaky thing that I'd do would be to review a competing product or two with a very minor mention of the product I'm being paid to astroturf. In that review I'd be generally positive, while at the same time throwing in a few questions that seeded some doubt and uncertainty about some seemingly important aspect of the competitor's product that my actual product reviews (and legitimate advertisements) had covered as being fully supported. In this way, the person 'researching' the general class of product is likely to do some further research into 'my' products - specifically searching for these 'missing' or 'doubtful' aspects that I've alluded to.
The above tactics are readily seen across the board in general advertisements where a company will harp on about some new 'feature' that they have - especially a feature that competing products don't officially advertise or mention - thus implying that the feature does not (or may not) exist in those other products. In reality the feature is a straw man style argument that compares apples/oranges in a way that less than 10% of the potential market would see as an invalid or incomplete comparison.
How does anyone know that there aren't people out there readily using any or all of the above tactics?
Re: (Score:2)
These are effective strategies that have been in use and working for some time. There are a few refinements the past decade. People often skip the middle of the review or zone out in the middle. So when you're 'turfing a product with fatal flaws like for example a phone that can't multitask, has no apps and can't even set your "I like big butts" MP3 as a ringtone for your amply so endowed love interest like other phones can, that's a good spot to pretend those slights don't exist and instead go on about s
Re: (Score:2)
So when you're 'turfing a product with fatal flaws like for example a phone that can't multitask, has no apps and can't even set your "I like big butts" MP3 as a ringtone for your amply so endowed love interest like other phones can, that's a good spot to pretend those slights don't exist and instead go on about some minor flaw that absolutely nobody could care about before dismissing it as a minor issue not worthy of subtracting a star.
It's interesting you point out well known issues in the original iPhone that you feel are 'fatal flaws' in one of the most successful single products on the market at the time. By extension of its success, I'd say those features are were significantly less important to a huge majority of Apple's target audience, and were in fact the minor and not the major flaws.
Now imagine that Apple's advertising agency had a number of astro-turfers out there specifically seeding those 'flaws' into general discussion alon
Re: (Score:1)
Congratulations - you've outed yourself. You're who we're talking about. That wasn't hard. For your sake I hope this is a throwaway account. I've friended you here and I hope all my friends will too, so we can see the nonsense you post. Now that you're stuck in it, do you see the trap? In hindsight it should be obvious. Those were features iPhone lacked about a billion phones ago, when they faced competition that didn't have those features, or lacked other critical features iPhone did have. They're al
Re: (Score:2)
Dude, feel free to hate Apple. And hate me too if you want.
But, I have no affiliations with Apple, Microsoft, Google, etc, and I don't work in marketing or advertising.
Yeah, WM7 phones had the same issues at launch, but I used the original iPhone as a counter-example because history has shown that at the time it was incredibly successful despite those things that you claim are major flaws. Those were all issues that were regularly echoed in this forum and on many other internet blogs and boards. Yet, people
Re: (Score:2)
Now you're posting as anonymous. Hehe!
dreadnoughts of Amazonia (Score:2)
Flawless astroturf is indistinguishable from hard work. It's certainly possible, but is it actually cheaper than not making a crap product in the first place? The makers of flawless astroturf are unlikely to be employed by Motel 69. I've heard it presently costs somewhere north of $500m to get a new drug approved by the FDA. The dreadnoughts of Amazonia are overstated.
I've argued several times recently for the virtues of pseudonymous pluripotency and against the consolidated identity of Google+. But t
Re: (Score:2)
The Slashdot comic-box gods are on to me:
How sharper than a serpent's tooth is a sister's "See?" -- Linus Van Pelt
One instance of the word "sample" in my post above could perhaps been rendered as "wind-up" instead, rife with puns and elisions ... and Charlie Brown's pitching elbow. Makes for a bad sentence. The sibilants of Spain are mainly the same. Is that a linguistic fact? I wonder.
Apples vs Oranges (Score:1)
Trash the apples in your opponent's plan and present the oranges of your own plan as apples.
Re: (Score:1)
Re: (Score:1)
Oh yeah...honest reviews of sufficient length always have unrelated details. Some promotional slob in an office somewhere isn't going to go off topic. Ya know, crap like "I treated my self to the Sopranos when I stayed there!". or "The maid was hot"
So, if you're like me, and into writing fake reviews...(yeah yeah, I know). don't capit
Re: (Score:1)
On the other hand you had a 50-50 shot to start with and had the advantage of knowing that one was fake.
Re: (Score:2)
Re: (Score:1)
Now they have to make it to recognize fake profiles in dating sites (grin)
Stupid project (Score:1)
Re: (Score:1)
I think the "undecidability" factor comes into play in any machine learning problem. For example, which email is spam and which isn't? But significant strides have been made to solve that problem and I don't see why online review system is any different.
Re: (Score:1)
(how in this day and age we still don't have ability to edit posts on Slashdot).
Re: (Score:1)
You clearly didn't click on his link, where you would have been disabused of your notion.
Re:Stupid project - this is Spam! (Score:1)
Re: (Score:2)
Re: (Score:1)
I think it incorrectly states "fake" where it should really say "automated".
I would say that it's more of a conflict of interest than fake or automated. The reviewer may be a real person and he or she may have actually owned/consumed/used the product being reviewed. The issue is that the reviewer is in the employ of the product manufacturer or marketing agency hired by the manufacturer.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
It's an "undecidable" problem. Think for a moment how you'd define "fake".
Do you mean fake as in promises, or fake as in breasts?
Re: (Score:2)
How about untruthful? It's not decidable in the provable mathematical sense, but they appear to have a statistical classifier system.
As for how they defined "fake", they went to Mechanical Turk and deliberately paid people for "fake" reviews. As this is one particular behavior we wish to detect and punish, I don't care about the ontological arguments. The real problem is whether this can cope once shills use it to tweak their bullshit until it "passes".
It works... (Score:1)
Link to the paper (Score:3)
I think you want the link for the paper [aclweb.org], rather than the slides.
Re: (Score:1)
Arms race? (Score:1)
If this catches on it will just make the fake-reviewers work harder, but it won't, by itself, stop them.
Reputation-based reviews are probably coming. Of course, it's possible to create a fake reputation if you plan ahead of time.
The one on the right is fake or flake (Score:1)
The bubbly-over-the-top sound of the review on the right screamed "this is not the opinion of a person exercising sober judgment."
So, even if it wasn't a fake it wouldn't be useful.
The one on the right is formulaic enough that it *could* be fake, so I would need to rely on the reputation of the author or publisher. If I saw it in a major newspaper's web site with a byline of a newspaper employee or a reputable wire service, I would assume it was legit. If I found it on some random blog or a web site where
Re: (Score:1)
well, don't use the Ringblat 330x camera to take the pictures. I heard they are not very reliable. Try the new ZoxRoc Giggity-Pixel Extreme for super best results.
Negative reviews (Score:1)
This is why I look at the negative reviews first to see if there are any particularly frequent problems with the product.
Of course, a few years ago when I was apartment shopping online, I found that this tactic doesn't always work so well. On one side, you have the management companies posting shill positive reviews for their apartments, and on the other side, you have disgruntled evicted tenants posting overblown negative reviews for those apartments.
Just like Slashdot (Score:2)
and on the other side, you have disgruntled evicted tenants posting overblown negative reviews for those apartments.
All they do is gripe about crappy Windows.
Re: (Score:1)
Cornell Software Fingers Fake Online Reviews (Score:1)
Its easier than that.. (Score:1)
its *ALL* fake. Either pissed off people or paid supports.
Here's news (Score:1)
Lets be realistic.... (Score:3)
... what we know is that this is just another form of information security.
The people who produce false reviews will develop a tool that not only fakes the reviews, but then applies this exact software (in the article) to analyze it, and then provides logical adjustments until this exact software cannot discern the difference between the adjusted outcome or real reviews.
All these cold wars suck. A little honesty and integrity in the world would be great, but when capitalism pits us against each other to survive (not for wants, but for needs), this is exactly what you should expect. I prefer cooperation over competition, when it comes to survival, but if the social environment dictates that I must do bad to survive, I must survive.
If you respond, please don't be the shallow minded bum that thinks there isn't a causal relationship between what I just said and how people fake reviews (for revenue).
Re: (Score:2)
when capitalism pits us against each other to survive (not for wants, but for needs)
http://en.wikipedia.org/wiki/Fundamental_human_needs [wikipedia.org] puts this statement in context. I can sympathise with your feelings, but in a 1st world country, the items on this list are available in abundance. It is true that there are some people who are unable to take advantage of this abundance due to mental illness and the like, but I don't agree that competition for resources is a large factor.
Rather, I think that perhaps rampant consumerism couches our wants in terms of needs. We follow a logical fallacy wh
Re: (Score:2)
I wish I had time to respond in full, but I don't right now (pretty busy with life, sorry about that because you deserve more from me).
But what I can say is that the needs I'm talking about are absolutely realistic, encompassing the complexity of society as it exists today and the intricacies that induce participation in competitive roles to fulfill needs. I could brainstorm many points in the fundamental human needs you posted that are not available by social means, especially once one considers relative d
Re: (Score:2)
Most Americans would become homeless after a year of unemployment, or even lack of full-time employment, thanks to inflated rent and real estate prices. The way things are going, this is a very realistic scenario for many people, that scares them shitless.
In most other countries unemployment will make you uncomfortable, but in US it means, you are literally left to die on the street.
Re: (Score:2)
The people who produce false reviews will develop a tool that not only fakes the reviews, but then applies this exact software (in the article) to analyze it, and then provides logical adjustments until this exact software cannot discern the difference between the adjusted outcome or real reviews.
That's a very real problem. A few years ago, phishing sites could be distinguished from real sites by observing bad grammar, poor layout, and other indicators of low quality. Today, phishing sites look very much like real ones to humans.
For a few years, until 2008, there was the Web Spam Challenge [lip6.fr]. A large number of web pages had been classified by humans as "spam" or "not spam", and people ran classifiers against them to try to match the human judgement. That used to have some effectiveness, but the qu
How's it do on sarcastic reviews? (Score:1)
Re: (Score:2)
Re: (Score:1)
Kind of Cool (Score:2)
Verified owners... (Score:2)
Re: (Score:2)
Most product review shows on TV (Score:2)
But how well does it work? (Score:2)
Re: (Score:1)
Leverage slashdot moderatation? (Score:1)
Using the very same method, we could leverage moderation logs to predict which comments are informative, funny, deceptive...
Please admins, where's the API?