08 February 2012 ~ 0 Comments

Spotting the Fakes

Brad Tuttle, Time Moneyland:

Bing Liu, a University of Illinois at Chicago computer-science professor, has made a habit of calling attention to fake online reviews. He is one of several experts trying to develop sophisticated detection software to rid the Web of “opinion spam,” as he calls it, which includes fake reviews, fake comments and fake blogs. Liu has estimated that for some products, it’s a safe bet that 30% of the reviews are fake. As things now stand, it’s easy to see why businesses are interested in pumping up online ratings, even if they have to resort to fraud. Liu told the Times:

“More people are depending on reviews for what to buy and where to go, so the incentives for faking are getting bigger,” said Mr. Liu. “It’s a very cheap way of marketing.”

A group of Cornell researchers is also working on algorithms that would out fake reviews. In their study, “Finding Deceptive Opinion Spam by Any Stretch of the Imagination,” researchers used their software to sift through a pool of hotel reviews — 400 fake, 400 real. The software detected the fakes 90% of the time, while a group of humans challenged with naming the fakes was correct only slightly more than half the time.

One of the report’s authors, Myle Ott, spoke with me last week and explained that the software takes note of subtle signs that most people overlook. “Truthful reviews tend to have more punctuation, such as dollar signs, which indicate a specific that’d only be known to someone who has been there,” he said. “There are also more specific details, like the hotel location or that the room was small or large.”

Fake reviews, by contrast, tended to have more superlatives and adverbs in the writing (makes sense) and more details that were “external to the hotel,” such as whom the reviewer was traveling with. The fakes were also filled with pronouns, rather than proper names — because someone who had never been to a hotel wouldn’t know the name of the bellman or the woman at the front desk.

While no major websites are using Cornell’s software, anyone who is interested can give it a try at ReviewSkeptic.com. It’s still marked Beta, but you can cut and paste any online review and the software will instantly do an assessment and state whether the review is truthful or fake. The site notes that it “works best on English hotels” and that it’s “currently offered for entertainment purposes only.”

The Cornell team’s latest research, which Ott wasn’t fully at liberty to discuss, expands well beyond online hotel reviews. It was prompted in part by its discovery that fake online reviews are being solicited by doctors.

“That was really disturbing,” said Ott. “The worst thing that could happen because of some fake reviews at a hotel site is that you spend the night in a bad hotel. But doctors? We’re talking really high stakes.”

At least, according to Ott’s research, doctors’ reviews don’t have the highest prevalence of deception online. That ignominious distinction belongs to sites such as Yelp and TripAdvisor, which allow anyone to post a review, without requiring proof that the reviewer has actually done business with the hotel or restaurant being rated. (Some sites, includingHotels.com and Priceline, only accept reviews from customers who have booked rooms.) Ott says that at the sites with the highest rate of deception, around 10% of the reviews are flagged as fake by the software.

TripAdvisor, which, interestingly enough, changed the slogan in its online reviews section from “Reviews you can trust” to “Reviews from our community” last fall, responded to the Cornell team’s findings by pointing to a survey it commissioned showing that 98% of respondents said that TripAdvisor hotel reviews were accurate to their experience. TripAdvisor also told me that “attempts to manipulate TripAdvisor are extremely rare,” and that “we have a zero-tolerance approach to all fraudulent activity and we have measures to penalize businesses for attempts to game the system, including affecting their popularity rating on the site and posting public warning notices on properties that have made attempts to manipulate their rating.”

Lots of people out there thinking about this problem.  Interesting that Cornell is now looking at doctors…

Leave a Reply