When it Comes to Wine, Love is Blind

Very shortly after we released our first wines, we also attracted our first Facebook troll. Heather (I’m using her real name here because, why not?) commented “looks like shitty wine.” This was a curious statement on a couple of levels. For one, why would a stranger take time out of her day just to be rude? But more to the point, how can anyone claim to judge a wine by the way it looks in an online image? If she poured a glass and observed apparent signs of age, mistreatment, or poor color, then OK. But she had not. A few days later, she commented again: “Pretty pictures to sell shitty wine.” For Heather, the combination of classic art and wine in commerce was a turnoff. Fair enough. But not exactly fair.

We think the beautiful art on our bottles is a reflection of what to expect from the wine inside. But Heather sees it differently. So it should not surprise us that wine tasters (even the pros) make all sorts of judgments about wine before a bottle is even opened. The producer, the region, the label, the alcohol content, the vintage, the weight of the bottle, the type of closure, and so on. Which is why we believe all professionals should taste exclusively blind (i.e., without knowing the source) when rating or reviewing so as not to be swayed by irrelevant factors. Here’s a good summary of which publications do so as well as a different take on some pros and cons of blind tasting for review.

The Emperor’s New Scores?

So why aren’t all professional tastings executed blind? There are various reasons but it cannot be ignored that blind tasting results don’t always flatter the tasters. In fact, it’s been shown many times that professional tasters can be quite subjective and inconsistent. The Guardian offers this fascinating summary of some revealing studies here. We’re not saying that all reviews and tasting results are useless, but with one substantial study demonstrating a swing of plus/minus 4 points for the exact same wine from the exact same taster (meaning a 90-point wine one day could be an 86 another day and a 94 the day after that), we think scores and reviews should be considered with a pinch of skepticism.

WSSept1996.jpg

An admirable experiment from Wine Spectator magazine in 1996 illustrates the case. The publication placed two respected wine critics (James Laube–an authority on California wines, and James Suckling–an expert on wines from Bordeaux) in a room together to taste and rate wine simultaneously for a feature called “The Cabernet Challenge.” These two highly experienced wine tasters (with 31 years of professional experience between them at the time) reviewed the same wines from the same bottles in the same physical space at the same time. Forty wines were tasted and 5 received the exact same scores from both tasters on a common 100-point scale (meaning that 12.5% of the time the experts agreed precisely). Ten wines were scored within 1 point of each other (25%). Eight wines scored within 2 points of each other (20%) and 4 wines scored within 3 points of each other (10%). In total, 67.5% of the wines were scored with no more than a 3-point difference. That’s pretty consistent, right?

On the other hand, isn’t a 3-point spread fairly significant? All other factors equal, if I’m presented an 87-point wine and a 90-point wine, I know which I will select every time. For that matter, is a 2-point spread insignificant? If you’re choosing between two unknown producers of the same variety, same vintage, same region, same price, would you not be drawn to a 90-point wine over an 88? At any rate, pairs of scores in this tasting diverged significantly (by 4 points or more) for a third of the wines tasted. I’ve never met a wine drinker who wouldn’t think of an 86-point wine and a 90-point wine quite differently. Here are some divergent pairs of scores from the tasting:

cab challenge.png

Note that only 4 (of 20) California wines were given significantly different scores, while nearly half of the French wines received scores with a difference of 4 points or more. (If you have a theory about this, please email me because I’m stumped.) More interestingly to me, the expert on each region consistently gave the higher score to the wine from that region (100% of the time for James Laube and 78% of the time for James Suckling). Surely that suggests a preference for whatever one knows best? Most importantly of all, there is a spread of as much as 9 points in these apples-to-apples scores. How crazy is that? Pretty. Unfortunately, the magazine has not since repeated such an experiment.

It’s Not Just About the Tasters

Established luxury wine brands have nothing to gain and everything to lose from blind tastings because they don’t get the benefit of their reputation. It’s not hard to imagine that some might discourage publications from tasting their wines blind because the most expensive wines don’t always win in such conditions. On the other hand, a new winery like ours has everything to gain from blind evaluation (so we’re biased–who isn’t?). Consider a recent example: Our $18 2013 Cabernet Manet received an “Outstanding” rating from a professional blind tasting panel placing it between two better-known wines–one costing $160 and the other $100. We reveled in that, of course. The big names on either side of our affordable Cab didn’t get much out of it, I’d guess.

So What’s a Consumer to Conclude?

Scores have their place in a world where wine drinkers are swimming in options. If you’re not familiar with a region or producer or style of wine, scores can help you get started (especially when from a professional taster who has provided compatible advice for your specific palate in the past). But so can wine retailers you know and trust. Or friends, of course. At any rate, evaluating and scoring wine is inherently (at least somewhat) subjective so we support those who do so blind, thus removing opinion-influencing factors that are not the wine itself. Blind is best!

 
 NOT a good idea for visiting your local park. But the only way to taste wine for review.

NOT a good idea for visiting your local park. But the only way to taste wine for review.

 

If you really want to get into the weeds on this topic, the American Association of Wine Economics (yes, that’s a real thing) has you covered. Here are three in-depth studies on the reliability of wine scores.

Journal of Wine Economics article 1

Journal of Wine Economics article 2

Journal of Wine Economics article 3


Take a risk and try a few wines without knowing what they are or where they’re from. It might lead to a good wine surprise! And if you have one to share, be sure to email us.