Reviews and substantial software

This uncomfortable but entertaining exchange between Leo Laporte and Mike Arrington got me thinking about software and hardware reviews in the computer/tech business and how I’ve long distrusted them.

This is the section of last weekend’s Gillmor Gang in which Laporte blew up at Arrington.

Laporte is pissed because he thinks Arrington is saying that Palm bought him off by sending him a free Pre. I think Arrington is getting at something slightly different: that Palm sent Laporte a Pre because it believes, for whatever reasons, that he will be predisposed to giving it a good review. Arrington did a pretty poor job of getting this idea across, and it’s not surprising that Laporte took it as an attack on his integrity. Further, Arrington’s “What are you going to do about it?” at 0:36 was inexcusably childish. Laporte was right: Arrington is an asshole.

The two have exchanged apologies, which is too bad. A good feud can be fun to follow, and it might have led to more discussion of reviews in the tech industry.

My distrust of reviews doesn’t come from any sense that the reviewers have conflicts of interest--I accept that what the reviewer says is his or her honest assessment. I’m not as sanguine about the relationship between the product makers and the publishers of product reviews. Publishers, whether print- or web-based, rely on the advertising dollars of the products’ makers. Product makers need good reviews that they can use to push their wares.

But even this potential conflict of interest isn’t my main problem with software and hardware reviews. My main problem is that most reviews are written after the reviewer has had only a very short time with the product. The reviews then necessarily end up being more a description than a critical assessment. In particular, for the software people use regularly--word processors, spreadsheets, databases, image editors, text editors, email clients; what I call substantial software--such reviews are almost always too superficial. They don’t give the reader a sense of what it’s like to live and work with these programs day after day. In fact, they can’t give the reader that sense, because nobody wants reviews of software to come out months after the software is released. And yet, for this class of application, that’s how much time is needed to really assess its value.

No doubt, there are some products for which a few days of use is sufficient. Most games, I think, would fall into this category, as would narrowly focused applications like Photoshop plugins. It’s also reasonable to believe that software updates can be assessed in relatively short order if the reviewer has been a regular user of the previous version. I have no doubt that someone who makes his living pumping out text in Microsoft Word N would need only a week or so to take the measure of Microsoft Word N+1.

Are there any good reviews of substantial software? Sure. When Andy Ihnatko discussed the word processor Scrivener in his Sun-Times column (which seems to have dropped out of the paper’s archives), it was after he’d used the product enough to know that it fit his needs; he’d actually switched to Scrivener from whichever word processor/text editor he’d been using before.

More often, though, the best discussion of the strengths and weaknesses of substantial software is not in reviews, but in the blogs and wikis of regular users. John Gruber mentioned this in a post last year. He linked to Alex Payne’s “How I Use TextMate” post, calling it an example of “experts writing about how they use their tools.”

Many Macintosh applications give you a 30-day free trial, which is great if

  1. you’re actually going to use it for many, if not most, of those 30 days; and
  2. you have enough background information on the application to really be able to exercise it.

For substantial software, the “experts writing about how they use their tools” articles are what you need to have in hand to get the most out of your free trial.