Write a comment

Almost all audio product reviews share the same conceit: the idea that the opinions of the person doing the review will correspond with yours. That’s because almost all evaluations of audio products are performed by a single reviewer, with negligible, if any, solicitation of or mention of others’ opinions. The premise has always been that because the writer is an expert, he (or in very rare cases, she) will understand the product well enough to predict whether or not an audio product will be a good choice for you -- or at least he or she will be able to describe the product’s characteristics accurately enough for you to get a good idea of whether or not you’ll like it.

In my 28 years of writing and editing audio product reviews for at least 16 different publications, I’ve worked closely with well over 100 writers and editors. I’ve also had a chance to evaluate the work of a couple hundred more, through reading their reviews of products I’d also tested. In that time, I’ve seen how much writers’ opinions of an audio product can vary -- and, sometimes, miss the mark entirely.

Brent Butterworth

I’ve seen writers overlook obvious performance flaws that measurements or user reviews easily uncovered. I’ve seen them praise products that had severe tonal colorations -- which, again, were found in other reviews or in measurements. I’ve seen them proclaim one product clearly superior to another when the two were, in fact, nearly identical. I’ve seen them trash products that were very successful in the market and, by general consensus, skillfully and thoughtfully engineered.

In short, while I often find the opinions of audio writers interesting, I see no reason to trust them as a sole gauge of audio product performance. Sadly, that goes for my own opinions, too.

I’m fortunate in that early in my career, a wise and experienced person guided me toward a better method. That person was Lance Braithwaite, former technical editor of Video magazine (since absorbed into Sound & Vision), where I began working in 1989. One of my first assignments at Video was to write up a comparison test of VCRs that Lance had organized. He used a calibrated monitor and a professional-grade switcher, and brought in all the Video editors as well as a few freelance writers to give their opinions without knowing which VCRs were in use. This way, he could get multiple perspectives on what differences the participants saw, and how important those differences were to them -- and by extension, how important these differences would likely be to readers. We later used this same structure for countless multiproduct reviews of TVs, video sources, camcorders, and audio systems.

Video

The benefits of this approach are obvious. If, say, four out of five panelists share a viewpoint about a particular product, you can make a pretty solid case for or against it. But if all you know is that a single person likes or dislikes that product, how do you know if he’s one of the four in the majority . . . or the outlier? You don’t.

Sadly, the reasons most publications don’t use multi-listener panels are just as evident.

The logistics alone can be daunting, especially in a time when audio publications make just a small fraction of what they did when I started in this biz. Writers or editors have to spend somewhere between four and 16 hours putting together each test, then enlist (and probably pay for) the services of nearby people who are interested enough in audio to spend an hour or two in the demanding, focused work of listening tests. It’s possible to ship small audio products around to gather multiple opinions, but again, it involves hours of the writer’s time and incurs significant costs.

More troublesome, though, is that the writer and editor, if they’re honest, lose some control over the review process. I’ve conducted and published somewhere around 100 panel tests (easily more than all other active North American audio writers combined), and I’d estimate that in about 75 percent of them, the results are different from what they’d be if only I had done the test. It always pains me to present those results -- but I always present them, knowing that the group opinion will better predict what you’ll think of the product.

So while I wish audio publications would do more multi-listener tests, I don’t blame them.

Fortunately, though, because headphones are small and easy to deal with, and because I live in a city with a large population of music and audio professionals, we’ve figured out a way for me to bring multiple listeners into our testing at SoundStage! Solo. In most of our tests going forward (beginning with last month’s review of the Monoprice Monolith M650 headphones), I’ll be gathering the opinions of two additional qualified listeners. I may be using different listeners from month to month, but all will share a passion for music and audio, and I’m hoping they’ll all be younger than me (so their high-frequency hearing is better) and that every test will include at least one female listener.

Ideally, I’d love to do these tests with a panel of a dozen outside listeners, but that would chew up far more time than I can afford, and more money than SoundStage! would prefer to spend, without any clear economic benefit to either of us. I’d love to do them under blind conditions, but with headphones, that’s extremely complicated and time-consuming, and the listeners can still be biased by the weight and feel of the headphones.

My “gold standard” for audio product reviews has always been double-blind tests with a dozen listeners, accompanied by measurements. Unfortunately, I’ve never had the resources to accomplish that, but I think this is a step in the right direction, and I hope you enjoy it!

. . . Brent Butterworth
This email address is being protected from spambots. You need JavaScript enabled to view it.

Say something here...
You are a guest ( Sign Up ? )
or post as a guest
Loading comment... The comment will be refreshed after 00:00.

Be the first to comment.

Latest Comments

a href="https://bestserviceis 19 hours ago Technics EAH-AZ70W True Wireless Earphones

Istanbul is a city that offers a perfect blend of history, culture, and world-class services. ...
If you love music and want to experience it like never before, I’d definitely recommend ...
I’ve been using the Sanwear GT earbuds from San Sound for almost a year now, ...
end of tenancy cleaning price 2 days ago Technics EAH-AZ70W True Wireless Earphones
End of tenancy cleaning prices are designed to be affordable and fair, ensuring that tenants ...
Hi Brent and David, I ordered your new headphone test album back on the Halloween ...
Man I ran a set of these headphones as A Bluetooth speaker and they go ...
headphonejack 2 months ago Is Accuracy the Only Option?
Is accuracy measurable? What do you consider the most accurate headphones? Or maybe top 5?
Doug Schneider 2 months ago Is Accuracy the Only Option?
It's an interesting point that's made. The only thing I'll say contrary is that with ...
Doug Schneider 3 months ago PSB M4U 9 Headphones
@steve pageThis problem is an interesting one. If you see this post, let us know what ...
Doug Schneider 3 months ago The Rise and Fall and Maybe Rise Again of MQA
@AndrewI don't know who was actually employed, but I have to say that the original ...