Are Readers Really Abandoning Pitchfork and Gearslutz?

In this article we investigate claims that readership at the music review website Pitchfork, and the audio forum Gearslutz, have begun shrinking after many years of unprecedented growth. We turn to the numbers and ask if they’re actually losing readers, what it might mean, and whether it’s truly possible to measure influence online.

The music review website Pitchfork, and the audio review forum Gearslutz, have a few things in common. They’ve both grown from humble origins to become dominant influencers of opinion in their fields.

Both sites command first-place results across a variety of common web-searches in their fields. And, although their sheer volume of web-traffic casts a shadow over similar outlets, they manage to maintain their positions with a minimum of expense.

The New Economy of Music Writing…

Pitchfork, which according to Bloomberg Businessweek had estimated annual revenues of $5 million and 250,000 visitors per day as early as 2008, operates with pretty a lean crew. Some sources estimate these numbers had more than doubled by 2010, but Pitchfork’s current website lists just 4 full-time copy editors and only 2 staff writers.

The site’s small army of freelance contributors were rumored to earn $25 per review as recently as 2007, and in December of last year, Editor-In-Chief Mark Richardson says those rates had increased to a thrifty-but-worthwhile $75-$110 per review.

A small, random sampling of Pitchfork bylines suggests that the site’s average freelancer contributes less than one review per week, and occasionally goes a month without submitting any reviews. In an industry where trusted writers were frequently paid $1.50-$3.00 per word as recently as 2000, it’s clear that for all of its purported influence, Pitchfork is still a labor of love – at least on the part of the record reviewers.

…And Audio Writing

Gearlslutz for its part, doesn’t have a conventional staff to speak of. Instead, it acts as an online forum, cataloging the opinions and gear reviews of everyday audio enthusiasts. The website boasts over 140,000 registered members (not to mention casual browsers), and at one time, its homepage recorded maximum traffic at over 10,000 readers online simultaneously back in February 2011.

In a field where the most popular print magazines, Tape Op and Mix, attract only 50,000 monthly subscribers each, Gearslutz’ traffic is staggering, and their opportunities to sell ads that capitalize on this user-generated content is significant.

One of our sources (who preferred not to be named for this article) recently advertised on the Gearslutz forum and in the process, became very familiar with the site’s stated rates and traffic. He estimates their monthly revenue at over $10,000. Several third-party ad-ranking services estimate that the true value of the site’s traffic is higher, offering appraisals of over a quarter-of-a-million dollars per year.

The Decline?

That these two websites have detractors is nothing new.

Pitchfork has been an outlet that a great number of readers have loved-to-hate since the  early aughts. And in recent years, Gearslutz has been on the receiving end of high-profile flak from some of the very industry veterans who once contributed much of the site’s most valuable content.

What is news is that for the first time, industry estimates and anecdotal evidence suggest that these outlets have been shrinking rather than expanding. But is it true?

Pitchfork by the Numbers.

If you have a look at this graph, the answer seems to be yes, Pitchfork is hemorrhaging readers like crazy:

Recent Pitchfork traffic as compared to two other music magazines for reference.

Recent Pitchfork traffic as compared with two other music magazines for reference. Source:

But wait, not so fast: Just what is this graph, where did it come from, and how do we know if we can trust it?

This chart was created by Compete, a popular web-ranking service used by advertisers and webhosts to track performance of websites they don’t own. But is it accurate? Just how do they come up with their figures? We contacted them to find out.

In essence, Compete works in a way that’s similar to the Nielsen TV ratings. Instead of using Nielsen’s smattering of just twenty thousand or so families to draw conclusions on the viewing habits of well over 1oo million TV households, Compete says they up the ante by maintaining a panel of 2 million web-surfers to make estimates about the nation’s 200 million individual internet users.

“We then take this data,” says Brady Delahanty of Compete, “and use it to project the online behavior of the entire US. All of the data on Compete comes from US users, so the numbers only represent the US population; international traffic is not accounted for.”

They do acknowledge that their stats have a weakness when it comes to niche sites with a low sample-size. Smaller outlets, who earn in the tens of thousands of hits each month could easily be under-represented in their sample. But a site the size of Pitchfork they say, should be fair game.

Editor’s sidebar: It’s worth mentioning that feelings of schadenfreude for Pitchfork seem to run high, and the organization has been faced with similar graphs before. In 2008, still in the midst of Pitchfork’s meteoric rise, a blogger at a site called the Daily Swarm stumbled across a graph on the analytics site Alexa which showed Pitchfork suffering a precipitous drop by their page-ranking standards.

Pitchfork creator Ryan Schreiber responded in his genial midwestern manner:  “Not sure how Alexa clocks traffic, but Pitchfork’s internal metrics (Google Analytics, Mint, etc) show us up for the year to date: We’ve tracked nearly 6 million visits and 17 million page views in the past 30 days alone, which puts us just slightly above last May—the month Alexa registered as our ‘08 peak. Hope this helps.”

On closer inspection, it turns out that the blogger’s mistake was that he tracked results for, rather than the newly acquired domain, As links on the web converted over to the new domain, hits on the old address continued to drop. The trend on the domain was accurate. What this blogger missed was why.

This is one of the things we miss about newspapers. Full-time fact-checkers were once a gainfully-employed part of any decent media outlet’s staff. Watch out! They might just make a return someday soon, helping prevent facepalm moments like the one mentioned above. Let’s see how the current graphs hold up.

On a related note, the Alexa ranking service has come to be considered woefully inaccurate as an estimator of traffic. We’ll largely ignore their rankings in this study.)

In addition to Compete, who have been in this side of the analytics game for some time, there’s a newer service that is fast-becoming a major source for traffic estimation. Perhaps you’ve heard of them. They’re called Google. That’s right, in late 2008 Google Labs began testing the new Trends for Websites feature. It’s safe to say they’re one of the more reliable voices in all things search and web traffic, and they seem to corroborate the decline:



The Narrative Behind a Pitchfork Decline

These services have their critics, and their charts are by no means conclusive. At best, they point to the possibility of a trend and seem to support the anecdotal evidence surrounding Pitchfork’s alleged loss of influence.

One version of the story is that Pitchfork started to lose credibility in late summer of last year, after they booked Bon Iver, their highly-touted purveyor of sleepy and unremarkable background music to headline their summer music festival in Paris. Pitchfork got their review in early, giving Bon Iver a 9.5 out of 10.

It’s worth noting that although many copycat outlets quickly followed suit, All Music, known for its measured, even-handed and optimistic reviews, eventually gave the album 2-and-a-half stars. Pop Matters gave it a 6/10. Audiences’ apprehension was echoed nicely by Alex Baumgardner in his recent piece “Did Pitchfork Kill The Music Critic?” published July 2011 in New City Music. Here, he conveys a quote from Sound Opinons host and outspoken Pitchfork critic Jim DeRogatis:

“It’s the web publication’s absolutist tone… mixed with its ambitions as a concert promoter that causes alarm. As the two grow more codependent on each other, it becomes difficult to tell when, and if, it is wearing the mask of a critical outlet or an indomitable, cash-hungry marketing organism… it causes one to ask whether or not a glowing review, say of the recent Bon Iver album (which was given a 9.5 out of 10), is a heartfelt piece of criticism, or a way to sell tickets for its upcoming Paris festival, which the band is headlining.”

That’s a fair point to make. Even if Pitchfork isn’t intentionally gaming the review system (which is doubtful) their commercial and political interests can still color their editorial leanings in insidious ways.

Where just a year ago, Pitchfork’s ratings influenced other reviewers, the music community seems to be increasingly tuning out their ratings. Pitchfork’s 5.4/10 pan of Clap Your Hands Say Yeah’s new album Hysterical was only mimicked by one major outlet this time around: Spin. With its surprisingly negative review, Pitchfork seems to be doing penance for leading the overhype parade for the band’s middling debut, causing waves of backlash and undue strain on the nascent group. A sea of other reviewers seemed to form opinions on their own this time, and the album received a variety of scores, almost all of them more positive. (And in my opinion, the positive reviews were deserved this time around.)

Pitchfork: Evil, Awesome or Innocuous?

Although the rapid decline shown in these charts show is unlikely, some shrinkage might be necessary for the site to survive in its most natural form. When it gets too big, Pitchfork can appear to be an awful, elitist, bullying machine. But when they operate on a smaller scale, it’s easier to see the outlet for what it really is: The culmination of years of labor on the part of a geeky, affable kid named Ryan Schreiber as he sat hunched over a keyboard in his parent’s basement, writing poorly-crafted-yet-earnestly-glowing reviews of his favorite indie bands. His dream? That someday, he might actually get to meet a few of them.

Although Pitchfork often receives criticism for being “elitist”,  if you really look at its roots, nothing is farther from the truth. Their writers’ early reliance on ten-cent words, confounding sentence constructions, and pompous posturing was never elitism: it was just the insecurity of unskilled or inexperienced new writers who had likely never read a writer’s style guide. To some commentators, it’s just bad writing 101.

Matthew Shaer described this phenomenon better than I can in his 2006 Slate profile, “Die, Pitchfork, Die: The indie music site that everyone loves to hate.

Schreiber has admitted that he trusts writers to “their own style and presentation,” but there’s not much that can excuse the writing in this 2004 review: “The epic ‘Visiting Friends’ gathers in faceless, mutated ghosts (i.e., oddly manipulated vocalizations from the duo) to hover over their dying fire in visage of nothing better than the tops of trees.”…

Clearly, this is prose that hasn’t been, like, edited. It’s dense without being insightful, personal without being interesting… Schrieber’s big wager is that music journalism should be an even more intimate affair than politics—that musical taste is deeply idiosyncratic and that writing about music requires writers who are closely in touch with what makes a band or a song matter to them.”

In “Gauging the Pitchfork Effect”, his 2007 article for the Brooklyn Rail, Matthew Ozga writes:

“Fine” [is the] mean of Pitchfork editorial content. Schreiber would be nuts to cough up any more than the twenty-five bucks he reportedly pays writers per review. Then again, you’d be even crazier to base a month’s worth of music purchases solely on Pitchfork’s recommendations. And you’d have to be downright touched in the head to pay twenty-one bucks for the Simian Mobile Disco album. Trust me.

Pitchfork’s writing has improved significantly in recent years, but the underlying culture remains. Their problems arise when that culture clashes with their new and unexpected role as an arbiter of mainstream taste. A publication with millions of dollars in revenue can’t act the same way as an iconoclastic upstart. If they’re to retain their current levels of readership and revenue, it will likely mean becoming part of a new establishment, raising the bar by increasing their writers’ pay to something that resembles a living wage. They’d likely have to offer more real journalism and continue to re-evaluate their editorial practices.

On the other hand, if Pitchfork does drop back down to its earlier levels of traffic and revenue, it would still be a media outlet of solid footing. And deservedly so. They’ve done a lot to get where they are today, even if they’ve made much of their fortune on the under-skilled, often under-paid, labor of a team of writers who may have had limited prospects in the days of traditional music journalism.

Although it’s still too early to tell which direction it’s headed, it’s likely that Pitchfork will have to continue to reinvent itself if it’s to avoid shrinking.

Editors Sidebar #2: Measuring traffic, let alone influence online, is not an easy task. The tracking services Alexa (which has lost favor with SEO experts) and Quantcast (which is gaining fans) tell their own stories. Both show more wild fluctuations in the short run than Compete and Google, but less pronounced changes in average traffic over time.

In Quantcast’s assessments, Pitchfork is down only slightly for the past few months, showing losses of about 10% during the same time period

If Quantcast is the most accurate, it means that the Pitchfork story is not nearly as big as other trackers suggest, and the website could easily recover in time. What’s also surprising about the Quantcast data is that it doesn’t show nearly the same significant growth from 2008 through 2010 that every other tracker has. So if Quantcast is the tracker that has it right, does this mean that so much of the media had the story of Pitchfork wrong all along? And what would that mean for the future of Google and Compete?

So, which set of data is the most reliable?

It’s still picking up steam, but many experts are beginning to prefer Quantcast — at least in cases like Pitchfork’s. Quantcast says that when websites sign up to have their site “Quantified”, as Pitchfork has done, it’s the next best thing to internal Google Analytics. For non-“Quantified” sites like Gearslutz, however, allegations are that Quantcast’s results may be the worst of the bunch.

Although Quantcast is quickly gaining a reputation, some users report inconsistent results that suggest  verified Quantcast numbers can be off by as much as 20% or even display inverse trends. Quantcast has also bolstered its marketing efforts by setting aside resources to become independently accredited by the Media Rating Council. That may set some minds at ease, but it’s also worth noting that the MRC enjoys the dubious distinction of having endorsed the woefully inaccurate and always controversial Nielsen and Arbitron rating systems.

Some webhosts show that Google and Compete can deliver trustworthy results. But they also have a vocal  share of critics. We’ve even discovered signs that unexpected downward slopes like these maybe endemic to recent results in Google Trends. Although the agreement between Compete and Google makes for a compelling case, based on competing evidence from Quantcast and the increasing criticism of sample-based tracking methods, we can’t use these charts to confirm that the growing narrative plays out in significantly reduced traffic for Pitchfork.

At best, these charts mirror the existing narrative. At worst, they may have helped create it. The one thing we are able to say for certain is that the industry needs better tools and greater consensus in estimating 3rd-party web traffic.

Gearslutz: The Decline?

Like Pitchfork, Gearslutz began as a grassroots operation and slowly built a sizeable following. What began as a humble niche site gradually morphed into one of the most influential new voices in music technology.

While this sounds like the makings of a wholesome entrepreneurial fairy-tale, those of us who have seen Labyrinth will tell you that when the fantasy world and the real world intersect, things can get messy. Not surprisingly, Gearslutz is faced with some of the same problems that have plagued Pitchfork.

Labyrinth (1986): Trolls

Labyrinth (1986)

The site is criticized for helping posters circumvent the basic tenets of journalism and civil discourse; For generating tremendous revenue off of unskilled or inaccurate writers and hoarding the cash for itself; And also like Pitchfork, some insiders suggest that ethically questionable new strategies may be leading to mass exodus.

The Narrative:

Gearslutz began to build its readership and revenue in the early ‘aughts. Back then, it was one of the only games in town for online gear reviews.

Since the site is run as a forum, the writing was informal and prone to hyperbole, but that was okay. Readers were finding opinions and information they couldn’t find anywhere else online. Industry veterans started using the forum to kill some time online, while gear manufacturers and dealers started showing up to support and “pimp” their wares. Soon, Gearslutz became a huge catalog of information and opinion, a good amount of it generated by real working experts.

At the time, forums like Gearslutz were a breath of fresh air. Before Tape Op ascended to become one of the best and most popular voices in audio, and before Mix Magazine revamped their format to become more than just architectural porn, the world of commercial audio magazines was a bleak and desolate place.

Until Tape Op began to establish itself, the market never seemed to come up with a worthy replacement for the largely excellent Recording Engineer/Producer Magazine which ran from 1970-1992, at least not from a content perspective. The intervening years were filled with half-baked advice from out-of-touch writers with limited credentials and little valuable input. It was a baldly commercial, unsustainable market just waiting to be knocked over by better, more dynamic and more accessible sources on the web.

But the honeymoon couldn’t last forever. In the past several years, detractors have begun to suggest that even if readership at Gearslutz grows, it’s a bubble that’s filled with little more than hot air. As time wore on, their story goes, opinion on the site has become increasingly anonymous and ill-informed. And as the most valuable and experienced contributors began to shy away, the site became a catalog of misinformation. The signal-to-noise ratio had grown out-of-control.

To ask the busy professionals who have since left the site behind, all that’s left is the blind leading the blind, and fewer of them each day. But is it true?

Gearslutz by the Numbers.

Here’s a chart from

The past 6 months of Gearslutz traffic

The past 6 months of Gearslutz traffic, courtesy of

Once again, the data is not conclusive, but these charts do suggest that a sudden and significant drop occurred just after the winter in which Gearslutz introduced new, more restrictive policies which have been tied to driving away even more of its most valuable contributors.

Google tells a similar story. As we zoom out, the recent drop-off looks even less like a blip and more like a trend:

Gearslutz, zoomed out. Source: Google Trends For Websites

When we zoom out, the alleged drop-off in readership seems even more unprecedented via Google Trends.

(Editor’s note: Once again, Quantcast and Alexa tell slightly different stories, showing wild fluctuations from month to month, with a less dramatic curve overall. Who has is it right? In this case, Quantcast makes no claims to having verified data for the site. Quantcast’s critics say it may be among least accurate of the major trackers when dealing with sites that haven’t signed up to be “Quantified”.

Our last remaining question is: could new mediums like Kindle and iPad be further confounding the readings? There is the possibility that toolbar-based services like Compete could take an accuracy hit as users switch away from reading on their PC screens. We could not find a definitive answer for Compete or Google as of press time, but it it worth noting that Pitchfork reviews have no Kindle or App access, and Gearslutz launched its App in early 2010, a full year before any suggestions of sudden decline. )

Although not all services agree on the traffic, the analytic sites do seem to agree on what could be a worrisome demographic-shift for Gearslutz. The latest estimates suggest that the overwhelming majority of users are now males in the 18-24 age bracket, and that older users are shying away from the site in increasingly larger numbers.

The remaining demographic is not only potentially less valuable to high-end equipment advertisers, but these users’ eagerness to engage in anonymous knee-jerk debate can sometimes drag down the discourse, further amplifying the trend. When the opinions of least experience are given equal or greater airtime than the observations of the most experienced, it does seem reasonable to ask if there’s much value left in the platform.

Further complicating issues, the site’s newest policies seek to make Gearslutz an increasing insular community, hostile to fact-based reporting and careful analysis.

“You Have Received An Infraction”

I first found out about Gearslutz’ new policies when I ran afoul of them myself.

Like many of my peers, I had shied away from the site over the years, but I’ve been known to poke my head in once every few months. I’m an avid reader and writer, so I’ve always liked sharing links to relevant stories. Someone had a question about copyright law? I had a particularly good link saved from a lawyer’s blog. More recently, I’ve been able to share my own. Someone wanted to know more about the production on the new Foo Fighters album? I had just interviewed the engineer.

When I launched the first issue of “Trust Me, I’m A Scientist”, I announced it on Gearslutz, among other forums. The response was overwhelmingly positive. Within a day there were over a thousand views and dozens of complimentary posts through Gearslutz alone. I was happy to see that so many searchers were ecstatic about the new material. They were vocal about it too, which I’ll admit was a nice ego-boost.

But then something unexpected happened. When I came back to the site days later, I found that not only the announcement, but all the links I had ever posted on Gearslutz had been taken down. I also had a message waiting from founder/moderator Jules Standen:

You have received an infraction at
Reason: “Fishing” / recruiting forum members to visit / join other audio forum / blog (not allowed)
This infraction is worth 1 point(s) and may result in restricted access until it expires. Serious infractions will never expire.

I imagined that there must have been some mistake. I wasn’t “recruiting” for a competing forum. There’s no way to “join” my magazine, unless you count signing up for once-monthly email alerts. I don’t even allow comments on the site. Many of the removed links didn’t even go to my own site! I was posting articles on his forum, getting people excited about new information and encouraging additional posts on his turf. Everybody wins, right?

When I asked about the shocking deletions, Standen replied briefly, suggesting that the links constituted spam. I reminded him that the dozens of posters who publicly praised the articles on his site would disagree. Spam after all, is one of the few areas where online forums are remarkably good at policing themselves. Namely: they won’t stand for it. I received no reply to this message, or to further inquiries about the site’s policies.

I soon learned that, just before the turn of 2011, new rules were established and outside links were no longer welcome at Gearslutz. There seemed to be a sole exception in the audio shootouts section. Enforcement was taken seriously. Although all I received was a warning for a pretty routine posting, others were reprimanded more harshly for behavior that is generally considered normal on any virtual bulletin board.

Michael Joly, an inventive purveyor of mic mods disappeared, and so did much of his posting history. A representative of Pro Sound Web was gone too. Among others, they had been banned. It was part of housecleaning that affected many members who had shared links to outside information.

Apparently Gearslutz management felt other websites were getting too much free advertising out of the forum by using it as… a forum.

This message board, which has had traffic that perplexingly outranked the largest audio magazines, now appears to be unwelcoming to the idea that links to outside articles, images or information could lead eyeballs (and ad dollars) away from the site for even a moment. A first look at the numbers suggests this strategy may have backfired.

Gearslutz: Evil, Awesome, Or Innocuous?

While I may have my reservations about Gearslutz, I don’t think they’re trying to be unethical. In reality, their newest policies are ill-advised at best, and damaging to their own site at worst.

It’s the rest of us who do a disservice to audio discourse when we rely too heavily on forums as a source of information. The Gearslutz ethos seems to lie in emphasizing equipment over music, valuing knee-jerk chatter over verifiable information, and making artisans insecure about their tools instead of encouraging them to put their noses to the grindstone. If we decide to let these attitudes consume our profession, we can only place the blame on ourselves.

Some former users suggest that the site has had a detrimental impact on the economics of engineering. If that’s ever true, it would only be because they’re potentially the diverting revenue that once may have gone to support an engineering magazine. A real media entity might have shared the wealth with a staff of editors, writers, and salespeople, creating valuable jobs in a sluggish economy. A commercial forum of this scale instead concentrates wealth and meaningful influence into the fewest hands possible. At their best, the industry magazines share well-researched articles and unique human stories, where Gearslutz has made it clear that they’re now committed to making sure new ideas stay out.

What surprised me the most in researching this article was that some members are afraid of openly speaking their minds when it means criticizing the forum’s policies. That may seem ridiculous when we step back and remember we’re talking about a geeky website about microphones, but the facts are there.

Recently, Sean Eldon of Mercenary Audio got into hot water with this virtual bulletin board for posting a humorous remark about it on an outside forum. He was quoting a colleague, who had quipped “Gearslutz: where the uneducated go to fight the misinformed.” In turn, he was told he was banned from the site, permanently.

(Editor’s note: After publicly griping about the ban on other forums, Eldon’s posting rights at Gearslutz were eventually restored.)

It makes me wonder if the TMimaS account on Gearslutz will face a ban as well, if only because we’ve published this article. To be honest, I’m not sure it really matters.  With my research done, I have no plans to return to the site in the near future. If the numbers are right, I may not be the only one.

Have your habits concerning these websites changed? Do you think the cause of this downward trend lies in the sites or in the measurement tools? Do you have a different analysis or a better set of data for these sites? Let us know via email.

This entry was posted in All Stories, Industry Trends, Most Popular, October 2011, Rants and Raves. Bookmark the permalink. Both comments and trackbacks are currently closed.
  • Sign up for our Email Newsletter


  • E-news and Updates

    Subscribe to Trust Me, I'm a Scientist Find me on Facebook Find me on Twitter Email Trust Me, I'm a Scientist