Skip to content

The Wall Street Journal’s Hit Piece Against Google

A few days ago, the Wall Street Journal published an article on how search engines, and particularly Google, work. The piece is titled “How Google Interferes With Its Search Algorithms and Changes Your Results,” and is subtitled “The internet giant uses blacklists, algorithm tweaks and an army of contractors to shape what you see.”

At its core, the article is a hit piece, and just about anyone who works in the SEO field knows it.

Unfortunately, not many lawyers are going to see through the smoke and mirrors. Worse, they’re going to see the article published in a seemingly reputable newspaper that they may have come to know and trust.

Here are 3 quick reasons why lawyers should ignore the piece, and maybe reconsider the trust they put in the WSJ. If you want more depth, Search Engine Land has published a deeper takedown of the article.

The Wall Street Journal published a hit piece against Google

1) Their Search Engine Tests Were a Joke

The core argument of the WSJ’s article was that Google was manipulating search results for their own gain by promoting content that aligns with Google’s own interests and burying content that did not.

The Journal tried to prove this by running search queries and tracking how results moved around on the search engine results page (SERP).

That experiment, however, included “17 words and phrases that covered a range of political issues and candidates, cultural phrases and names in the news.” The Journal tracked the results for those queries for 17 days.

Seventeen search queries.

Seven. Teen.

Google handles millions of search queries every day, but the WSJ thinks that the results of 17 of them can definitively prove that Google is a nefarious party that is manipulating reality for anyone who enters a search?

Even a lazy glance at other search engine experiments found that legitimate studies use data sets that, well, aren’t puny:

We emphasize lazy glance, here. These were among the first search engine studies that we found in a basic online search. We barely tried. We know that most experiments consist of hundreds of thousands of data points, but when all we had to do was find more than a few dozen, we knew that any, literally any, SEO study would suffice.

Oh, and those 17 data points that the Wall Street Journal tracked? They were all in a very particular subset of results dealing with news and politics, where certain algorithms play an outsized role to ensure topical content gets promoted over what’s gone stale. Yet the article extrapolates its findings to include all search results.

2) The Reporters Botched Interviews Across the Spectrum of Experts

Several people were quoted in WSJ’s hit piece.

One of them, Glenn Gabe, is an SEO expert and digital marketing consultant. He was contacted by the reporters for the WSJ article in the spring, while they were still putting the article together. He spoke off the record with them to provide background information because, to him, “it was clear that the writer had a very limited understanding of how Google’s algorithms worked.”

Despite insisting that he was speaking off the record and offering to talk later to provide a quote “if they needed one,” the Journal’s piece quotes Gabe. In that quote, Gabe refers to Google’s algorithms as “black magic” – something that he insists he never would have said. According to him, he used the phrase “black box” in one of his phone calls with the reporters – a phrase he uses to describe what Google’s algorithms look like to business owners because they aren’t privy to how it makes decisions.

When Gabe reached out to WSJ about the misquote, he was told by editors that it would not be fixed.

Another person who was interviewed by the WSJ team, Barry Schwartz, wrote the detailed article in Search Engine Land that we alluded to earlier about the WSJ piece’s problems.

Schwartz is an expert on all things Google. We even listed his Twitter account in our short listing of the most important outlets for SEO information.

Like Gabe, Schwartz was contacted by the WSJ reporters in the spring. He found that “it was clear then that they had little knowledge about how search worked. Even a basic understanding of the difference between organic listings (the free search results) and the paid listings (the ads in the search results) eluded them.”

Worse – and this is important – Schwartz opined that “they seemed to have one goal: to come up with a sensational story about how Google is abusing its power and responsibility for self gain.”

Moz founder and SEO expert Rand Fishkin agreed:

“Sadly, the WSJ reporters tried to shoehorn a narrative onto facts that don’t fit, rather than letting the discoveries themselves guide the piece. There’s a lot of unproven, speculative innuendo about how Google’s blacklists work, about the nefarious motivations behind their decisions, and no statistical or meaningful assessment of whether Google’s decisions are good or bad for businesses or users.”

Which brings us to our final point.

3) This Conspiracy Theory Has Been Peddled, Before

Political conservatives have claimed that Google – a company that is known for its liberal tendencies – has been burying conservative-friendly articles in the results page for years, and the Wall Street Journal has been at the forefront of this line of argument.

The problem: It’s a conspiracy. The argument is that Google “blacklists” certain sites because of who they are, burying their content in the results page.

The Achilles’ heel of that idea, and the reason why it is just a conspiracy theory, is that Google – and all other search engines – are in the business of finding relevant and important results for a given search query. That involves finding sites that are reputable and promoting them over sites that are not.

For example: Between NBC and Breitbart, which is more reputable? Based on reader metrics and linking schemes, Google and other search engines have determined that it is NBC. They promote its content above Breitbart’s.

Conservatives see this as political bias on Google’s part. Perhaps they should focus instead on the quality of content that Breitbart provides, its status in the journalistic community, and whether it can be counted on to provide – or at least attempt to provide – objectively neutral reporting that does not push a particular narrative.

Revisiting the Concept of Fake News

Long ago, we published an article on fake news and the levels of journalism.

In it, we focused on a few different aspects of journalism and how news was presented. We saw that there was:

  • News – facts gathered from investigative reporting and narrated in a way to tell the truth,
  • Spin – interpretations of the news, and
  • Fake news – fictions or groundless interpretations of what happened in order to manipulate an audience.

This Wall Street Journal piece presents itself as an example of something more worrisome. It speaks with the structured objectivity of news, but consists of stories and content that seems to have been put together with something closer to spin or manipulation in mind.

Lawyers interested in marketing their website online should disregard it, and possibly reflect on whether the article should change their reading habits.

It changed ours.