Ranking Reports - a Defence.

You know you are getting old when you have an uncontrollable urge to start a post starting with words like "in my day.." , "I remember when.." or "I've seen this before..."

But it's true - that's the problem. People who don't learn from the past are doomed to repeat it, or to even live through something worse. I was reminded of this while reading a recent forum thread about ranking reports.

Someone started complaining that Google had prevented a certain piece of ranking software from getting information from them. This was almost universally followed not by useful advice, but by a bunch of people saying ranking reports were useless. You see this type of thing with regard to PageRank, as well.

I appreciate that the posters meant well, but I really wish people would think clearly, rather than spouting off the latest over-correction crap they've heard recently. Regardless of the topic, there seems to be a consistent flow to this:

  1. Something is pushed too far. Could be a theory, or political view, or whatever.
  2. Experts, concerned about this, issue reasons why those pushing too far are wrong
  3. A new generation of people, in a fit of rebellion against the status quo, quickly grab the "new way of thinking" and then, you guessed it, go too far with it.
  4. Eventually, there is a counter-rebellion and the thinking about the topic matures, now that both sides have been expressed fully and challenged.

The examples are legion. Think about any political, civil rights or scientific revolution and you will see this. So it should be no surprise that it affects SEO, as well.

That being the case, let this be the first shot fired in the counter-revolution against Ranking Reports.

See, in the old days, no one did ranking reports. Who cares how you rank if no one ever visits your site? Instead, server reports were used. In this case, it was often "hits". At the time, hits were not a bad measurement. Pages were very light on graphics, and e-commerce wasn't really a workable concept - people were after "branding" and "buzz". In this context, measuring hits was an easy and fairly accurate way to measure the popularity of a site.

Even today, a popular site will tend to have more hits than a similar unpopular site. You can complain about issues with hits all you want, but the broad brush stroke is usually still representative.

Of course, there are many problems with hits - they are a method of measuring server performance, not visitor performance. A page with a single graphic on it counts as 2 hits (one for the html page and one for the graphic) whereas a page with 50 images on it (even 1x1 pixel images) would count as 51 hits. In both cases, there is only one visitor, however. This was exploited by spammers and bad SEO's who, rather than increasing visitors to a site, would sometimes just slowly add more invisible pictures to the pages in the site, thereby increasing the hits without increasing the actual visitors.

Soon, the industry became more sophisticated, the experts began to call for ranking reports instead of "hits" reports, and we began to move from server-based measurement to search engine based measurement. Instead of measuring our performance using internal criteria (server logs), we switched to external criteria (ranking reports).

After all, it can clearly be shown that your rankings directly affect your traffic and sales. I can see this in both PPC and SEO everyday across many industries. Additionally, measuring based on rankings was more closely tied to the job an SEO does - optimizing for search engines, not for server performance. Life was good. For a while.

Then, the problems with ranking reports began to emerge - first, search rankings can fluctuate from day to day and even minute to minute. Second, some clients wanted daily and even twice daily ranking reports for their top terms. This placed a lot of pressure on both SEO's and search engines, in many cases for no good reason. There is a point when granularity exceeds the information. You can measure too much, and too deeply, getting lost looking at bark under a microscope when you should be looking at the forest, or at least groups of trees.

Other issues with ranking reports were just as serious - ranking well in a term that no one searches for is useless, and sending unqualified traffic was just as bad (and even worse) than sending no traffic at all. Finally, search engines started placing robots.txt files on them. Suddenly, ranking reports didn't look so good anymore. The experts changed course again and began to recommend analytics, instead.

Ironically, we've come full circle. We started doing server based measurement (hits and unique visitors), abandoned it in favor of search engine based measurement (ranking reports) and have now come back to server based measurement - except now we are a lot more sophisticated. We don't want hits anymore, we want unique visitors, time on site, fallout reports, and all sorts of other Key Performance Indicators (KPI). This is cool. It's scientific, it's visitor-focused, and it's got all sorts of pretty graphs and charts.

Ranking reports are now in disfavor, to the relief of bad SEO's who can now justify bad rankings (and their dislike of doing linking campaigns) by talking about how rankings don't matter - it's visitors and sales. How very convenient. It sounds great. Very forward-thinking and modern. Except it's wrong.

Why? Because there is no context. Your visitors and buyers have increased - big deal. Maybe your competition's has also increased, but at a rate ten times yours. But you won't know that, because guess what? You are only looking at your own server for information. If it's not on your server, it doesn't exist. Analytics ignores the outside world and just builds a better navel gazer.

It's all well and good to make yourself better, to optimize the user experience and to increase conversions, but I have news for you - a qualified visitor does not usually stumble randomly to your site - they arrive from somewhere, and that is very often from a search engine. To say rankings don't matter is to do your clients and yourself a great disservice. The best website in the world is no good if no one can find it.

So what now? Go back to ranking reports? No - their weaknesses are well known and legitimate. The next step is to blend your rankings and other external data with your analytics. The visitor experience of your website doesn't begin when they land on your page, it should start as soon as they start looking for what you have to offer. Branding, buzz, social media, PPC and SEO all play a part in this, and thus far analytics has not adequately addressed it, particularly in SEO and social media.

Oh, they can say that people coming to the site based on keyword "X" convert at such and such a rate, but they usually don't suggest new keywords, or tell the analyst where the site ranked for keyword "X" that day, or whether the site was on the front page of Digg that day, or whether there was a news article, blog post or twitter campaign going on. That's important, but since it's not on their server, it doesn't get reported. There is no context, and therefore the information is suspect.

SEO's who "don't believe in ranking reports" don't get it. That's part of their job. If you want to avoid the robots.txt issues, do it manually or use a human-operated script rather than a robot (yes, there is a difference). It will help stop you from over-doing it, anyway. But to say that you optimize sites for search engines and then say that you don't believe in checking a search engine to see if your site is optimized for it is ludicrous.

If you have done your keyword research, a ranking report is a legitimate and extremely important part of the puzzle. It's only a part of the puzzle, not the solution, but it's a very important part of it. Data without context is meaningless, and context is provided from external sources of information, not self-referencing (internal) sources.

I think, like analytics moving from "hits" to KPI, search reports need to become more sophisticated, as well. A simple ranking isn't enough. You need more information. In many cases this information can come from analytics, but it can also come from linking reports, SE saturation, social media buzz references, and many other places. I call not for the return of ranking reports 1.0, but for ranking reports 2.0 - newer, improved and more connected to the visitor.

Doing the job properly requires BOTH server-based and external information sources. I hereby issue a call for all search marketers to move past the current myopic focus on analytics and to look at the whole picture - that's where true understanding lies.



nathan holman said...

Good points and timely as well, Ian. I see the two extremes often - either paralysis by loads of modern analytics information or blindly saying 'we want to be #1 for this term'...I think that finding that blended balance between the two points and then figuring out an actual plan of action to maximize the bottom life for your sites (or your clients) is the path to success.

clickfire said...

Beautifully put, Ian. Sign me up for the counter-revolution.

Stoney deGeyter said...

Excellent post Ian. You went into much more detail than I did in my post the other day which basically also stated the value in ranking reports. Rankings are not the only measure one should ever use, but they certainly shouldn't be left out of the analytics puzzle.

Will Scott said...

This is a great point. Histrionics often lead to over correction.

The good news is, in my opinion, that if the over correction leads to a (wrong) change in consensus it leaves a hole the size of a locomotive for the rest of us to drive through :)

Mel66 said...

Very good post. Rankings are not the end-all, but are definitely a needed piece of information.