Geolocation - Your tool is probably wrong!

As you can tell from my most recent posts on IP Addresses and Geotargetting Adwords, I've been thinking a lot about geolocation recently. Up until recently, I've been suggesting that people use the tool IP2Location for checking geolocation, since there is free demo online and they have free/cheap API tools.

The problem is that I've been recently testing IP2Location, and it's database is not very accurate in my tests. Unfortunately, since it's easy to use and cheap/free, chances are any tools you use to check geolocation are likely using it.

For example, SEOMoz's Geotargetting Detection Tool uses what looks like IP2Location and as a result my website (www.mcanerin.com) is apparently located in Glen Ellyn, Illinois, USA, rather than Toronto, ON, Canada.

Quite a difference. I can maybe see if you get the wrong city, but the whole country? It pretty much makes the tool useless.

SEOMoz Geotargetting Tool Screenshot for www.mcanerin.com:

It's nice that they provide the disclaimer at the bottom that the results may not be accurate, but it would be better to improve accuracy rather than work on more obvious disclaimers, so I'm not going to talk about the disclaimer and focus instead on the accuracy.

The thing is, that I doubt the fine folks at SEOMoz have any reason to suspect that these results are wrong, since it's not their database, and they are apparently pulling it from a very well known IP Geolocation database (IP2Location):



The problem is, that there is no point in providing a tool if it's wrong. Worse, what if you were using this database to deliver ads? Identify where a visitor is coming from? Look for click fraud by trying to cross reference IP's? Suddenly it gets more serious (and expensive) for this information to be wrong.

It's one thing for a free tool to be wrong, but when that same information is feeding your ad delivery software, wise marketers start asking questions and double checking results.

The Search Majors (Google, Yahoo, MSN, Ask, etc) use higher end IP Geolocation companies because the results are more accurate. This means that you probably should too.

The industry leaders in the commercial space are Digital Envoy (Google used to use them until DE sued them for more money), Quova, and MaxMind. Let's look at MaxMinds online tool results for "www.mcanerin.com":


Finally! The right answer!. Fortunately for me and my site's geolocation, this is also the type of database that the search engine's use. The only problem is that this quality comes at a price.

There is a possible free option, however, IPligence also gets the location correct:



I haven't fully checked this one out, but so far the (free) results have been very accurate. Certainly better than IP2Locations.

This post isn't about coming down on anyone, but rather a warning about the dangers of assuming that just because something gives you results for your query, doesn't mean it's always accurate or up to date. Which you'd think us search marketers should know by now, including myself.

I apologise for recommending IP2Location up until now, and am now recommending either the Maxmind or IPligence online tools instead for a quick geolocation check.

Ian

IP Addresses, SEO and Your Site

I haven't talked about IP addresses for a while (and never in this blog). Since I've been asked several times recently about them, I think it's a good time to talk about them.

What is an IP Address?


Computers "think" in numbers, not words and letters. Humans tend to be the opposite. In order to deal with this, programmers and technicians have been creating technologies that translate between the two since shortly after computers were invented. Programming languages are an example of this.

Another example of this is DNS, or Domain Name Services. See, your website really can't be found at your domain name - it's found at an IP address. The domain name is just a human-friendly way of finding the IP Address. Think of it like a telephone number. It's much easier to remember someones name than a telephone number, but your telephone doesn't understand names, just phone numbers. So we invented phone books. You look up the name you remember in a phone book, which gives you a phone number that the phone system can understand. DNS is like a phone book for the internet.

If you type in http://www.mcanerin.com/ into a browser, you don't go over to my company site right away. Instead, your browser looks up the name in a DNS server and the DNS server tells it an IP Address (in this case, 64.34.120.55) that matches the domain name. Then the browser can go to the right website.

So the bottom line is that an IP address is your websites real address on the internet. Now, there are just so many IP Addresses in the world, so people have figured out ways that more than one website can share an IP Address. In this case, after the browser gets the IP Address, it goes to the webserver and also gives it the domain name it's looking for. The server then sends the requested site. This is like having one phone number for your home, rather than one phone number for each family member living at your home. You have to phone the house, then ask for who you want to talk to.

When you have each person with there very own phone number, it's more expensive, often unnecessary, but has some advantages, like your personal cell phone. The same applies online.

Dedicated IP Address VS Shared IP Address


A Dedicated IP (sometimes wrongly called a "static" IP by some web hosts) is an IP address that only points at one website. A Shared IP is an IP address that can be shared by more than 1 website.

Now, search engines don't usually care about IP addresses - they index you based on your domain name. That's why having more than one domain name for a site can confuse them.

Note on Dynamic and Static IP Addresess

When you use your ISP to connect to the internet, you will often get what is called a "dynamic" IP, which means basically that it changes. Most ISP's have a big pool of IP addresses and they just hand a random one out to you whenever you connect.

A static IP is simply an IP that you have each time you log in - it doesn't change. You really don't need a static IP for just surfing the net, but if you host your own server at your home or office then it's usually best to get a static IP address.

There are ways to host websites with a dynamic IP address. I hosted mcanerin.com for years at home on a dynamic IP address and ranked very well, thankyouverymuch. I just ran a script that automatically checked my IP constantly, and when it changed, the script would update my DNS server with the new IP and my site was back up and running again, often within a minute or so. This is a good example of why IP really doesn't matter as much as some people think it does for search engines.

Keep this in mind when some tech tries to tell you that your website is doomed if you switch servers or IP's. Nonsense. I used to switch IP's as often as several times a day without any problems :)

Geolocation by IP


There are, however ways a search engine will use your IP Address. Neither are directly for ranking purposes. They are additional processes that Google applies to sites during the ranking process. The first is Geolocation.

Since IP addresses are assigned to webhosts, and then given to that webhosts clients, a search engine can lookup where that webhost is, and therefore know where, approximately, your website is hosted. This is one way that Google knows your site is from the US, Canada or China. Google will give websites that it knows are from the UK a boost in results shown to searchers from the UK, on the assumption that they would probably consider UK sites to be more relevant to them. This is almost always a good assumption.

The problem is that if you are a UK company but host your .com in the USA for some reason, you will be considered a US site, not a UK one. Dealing with issues like this is actually my specialty, and I assure you there can be some tricky aspects to it, especially if you have sites for different areas of the world but one CMS that controls them all. So it's important to know where your IP is Geolocated to.

IP Address and Spam.


The second problem is a little more insidious and difficult to pin down. IP Addresses, being known physical locations, are a better method of detecting search engine spam than domain names, which can be moved around very quickly and are cheaper than hosting. So search engines look at IP Addresses (among many other things) during spam checks. This can cause problems for some sites.

The thing is that if you have a hosting account and decide to use it to create a link networks of thousands of sites, then all of those sites will either have the same IP address, or will be withing the block of IP addresses that your website host has available. Website hosts are typically assigned a Class "C" or part of a Class "C" to use. I'll explain what this is in a moment.

What Google knows at this point, however, is that chances are that a whole bunch of sites on the same IP or within the same Class "C" IP Address space have something in common" At the very least, they are hosted by the same webhost in the same location together. This may mean nothing, or it may mean that they are all owned or controlled by the same person or persons. In short, if they start linking to each other, the links may not be independent.

There is no guarantee of this, of course. Some towns (and small countries) only have one host, and therefore they all share a Class "C".

What is a Class "C" IP Address Range?


What's a Class "C", you are still asking me? Ok, it's actually really easy.

A typical IP Address has 4 sets of 3 numbers. In cases where some of the numbers are 0, you can leave them out. So the IP address for mcanerin.com is: 064.034.120.055, or it's sorter form, 64.34.120.55. Since firewalls are usually used to block the long form in favor of the short form, it's usually a waste of time to try to use the long form.

Let's look at this IP address. If we replace the numbers with X's to symbolize a generic IP Address, it looks like this: XXX.XXX.XXX.XXX - now, each of these sets of three X's is a Class. If I now change the X's to the Class, it would look like this: AAA.BBB.CCC.XXX.

The X's at the end are correct, they are not the "D" class or anything like that. Now, do you see the "CCC" section? That's the Class "C" number. In my case (064.034.120.055) the Class "C" address is 120. This means that other sites that have a 120 right there (XXX.XXX.120.XXX) would be considered related in some way.

Relationships Can be Good (or Very Bad)


By itself, a relationship means little. Linking or being linked to is a relationship, as well. So is being in the same country, having similar WHOIS contact information for your domains, and all sorts of other things.

The problem is that in general, the more relationships that are involved, the more likely a site will be considered directly related. Google really doesn't want to show more than one directly related site in any particular SERP - it's not really fair for have the same owner have 5 of the top 10 slots, for example. Additionally, links from sites that are related to you should (and usually do) count for less than links from total strangers.

This means that you should be aware of your Class "C", and of the Class "C"'s of other sites you may own. This sounds like I'm telling you how to spam better, but here is why I recommend this even for the most milky white of white hat sites.

Real Life Example of Why You should Care About Your Class "C"


I had a client, who had 2 sites. They were both in the same general topic range, and hosted on the same Class "C". One site was on the topic (for example) of "New York Lawyer", and the other was one the topic of (for example) "Lawyer Resources". There was no overlap in content, and the target audiences were completely different. Additionally, the keywords for each site were also different. Or so we thought.

It turns out that "New York Lawyer" and "Lawyer Resources" have something in common: the keyword "Lawyer", even though neither site was actually pursuing that term! Now what we had was 2 sites that were related (on the same Class "C") and, in Googles mind, relevant for the same keyword ("Lawyer"). One site dropped off the SERPS for almost everything except a few long tail terms, and the other lost rankings, as well. Why? Because multiple related sites on the same topic hit a spam filter. That's not a problem in theory, except Google assumed a keyword neither site was pursuing to be the issue.

The fix was to totally separate the sites. We moved them to 2 different Class "C" addresses, and just to be sure changed the WHOIS data to the second owner (NEVER fake WHOIS data - it's against the law). This fixed the problem, and both sites now rank well for their respective keywords. If you check the shared keyword, Googles duplication filter kicks in and only the site with the highest link pop shows up - which is exactly the type of behaviour I would expect, and have no problem with.

So keeping your sites on difference Class "C"'s is a good idea, even if you are not inclined to spam at all.

Why You Should Care About Other People's Class "C"'s


Here is another issue: what happens if you are on a shared IP with other sites that are going after your keyword? You now have 2 relationships with them, whether you know it or not. This can cause problems even if you are totally innocent. It's a good idea to check the other sites on shared IP addresses for this reason, and make sure none of them are competitors or otherwise related to your keywords. Same with Class "C"'s to a lessor degree.

Finally, what if you have sites on 2 different Class "C''s, but the same people link to both of your sites? Shows a relationship? Of course! But here is another scenario you may not have thought about. What if the people pointing at your site have relationships with each other?

What if you have 10 websites pointing to you, but they are all from the same IP? Or same Class "C"? Of course links from related sites won't pass on as much PR as unrelated ones. Geez, things just keep getting more complicated, don't they?

So What Do I Do?


Well, you have to host *somewhere*. And you can't really control the IP's of everyone who links to you. Even a link that passes on less than the full PR is still passing on PR. At some point you just have to decide to stop worrying and get on with marketing your website. Having the "perfect" IP address won't rank you for anything. It's just a technical detail that can bite you on occasion.

In general, people still rank well for all sorts of things, even if they don't even know what their IP address is, so this isn't the end of the world. I would personally only worry about it if:

  1. You have more than one site on a topic that could even be slightly related ( If so, consider merging them)
  2. You are considering begging/trading/buying links from groups of sites
  3. You have an inexpensive, popular host (spammers like cheap hosting)

If any one or more of the above are true, then you should to start paying attention to relationships, including links and IP Addresses. Relationships are the currency of the internet, and it's much better to have good relationships with others than bad or questionable ones.

Some tools to help you:

Enjoy,

Ian

AdWords PPC Geotargetting / Language Setup

"When geo-targeting ads by a client's country, what is the best practice for language targeting?"

As a general rule of thumb, your ad should be in the same language as the SERP the searcher is looking at.

Let's say you target Korea, to use an example. If you are just beginning, then it would be best to geotarget Korea and also target the Korean language. Although many Koreans read/speak English, it' s jarring to see an English ad when the rest of the SERP is in Korean. It makes it stand out, but in the wrong way. Usually they decide that the company is clueless and "doesn' t understand Koreans". I've had many discussions with Koreans on this very topic. The same also applies to Chinese, and especially to Japanese.

If you wanted to be more accurate and do a really thorough job in the market, you could do the following (though it's more work and for some markets isn't worth it):

  1. Geotarget Korea (or whatever country you are looking at)
  2. Create a KeyWord list. Separate out the keywords that are the same in English and Korean ( i.e. " Samsung") from the pure Korean words.
  3. Anything that is pure Korean, target Korean language only.
  4. Anything that could be both (and would result in a SERP with both English and Korean in it) you would use as two different groups – one targeting English with English ads, and one targeting Korean with Korean ads.

Some other observations:

  1. A single English word ("Samsung" ) could be equally in either language, but multiple English words "Samsung office in Seoul " is usually (though not always) an indication that the target language should be English. This also works the other way – one English word in a Korean phrase is probably Korean.
  2. If in doubt, use the official national language of whatever country you are in, or the most common language of the region if there is more than one. For example, in Canada, you would default to English for western Canada and French for Quebec, unless someone indicates that they are looking for a language specific Keyword.
  3. Due to the different character sets between Asian languages and English, this might seem more complicated than it needs to be (you are normally safe in assuming any keyword written in Chinese characters has a preference for Chinese ads, for example) but as a best practice it 's a good idea to language target as well as geotarget, especially when you begin to work with multiple languages that share characters (English/Spanish/French or Chinese/Japanese/Korean).

Practical Final Answer: Start off with Korean language ads geotargeted to Korea and the Korean language keywords, including the dual-language Korean list. See how that goes. If it goes badly, you are unlikely to fix it by adding English to the mix , and it will just complicate things.

If it does well, then add the dual-language English list to the mix. In this case, you would just create a second campaign, but this one geotargetting Korea but only the English language , then use the dual-language English words. In this case, Google (for example) would not treat that as a duplicate, but would trigger the English ads for searchers that had indicated a preference for English, and Korean for those who indicated a preference for Korean.

This type of system is especially useful when you have products and numbers involved – for example, the "SGH-L760 " from Samsung is the same search term in any language – Korean, English, Chinese, Japanese, etc. You simply can' t just geotarget it – you have to also target the language in order to trigger the correct ad.

I hope that helps,

Ian

Microsoft FrontPage SEO? Say it isn't so!

I admit it. I've been using MS FrontPage since before it could even be called a proper website editor - a copy came free with Windows NT 4. Yes, it was crap. But up until then I'd been using a text editor (vi) and actually having a semi-WYSIWYG was a big improvement at the time.

Since then, I've used Dreamweaver, CoffeeCup, HTML Kit, Amaya, Homesite, and even Notepad. Of them I was most drawn to Homesite, but I can deal with almost any interface (eventually). But time and time again I ended up back with FrontPage - v2, v3, 98, 2000, and finally 2003.

I complained about it's proprietary features, code bloat, etc. But the fact that my clients are almost always from Windows shops and tend to send me things in MS Office format simply made it easier to use FP. Whatever it' s drawbacks, FrontPage has two things going for it - I was very familiar with the interface, and it handles MS Office files better than anyone else (no surprise, of course). Then one day Microsoft discontinued FrontPage.

That was an interesting day. At first, I was kind of shocked. After all, FP had been a comfortable if occasionally annoying tool for me for more than 12 years and I had finally figured out how to generate clean, compliant code with it quickly and easily. Yup, you could do it. It just wasn't it's out-of-the-box settings.

But then MS announced that it was replacing FP with Expression Web. Naturally, I figured this was just another name change. But EW really is different. It is actually designed for web professionals, instead of Office users who need to make a web page.

First, it starts with standards, then adds .NET functionality, rather than the other way around. It designs using XML and CSS rather than FP Templates. CSS support is really good, rather than being a clumsy add-on like in FP2003 or non-existent like in previous versions. It has built in checkers for W3C standards and usability. If you declare a doctype, it warns you when you use coding that strays from the doctype, even if you are hand coding. Speaking of which, the hand coding editor is really good. The list goes on, but it's a good list.

Anyway, Once I got used to the new interface, I was a very happy camper. If you use FP, dump it and get EW. Don't even wait. If you use DW, it's a tougher call. If you want to support standards, use EW. Yes, You heard me right. The MS product is better at supporting standards than DW! It's better at detecting issues, better with dealing with them (if you open old pages, for example) and better at creating compliant code.

On the other hand DW has tons more widgets, plugins and so forth. So if you are new to website design and need your hand held more, DW is the tool of choice.

Funny, not so long ago FP was for the newbs and DW was the pro choice, but now EW is the standards gorilla and I'm finding that newbs have embraced DW to the point that being a DW user doesn't say anything about your skill set. You could be a web god, or a total drag and drop drone. Weird. The world always changes and now the perceived roles have reversed.

I personally know a lot of so-called "professional" web developers using DW that can't understand raw HTML code for the life of them. Maybe I'm getting old, but I still think that a website designer should be able to read, understand and hand-edit HTML, regardless of the other tools they use. This applies to FP drones, as well, of course. It's just that the DW users are more likely to claim "pro" status, and I hold anyone claiming that status to a higher standard. Like knowing HTML. Anyway, I digress.

Even Adobe (DW's owner) has posted an article about how good EW is, and they were only looking at the beta at the time. I've no doubt they are planning upgrades to address EW. I understand that CS3 is pretty good.

No web tool is perfect, and EW is missing some things (no Mac support, etc). One thing that it was missing is SEO tools. Oh, the usability features like automatically bringing up the ALT attribute editor when you drop in an image certainly help with SEO, since SEO is 80% about usability when you get down to it. But some SEO friendly functionality was missing.

Today, I found a nifty site called Expression Extras and they have a plugin for EW that lets you:


  • Create an XML Sitemap (for Google, Yahoo, Ask, etc) at the push of a button, supporting EW's "Do Not Publish" tag for internal docs.
  • Create a Google Webmaster Tools compliant robots.txt that automatically creates and links to the xml sitemap autodiscovery directive.
  • Easily lets you track and edit the Title, Keywords and Description tags (EW lets you do this, of course, but the tool makes it way easier)
  • An ALT attribute checker to easily check and update all image ALT attributes on a page
  • A time tracker for people who design and bill by the minute/hour.
  • plus more...

I don't know this guy, have an affiliate link, or anything like that. I'm just sharing a cool tool. It only costs $16. Oh, and there are some FREE downloadable Web 2.0 style "glass" buttons and medals on the site, as well.

If you use FP, or are looking for a standards compliant but easy to use WYSIWYG website editor, get EW. If you have EW, I recommend you check out the Expression Extras Site.

There are some other EW plugins available from other sources, as well. I'm having fun.

Ian

SMA-NA Dissolved

It's with a very heavy heart that I was forced to dissolve the Search Marketing Association of North America (SMA-NA) today.

I'm the last man standing from the Board, and the only communications I've received from members recently have been requests to cancel their subscriptions. So that's it. I really wish it were different.

The History

The SMA-UK was started back in late 2004 as a response to several issues, notably concerns over SEMPO's teething problems. I had been contributing to SEMPO to this point but had grown disenchanted with some aspects of it as well, so, I flew to Stansted, England around Christmas 2004 to meet with them with the express idea of founding a North American chapter - the SMA-NA. Mike Grehan was instrumental was providing help and encouragement during this stage.

At first, there was a lot of interest and excitement about a new search marketing organization, and I remember the early meetings being full of lively conversation and lots of ideas. At this point a veritable "Who's Who" in SEO stood up to help out: Christine Churchill, Debra Mastaler, Andrew Goodman, Ben Pfeiffer, Fionn Downhill, Bill Slawski, Beth Abernathy, Karl Ribas, Ignacio (Nacho) Hernandez, Kim Krause Berg, Jeff Nelson, Barry Welford, Rand Fishkin, Eric Martin, Matt Service and too many others to list, but not to appreciate greatly! (Yes, I know that there are a couple of errors in the membership list - I'll fix them as soon as I can).

The Current Situation

The problem, I think, is that we started off with the idea that search marketing association should be readily available to everyone who wanted to join. This means low membership fees. The problem is, that means restricted access to resources due to a lack of money and a subsequently higher reliance on volunteers from an extremely busy industry. Worse, we didn't want to appear beholden to large money-rich sponsors (such as search engines, etc) that may attempt to control or direct the organization, so we were not very aggressive in looking for sponsors. This all combined into a significant cash crunch, even though we used as much volunteer time as possible, traded services (ie hosting) for memberships and hired a part-time employee to do updates rather than a full-time management staff.

The other members of the board held on for as long as they could, but there was just so long that they could try to run their own businesses as well as the SMA-NA. Myself, I've been getting very busy as well, and combining this with my recent health problems, it simply is not reasonable to continue attempting to run the organization. It's also unreasonable to expect members paying fees in return for few benefits outside of a nice link and some discounts, and very little communication.

In the meantime, SEMPO has hired professional managers and the initial concerns I had 2 years ago are no longer as valid as they once were. The remaining concerns can probably be addressed internally. In short, I'm fighting problem that really doesn't exist with an organization that in most practical aspects also doesn't exist. No matter how strongly I feel about the goals of the SMA, it's come to the point where I feel I can make more of a contribution to the industry using other methods and processes, and this is where I feel I should focus my efforts to promote and engage this industry I love and am so deeply a part of.

What's Next

From an administrative standpoint, I will maintain my position of President of the SMA-NA long enough to properly wind it down, pay it's remaining bills and so on, but I will not be actively engaged in anything else SMA-NA related.

On one hand, this is a very sad day for me, but on the other hand, I believe that the fact that we are at this stage means that this industry is growing and maturing, and that the issues of the past are no longer holding us back from dealing with the problems and challenges of the future.

Ian McAnerin

Google Proxy Hack - Part 3

Well, my client is now ranking again for negotiation training and the proxy has disappeared. The funny thing is that it was fixed before we were even able to start blocking IP's.

If you are from Google and brought this to the attention of those who fixed it - thanks :) Hopefully this is the start of this entire issue being fixed for everyone.

Ian

Google Proxy Hack - Part 2

As I mentioned in a previous post, Google is susceptible to a proxy hack that can wipe sites off the SERPs, and do all sorts of other nasty things, as well.

Naturally, my client is not the only one with this problem - it's a well known issue with Google and has been for almost a year. Based on how long it took them to address the 302 Hijack, I'm not holding my breath for a quick fix on Proxy Hijacking, either.

Dan Thies posted a great article on this issue, so I'll link to it rather than repeat the information in it. The takeaway is that it's a real issue, and it's not that easy to solve from a victims standpoint.

The easy fix is to block the IP of the proxy. This only works if the issue was accidental and only one proxy.

If it's a case of someone deliberately gunning for you, then don't expect any help from Google. In this case, you need to get a bit more fancy. There are several different methods, all of which could be countered, but Jaimie Sirovich wrote a nice script to do what is basically reverse cloaking, feeding the normal pages to search engines and putting noindex,nofollow on pages sent to everyone else (including, hopefully, proxies).

I hate proxy hackers. Proxies are a useful part of the internet, and people who abuse them in this way are NOT "pointing out flaws in Google to make them better", they are simply people who don't understand that pissing in your own well water is stupid, and they deserve nothing but contempt.

Ian

Exploiting Googles Proxy Weakness

I have a client. He does negotiation training and has clients like IBM. His website is http://www.negotiationdynamics.com/. He does the actual SEO on his site and phones me for help when needed, which is a business model I rather like.

Today, I got a phone call from him, and he told me he'd disappeared off of Google. Worse, his site seemed to have been hijacked. When you type in the search negotiation training, the result that used to be his now looks like this:
You will see that instead of the URL being negotiationdynamics.com, it's now a search result from a known Open Proxy called firewalldown.net. Naturally, efforts to contact these guys and tell them to put a damn robots.txt file in that would exclude the spidering of results has not been successful.

But wait! There's more! If you sign up now for the Google Proxy Hack we'll throw in the following for FREE:

First, Google will cache (and rank) your website (or that of a competitor you don't like):


BUT, when hapless searchers attempt to connect to the site, they are redirected to a domain reseller! Cool! Google R0X! (rolls eyes):



It's not that I blame Google for being gamed - that can happen to any company approaching the size and influence of a public resource. But I *do* blame Google 100% for indexing obvious and PUBLIC proxy results. It's sloppy programming and poor usability.

Proxies are all over the internet. Many are used for perfectly benign purposes. It's simple to identify them. The answer is not to pretend they don't exist, or to attempt to ban all of them. It's certainly not to try to find and kill proxy results one at a time by hand as they are reported, which apparently is Googles current method. The answer is much simpler.

Here is a thought: don't index URLs that have other URLs in them as a variable. Or is that too complicated? This could be done in like 20 minutes. But it hasn't been. For shame.

Ian

The Ultimate SEO Effectiveness Formula

I was thinking today about what makes a specific SEO tactic effective or not, and had an epiphany. I've discussed my opinions on marketing VS content before, but this time I came up with a very different angle.

You see, what really drives traffic to sites is not any one tactic, but rather a combination of factors that work together as an attractant. What are the combination of factors? Well, if you boil them down to their essence, all the details disappear and you are left with one basic universal concept:

The Effectiveness of an SEO tactic is based on a combination of Marketing and Content.

You need to have content people want, and those people must know about it and/or be able to find it easily.

Marketing gets the word out. It doesn't matter how great your content is if no one knows it exists. Exactly how you market can vary (and must vary based on your audience and the medium you are using), but you must market to be successful. Linking campaigns, social media promotion, paid advertising and linkbaiting tactics are all marketing. Even sending out print flyers can be an effective SEO tactic for local companies. The point is that you need marketing or your site (and business) will be at a standstill.

Content is also critical. Without a reason to visit (and return, and link) then marketing will only get you so far. You can become the internet equivalent of a one-hit wonder with marketing alone, but in order to be truly successful, your content must pick up and deliver on the implied promises your marketing makes. This makes content more important than marketing, but also reliant on it.

Additionally, your content helps your marketing, once a certain level of awareness is achieved. At a certain point, your direct marketing efforts stop driving the majority of traffic, and indirect and residual marketing - word of mouth, blogging, reviews, etc take over and propel your site to heights that are unattainable by your own efforts alone. Content is what creates this virtual (unpaid) army of marketers working for you. Content markets.

Therefore, content is extremely important, because it not only affects traffic by itself by virtue of it's own merits, but it also multiplies the efforts of your marketing.

Further, the more content you have, and the better it is, the more pronounced the effect will be. Rating content on a scale of 1 to 10, content rated a 1 would be far less effective than content rated a 5, for example. The content rated a 5 would be far more than 5 times effective, it would be several orders of magnitude more effective, since content tends to support other content as far as both real and perceived value is concerned.

A single paragraph is less impressive and useful than several paragraphs of content, assuming the quality is the same. The knowledge and background of the first paragraph influences and enhances the value of the following paragraphs, and the following paragraphs support the earlier ones.

Content quality is not just a linear progression - it's exponential.

For the purposes of this article, I'll square it. The actual exponential value might be more or less, but I think squaring is pretty close based on my own experiences.

So, where are we now?

Well, Marketing increases the Effectiveness of SEO, but only to a limited amount. The more and better the Content you have, the exponentially more it affects your Effectiveness.

Further, Marketing creates an initial impression of quality of content - a link to a topic with anchor text implies that the content linked to is relevant to the anchor text, and that the person who linked to it is recommending it. In the case of 2 equal pieces of content (such as 2 identical articles hosted on 2 different sites) the one that is better marketed will generally have the most traffic.

In this case, the Marketing affects the perception of the Content. Given a choice to read the same article from the Wall Street Journal and on a blog you've never heard of, you'll probably read the one printed by the WSJ. This is an example of Marketing affecting the perception of the quality of the Content.

However, that is just the beginning. A bad article in the WSJ is still a bad article. Content trumps. One of the reasons the WSJ is so well regarded is that it tends to have lots of great Content. This is Content affecting Marketing, since being regarded as having great Content is why people trust the WSJ more than some blog they've never heard of. It's name and reputation for quality Content are part of it's Marketing.

So Marketing affects Content, and Content affects Marketing, but of the two, Content has an exponentially stronger influence on the total Effectiveness of the relationship.

Therefore, in order to determine the Effectiveness of a particular SEO tactic or campaign, you can turn this all into a simple, easy to understand formula:

Effectiveness equals Marketing times the Content squared.

-or-

E=MC²

Remember this simple formula, and it will help guide you through your next SEO campaign. Simply judge the effectiveness of what you are doing by how you are marketing it, and what content you are using, keeping in mind the relationship between the two.

Ian