Showing posts with label Research. Show all posts
Showing posts with label Research. Show all posts

Current Search Share 2010/2011




This is the current organic search share for the end of 2010. You can click on the image for a larger version.

Organic Search Engine Market Share - Global 2010/2011
(Rounded data in brackets)
  1. Google/Ask 86.21% (87%)
  2. Bing/Yahoo 9.06% (9%)
  3. Baidu 2.97% (3%)
  4. Others 0.68% (1%)
2010 was a busy year for search, due to Google pulling out of China and merging with Ask, and the Yahoo/Bing merger. In order for the above chart to be useful, I've merged the Bing/Yahoo and Google/Ask data, even though for much of the year these were separate search engines.

Yes, you can use this in your presentations, as long as credit is given.

Raw sources: Hitslink, Comscore, E-Consultancy, eMarketer, Hitwise, Nielson Media Research, McAnerin International Inc internal analytics data.

SEO-Browser Updated!

We've updated the SEO-Browser!

You can now use it to visit sites from the US or Canada, which is very helpful for multi-country sites. More countries will be added later, as the demand grows.

Canada: ca.seo-browser.com
USA: us.seo-browser.com

In order to add this functionality, we had to put in a FREE one time signup. Don't worry, we don't share emails with anyone.

You can still use it in simple mode without a signup, but the advanced mode is starting to get too difficult to program without assigning users and sessions, hence the login.

Eventually, the advanced mode will add things like ranking reports, etc - all of which require user/password combinations for privacy and data storage purposes, so this is preparation for that.

Open Office VS Bing

It seems that MS Bing is either really bad at search, or is allowing it's results to be unduly influenced by either money or corporate policy. I'm leaning towards the censorship version of the story, myself.

Do a search for open office in Google. Now do the same one in Bing. You'll notice the complete absence of openoffice.org in Bing. They send you almost everywhere but to the right site.

Think that's odd? Or maybe just a coincidence? Try searching for "openoffice.org". Nuff said.

Here is the real kicker - Bing's automated recommendations and suggestions all clearly show people are searching for openoffice.org, but I guess that's just too bad.

For shame, Microsoft.

UPDATE:

Matt and Vanessa did some checking, and found it was a technical glitch, and not an evil conspiracy. Although I'm happy the net result is positive for searchers, I'm annoyed at myself for not checking deeper into other possible explanations.

Redirect Issue with IIS

I'm not sure if this is just an IIS 6.0 issue, but that's where I just ran into it.

A user of the SEO Browser contacted us today because the browser couldn't load his site. That wouldn't have been so bad if it wasn't for Google ALSO not apparently being able to load his site (it was unindexed). However, browsers loaded it just fine.

It took some investigating, but I finally found the issue. The site was set to redirect visitors to the domain root to an internal page. No problem. The redirect was the default 302 that IIS uses. Ok, a bit of a problem, but not enough to stop Google or the SEO Browser from loading it.

But buried in the HTTP headers ( I used SEO Consultants "Check Server Headers Tool") I found this directive:

FollowRedirects=False; Server requested redirection, but sent no new location.

Oops. If you set a redirect (301 or 302) but then say that you should not follow any redirects, strange things happen to your site as the two incompatible directives play out.

In this case, browsers (Firefox and IE) followed the redirect and ignored the server directive saying they should not.

Spiders (Googlebot and the SEO Browser) acknowledged that a redirect existed, but also acknowledged that the server was telling them not to follow it. Therefore they did what they were told and stopped.

Naturally, the fix is a very simple checkbox telling IIS to set the FollowRedirects=False to FollowRedirects=True.

I've never encountered this issue before, but I thought I'd share, since it might help someone out there in the future. I'll see if I can have this issue detected by the SEO Browser, as well.

Ian

PS I need to blog more often. I feel a New Years Resolution coming on...

Cannot start Microsoft Outlook. Cannot open the Outlook window

M$ (automatically) killed my Outlook 2007 last night on my Windows Vista 64 system. It was one of those "Windows Update is finished so we restarted your computer for you even though you didn't say we could" issues, followed by the usual Microsoft screw-up of hosing the system they were supposed to be fixing and protecting.

I'd be a little more patient with them if this wasn't caused directly by following their RECOMMENDED procedures. Yeah, I could disable updates, etc, but that rather defeats the purpose of updates in the first place. No, this is totally on M$. Their recommendations, their update, their software, their restart, their screwup.

They don't get off the hook for me not disabling updates any more than rapists don't get off the hook because their victims don't wear chastity belts. (Well, that might be an extreme analogy, but you get the drift. The bad guy is at fault, period).

But, of course, it's still MY problem.*

The error is simple. Office works fine, but any attempt to launch Outlook results in this error: "Cannot start Microsoft Outlook. Cannot open the Outlook window".

If you ask Microsoft, their answer is to use the system restore. Don't get me started on their answer being for me to undo their screw-ups. One guy was told to wipe his system!

Anyway, a much better answer by a user called Dayneb in the new (to me) Microsoft Answers forum was something far simpler:

Start->run..then type the following -> Outlook.exe /resetnavpane

In some cases, you have to make sure you are in the Office Directory on your hard drive for this to work, but work it does. Instantly and easily. Thanks Dayneb!

I'd thank MS for the Answers Forum, but that just feels wrong, since I should not have needed the answer in the first place...

Ian

----
*Serves me right for not switching to Gmail. I'll do that as soon as I can be anywhere in the world, including airplanes and ships, and still be connected. Hopefully that will come sooner than later. I travel too much now to rely on GMail for anything other than a backup system and spam catcher.

What You (Probably) Don't Know About Redirects.

Most SEO's are taught the simple mantra that a 302 (temporary redirect) is bad, and a 301 (permanent redirect) is good.

This is wrong. Or at least, part of it is.

An SEO quoting the above to someone who is really knowledgeable in web servers will have just shot themselves in the virtual foot, and probably made their job harder, since now the server expert is more likely to dismiss other things they are saying as possibly oversimplified and misleading, too.

The Facts

The good news is that a 301 is usually what you want as an SEO, and you want to avoid 302's, so even though you may have been wrong in your ideas as to how things work, the net effect was probably correct, or at least, good enough.

As with many things that are "good enough, it works", most people never bother to look further. The rest of this article is for those who actually prefer to understand things, rather than those who just follow checklists blindly. The rest of you can stop here and be happy that 301's normally do the job you think they are doing.

A 302 actually isn't a Temporary Redirect

A 307 is. In reality, a 302 just means "Object Moved", or "Found", which, if accompanied by a target URL, browsers and servers interpret as a redirect. But with no target URL, they will happily stay there, and it's not an error.

In reality, a 303 is what most SEO's think a 302 is. A 303 means "See Other". A 307 is the actual Temporary Redirect. It really means temporary, as in the very next request should also be made to the old URL, and the new one should not even be cached. This is usually only used for emergency redirects (like when a primary server is down) and the like.

A302 doesn't dictate a redirect, it just says that what you were looking for moved, and it's been found there. You are usually redirected only as a courtesy and for usability purposes. Technically, you should use a 303, which really does the job properly.

The Problem With 301's

A pure 301 actually isn't always the best choice for a redirect, either. The problem is that a 301 is cachable, and therefore if you ever change that 301 to point to a second place, it may take quite a bit of time for the search engine to update it's files, which is why there is often a delay in you seeing results after changing 301's.

For example, let's say you 301 me.com to you.com. Then later you decide you want to change the me.com redirect to us.com. There can be a significant delay in this working, because the search engines will cache the original redirect for quite some time.

Want to fix this?

You can put a 301 on you.com as well as me.com, which will create a second hop in some cases, but will give you some time for the caches to be updated and speed up indexing, in some cases dramatically.

It's a simple fix, but it can save you a lot of time and headaches.

Want another fix? (Best Practice)

When you create 301 redirect, prevent it from being cached is there is a possibility that you will change it in the future.

If you don't think you'll be changing it, then just do what you usually do - it will cache automatically, which is normally a good thing.

If you might change it, or expect to change it, you can disable the cache in one of 2 ways in either the server response or the redirect page. Server response is better, IMO.

  • (BEST) use HTTP headers in the server response to send "Cache-Control: no-cache"
  • Or, if you can't do that on your server, you can use "Pragma: no-cache" option in your html redirect page header area. But real server headers are better than the pragma.
Redirect Codes

  • 300 Multiple Choices – HTTP_MULTIPLE_CHOICES
  • 301 Moved Permanently – HTTP_MOVED_PERMANENTLY
  • 302 Found – HTTP_MOVED_TEMPORARILY
  • 303 See Other – HTTP_SEE_OTHER
  • 304 Not Modified – HTTP_NOT_MODIFIED
  • 305 Use Proxy – HTTP_USE_PROXY
  • 307 Temporary Redirect – HTTP_TEMPORARY_REDIRECT
Ian

Google International PPC Ad Targeting Twist

After some testing and a confirmation from Google, I've discovered a new (to me, anyway) twist to Google's PPC Ad targeting for international sites.

In SEO, Googe decides on the country your site is relevant for first by ccTLD (Country Code Top Level Domain - like .ca, .uk, .mx, and so on) and then by IP and other methods.

Turns out this thinking also applies to PPC.

If you target a specific country, like in my case, Mexico, Google will only show ads to surfers within Mexico, right?

Wrong. They show ads to surfers in Mexico, yes. But they also show ads on Mexican tagged sites, such as sites with a Mexican IP address or sites that use the ccTLD of .mx, even if they are hosted in the US, etc.

It makes sense if you think about it. Of course, you may be thinking that this would only apply if you have chosen to advertise on the content network. You'd be wrong.

See, even if you don't advertise on the content network, there is a site that is a .mx and is not on the content network - www.google.com.mx

That's right. Even if you target only Mexico, your ads can show up to anyone anywhere in the world who happens to use google.com.mx. Anyone who needs to look at or verify geo-targeted ads knows it's sometimes hit-and-miss, but they do show up.

I first really put it all together when I was looking at traffic patterns for my Mexican campaign and noticed a bunch of traffic from the US and even India. This is normal, and is not necessarily an issue, as long as you are aware of it.

But let's say you really don't like it. There is a fix.

Instead of targeting Mexico (or wherever) you simply target physical locations within Mexico, such as cities, and lat/long areas. This forces a geolocation check and gets rid of your Japanese traffic.

Just remember that the traffic from Japan may be traveling Mexican business-people, since they would have to be using google.com.mx, and you are now excluding them.

But if that's what you need to do, then that's how you do it.

Ian

Interesting Antispam Tactic: Bot Bomb

My son Tas plays an online game called Mabinogi and today mentioned to me a new feature in the game, which had been overrun by something called bots.

Bots are basically programs that control player characters within the games, taking over for the person who is supposed to be doing it. There are many types of bots, including ones that do repetitive tasks so they player doesn't have to, and the worst, spam bots.

These basically stand in the middle of places where people congregate and shout out advertising (typically game money that in turn was gained by bots, sold for real world money). This is against the game rules and makes it no fun for players who are actually trying to play the game.

In order to combat this (which was ruining this and many other online games) the creators of the game did all they could from a game security perspective, but it was simply too difficult to keep up with and detect the spammers, which went to great lengths to simulate human behavior well enough to fool a computer.

But not a human. Experienced human players can detect bots (and spam) almost instinctively.

The game designers finally decided to try something different: using humans to detect bots. Within the game, they created something called a "bot bomb" which can be thrown at a suspected bot. The "bot bomb" then asks a very simple question, that any human player would be able to answer with no difficulty. Essentially a reverse Turing test, like a CAPTCHA.

If the bot fails, it's logged out and the account flagged. Time limits are in place to prevent humans from being "bot bombed" by being hit with so many that they can't respond properly.

Why is this on my blog? Because game theory is a very important aspect to how the modern web functions. There are more similarities to dealing with search and SEO as an online "game" than with offline human to human behavior, to the ongoing annoyance of search engines.

It occurs to me that a method that accurately and easily harnesses humans as spam detectors, while not overloading the system with unmanageable amounts of spam reports (and fake spam reports) is something that can be learned from by website owners and search engines alike.

Use humans to detect the spam, but use the system to verify it independently, in order to minimize false or malicious reports.

Interesting.

Ian

New Use for Canonical Tag - Geolocation.

I've been messing around with it, and I believe that I've found a new use for the Canonical tag - geolocation of gTLD's.

Simply park (not redirect) a ccTLD on your site, upload an HTML sitemap that points to all your pages with the ccTLD, then place the canonical tag with the gTLD on your pages.

Viola - the search engine "tags" your pages as geolocated via the ccTLD, but only displays the gTLD.

You keep your .com, but have now geolocated your site without having to host it locally.

This is a variation on a technique I've been using for a while, but much cleaner due to the addition of the canonical tag.

Ian

The Relationship Between Search Behaviour and SERP Landscape (Part 2)

This is the second and final part of a 2 part series on the relationship between search behavior and SERP landscape. Part 1 is here.

PART 2: THE EFFECT OF SERP LANDSCAPE ON SEARCHER BEHAVIOR

The seeds of this article arose when I was studying differences in searcher behavior between countries.

I noticed that in many countries, especially those who are relatively new to the internet being widely available, searchers seem to show, in general, lower stages of search behavior, often progressing no further than Stage 2 - Exploration.

At first, I assumed that this less sophisticated behavior was due to less experience with the internet, an assumption I now know to be false and, to a degree, ethnocentric.

The assumption that searchers in these countries are less sophisticated due to a lack of experience can be supported by the massive growth rate (500% or higher is common) of internet availability, but does not explain why even people who have had the internet for years in these countries still tend to show less sophisticated search behavior as a whole than searchers in developed countries.

There is no "national searcher" - everyone progresses by themselves as an individual due to their own experiences. Additionally, in many of these countries (typically developing countries) people who have internet tend to be well educated and spend a lot of time online - they should progress fairly quickly, yet often do not. So the question is - what's going on?


I believe the answer lies in the SERP landscape, and it's been hidden there due to the halo effect of the online industry as a whole.

You see, my research appears to show that searchers everywhere will naturally try to progress through the stages of search sophistication as they use search more and more.

However, if the SERPs do not support that progression, the progression stops. In short, increases in the sophistication of tool use is limited by the sophistication of the tool itself. Some tools simply lend themselves to more sophisticated actions than others.

How could this happen? Many ways.

Sometimes, it's due to limitations of the search engines themselves. Back in the days before Google, searcher behavior was also fairly unsophisticated, since attempts at sophisticated searches were generally foiled by small indexes filled with spam. It simply wasn't worth the effort to try to improve your search techniques when the search engine still gave you poor results.

Another reason this could happen is due to the available results. If nothing but spam is available, then you will get nothing but spam. If local websites are designed in a manner that is not search friendly, then no matter how good you are at searching, you are still unlikely to find what you need, because the search engine simply doesn't have it to give you. It may exist, but it's not available through a search engine.

GIGO - Garbage In, Garbage Out.

It is this second reason that appears to be why searcher behavior tends to "stall" in some countries and languages.

Although part of the blame sometimes lies on the search engines themselves for not adapting to local restrictions, culture and resources, the majority of the blame for this is, very simply, poor websites from an SEO (and often usability) perspective.

As websites for the region or language become more sophisticated and accessible to search engines (and start carrying useful information rather than being just web brochures), the search engines will begin to be able to provide a better sampling of results, thus allowing more advanced searcher behavior.

Analogy - Buying some Beans

As an analogy, think of going shopping to go get a can of beans at two different stores. The first store is a well run national chain with a huge selection and a logical, clear layout. Since you have been there many times before, and you already know exactly what you want, you will probably know exactly what to do, even if you have never bought beans before at that store.

You will go to the aisle that is most likely to contain the beans, glance quickly down it for the area most likely to contain the beans and head right there. On the way, you have probably already checked for coupons and sales, and have a very good idea of exactly what brand and size you want, along with what brands you may not want. You may also know whether you want to get the beans from the "All Natural" aisle, the "Ethnic Foods" aisle, or the general "Soups and Canned Goods" aisle.

Within seconds, you have exactly what you want, after following what is, if you think about it, a very sophisticated and effective search pattern, that was helped by a large, well-organized selection to choose from. This is Stage 4 - Control mode. You take control of your search and make it work for you.

Next, contrast this with going to get a can of beans from a nearby store that is very messy, disorganized and has little stock. First, forget coupons, the "All Natural" aisle and all of that. If you are lucky, there might actually be a "groceries" aisle. Your can will be there, in with the other cans, if you are lucky. If there is an organization to it, it's not readily apparent.

At this point, you have to throw planning and sophistication out and basically just start hunting through the shelves until you find a can of beans. The chances of it being exactly what you were looking for are remote, so you may then end up sifting through yet more cans in a vain hope that there might be another choice (hopefully one that has not expired). You may even decide to give up and either skip the beans altogether, or go check a different store. You are in Stage 2 - the Exploration mode.

In this scenario, it's not YOU who has changed and become less sophisticated, it's the shopping environment that has. You basically had to degrade your planning and shopping behavior to deal with the fact that sophisticated actions can only take place at the top end of the available actions, and the available actions are reliant upon the choices, support and quality present at the time.

Conclusion

In short, searchers in China, Mexico and elsewhere are only searching in unsophisticated manners because the SERPs themselves are unsophisticated, not because of some sort of cultural norm, which is often currently the assumption.

I hear this all the time: "The Chinese search like this" or "Mexicans tend to do searches this way", but this is ethnocentric and misleading.

It would be more accurate to say that "searchers in China do this" or "searchers in Mexico have to search this way". The sophistication of the searches are based on the sophistication of the search landscape, not the searchers themselves.

There are 2 major conclusions of interest to the search community that can be taken away from this, IMO.

  1. If a market shows unsophisticated searcher behavior (as evidenced by the types of searches performed), then there is almost certainly an excellent potential market for SEO (along with an attendant lack of awareness of SEO in the first place). Additionally, due to this lack of sophistication within the market, SEO and PPC are likely to very effective. There is simply less high-quality competition.

  2. Since searchers will increase in sophistication as the available search landscape evolves, it is important to prepare websites for more sophisticated searches (ie long tail terms, searches for specific on-site information rather than just contact information, etc) rather than simply rely on current KW research. You will need to evolve within your market as your searchers do. And they will, as quickly as the market does.
Hopefully these 2 articles have been helpful to you. I know the insights I've taken from coming up with them are serving me well with my current international SEO efforts.

Ian

The Relationship Between Search Behaviour and SERP Landscape (Part 1).

PART 1: STAGES OF SEARCHER BEHAVIOR

The following is original research developed from several hundred interviews across 9 countries (Canada, USA, Mexico, Brazil, China, Japan, Korea, England and France), combined with insights gleaned from both formal training in cultural anthropology and HCI (Human-Computer Interaction - aka usability). That said, I could be totally wrong. But I don't think so.

I was explaining this theory to a businesswoman in Brazil a couple of days ago and it occurred to me that I hadn't written about it yet, nor had I actually shared any of this with any of my SEO colleagues - forgive me, I'm writing a book on international SEO and as a result my blogging has been slow lately.

Here is the insight: People search differently at different stages in their comfort levels and experience with search engines, but progression through these stages are in turn affected by the SERP landscape they are provided.

This is most prominently (though not exclusively) visible within intranational searches, but this behavior can also be seen in any area where the SERP landscape is different from "standard", such as mobile search, image search, local search, and so on.

Searcher Behavior Stages

The speed of progression through these stages can vary based on how often the searcher searches (dozens of times per day vs a few times per month), the topics for which they search (each major topic will have it's own progression), and whether they get help or training (or have previous experience).

But this is the general progression of search stages:

  1. Trust. A searcher at this stage is a novice, and has no idea what to expect as far as results for their search. In general, they will click on the first link that is not clearly spam or inappropriate (some novices to computers in general as well as search will not even attempt to look for spam or inappropriate results - they will simply trust the search engine to answer their question). If they don't find what they are looking for, they tend to blame themselves for making a bad search or assume the information is not available.

    Searches tend to be simple and general ("taxi" or "how to babysit"). Repeat searches may get slightly more complex (ie the addition of a location - usually at the end of the search as an addition to the original) but not too much more complex - they tend to assume the search engine is much smarter than they are and will figure it out. Users at this level often do not realize the difference between PPC ads and organic results (or don't care).

  2. Exploration. Fairly quickly, searchers tend to progress to the next stage (since the first tends to be unfulfilling), which is exploration. At this stage, there are several ways it will play out, but the most common is the "serial-clicker" - someone who goes though every (or almost every) result in the desperate hope that one of the results will be an answer. Another common scenario is the searcher, rather than going back to the original search results, will begin to click on links within the site they landed on, surfing from page to page and site to site. MFA sites make a lot of money based on this behavior. This is a very important stage to remember, and I'll tell you why in Part 2. These searchers may go several levels deep in the results (page 3 and beyond).

    At this point the user is still trusting of the search engine, so search queries tend to be very similar to the trust stage. Instead, the searcher is modifying their own behavior (still assuming the problem is with themselves or with the available data) by clicking differently and exploring the results to try to find their answers.

  3. Analysis. At this stage, the searcher starts getting smarter and more experienced. They have come to realize that clicking on more links isn't really the answer. At this point, depending on their personality, the results they have seen so far, and other criteria, they will begin to change their tactics. Some of the tactics they may try include one or more of the following:

    • looking at the results page for likely candidate sites before clicking on any (aka "sniper" mode)
    • trying other search engines
    • beginning to use more sophisticated searches and planning ahead (ie putting the location first, then the query)
    • Figuring out the difference between PPC and organic listings (and tending to avoid PPC)
    • Finding and re-using "tried and true" search patterns (like "X reviews" "X FAQ" or "X wiki"

  4. Control. At this stage, the searcher becomes sophisticated and takes control over their searches. They realize that the results a search engine provide are in part controlled by the search query itself, as well as the results available. This is usually the level most people eventually find themselves at.

    At this stage, advanced search tactics are used, such as:

    • tiered searches (searching in a general manner, then using information gleaned from those results to perform the "real search" using the information and keywords from the previous search - like looking up the wikipedia entry for a topic, then using keywords and ideas from that to perform a second, "real" search)
    • searches based on likely content or title of a desired result, rather than the user's question
    • long tail searches become more prevalent
    • Simple parameters such as quotes or "results from this country" are more likely to be used
    • Actively trying to prevent bad or off-topic results by using negative parameters or less ambiguous terms.

  5. Expert. Most searchers do not reach this stage, as it requires study and is more difficult to do than the results are generally worth for most searches. Experts will use advanced search parameters, tiered searches, and other advanced techniques that require a good knowledge of search engine behavior. This category includes information and search professionals (SEO's, researchers, topic experts, advanced students).

    Searches are planned out, often "long tail" or tiered, and can include advanced and multiple parameters. PPC ads, often avoided at the control and analysis stages, will begin to be clicked on if they appear to answer the query. Experts are after the best result, and don't usually care how they get it. They will intelligently make exceptions to general rules of good searching if they believe the result will be good.

Well, that's my list of searcher behavior stages.

The next thing to realize is that these stages may repeat themselves for different queries or topics.

For example, someone may search at a Control or even Expert stage for a topic related to their work or hobby, but when confronted with something totally new (like planning a wedding) is likely to go through the stages again, though usually at a much faster rate than a new searcher would (sometimes in a few hours or less).

They know nothing about the topic, so they need to start by trusting the search engine again.

While the above information is useful in and of itself, in Part 2, I'll go over how people (and entire cultures/nations) can get "stuck" at certain stages, and the effect this has on international SEO and SEM.

Stay tuned.

Ian

Optimizing Powerpoint Presentations

I give a lot of presentations, and I optimize things for a living, so it was just a matter of time for nature to take it's course and for me to start wondering what the hell I was doing.

See, there are a lot of different types of presentations (I'm talking about ones that use slides or PowerPoint here). There are presentations intended to inspire, presentations intended to sell, presentations intended to convince, and presentations intended to teach. Each type of presentation will require it's own format and style.

I do mostly the teaching type, and I love doing them. Like anything else worth doing, if it's worth doing, it's worth doing right. For years I've been a frustrated victim of PowerPoint and so-called "advice" on how to make a great presentation. Enough is enough. It's simply not fair to the audience for someone in a technical field like SEO to try to teach them using the wrong presentation style.

Bad Advice

It's not so much bad advice, as it is the wrong advice. If you go looking, as I have, on how to make a great PowerPoint presentation, you'll find that most of the advice boils down to: "Make a presentation like Steve Jobs". All pictures, little or no text.


What was the point of this again?


Maybe that's great advice for Steve Jobs, who is trying to sell you overpriced, proprietary hardware and software by making it seem really, really cool, but for someone just trying to teach people how to get better rankings or convert visitors into customers, cute graphics and drop shadows simply don't cut it.

People at search marketing conferences don't want to be inspired (they already are, that's why they have a site they want to promote), they want to get their money's worth by learning tactics and strategies they can actually take back and use. It's about being practical and effective, not about getting excited over the latest color available for the iPod.

You simply can't use a Steve Jobs / Seth Godin type of presentation to do anything other than inspire. There is nothing wrong with inspiration (I love Seths work), but inspiration won't fix that 302 redirect issue you have, or give you a checklist for finding good links.

The worst part is when you need to take notes. As an audience member, you probably can't write fast enough, (or perhaps were at the session next door) and may want to download and review the presentation later instead. Fat lot of good that picture of the monkey with a banana with no text will do then!

Presenters at technical conference learn this very quickly (assuming they even get invited back) and change their presentations to include examples, data and text so that the presentation will be more useful - basically helping the audience take notes, so they can spend the time listening to you rather than scribbling madly.

That's the point, really - avoiding too much note-taking or boredom. Inspirational presentations do this with big picture statements and great graphics, technical/teaching presentations do this by providing the notes so you don't have to write them yourself. Both accomplish the same goals, just in different ways.

Over-Correction

This brings me to the next annoyance with technical/teaching presentations - information overload. These presentations are the opposite of inspirational ones - they are basically a speech or lecture in slide form. I've even seen people do the entire presentation simply by reading every line of every slide - horrors!


An old presentation of mine from 2006. Ouch.


I think these presentations are even worse than the inspirational ones, in that although it solves the problem of note-taking, there really isn't a reason to attend the conference anymore, is there? These presentations are boring - which is one of the worst insults you can give to a presentation, in my opinion.

The font size keeps getting smaller as more and more information is packed in, the number of slides increase dramatically, and many presenters find themselves fast forwarding or skipping slides due to time constraints. This is very, very annoying.

Showing me a smaller amount of information than you know is one thing, but skipping by slides of potentially juicy information because you ran out of time is unprofessional and frustrating.

Making the Ideal SEO/SEM Conference Presentation

So, what to do? How do you inspire people enough to make them interested in what you have to say, but at the same time, give them enough information that it's worth it for them to get inspired in the first place?

Well, let's break it down. What do you need to accomplish? You need to:

  • Make the presentation interesting and even entertaining to a live audience.

  • Free people from the necessity of writing a lot of notes. It's great for them to make "AhHa" notes to themselves as they listen, but they should not be trying to scribble down every word you say for fear of missing something. If your audience is doing this, you should have stuck with blogging.

  • Provide answers and resources, which you can use during the Q&A, as well as helping people who download your presentation. Although some people don't like to allow their presentations to be downloaded, they are missing the point (among other things).

    If your presentation is about your branding, then you are giving a commercial rather than a presentation and you should do everyone a favor and get the hell off the stage (unless the audience knew they were going to see a commercial in the first place). If you are trying to teach people, put a copyright notice on it, ask for a link and attribution, and let them download your notes.

  • Identify yourself. This is not only great from a branding and marketing standpoint, but it's also helpful to your audience, should they wish to contact you. How will they remember? You've let them download the presentation! It's now like a giveaway pen with your logo on it, except it's a lot more useful, more likely to be saved, and contains much better contact information. Virtual swag, and costs you nothing to reproduce!
The Perfect* Presentation

*A good presentation should be like a website - always being tested and improved. Perfection is a goal, not a state of being, especially for anything using computers.

Section 1 - Splash Page. This should introduce your presentation and yourself in a simple and easy to understand format - no marketing hype. Also the perfect place to point out where you can download the presentation, since the audience looks at this while you are being introduced, and it's really the only time they can/should take notes. They usually also look at this slide the longest, since it's a placeholder during the introduction. Never underestimate the value of a splash page. A splash page/slide has a job to do, and they are only useless if you forget that. Use it wisely and well, or don't use it at all.

Section 2 - TOC. As my old drill instructor said, "Tell 'em what you're gonna tell 'em, tell 'em, then tell em what you told 'em". Resist the urge to put in every slide - just the main topics and sections. This is the perfect opportunity to point out you have additional information at the end of the presentation that won't be shown (I'll get to this in a moment).

Section 3 - Visual Presentation. The main event! This is where you keep the graphics clean, the text light, large and punchy, and the entertainment value high. Don't go into excruciating detail. Provide an overview and enough information so that people will be introduced to what they don't know and what kind of questions they should be asking you and themselves.

Remember many members of your audience will only be interested in certain sections, and just want an overview on how it all fits together. But don't over correct and be too shallow. A good way to accomplish this is to do an overview on a topic or sub-topic and then give some specific, practical tips related to that topic or sub-topic.

Section 4 - FAQ. If you have one or two questions that always come up that are not easily addressed in section 3 - put them in here. This is an audience favorite, and establishes you as helpful and available, rather than just a lecturer. This section is optional, but recommended. It does 3 jobs - it answers questions, it signifies the end of the official presentation, and it provides a time cushion where you can easily skip this part or go into greater detail, depending on how much time you have, without making the audience feel they have been ripped off.

Section 5 - Resources. This is where you put all that juicy info that you can't put in the visual presentation because you'd need to use small fonts and crowded screens. Charts, data, links, a bibliography and so on can all be placed in here. This is a great incentive for the audience to download your presentation if they want, but if they only wanted an overview, they don't feel they are forced to. You can also put in the kind of information you may want handy during the Q&A part of the session, but didn't want to get into during the main presentation.

Section 6 - Contact information. This is more than just your name and email. You can include a company overview, services you offer and even things like special offers and discount codes. This is your reward and incentive for making the presentation useful and available for download. It's low-key and not pushy, but highly effective at getting your marketing message out without ruining the on-stage presentation.

Wrap Up

I hope this has been helpful to you (and your audience). If you decide to use this format, feel free to leave a link in my comments area to the presentation for others to see and be inspired by.

Ian

Are you an SEO/SEM in Mexico or Brazil?

I'll be visiting Mexico from Feb 16-24 and Brazil from Feb 24-Mar 3 for the purposes of meeting other SEO's, learning about the local search marketing environment and culture, and generally beginning to learn more about and promote those two countries as excellent SEO targets for international businesses.

If you live in either place (specifically Mexico City, Acapulco, Cancun, Sao Paulo and Rio de Janeiro) I'd love to meet with you - I'm also looking for local contacts and sources, and can in turn introduce you to my other contacts and sources internationally. I'll buy the beer :)

And, just to make this post more interesting, I'm in the process of finishing up (finally) my book (International SEO) and I'd like to share with everyone the top 20 international SEO/SEM countries (there may be a few surprises in here for some people):

This list was created by comparing overall population, internet users, internet penetration, broadband subscribers, and user growth for 194 countries. Ranked in order of internet broadband users:

1. United States
2. China
3. Japan
4. Germany
5. Korea, South
6. United Kingdom
7. France
8. Italy
9. Canada
10. Spain
11. Brazil
12. Netherlands
13. Taiwan
14. Australia
15. Mexico
16. Turkey
17. Russia
18. Poland
19. India
20. Vietnam

I intend to meet with SEO's and other search marketers in every single country before publishing the book, hence my trip to Brazil and Mexico.

Cheers,

Ian

3 Factors of International SEO

International SEO is just like regular SEO, but with a type of personalization.

Arguably, the first step towards true search personalization by the search engines was international search. This was followed by local search and universal search, with even more coming ideas coming along, including logins, "answers" and so on.

But it started with international search, because that was the big issue - how do you develop relevent results if you can't even present them in the language of the visitor? How can you claim a result for someone looking to buy something is relevent if the companies in the results don't even ship to where that searcher is? Unless you restrict yourself only to a certain region, you can't.

As a result, international search is, even with all it's little issues and odd behaviours, one of the more settled and stable types of personalization. Which means you can actually DO international SEO, rather than hoping you are doing it. As a matter of fact, I'm writing a book on that very subject right now.

First things first: standard SEO still counts. You still need good technological setup, links and content. If you don't start here, then internationalizing your site will only make things harder, not easier.

Once you have that in place, here are the 3 main factors for internationalization:

Localization

Localization is the act of making your site relevent and useful to visitors from a particular locale. This includes:

  • Translation and language localization (examples include US vs UK spellings and regional terms like "pop" vs "soda")

  • User interface (examples: sites aimed at Asian languages should be more link-heavy than English sites, you need to avoid menu systems that assume that the words in the menu are a specific width, and your order forms need to accept addresses in the local order)

  • Keyword research - there is no Chinese "WordTracker" or Keyword Discovery" program currently available. I often need to do PPC campaigns to do any KW research at all. This is why I use SEO-trained translators - they cost more, but are worth it.

Geolocation

Geolocation is the process of figuring out the best localized site to present to a particular visitor. Often this is done by matching a visitor IP from a country to the country-specific version of a website (IP Geolocation), but this is a simplistic method that doesn't account for todays highly mobile workforce - an American visiting Japan may want an American site, or they may want a Japanese site, you don't know, and can't guess easily.

And what country is Google from? More specifically, which site do you present to search bots, and how do you do it to avoid problems. This is an entire article by itself, and it's not as easy as you may think.

Geolocation, then, is the process of both choosing the right version of your site for visitors, and also helping the search engines make the same choice.

Here are some ways to help do this:

  • Geolocating the website. The most important step of the process is letting everyone know what country your site is targeted to. You can ONLY target one country per page, and in reality, for most organizations, one country per site or sub-site. Methods of doing this are using a ccTLD (country code top level domain like .ca and .uk) and using an IP address that is assigned to a particular country.

  • Geolocating the visitor. This can be done in several ways. You can detect their IP address, you can ask them to identify their preferred site, and you can detect their browser settings.

  • Visitor language detection. Many countries have more than one official language, and even countries that only have one official language may have substantial alternate linguistic populations ( For exampe, Spanish speakers in the US, English Speakers in Korea). This means you can't just assume that country=language. This can be either detected using browser settings, the keyword used, user choice, or (last resort) geolocation.

  • Website language detection. This is harder than you may think. Many language share words or have similar characters, making it hard to automatically detect a language if you are a computer. This can usually be solved by declaring the language within the code.
Globalization

Globalization in this context means bringing it all together. Combining the localization and geolocation in order to present the best site for any visitor. Creating a system that treats every visitor as special, and dealing with their language choice as easily as dealing with the fact that they want your product in size 8, or shipped overnight, or paid for in Euros.

Geolocation and Localization focus on the differences between the countries, languages, and cultures of each visitor, while Globalization focuses on the similarities between visitors, and works towards communicating a message or selling a product seamlessly.

The main mistake I see during globalization efforts is an inappropriate division of responsibility. There are some things that head office should be in control of, and some that the local office should be in control of.

Head office should run the branding and overall marketing strategy for the company. Local offices should then take that strategy, and devise tactics that will work in the local market that further the global strategy.

This means, of course, that head office should control the urge to create "strategies" that are really tactical - like choosing the exact wording of ads, and so forth. Because none of that will matter during translation, and will almost certainly make it worse. Let the locals sell to the locals. You just provide them with the tools and support to do it.

Likewise, local offices should control the local tactics, such as specific marketing copy, photographic images (other than product shots) and, within reason, timing. Launching a major campaign on your new fast food item may not be a brilliant strategy at the beginning of Ramadan, for example.

But local offices tend to be focused on their own area. They may not understand the global issues (including supply line problems) and often don't have as strong of a sensitivity to the "brand" as head office does. A local office in South America infamously changed the BMW logo to better match it's new website, for example. You don't do that!

Conclusion

In order to properly do international SEO, you need to address all three aspects of it - localization, geolocation, and globalization. Once you get these parts are working in harmony, you'll find that it's actually fairly easy to continue to do and improve upon.


Ian

Search Engine Share 2008

I just made these for a presentation, based on the latest information I have for this year. Enjoy.



Organic Search Engine Share 2008 for North America


Organic Search Engine Share 2008 for North America





PPC (Pay Per Click) Search Engine Share 2008 for North America

PPC (Pay Per Click) Search Engine Share 2008 for North America

Ranking Reports - a Defence.

You know you are getting old when you have an uncontrollable urge to start a post starting with words like "in my day.." , "I remember when.." or "I've seen this before..."

But it's true - that's the problem. People who don't learn from the past are doomed to repeat it, or to even live through something worse. I was reminded of this while reading a recent forum thread about ranking reports.

Someone started complaining that Google had prevented a certain piece of ranking software from getting information from them. This was almost universally followed not by useful advice, but by a bunch of people saying ranking reports were useless. You see this type of thing with regard to PageRank, as well.

I appreciate that the posters meant well, but I really wish people would think clearly, rather than spouting off the latest over-correction crap they've heard recently. Regardless of the topic, there seems to be a consistent flow to this:

  1. Something is pushed too far. Could be a theory, or political view, or whatever.
  2. Experts, concerned about this, issue reasons why those pushing too far are wrong
  3. A new generation of people, in a fit of rebellion against the status quo, quickly grab the "new way of thinking" and then, you guessed it, go too far with it.
  4. Eventually, there is a counter-rebellion and the thinking about the topic matures, now that both sides have been expressed fully and challenged.

The examples are legion. Think about any political, civil rights or scientific revolution and you will see this. So it should be no surprise that it affects SEO, as well.

That being the case, let this be the first shot fired in the counter-revolution against Ranking Reports.

See, in the old days, no one did ranking reports. Who cares how you rank if no one ever visits your site? Instead, server reports were used. In this case, it was often "hits". At the time, hits were not a bad measurement. Pages were very light on graphics, and e-commerce wasn't really a workable concept - people were after "branding" and "buzz". In this context, measuring hits was an easy and fairly accurate way to measure the popularity of a site.

Even today, a popular site will tend to have more hits than a similar unpopular site. You can complain about issues with hits all you want, but the broad brush stroke is usually still representative.

Of course, there are many problems with hits - they are a method of measuring server performance, not visitor performance. A page with a single graphic on it counts as 2 hits (one for the html page and one for the graphic) whereas a page with 50 images on it (even 1x1 pixel images) would count as 51 hits. In both cases, there is only one visitor, however. This was exploited by spammers and bad SEO's who, rather than increasing visitors to a site, would sometimes just slowly add more invisible pictures to the pages in the site, thereby increasing the hits without increasing the actual visitors.

Soon, the industry became more sophisticated, the experts began to call for ranking reports instead of "hits" reports, and we began to move from server-based measurement to search engine based measurement. Instead of measuring our performance using internal criteria (server logs), we switched to external criteria (ranking reports).

After all, it can clearly be shown that your rankings directly affect your traffic and sales. I can see this in both PPC and SEO everyday across many industries. Additionally, measuring based on rankings was more closely tied to the job an SEO does - optimizing for search engines, not for server performance. Life was good. For a while.

Then, the problems with ranking reports began to emerge - first, search rankings can fluctuate from day to day and even minute to minute. Second, some clients wanted daily and even twice daily ranking reports for their top terms. This placed a lot of pressure on both SEO's and search engines, in many cases for no good reason. There is a point when granularity exceeds the information. You can measure too much, and too deeply, getting lost looking at bark under a microscope when you should be looking at the forest, or at least groups of trees.

Other issues with ranking reports were just as serious - ranking well in a term that no one searches for is useless, and sending unqualified traffic was just as bad (and even worse) than sending no traffic at all. Finally, search engines started placing robots.txt files on them. Suddenly, ranking reports didn't look so good anymore. The experts changed course again and began to recommend analytics, instead.

Ironically, we've come full circle. We started doing server based measurement (hits and unique visitors), abandoned it in favor of search engine based measurement (ranking reports) and have now come back to server based measurement - except now we are a lot more sophisticated. We don't want hits anymore, we want unique visitors, time on site, fallout reports, and all sorts of other Key Performance Indicators (KPI). This is cool. It's scientific, it's visitor-focused, and it's got all sorts of pretty graphs and charts.

Ranking reports are now in disfavor, to the relief of bad SEO's who can now justify bad rankings (and their dislike of doing linking campaigns) by talking about how rankings don't matter - it's visitors and sales. How very convenient. It sounds great. Very forward-thinking and modern. Except it's wrong.

Why? Because there is no context. Your visitors and buyers have increased - big deal. Maybe your competition's has also increased, but at a rate ten times yours. But you won't know that, because guess what? You are only looking at your own server for information. If it's not on your server, it doesn't exist. Analytics ignores the outside world and just builds a better navel gazer.

It's all well and good to make yourself better, to optimize the user experience and to increase conversions, but I have news for you - a qualified visitor does not usually stumble randomly to your site - they arrive from somewhere, and that is very often from a search engine. To say rankings don't matter is to do your clients and yourself a great disservice. The best website in the world is no good if no one can find it.

So what now? Go back to ranking reports? No - their weaknesses are well known and legitimate. The next step is to blend your rankings and other external data with your analytics. The visitor experience of your website doesn't begin when they land on your page, it should start as soon as they start looking for what you have to offer. Branding, buzz, social media, PPC and SEO all play a part in this, and thus far analytics has not adequately addressed it, particularly in SEO and social media.

Oh, they can say that people coming to the site based on keyword "X" convert at such and such a rate, but they usually don't suggest new keywords, or tell the analyst where the site ranked for keyword "X" that day, or whether the site was on the front page of Digg that day, or whether there was a news article, blog post or twitter campaign going on. That's important, but since it's not on their server, it doesn't get reported. There is no context, and therefore the information is suspect.

SEO's who "don't believe in ranking reports" don't get it. That's part of their job. If you want to avoid the robots.txt issues, do it manually or use a human-operated script rather than a robot (yes, there is a difference). It will help stop you from over-doing it, anyway. But to say that you optimize sites for search engines and then say that you don't believe in checking a search engine to see if your site is optimized for it is ludicrous.

If you have done your keyword research, a ranking report is a legitimate and extremely important part of the puzzle. It's only a part of the puzzle, not the solution, but it's a very important part of it. Data without context is meaningless, and context is provided from external sources of information, not self-referencing (internal) sources.

I think, like analytics moving from "hits" to KPI, search reports need to become more sophisticated, as well. A simple ranking isn't enough. You need more information. In many cases this information can come from analytics, but it can also come from linking reports, SE saturation, social media buzz references, and many other places. I call not for the return of ranking reports 1.0, but for ranking reports 2.0 - newer, improved and more connected to the visitor.

Doing the job properly requires BOTH server-based and external information sources. I hereby issue a call for all search marketers to move past the current myopic focus on analytics and to look at the whole picture - that's where true understanding lies.

Ian

Muphry's Law

AS mny of you know, I kan't spel good sumtimes.

However, I've learned to use the Google Toolbar spellchecker (and the Word spellchecker, when I'm in there) and now most of the time my misspellings are more of the "properly spelled wrong word" variety. You know, typing "it it" instead of "it is" or my most infamous mistake, which, no matter how hard I try, I seem to consistently use wrongly: "its" vs "it's" vs "its'"

This is a bit of a Yak Shaving day for me, as I was reading a post by Rebecca over at SEOMoz and it led me to another post which led me to another with this wonderful term in it: "Muphry's Law" (no, not "Murphy's Law", that's something related but different).

Muphry's Law dictates that (a) if you write anything criticizing editing or proofreading, there will be a fault of some kind in what you have written; (b) if an author thanks you in a book for your editing or proofreading, there will be mistakes in the book; (c) the stronger the sentiment expressed in (a) and (b), the greater the fault; (d) any book devoted to editing or style will be internally inconsistent.

In short, people wielding the sword of editorial righteousness tend to cut themselves with it. I've noticed this myself, and think it's really funny, especially since I'm almost always in the "spelling correctee" camp...

On a related note, there is another word I like: incorrection, or a correction that itself is incorrect. I used to get this all the time in legal discussions. I'd say something was X, then someone who'd obviously learned about the law watching the Jerry Springer show "corrects" me with a really stupid definition/usage. My favorite part is when they do this with that look on their face that says they pity me for being so stupid...

Anyway, this whole thing reminded me of misspellings and such, so I thought I'd through together some interesting references:


Of course, not all misspellings are bad - you can do very well for yourself as a marketer if you bid on, SEO for, and register domains with, misspellings. Here are some tools and resources for this:


Other Hints

  1. If you bid on typos in PPC, make sure you put them into their own Adgroup and then remove the DKI (Dynamic Keyword Insertion) in your ads for that group - otherwise you look illiterate.
  2. Testimonials, ALT attributes, the keyword metatag (for Yahoo and MSN, not Google), filenames (ie misspelling.htm), image names (misspelling.jpg), and incoming anchor text from a "misspelling glossary" on your site, or from other sites are all great ways to show up for misspellings in organic SEO.
  3. For non-English languages, always include the spelling of words without special characters - if the word is "Montréal" or "piñata", then also optimize for "Montreal" or "pinata" - some people are using a US style keyboard, and find it easier to type without special characters, even if they know the language perfectly. They are used to Google helping them and giving them good results even though it's not perfectly spelled. You want to be one of those good results.
  4. For English, don't forget "s" and "z" transpositions between the US and UK spellings - ie "optimize" vs "optimise".

Ian

PS: I'm well aware that since this is a post about spelling and I'm a lousy speller, I'm a prime candidate for Muphry's Law. So I've short-circuited the problem by deliberately leaving in some errors and announcing that I've done so. Let's see the damn Law deal with THAT... ;)

SEO-Browser.com Update

One of the nice things about being the co-creator of an SEO tool like the SEO Browser is that when I have an idea for something I'd like fixed or added, it's a lot easier to make sure it happens ;)

Of course, I have to pay for it, but at least it happens...

For those of you who don't know what the SEO browser is, it's an online SEO tool that lets you see your site the way a search engine sees it. Although this sounds like something you can do in any text only browser like Lynx or Firefox with certain options turned off, there is a lot more to it than that.

Some of the features (some you have to go into "Advanced Mode" to see. It's at the top of the SEOBrowser page):
  • Text only mode browsing of the site.
  • Image and object alt attributes are shown, but Italicized to let you know it's alt text.
  • Issues with robots, metatags, etc are highlighted
  • The character count for titles, meta information, etc are listed.
  • You can see pages in advanced modes, such as with "stop characters" removed or in compressed mode (how a search engine actually stores the page)
  • You can see WHOIS info, DNS Info, Header response etc
  • Toggles Highlights your keywords so you can visualize the page easier.
  • Lists the KeyWord Density for every keyword on the page.
One of the only things it doesn't do is give advice. It's a tool for professionals to gather information with, not some sort of mechanical SEO tool. I don't believe in programs that try to replace the skill of a real SEO. Basically, I designed it to do what I needed. Then shared it.

Anyway, I'm pretty proud of the latest tweak. It's such a simple thing, but can be huge when you are dealing with complicated sites. When you go to each page, it lists the response code at the top in orange.

That's all. But in practice, it's actually really important. I'll give you a few examples. Non-SEO's may not appreciate these, but the rest of you should be able to figure out why I like this so much.

First, you can just go to my home page: mcanerin.com [SEO Browser Version]

You'll see it's just a 200OK. Big deal. Now, lets get more interesting. Check out these pages by loading them into the seobrowser:
Cool huh? And very, very useful for people debugging sites. Try to spot the errors (some of them serious) in the example sites above. Then maybe take a look at your own.

Ian

Search Marketing In Latin America?

Every year, I go visit at least one of the major international search markets. Since my specialty is in global SEO/SEM it would not be very credible for me to claim to be an expert on marketing to a country I've never even been to, would it?

Actually, there are a depressing number of companies that do exactly this - sell one-page "international" pages for sites, typically poorly translated, and then "submit" them to the local Google (ie Google.co.uk, etc). This is a scam.

First, you don't need to submit to Google - any version. Second, one page of content is highly unlikely to bring in any traffic to speak of, unless it's desperation or long tail traffic. Finally, the whole concept is wrong. If you are not willing to invest more than one page for an entire market, then you have more problems with your marketing plan than SEO.

Anyway, although I have offices in Mexico, Brasil and Argentina, they are through a contract firm and I've honestly never been there or met the staff. So I'm going. As someone formally trained in anthropology, I'm a huge believer in the value of "field work" and "being there". Now we get to the reason I'm bothering to tell you all this:

If YOU were going on an SEM-related trip to South/Latin America - what would you be interested in learning? Where would you go? What would you do? I'll be happy to share what I learn with you, if you want to give me some guidance as to what you want to know.

I'm thinking of going in the fall, on advice from some friends in Argentina regarding the weather. Since I only have about a week or so, I need to be very focused. Here is a (very) rough idea of what I'm interested in seeing/doing:

Countries to visit: Mexico and Brasil. Argentina if I can, but it's iffy this trip, given the time constraints. Just so you know, the top 3 are (in order): Brasil, Mexico, then Argentina.

Search Companies to Meet with (if possible): Yahoo. Anyone else?

Questions to be Answered: What is the real search market share in these countries? What are the scams that need to be avoided? What tactics seem to work the best?

If you can think of anything else, let me know, and I'll find out for you :)

Ian

PS: No, I'm not spelling "Brazil" wrong - the locals call it "Brasil".

Yay! Patent Number Assigned.

I've already mentioned that I've applied for a search related patent recently.

Well, I finally got offical confirmation of the "Patent Pending" status and my very own USPTO Patent Application Number - 60/999,180. "System and Method for Website IP Address Based Geolocation"

Cool. Now I just have to finish up the control panel and it will be ready for public use.

Ian