Showing posts with label Geolocation. Show all posts
Showing posts with label Geolocation. Show all posts

SEO-Browser Updated!

We've updated the SEO-Browser!

You can now use it to visit sites from the US or Canada, which is very helpful for multi-country sites. More countries will be added later, as the demand grows.

Canada: ca.seo-browser.com
USA: us.seo-browser.com

In order to add this functionality, we had to put in a FREE one time signup. Don't worry, we don't share emails with anyone.

You can still use it in simple mode without a signup, but the advanced mode is starting to get too difficult to program without assigning users and sessions, hence the login.

Eventually, the advanced mode will add things like ranking reports, etc - all of which require user/password combinations for privacy and data storage purposes, so this is preparation for that.

Google International PPC Ad Targeting Twist

After some testing and a confirmation from Google, I've discovered a new (to me, anyway) twist to Google's PPC Ad targeting for international sites.

In SEO, Googe decides on the country your site is relevant for first by ccTLD (Country Code Top Level Domain - like .ca, .uk, .mx, and so on) and then by IP and other methods.

Turns out this thinking also applies to PPC.

If you target a specific country, like in my case, Mexico, Google will only show ads to surfers within Mexico, right?

Wrong. They show ads to surfers in Mexico, yes. But they also show ads on Mexican tagged sites, such as sites with a Mexican IP address or sites that use the ccTLD of .mx, even if they are hosted in the US, etc.

It makes sense if you think about it. Of course, you may be thinking that this would only apply if you have chosen to advertise on the content network. You'd be wrong.

See, even if you don't advertise on the content network, there is a site that is a .mx and is not on the content network - www.google.com.mx

That's right. Even if you target only Mexico, your ads can show up to anyone anywhere in the world who happens to use google.com.mx. Anyone who needs to look at or verify geo-targeted ads knows it's sometimes hit-and-miss, but they do show up.

I first really put it all together when I was looking at traffic patterns for my Mexican campaign and noticed a bunch of traffic from the US and even India. This is normal, and is not necessarily an issue, as long as you are aware of it.

But let's say you really don't like it. There is a fix.

Instead of targeting Mexico (or wherever) you simply target physical locations within Mexico, such as cities, and lat/long areas. This forces a geolocation check and gets rid of your Japanese traffic.

Just remember that the traffic from Japan may be traveling Mexican business-people, since they would have to be using google.com.mx, and you are now excluding them.

But if that's what you need to do, then that's how you do it.

Ian

New Use for Canonical Tag - Geolocation.

I've been messing around with it, and I believe that I've found a new use for the Canonical tag - geolocation of gTLD's.

Simply park (not redirect) a ccTLD on your site, upload an HTML sitemap that points to all your pages with the ccTLD, then place the canonical tag with the gTLD on your pages.

Viola - the search engine "tags" your pages as geolocated via the ccTLD, but only displays the gTLD.

You keep your .com, but have now geolocated your site without having to host it locally.

This is a variation on a technique I've been using for a while, but much cleaner due to the addition of the canonical tag.

Ian

The Relationship Between Search Behaviour and SERP Landscape (Part 2)

This is the second and final part of a 2 part series on the relationship between search behavior and SERP landscape. Part 1 is here.

PART 2: THE EFFECT OF SERP LANDSCAPE ON SEARCHER BEHAVIOR

The seeds of this article arose when I was studying differences in searcher behavior between countries.

I noticed that in many countries, especially those who are relatively new to the internet being widely available, searchers seem to show, in general, lower stages of search behavior, often progressing no further than Stage 2 - Exploration.

At first, I assumed that this less sophisticated behavior was due to less experience with the internet, an assumption I now know to be false and, to a degree, ethnocentric.

The assumption that searchers in these countries are less sophisticated due to a lack of experience can be supported by the massive growth rate (500% or higher is common) of internet availability, but does not explain why even people who have had the internet for years in these countries still tend to show less sophisticated search behavior as a whole than searchers in developed countries.

There is no "national searcher" - everyone progresses by themselves as an individual due to their own experiences. Additionally, in many of these countries (typically developing countries) people who have internet tend to be well educated and spend a lot of time online - they should progress fairly quickly, yet often do not. So the question is - what's going on?


I believe the answer lies in the SERP landscape, and it's been hidden there due to the halo effect of the online industry as a whole.

You see, my research appears to show that searchers everywhere will naturally try to progress through the stages of search sophistication as they use search more and more.

However, if the SERPs do not support that progression, the progression stops. In short, increases in the sophistication of tool use is limited by the sophistication of the tool itself. Some tools simply lend themselves to more sophisticated actions than others.

How could this happen? Many ways.

Sometimes, it's due to limitations of the search engines themselves. Back in the days before Google, searcher behavior was also fairly unsophisticated, since attempts at sophisticated searches were generally foiled by small indexes filled with spam. It simply wasn't worth the effort to try to improve your search techniques when the search engine still gave you poor results.

Another reason this could happen is due to the available results. If nothing but spam is available, then you will get nothing but spam. If local websites are designed in a manner that is not search friendly, then no matter how good you are at searching, you are still unlikely to find what you need, because the search engine simply doesn't have it to give you. It may exist, but it's not available through a search engine.

GIGO - Garbage In, Garbage Out.

It is this second reason that appears to be why searcher behavior tends to "stall" in some countries and languages.

Although part of the blame sometimes lies on the search engines themselves for not adapting to local restrictions, culture and resources, the majority of the blame for this is, very simply, poor websites from an SEO (and often usability) perspective.

As websites for the region or language become more sophisticated and accessible to search engines (and start carrying useful information rather than being just web brochures), the search engines will begin to be able to provide a better sampling of results, thus allowing more advanced searcher behavior.

Analogy - Buying some Beans

As an analogy, think of going shopping to go get a can of beans at two different stores. The first store is a well run national chain with a huge selection and a logical, clear layout. Since you have been there many times before, and you already know exactly what you want, you will probably know exactly what to do, even if you have never bought beans before at that store.

You will go to the aisle that is most likely to contain the beans, glance quickly down it for the area most likely to contain the beans and head right there. On the way, you have probably already checked for coupons and sales, and have a very good idea of exactly what brand and size you want, along with what brands you may not want. You may also know whether you want to get the beans from the "All Natural" aisle, the "Ethnic Foods" aisle, or the general "Soups and Canned Goods" aisle.

Within seconds, you have exactly what you want, after following what is, if you think about it, a very sophisticated and effective search pattern, that was helped by a large, well-organized selection to choose from. This is Stage 4 - Control mode. You take control of your search and make it work for you.

Next, contrast this with going to get a can of beans from a nearby store that is very messy, disorganized and has little stock. First, forget coupons, the "All Natural" aisle and all of that. If you are lucky, there might actually be a "groceries" aisle. Your can will be there, in with the other cans, if you are lucky. If there is an organization to it, it's not readily apparent.

At this point, you have to throw planning and sophistication out and basically just start hunting through the shelves until you find a can of beans. The chances of it being exactly what you were looking for are remote, so you may then end up sifting through yet more cans in a vain hope that there might be another choice (hopefully one that has not expired). You may even decide to give up and either skip the beans altogether, or go check a different store. You are in Stage 2 - the Exploration mode.

In this scenario, it's not YOU who has changed and become less sophisticated, it's the shopping environment that has. You basically had to degrade your planning and shopping behavior to deal with the fact that sophisticated actions can only take place at the top end of the available actions, and the available actions are reliant upon the choices, support and quality present at the time.

Conclusion

In short, searchers in China, Mexico and elsewhere are only searching in unsophisticated manners because the SERPs themselves are unsophisticated, not because of some sort of cultural norm, which is often currently the assumption.

I hear this all the time: "The Chinese search like this" or "Mexicans tend to do searches this way", but this is ethnocentric and misleading.

It would be more accurate to say that "searchers in China do this" or "searchers in Mexico have to search this way". The sophistication of the searches are based on the sophistication of the search landscape, not the searchers themselves.

There are 2 major conclusions of interest to the search community that can be taken away from this, IMO.

  1. If a market shows unsophisticated searcher behavior (as evidenced by the types of searches performed), then there is almost certainly an excellent potential market for SEO (along with an attendant lack of awareness of SEO in the first place). Additionally, due to this lack of sophistication within the market, SEO and PPC are likely to very effective. There is simply less high-quality competition.

  2. Since searchers will increase in sophistication as the available search landscape evolves, it is important to prepare websites for more sophisticated searches (ie long tail terms, searches for specific on-site information rather than just contact information, etc) rather than simply rely on current KW research. You will need to evolve within your market as your searchers do. And they will, as quickly as the market does.
Hopefully these 2 articles have been helpful to you. I know the insights I've taken from coming up with them are serving me well with my current international SEO efforts.

Ian

The Relationship Between Search Behaviour and SERP Landscape (Part 1).

PART 1: STAGES OF SEARCHER BEHAVIOR

The following is original research developed from several hundred interviews across 9 countries (Canada, USA, Mexico, Brazil, China, Japan, Korea, England and France), combined with insights gleaned from both formal training in cultural anthropology and HCI (Human-Computer Interaction - aka usability). That said, I could be totally wrong. But I don't think so.

I was explaining this theory to a businesswoman in Brazil a couple of days ago and it occurred to me that I hadn't written about it yet, nor had I actually shared any of this with any of my SEO colleagues - forgive me, I'm writing a book on international SEO and as a result my blogging has been slow lately.

Here is the insight: People search differently at different stages in their comfort levels and experience with search engines, but progression through these stages are in turn affected by the SERP landscape they are provided.

This is most prominently (though not exclusively) visible within intranational searches, but this behavior can also be seen in any area where the SERP landscape is different from "standard", such as mobile search, image search, local search, and so on.

Searcher Behavior Stages

The speed of progression through these stages can vary based on how often the searcher searches (dozens of times per day vs a few times per month), the topics for which they search (each major topic will have it's own progression), and whether they get help or training (or have previous experience).

But this is the general progression of search stages:

  1. Trust. A searcher at this stage is a novice, and has no idea what to expect as far as results for their search. In general, they will click on the first link that is not clearly spam or inappropriate (some novices to computers in general as well as search will not even attempt to look for spam or inappropriate results - they will simply trust the search engine to answer their question). If they don't find what they are looking for, they tend to blame themselves for making a bad search or assume the information is not available.

    Searches tend to be simple and general ("taxi" or "how to babysit"). Repeat searches may get slightly more complex (ie the addition of a location - usually at the end of the search as an addition to the original) but not too much more complex - they tend to assume the search engine is much smarter than they are and will figure it out. Users at this level often do not realize the difference between PPC ads and organic results (or don't care).

  2. Exploration. Fairly quickly, searchers tend to progress to the next stage (since the first tends to be unfulfilling), which is exploration. At this stage, there are several ways it will play out, but the most common is the "serial-clicker" - someone who goes though every (or almost every) result in the desperate hope that one of the results will be an answer. Another common scenario is the searcher, rather than going back to the original search results, will begin to click on links within the site they landed on, surfing from page to page and site to site. MFA sites make a lot of money based on this behavior. This is a very important stage to remember, and I'll tell you why in Part 2. These searchers may go several levels deep in the results (page 3 and beyond).

    At this point the user is still trusting of the search engine, so search queries tend to be very similar to the trust stage. Instead, the searcher is modifying their own behavior (still assuming the problem is with themselves or with the available data) by clicking differently and exploring the results to try to find their answers.

  3. Analysis. At this stage, the searcher starts getting smarter and more experienced. They have come to realize that clicking on more links isn't really the answer. At this point, depending on their personality, the results they have seen so far, and other criteria, they will begin to change their tactics. Some of the tactics they may try include one or more of the following:

    • looking at the results page for likely candidate sites before clicking on any (aka "sniper" mode)
    • trying other search engines
    • beginning to use more sophisticated searches and planning ahead (ie putting the location first, then the query)
    • Figuring out the difference between PPC and organic listings (and tending to avoid PPC)
    • Finding and re-using "tried and true" search patterns (like "X reviews" "X FAQ" or "X wiki"

  4. Control. At this stage, the searcher becomes sophisticated and takes control over their searches. They realize that the results a search engine provide are in part controlled by the search query itself, as well as the results available. This is usually the level most people eventually find themselves at.

    At this stage, advanced search tactics are used, such as:

    • tiered searches (searching in a general manner, then using information gleaned from those results to perform the "real search" using the information and keywords from the previous search - like looking up the wikipedia entry for a topic, then using keywords and ideas from that to perform a second, "real" search)
    • searches based on likely content or title of a desired result, rather than the user's question
    • long tail searches become more prevalent
    • Simple parameters such as quotes or "results from this country" are more likely to be used
    • Actively trying to prevent bad or off-topic results by using negative parameters or less ambiguous terms.

  5. Expert. Most searchers do not reach this stage, as it requires study and is more difficult to do than the results are generally worth for most searches. Experts will use advanced search parameters, tiered searches, and other advanced techniques that require a good knowledge of search engine behavior. This category includes information and search professionals (SEO's, researchers, topic experts, advanced students).

    Searches are planned out, often "long tail" or tiered, and can include advanced and multiple parameters. PPC ads, often avoided at the control and analysis stages, will begin to be clicked on if they appear to answer the query. Experts are after the best result, and don't usually care how they get it. They will intelligently make exceptions to general rules of good searching if they believe the result will be good.

Well, that's my list of searcher behavior stages.

The next thing to realize is that these stages may repeat themselves for different queries or topics.

For example, someone may search at a Control or even Expert stage for a topic related to their work or hobby, but when confronted with something totally new (like planning a wedding) is likely to go through the stages again, though usually at a much faster rate than a new searcher would (sometimes in a few hours or less).

They know nothing about the topic, so they need to start by trusting the search engine again.

While the above information is useful in and of itself, in Part 2, I'll go over how people (and entire cultures/nations) can get "stuck" at certain stages, and the effect this has on international SEO and SEM.

Stay tuned.

Ian

3 Factors of International SEO

International SEO is just like regular SEO, but with a type of personalization.

Arguably, the first step towards true search personalization by the search engines was international search. This was followed by local search and universal search, with even more coming ideas coming along, including logins, "answers" and so on.

But it started with international search, because that was the big issue - how do you develop relevent results if you can't even present them in the language of the visitor? How can you claim a result for someone looking to buy something is relevent if the companies in the results don't even ship to where that searcher is? Unless you restrict yourself only to a certain region, you can't.

As a result, international search is, even with all it's little issues and odd behaviours, one of the more settled and stable types of personalization. Which means you can actually DO international SEO, rather than hoping you are doing it. As a matter of fact, I'm writing a book on that very subject right now.

First things first: standard SEO still counts. You still need good technological setup, links and content. If you don't start here, then internationalizing your site will only make things harder, not easier.

Once you have that in place, here are the 3 main factors for internationalization:

Localization

Localization is the act of making your site relevent and useful to visitors from a particular locale. This includes:

  • Translation and language localization (examples include US vs UK spellings and regional terms like "pop" vs "soda")

  • User interface (examples: sites aimed at Asian languages should be more link-heavy than English sites, you need to avoid menu systems that assume that the words in the menu are a specific width, and your order forms need to accept addresses in the local order)

  • Keyword research - there is no Chinese "WordTracker" or Keyword Discovery" program currently available. I often need to do PPC campaigns to do any KW research at all. This is why I use SEO-trained translators - they cost more, but are worth it.

Geolocation

Geolocation is the process of figuring out the best localized site to present to a particular visitor. Often this is done by matching a visitor IP from a country to the country-specific version of a website (IP Geolocation), but this is a simplistic method that doesn't account for todays highly mobile workforce - an American visiting Japan may want an American site, or they may want a Japanese site, you don't know, and can't guess easily.

And what country is Google from? More specifically, which site do you present to search bots, and how do you do it to avoid problems. This is an entire article by itself, and it's not as easy as you may think.

Geolocation, then, is the process of both choosing the right version of your site for visitors, and also helping the search engines make the same choice.

Here are some ways to help do this:

  • Geolocating the website. The most important step of the process is letting everyone know what country your site is targeted to. You can ONLY target one country per page, and in reality, for most organizations, one country per site or sub-site. Methods of doing this are using a ccTLD (country code top level domain like .ca and .uk) and using an IP address that is assigned to a particular country.

  • Geolocating the visitor. This can be done in several ways. You can detect their IP address, you can ask them to identify their preferred site, and you can detect their browser settings.

  • Visitor language detection. Many countries have more than one official language, and even countries that only have one official language may have substantial alternate linguistic populations ( For exampe, Spanish speakers in the US, English Speakers in Korea). This means you can't just assume that country=language. This can be either detected using browser settings, the keyword used, user choice, or (last resort) geolocation.

  • Website language detection. This is harder than you may think. Many language share words or have similar characters, making it hard to automatically detect a language if you are a computer. This can usually be solved by declaring the language within the code.
Globalization

Globalization in this context means bringing it all together. Combining the localization and geolocation in order to present the best site for any visitor. Creating a system that treats every visitor as special, and dealing with their language choice as easily as dealing with the fact that they want your product in size 8, or shipped overnight, or paid for in Euros.

Geolocation and Localization focus on the differences between the countries, languages, and cultures of each visitor, while Globalization focuses on the similarities between visitors, and works towards communicating a message or selling a product seamlessly.

The main mistake I see during globalization efforts is an inappropriate division of responsibility. There are some things that head office should be in control of, and some that the local office should be in control of.

Head office should run the branding and overall marketing strategy for the company. Local offices should then take that strategy, and devise tactics that will work in the local market that further the global strategy.

This means, of course, that head office should control the urge to create "strategies" that are really tactical - like choosing the exact wording of ads, and so forth. Because none of that will matter during translation, and will almost certainly make it worse. Let the locals sell to the locals. You just provide them with the tools and support to do it.

Likewise, local offices should control the local tactics, such as specific marketing copy, photographic images (other than product shots) and, within reason, timing. Launching a major campaign on your new fast food item may not be a brilliant strategy at the beginning of Ramadan, for example.

But local offices tend to be focused on their own area. They may not understand the global issues (including supply line problems) and often don't have as strong of a sensitivity to the "brand" as head office does. A local office in South America infamously changed the BMW logo to better match it's new website, for example. You don't do that!

Conclusion

In order to properly do international SEO, you need to address all three aspects of it - localization, geolocation, and globalization. Once you get these parts are working in harmony, you'll find that it's actually fairly easy to continue to do and improve upon.


Ian

Yahoo (YSM) Mexico Upgraded to Panama

I just received an email today from Yahoo informing me that Mexico is now on the new Panama PPC system.

This is good news for international SEM's like myself, since the old Yahoo kinda sucked. Worse, it was really, really hard to integrate into PPC software like Omniture Search Center.

I haven't played with it yet, but I'll keep you updated.

Ian

-----------------------------------------------------------------------------------------

Original Email (translated into English with personal info removed, I apologize in advance for my bad Spanish):

-----------------------------------------------------------------------------------------

Dear Advertiser, Congratulations!

The new Yahoo! Search Marketing has been activated. Our system has new and advanced features that are very easy to use and that will help you connect better with the vast and valuable audience of Yahoo! and sites associated with Yahoo! Search Marketing.

Your new Sponsored Results account

Your new user name is' XXX '. Before you can access your account, you will be asked to reestablish your password. After this, you receive an email with instructions.

After you reset your password, you can access your account at https://login.marketingsolutions.yahoo.com/es_MX. If you cannot click on the link, please copy and paste it into your Internet browser.

To help you have a successful start, we suggest you become familiar with your new account by logging in today at https://login.marketingsolutions.yahoo.com/es_MX. Please be sure to have a full campaign and add your billing information. We suggest that you add the page to access your account in your favorites for a easy access in the future.

The new Sponsored Results account structure

The interface of your account is as follows:

Administration - Manage your account information, including information payment and billing. In addition, allocate your daily spending limit, user privileges, and other options.

Please note that all information in your account, including the balance and budget, have been transferred to your new account with Sponsored Results of an account number again. However, we will continue accepting pesos as payment. The amount will simply become the equivalent amount of US dollars according to the current exchange rate.

Control Panel - See a list of all your campaigns, a summary of performance and alerts that require your attention.

Campaigns - Here is where you make the most of managing your account. You can see all your campaigns, ad group, ads and keywords. You can also create or edit campaigns, accessing the auctions and forecasting tools.

Reports - Access to numerous reports of performance to monitor the success of the campaigns, including: impressions, clicks, conversions and costs.

Access to its previous account

You still have access to their previous account (for consultation reporting only), which will be available the next six months, but you may not make any changes.

We are very pleased to have you as an advertiser and we thank you for trusting Yahoo! for your business. If you need assistance, please contact our customer service team at:

Mexico (+52 55) 3003 - 1909 or the interior at no cost to 01800 to 123 - 8593 Argentina (+54 11) 4837 - 8108 Or by email: latam-cs-ysm@cc.yahoo-inc.com

Sincerely,

Yahoo! Search Marketing and it's Partners

Search Marketing In Latin America?

Every year, I go visit at least one of the major international search markets. Since my specialty is in global SEO/SEM it would not be very credible for me to claim to be an expert on marketing to a country I've never even been to, would it?

Actually, there are a depressing number of companies that do exactly this - sell one-page "international" pages for sites, typically poorly translated, and then "submit" them to the local Google (ie Google.co.uk, etc). This is a scam.

First, you don't need to submit to Google - any version. Second, one page of content is highly unlikely to bring in any traffic to speak of, unless it's desperation or long tail traffic. Finally, the whole concept is wrong. If you are not willing to invest more than one page for an entire market, then you have more problems with your marketing plan than SEO.

Anyway, although I have offices in Mexico, Brasil and Argentina, they are through a contract firm and I've honestly never been there or met the staff. So I'm going. As someone formally trained in anthropology, I'm a huge believer in the value of "field work" and "being there". Now we get to the reason I'm bothering to tell you all this:

If YOU were going on an SEM-related trip to South/Latin America - what would you be interested in learning? Where would you go? What would you do? I'll be happy to share what I learn with you, if you want to give me some guidance as to what you want to know.

I'm thinking of going in the fall, on advice from some friends in Argentina regarding the weather. Since I only have about a week or so, I need to be very focused. Here is a (very) rough idea of what I'm interested in seeing/doing:

Countries to visit: Mexico and Brasil. Argentina if I can, but it's iffy this trip, given the time constraints. Just so you know, the top 3 are (in order): Brasil, Mexico, then Argentina.

Search Companies to Meet with (if possible): Yahoo. Anyone else?

Questions to be Answered: What is the real search market share in these countries? What are the scams that need to be avoided? What tactics seem to work the best?

If you can think of anything else, let me know, and I'll find out for you :)

Ian

PS: No, I'm not spelling "Brazil" wrong - the locals call it "Brasil".

Yay! Patent Number Assigned.

I've already mentioned that I've applied for a search related patent recently.

Well, I finally got offical confirmation of the "Patent Pending" status and my very own USPTO Patent Application Number - 60/999,180. "System and Method for Website IP Address Based Geolocation"

Cool. Now I just have to finish up the control panel and it will be ready for public use.

Ian

Google Webmaster Tools Geotargeting Added

Google just unveiled a new tool in their Webmaster Tools - the ability to set geotargeting for sites, even down to the street level if necessary. Naturally this is very cool, and interesting to those of us that deal with geolocation issues all the time.

The tool will not allow you to override the ccTLD, so you can't declare your .ca site to be from the US, for example, but if you have a gTLD like .com, .net and so forth, then you can. For many people, this is an excellent method for accomplishing what they want.

If you want your site to have multiple countries, you would just create sub-domains - france.domain.com, for example, and then get geolocation for that sub-domain to France. You can do this with multiple sub-domains each to multiple countries. No site can be geolocated to more than one country, however.



Naturally, this will bring up the next question - "Hey Ian, since Google has this now, why would someone need your IPGeoTarget tool™ "?

Well, as long as you only cared about Google and used the Webmaster Tools, then you would not, and I would not in good faith recommend paying for any system as long as Google has a free tool that does the same thing.

However, since this only works for Google, you would not be able to geolocate for Yahoo, MSN, Ask, or other search engines, so that's certainly a limiting factor.

Second, this currently only works for entire sites (including sub-domains), but not directories or pages. Many companies have set up their sites like this: domain.com/canada/ and Googles system would not help them in this case. This can cause big issues if your CMS doesn't support cross-site editing, or if you are looking at the possibility of 301'ing thousands of indexed pages (along with the related drop in traffic/rankings during the switchover).


Finally, not every company uses or likes to use the Webmaster Tools (though I admit they are pretty useful myself).

I admit the timing is a little annoying - I have no idea if it's a pure coincidence or if someone decided to speed up the announcement because of the interest in IPGeoLocate. Either way, it was just a matter of time before Google did this, since webmasters have been clamoring for it for some time, so I'm not worried.

My recommendation would be to use it right now if your site structure allows - Google does drive a fair amount of traffic, so it's not like it's a waste of time - and then add IPGeoLocate to the mix once it's available for your target country - hopefully next month.

Ian

A Patent Rant.

Andrew R H Girdwood is a very smart fellow. He's already posited that my new IPGeoTarget system is likely based on proxying, and of course he's correct - it really can't be done any other way.

Well, there is one other way - one could alter the geographic tag for an IP, either directly at the IP Mapping provider or after the fact, like a private database. Both of these have issues - mainly the problem of shared IP addresses. The real answer to the whole geolocation mess is to identify *domains* (or better yet, pages and directories) as geographically located using something additional to a ccTLD. I'm leaning towards either a metatag or an entry in a robots.txt file, myself.

Registering with Google would only help for Google, and not everyone wants to register everything they do with Google, particularly since they have been acting less and less like idealists, and more like a shareholder-owned corporation (unsurprisingly).

In particular, companies located outside of the US are hesitant to give additional information to companies (like Google) that are easily targeted by US laws that may not have their best privacy interests at heart. The feeling sometimes outside the US is that anything the Chinese government can order Google to do, so can the US government, and better, since that's where the head office is. It's not that they actively distrust them, it's just that international companies tend to not get to where they are by being blindly trusting with their data.

So until the search engines get together on this issue, it's going to continue to be an issue. Even afterward, it would still be nice to speed up connection times to visitors without having to physically move a site - there are reasons other than geolocation to use this type of technology.

Anyway, Andrew also posts a worry that I may be trying to patent a technique that's already been done, or that tries to lock down common internet technology. I'll directly address that, since it's a legitimate concern and he's right to bring it up.

1) To the best of my knowledge, it's not covered/prevented by prior art (though of course almost everything on the internet has some sort of prior art connection simply by being on the internet), and

2) I'm not trying to patent the concept of a proxy, IP address, "Click to buy" button or anything that basic or obvious. Though you'd be surprised what can be patented nowadays.

At least, I hope so on the second item - every patent applicant has either nagging doubts or is delusionally self-important. I think my teen-aged delusions of infallibility have been quashed out of me after years of being in flame wars on forums, working with non-profit organizations, and having a family. I guess it's up to the patent office to ultimately decide, and for now I'm leaving it to them. The point is that I'm acting in good faith and trying to make things better, not prevent competition or cash in on anyone else's hard work.

Beginning of Patent Rant

I'm changing topics now - this has nothing to do with Andrews post. I'm just on a role and am too lazy to start a new post. Besides, if you read this blog you are probably used to really long, wandering posts by now. It's because I type exactly like I talk.

I've been asked several times what I would do if the patent office said no to my poor pending patent proposal for pinpointing positioning (how's that for an alliteration?), and I'm drawing upon my previous experience as the patent manager for a company with 72 patents worldwide for the answer.

The answer is that it doesn't matter. Surprised? Then you don't know as much about patents as you think you do. Experienced patent lawyers would not be surprised by my attitude (though they may be dismayed at the thought of losing all that money made during the process), and I'll tell you why.

At the end of the day, a patent is simply protection for a business idea, so if you can't make a business work from it, you've wasted your time on what is basically an ego trip. So patents don't matter, business concepts do. There, I said it.

It's more important to have a legitimate and profitable business than a patent, and some people (notably inventors, dreamers and narcissists) never seem to really get that, which is too bad, because then they get screwed by businesses that may not be as creative, but have a stronger drive to succeed and profit.

It happened to the original inventor in my previous company - he's broke now and doesn't even own any shares in his own company anymore, which is still going strong. The last I heard he was in hiding. That's what happens when you trust venture capitalists to run your company for you while you hope to rake in the royalties.

I was part of the "cleanup crew" hired after his original company imploded when the VC's exercised their "exit strategy", and I learned a lot from the experience.

In particular, I learned 2 very important lessons from the whole mess:

1) You don't have control over the patent process - other people do. Lawyers, competing companies with their own patents, owners of prior art patents who think they also own everything even slightly related to their own patents,, law firms that buy vague patents and then make money suing people at the drop of a hat, "free spirits" who don't think anything should be patented/copyrighted/trademarked, naysayers who think everyone else's ideas are always wrong, friends who are worried you might be hurt, and, of course, the patent office. And not just the patent office, the particular patent examiner you get. Then the whole thing starts over in every single country in the world that you try to patent in. I'm surprised anyone bothers to even try anymore!

2) You DO have control over your business - unless you give it up. Too many people think that if they get a patent then they can sit back and let the royalties roll in while others do all the work. Well, it's not that easy in the real world, which is why usually the only patent holders you meet that are rich are those that are astute business people, and it was their business dealings that made them rich, not the patent. IBM makes tons of money on royalties from it's patents, but it's not because they sit around waiting for people to send them money - they work the angles and earn the royalties actively. The patent process itself can make you broke very quickly. Therefore, forget the patent, and focus on the idea. Is it a good idea? Great! Go make it work. Who cares if you don't have the patent yet? The fact that you are at "Patent Pending" generally scares off those that care about such things, and for those that don't, they don't care about whether the patent is granted or not.

I am personally aware of a well-known person in the SEO world with a patent that Google and Yahoo are both flagrantly in violation of. It didn't seem to stop them at all. This person knows if they sue it will likely be more trouble than it's worth. So really, what is the patent worth? Once again, it's not the patent, it's the business. Learn that lesson well before you decide to patent anything.

(I don't think this person is trying to keep this a secret, since otherwise they would not have done something as public as a patent, but I'll let them identify themselves or not as a courtesy, just in case.)

In the meantime, I'm going to proceed on the basis that even if the patent office disagrees with me, the usefulness of a company being able to open an account at IPGeoTarget.com, type in their URL or domain, choose a target country, and then be geolocated to that country with little other fuss or muss, will be a viable business model.

As a matter of fact, I'm about to spend a whole bunch of money on exactly that. The patent is icing on the cake and a nice angle in a sales pitch or press release. But a patent is not a business. Work on what you can actually control, and for the rest, do your best to set things up so they end up in your favor, then forget about it and deal with things as they come.

Bottom line: Thinking of applying for a patent? Make a detailed business plan first, because that's what it's really all about.

Ian

WINNER! - IPGeoTarget

That was fast - I already have a winner for the Name My Geo IP Service contest I just announced.

A big congratulations to Jill Whalen of HighRankings.com for the winning entry of IPGeoTarget™.

A special thanks also to Barry Welford of Strategic Marketing Montreal for his close runner-up suggestion - much appreciated!

I guess now I'm gonna have to get the trademark registered and start making the darn website with the IPGeoTarget™ service launched. I wonder if it's possible to code HTML in Braille... ;)

Ian

Win $100, a nice link, and more!

OK, I'm having a problem here. I'm trying to figure out a trade name for my new patent pending service. I actually thought this would be the easy part, but it turns out it's harder than coming up with the darn patent in the first place!

It doesn't help that I'm mostly blind and can't look at a computer screen for more than a few minutes at a time.

The service basically allows you to "set" the IP of your website to any country in the world that you may wish it to be. Why would you want to do that? Because if you are a .com and are hosted in the US, but are trying to sell to people in the UK, Google and the other search engines will decide that you are a US site based on your US IP address and you will show up well in the US, but not in the UK.

Normally, the answers to this are to:

  • register a ccTLD (not popular due to branding issues)
  • host in the target country (not popular with head office, usually for political reasons)
  • park a ccTLD on the .com (complicated, slow, and easy to mess up)

Now, you can just say "I want my website to look like it's hosted in the UK (or any other country) but actually be hosted here on my preferred servers in my own country. I do some magic and bingo, that's what happens.

In answer to some of the more usual questions at this point: No, it's not a spam technique, and can't be used as one (at least no more than anything else on the web), no, it doesn't create a duplication issue, and yes, your website logs and analytics will continue to work perfectly.

If you want to see it in action, you can check www.mcanerin.net, which is actually hosted in Toronto, On, Canada, but appears to be hosted in the USA. Yeah, the sites, ugly - it's a holder site until I get the new one up with a new name.

Which brings me to my problem - I CANT THINK OF A DAMN NAME!

Oh, I've thought of lots of names for the service/concept: geoswitch, geomirror, etc, but they have all been taken. Since I can barely see, this is a very painful process for me.

So I'm gonna try bribery...errr...a contest.

The rules are simple:

  1. the name has to be Trademarkable, and unusual enough that there are no websites with the name already.(this is the tough one - I really liked "geoswitch"!
  2. It should be easy for a non-tech marketing guy to explain to his/her boss and to reference in a PowerPoint presentation "Sub-directed geotargetted reverse proxy system" just doesn't cut it. Think catch-phrase, not technical description.

Send your suggestion to mcanerin(at)gmail.com and I'll pick the winner from there. In case of identical suggestions, the first one submitted wins. There is no limit on suggestions, but PLEASE do a basic Google check before submitting it. The content ends when I find somethign I like and can use. I'm trying to get this done as quickly as possible.

The winner gets $100USD PayPal'ed to his/her account, Fame (and a link) and FREE lifetime small-medium website geolocation account to a country of your choice (as long as I have a server there) as soon as the system goes live. And my everlasting gratitude. :)

Ian

Geolocation - Your tool is probably wrong!

As you can tell from my most recent posts on IP Addresses and Geotargetting Adwords, I've been thinking a lot about geolocation recently. Up until recently, I've been suggesting that people use the tool IP2Location for checking geolocation, since there is free demo online and they have free/cheap API tools.

The problem is that I've been recently testing IP2Location, and it's database is not very accurate in my tests. Unfortunately, since it's easy to use and cheap/free, chances are any tools you use to check geolocation are likely using it.

For example, SEOMoz's Geotargetting Detection Tool uses what looks like IP2Location and as a result my website (www.mcanerin.com) is apparently located in Glen Ellyn, Illinois, USA, rather than Toronto, ON, Canada.

Quite a difference. I can maybe see if you get the wrong city, but the whole country? It pretty much makes the tool useless.

SEOMoz Geotargetting Tool Screenshot for www.mcanerin.com:

It's nice that they provide the disclaimer at the bottom that the results may not be accurate, but it would be better to improve accuracy rather than work on more obvious disclaimers, so I'm not going to talk about the disclaimer and focus instead on the accuracy.

The thing is, that I doubt the fine folks at SEOMoz have any reason to suspect that these results are wrong, since it's not their database, and they are apparently pulling it from a very well known IP Geolocation database (IP2Location):



The problem is, that there is no point in providing a tool if it's wrong. Worse, what if you were using this database to deliver ads? Identify where a visitor is coming from? Look for click fraud by trying to cross reference IP's? Suddenly it gets more serious (and expensive) for this information to be wrong.

It's one thing for a free tool to be wrong, but when that same information is feeding your ad delivery software, wise marketers start asking questions and double checking results.

The Search Majors (Google, Yahoo, MSN, Ask, etc) use higher end IP Geolocation companies because the results are more accurate. This means that you probably should too.

The industry leaders in the commercial space are Digital Envoy (Google used to use them until DE sued them for more money), Quova, and MaxMind. Let's look at MaxMinds online tool results for "www.mcanerin.com":


Finally! The right answer!. Fortunately for me and my site's geolocation, this is also the type of database that the search engine's use. The only problem is that this quality comes at a price.

There is a possible free option, however, IPligence also gets the location correct:



I haven't fully checked this one out, but so far the (free) results have been very accurate. Certainly better than IP2Locations.

This post isn't about coming down on anyone, but rather a warning about the dangers of assuming that just because something gives you results for your query, doesn't mean it's always accurate or up to date. Which you'd think us search marketers should know by now, including myself.

I apologise for recommending IP2Location up until now, and am now recommending either the Maxmind or IPligence online tools instead for a quick geolocation check.

Ian

IP Addresses, SEO and Your Site

I haven't talked about IP addresses for a while (and never in this blog). Since I've been asked several times recently about them, I think it's a good time to talk about them.

What is an IP Address?


Computers "think" in numbers, not words and letters. Humans tend to be the opposite. In order to deal with this, programmers and technicians have been creating technologies that translate between the two since shortly after computers were invented. Programming languages are an example of this.

Another example of this is DNS, or Domain Name Services. See, your website really can't be found at your domain name - it's found at an IP address. The domain name is just a human-friendly way of finding the IP Address. Think of it like a telephone number. It's much easier to remember someones name than a telephone number, but your telephone doesn't understand names, just phone numbers. So we invented phone books. You look up the name you remember in a phone book, which gives you a phone number that the phone system can understand. DNS is like a phone book for the internet.

If you type in http://www.mcanerin.com/ into a browser, you don't go over to my company site right away. Instead, your browser looks up the name in a DNS server and the DNS server tells it an IP Address (in this case, 64.34.120.55) that matches the domain name. Then the browser can go to the right website.

So the bottom line is that an IP address is your websites real address on the internet. Now, there are just so many IP Addresses in the world, so people have figured out ways that more than one website can share an IP Address. In this case, after the browser gets the IP Address, it goes to the webserver and also gives it the domain name it's looking for. The server then sends the requested site. This is like having one phone number for your home, rather than one phone number for each family member living at your home. You have to phone the house, then ask for who you want to talk to.

When you have each person with there very own phone number, it's more expensive, often unnecessary, but has some advantages, like your personal cell phone. The same applies online.

Dedicated IP Address VS Shared IP Address


A Dedicated IP (sometimes wrongly called a "static" IP by some web hosts) is an IP address that only points at one website. A Shared IP is an IP address that can be shared by more than 1 website.

Now, search engines don't usually care about IP addresses - they index you based on your domain name. That's why having more than one domain name for a site can confuse them.

Note on Dynamic and Static IP Addresess

When you use your ISP to connect to the internet, you will often get what is called a "dynamic" IP, which means basically that it changes. Most ISP's have a big pool of IP addresses and they just hand a random one out to you whenever you connect.

A static IP is simply an IP that you have each time you log in - it doesn't change. You really don't need a static IP for just surfing the net, but if you host your own server at your home or office then it's usually best to get a static IP address.

There are ways to host websites with a dynamic IP address. I hosted mcanerin.com for years at home on a dynamic IP address and ranked very well, thankyouverymuch. I just ran a script that automatically checked my IP constantly, and when it changed, the script would update my DNS server with the new IP and my site was back up and running again, often within a minute or so. This is a good example of why IP really doesn't matter as much as some people think it does for search engines.

Keep this in mind when some tech tries to tell you that your website is doomed if you switch servers or IP's. Nonsense. I used to switch IP's as often as several times a day without any problems :)

Geolocation by IP


There are, however ways a search engine will use your IP Address. Neither are directly for ranking purposes. They are additional processes that Google applies to sites during the ranking process. The first is Geolocation.

Since IP addresses are assigned to webhosts, and then given to that webhosts clients, a search engine can lookup where that webhost is, and therefore know where, approximately, your website is hosted. This is one way that Google knows your site is from the US, Canada or China. Google will give websites that it knows are from the UK a boost in results shown to searchers from the UK, on the assumption that they would probably consider UK sites to be more relevant to them. This is almost always a good assumption.

The problem is that if you are a UK company but host your .com in the USA for some reason, you will be considered a US site, not a UK one. Dealing with issues like this is actually my specialty, and I assure you there can be some tricky aspects to it, especially if you have sites for different areas of the world but one CMS that controls them all. So it's important to know where your IP is Geolocated to.

IP Address and Spam.


The second problem is a little more insidious and difficult to pin down. IP Addresses, being known physical locations, are a better method of detecting search engine spam than domain names, which can be moved around very quickly and are cheaper than hosting. So search engines look at IP Addresses (among many other things) during spam checks. This can cause problems for some sites.

The thing is that if you have a hosting account and decide to use it to create a link networks of thousands of sites, then all of those sites will either have the same IP address, or will be withing the block of IP addresses that your website host has available. Website hosts are typically assigned a Class "C" or part of a Class "C" to use. I'll explain what this is in a moment.

What Google knows at this point, however, is that chances are that a whole bunch of sites on the same IP or within the same Class "C" IP Address space have something in common" At the very least, they are hosted by the same webhost in the same location together. This may mean nothing, or it may mean that they are all owned or controlled by the same person or persons. In short, if they start linking to each other, the links may not be independent.

There is no guarantee of this, of course. Some towns (and small countries) only have one host, and therefore they all share a Class "C".

What is a Class "C" IP Address Range?


What's a Class "C", you are still asking me? Ok, it's actually really easy.

A typical IP Address has 4 sets of 3 numbers. In cases where some of the numbers are 0, you can leave them out. So the IP address for mcanerin.com is: 064.034.120.055, or it's sorter form, 64.34.120.55. Since firewalls are usually used to block the long form in favor of the short form, it's usually a waste of time to try to use the long form.

Let's look at this IP address. If we replace the numbers with X's to symbolize a generic IP Address, it looks like this: XXX.XXX.XXX.XXX - now, each of these sets of three X's is a Class. If I now change the X's to the Class, it would look like this: AAA.BBB.CCC.XXX.

The X's at the end are correct, they are not the "D" class or anything like that. Now, do you see the "CCC" section? That's the Class "C" number. In my case (064.034.120.055) the Class "C" address is 120. This means that other sites that have a 120 right there (XXX.XXX.120.XXX) would be considered related in some way.

Relationships Can be Good (or Very Bad)


By itself, a relationship means little. Linking or being linked to is a relationship, as well. So is being in the same country, having similar WHOIS contact information for your domains, and all sorts of other things.

The problem is that in general, the more relationships that are involved, the more likely a site will be considered directly related. Google really doesn't want to show more than one directly related site in any particular SERP - it's not really fair for have the same owner have 5 of the top 10 slots, for example. Additionally, links from sites that are related to you should (and usually do) count for less than links from total strangers.

This means that you should be aware of your Class "C", and of the Class "C"'s of other sites you may own. This sounds like I'm telling you how to spam better, but here is why I recommend this even for the most milky white of white hat sites.

Real Life Example of Why You should Care About Your Class "C"


I had a client, who had 2 sites. They were both in the same general topic range, and hosted on the same Class "C". One site was on the topic (for example) of "New York Lawyer", and the other was one the topic of (for example) "Lawyer Resources". There was no overlap in content, and the target audiences were completely different. Additionally, the keywords for each site were also different. Or so we thought.

It turns out that "New York Lawyer" and "Lawyer Resources" have something in common: the keyword "Lawyer", even though neither site was actually pursuing that term! Now what we had was 2 sites that were related (on the same Class "C") and, in Googles mind, relevant for the same keyword ("Lawyer"). One site dropped off the SERPS for almost everything except a few long tail terms, and the other lost rankings, as well. Why? Because multiple related sites on the same topic hit a spam filter. That's not a problem in theory, except Google assumed a keyword neither site was pursuing to be the issue.

The fix was to totally separate the sites. We moved them to 2 different Class "C" addresses, and just to be sure changed the WHOIS data to the second owner (NEVER fake WHOIS data - it's against the law). This fixed the problem, and both sites now rank well for their respective keywords. If you check the shared keyword, Googles duplication filter kicks in and only the site with the highest link pop shows up - which is exactly the type of behaviour I would expect, and have no problem with.

So keeping your sites on difference Class "C"'s is a good idea, even if you are not inclined to spam at all.

Why You Should Care About Other People's Class "C"'s


Here is another issue: what happens if you are on a shared IP with other sites that are going after your keyword? You now have 2 relationships with them, whether you know it or not. This can cause problems even if you are totally innocent. It's a good idea to check the other sites on shared IP addresses for this reason, and make sure none of them are competitors or otherwise related to your keywords. Same with Class "C"'s to a lessor degree.

Finally, what if you have sites on 2 different Class "C''s, but the same people link to both of your sites? Shows a relationship? Of course! But here is another scenario you may not have thought about. What if the people pointing at your site have relationships with each other?

What if you have 10 websites pointing to you, but they are all from the same IP? Or same Class "C"? Of course links from related sites won't pass on as much PR as unrelated ones. Geez, things just keep getting more complicated, don't they?

So What Do I Do?


Well, you have to host *somewhere*. And you can't really control the IP's of everyone who links to you. Even a link that passes on less than the full PR is still passing on PR. At some point you just have to decide to stop worrying and get on with marketing your website. Having the "perfect" IP address won't rank you for anything. It's just a technical detail that can bite you on occasion.

In general, people still rank well for all sorts of things, even if they don't even know what their IP address is, so this isn't the end of the world. I would personally only worry about it if:

  1. You have more than one site on a topic that could even be slightly related ( If so, consider merging them)
  2. You are considering begging/trading/buying links from groups of sites
  3. You have an inexpensive, popular host (spammers like cheap hosting)

If any one or more of the above are true, then you should to start paying attention to relationships, including links and IP Addresses. Relationships are the currency of the internet, and it's much better to have good relationships with others than bad or questionable ones.

Some tools to help you:

Enjoy,

Ian