SEO is Dead... Long Live SEO!

I'm Old.

Well, not THAT old, but old enough to remember that newfangled "desktop publishing" craze, followed by the hosting craze followed by the web design craze and now followed by the SEO craze.

I think there are some similarities, if you will be kind enough to follow along with me.

There comes a point with almost every technology based skill set where it dies, merges with something else, or becomes specialized. Sometimes all three. It's the nature of the beast, and part and parcel with progress.

When desktop publishing came along, I was still manually typing news stories into a Linotype machine, having it printed in a big roll by a custom print shop, then using an Exacto knife to cut the roll into columns, and pasting them onto a big, newspaper sized board. If there was a spelling error we painfully cut out individual letters and corrected them letter by letter (not a job for the clumsy!). For photos, we had to send things off to a PMT shop to turn the photograph into dots, which was after we took the photo in black and white and developed it in our own darkroom. Turnaround time for even small changes was measured in hours or days.

Then along came this little Macintosh computer with an 8"x8" inch black and white screen, a program called Pagemaker, and the joy of Adobe, a fairly unknown company that specialized mostly in fonts . And a laser printer! Suddenly, all the skill needed for the old process went out the window, and the skills needed for the new one were far easier to attain, with better and faster results.

The next thing you know, it seemed like everyone was a "desktop publisher". You would see signs up on lampposts, ads in newspapers and so forth. It was the high time of public publishing. And with it came some severe growing pains. People with no training, a love of technology and a desire to get rich quick began calling themselves "desktop publishers" and there began a severe decline in a lot of skills in the industry. The same skilled people were still there, but were drowned out by those who would use 12 font types on the same page, had no color sense, and switched clip art styles with gleeful abandon.

The promise of "publishing for the people" quickly became claims that publishing was a dead art, and there was a lot of evidence to support it. But that didn't happen. Why? Because, to use a Darwinian metaphor, the weak were eaten.

Technology didn't stop at Pagemaker and laser printers. The DTP programs became cheaper and easier to use, laser printers became cheaper, newfangled "inkjet" printers became common, and WYSIWYG came to word processing (boy, did THAT cause an uproar! You are not supposed to mix content and presentation, didn't you know )

Suddenly, almost anyone could do simple DTP. It was so simple that no one would pay the so-called DTP types to do something for a lot of money that they could do better and for free/cheap. Most DTP types left for greener pastures.

But some stayed, the best ones. And they formed dedicated shops with high quality printers and demanded the highest quality skill sets in design and typography. Today, they flourish. I don't think twice about printing my own documents off, but I go to a printer for my business cards, rather than use an inkjet, for example.

In short, while it was happening, it looked like technology had created a new industry. In hindsight, it actually transformed the old one into something new.

The same type of thing happened to website hosting. At first, you needed expert knowledge in ISDN modem technology, and the technical skills required to keep a server running and connected were huge. It was a time for highly skilled technicians. Then along came linux packages and apache, internet connections became cheaper, and front ends made things easier. Suddenly "everyone" was a web host. Turns out that it's not so easy to make money if your skills are marginal, you don't invest a lot, and the competition is extremely intense.

Once again, what happened was that a lot of low end hosting is now done in-house, and when people go outside, they look for companies with multiple dedicated redundant connections, tons of technology, and extremely competent staff. The middle men have mostly died out, and spend a lot of time either bottom feeding or offer it was a part of a much larger package of services. The days of the "web host" operating from his basement are over. There are a few holdouts (like me, and I'm slowly allowing attrition to get rid of my clients) but most are gone.

I can see web design going the same way (it's near the tail end) and I believe SEO is heading there right now. It's almost time for the shakedown. 2 years tops.

I predict, based of previous experience, that a lot of what people call "SEO" today will disappear and simply become part of the web design toolset. The so-called "SEO's" that offer to "fix your metatags", who believe that you can achieve long term goals by running a script you bought (as if no one else will buy the same script!) and so forth, will move on to the Next Big Thing (thank god!) and those that have true skills will consolidate and specialize.

A lot of the basic stuff will be done by users, software programs will come "out of the box" with spider friendly designs and tools, and most of the low end "easy fix" stuff will be done by web designers and owners.

I suspect, for example, most of the members of this forum are not "SEO's" but rather people learning how to do SEO for their own sites and maybe to use the basics for their design clients. And good for them! I think they will be the agents of destruction for the pseudo-seos that most people complain about today. Not any kind of "ethical revolution" or central organization. Pure Darwinian selection of the simple and effective over the expensive and weak.

However, the skills necessary to do high end campaigns, to compete in very competitive markets, to keep up with the latest changes in the industry - those will still be needed, and will remain in demand. SEO as a professional service will do very well, and will be run mostly by professionals.

The public will take over the easy stuff, the specialists will deal with the hard stuff, and the parasites in the middle will (mostly - it's hard to kill a parasite) go away.

SEO is dead (or dying) ... Long Live SEO!

Ian

SEO Browser - Online Lynx (Text Only) Viewer

SEO Browser - Online Lynx (text only)Viewer

Like many people, I use lynx to test websites as part of my SEO work. Lynx is a text-only browser for the web that allows people to view websites in text only mode. I used in it the old days when I would telnet into a SunOS server in order to access that newfangled fad called the web.

Once graphical browsers became common, lynx lost a lot of popularity. However, text browsing has seen a revival recently due to two main issues: first, people who are vision-impaired can't used graphical browsers, and there is a strong movement among forward thinking designers and companies to make sure that websites can be visited and used by everyone, not just people with good eyesight, fast connections, and graphical browsers. I think this is a very good thing.

The second reason is less selfless and more pragmatic: search engines, which often deliver a huge portion of a websites visitors, are essentially text only browsers. If you don't make them happy, you can find yourself losing a substantial number of potential visitors to competitors whose sites might not have all the nifty Flash that yours does, but instead are visible to search engines. Since that's a direct hit to the pocketbook. it's caught the attention of the business world as well.

Arguably, the single most important visitor to a website is a search engine, and a search engine is, for all intents and purposes, blind. It's kind of sad that it took this type of pressure to get a lot of sites to care about something they should care about anyway, but I take my victories where I can.

Accordingly, the use and importance of having a text only browser to test your website with has moved from "it would be nice" to "it's absolutely critical". A text browser is an essential part of the toolkit of both the SEO and the web designer.

Normally, I've been using the actual lynx executable, but it's kind of hard to use after several years of windows (no, the mouse doesn't work in it! Blind people can't see pointers, silly).

Additionally, when I'm talking to a client or giving a presentation, I can't ask them to install an executable on their system just to demonstrate somethign quickly. Some clients are disallowed from doing so by corporate policy even if they wanted to. Plus, it's hard to use for the mouse generation.

Up until now, the answer has been using the Delorie tool, which is basically a web based lynx viewer. Recently, due to bandwidth issues and hacks, they have discontinued it's easy functionality and required a site to install a custom page on it in order to view pages from that site. Well, clients are not going to do that any more than they would install lynx. Now the tool can basically only be used for sites only under your control, which presumably is the intent.

But that makes it useless for most SEO's and designers, except on personal projects or clients they already have.

Additionally, this tool didn't do what I wanted it to do (I still had to use 10 or 15 sites just to check one site) so I started to develop my own, in conjunction with a marketing company and developer I know. Lynx is close, but it doesen't show you what a search engine sees, only what is intended for text browsers, there is a big difference in the handling of graphics, headings, lists, and other page components between the two.

In short, lynx, is good, but not good enough. I needed a true SEO Browser. So I made one, in conjuction with a marketing company I work with a lot (Anduro) and a developer here in Calgary (Commerx). They are key to the fact that this project exists anywhere outside of my head. :D

The concept is simple, create a tool that mimics what a search engine sees, and then add other tools an SEO needs in conjuction with it.

Due to bandwidth usage, we are planning to make it half commercial, half free. The basic functionality will be, of course free. It's essentially an online lynx-type viewer with some enhancments specific to SEO's (like how it handles some things that a SE might care about but a pure text browser would not).

Then the idea is to have a much more robust version behind a login that will do all sorts of fun things, many of which will require a Google API, etc (thus the login).

The basic functional version is here:http://www.seo-browser.com/

For now, the advanced version is being added to daily and is open to the public for testing purposes. After we get it working perfectly, It will be a paid area (probably along the lines of Wordtracker style - pay by day, week, year, etc.)

One thing I'd like to add once it's done is the ability to get an XML feed from the advanced section that people can use to format their own reports with, etc. Please feel free to test it and provide feedback. We are committed to making both versions (including the free one) available and the best we can make them.

So far, the response has been amazing! Especially in view of the fact that this is still very much in development and is nowhere near being finished.

Some upcoming additional information it will hopefully provide include:

  • A link to http header information for the page.
  • A link to the CSS file
  • A word and character count for the meta data, title and body text
  • A link count (how many links are on the page – how many are internal and how many are external)
  • An image count, along with how many do not have ALT parameters (alt=”” would count as an alt parameter for SEO purposes)
  • A link to check the W3C Validity of the page
  • The number of backlinks this page has under Google, Yahoo, MSN, and Teoma (and whether it exists in the index at all)
  • Whether the Javascript (if it exists) on the page is on-page or external
  • Whether the page contains Flash, Java, imagemaps or DHTML (in red – these are usually bad)
  • The Meta Keywords would be links to add extra functionality to the page (explained below) and would also show a number used and density % (example: “SEO [14, 12.5%], promotion [3, 1.2%], mcanerin [5, 4.6%]) I would suggest a limit of 15 keywords – if they have more than that then it should be noted as an error, and only show the first 15.
  • A list of cookies requested/sent would be listed. This tool should never accept cookies, however (search engines don’t)
  • A link to the robots.txt, and an error if it does not exist
  • The IP address displayed. *Page load time displayed.
  • A list of comments (how many, and the contents of them – can be an unformatted dump)

Under this is the text only page itself. This page would look the same as the free version except:

Text that is hidden using CSS or in other manners (ie black text on black, or 1 point high text) would be italized. Text that is within a header tag is actually displayed in a header tag (H1, H2, etc) When you click on a keyword in the keyword list, it highlights in bold. Up to 3 keywords can be highlighted at once.

Under this should be another section with 3 links: Compress, Tokenize, and Index.

Compress would popup another window and show the text as a pure, unformatted text dump with no punctuation. Tokenize would popup another window and show the compressed text but with stop words removed (a, the, and, but, etc) Index would take the Tokenized list and display it as a word list with number counts for each word as well as density (like the metatag keywords)

Where does the Description Come From for Google?

As we all know, along with the title, the description that a search engine uses for your site in a SERP can have a significant impact on whether or not someone chooses to click on your link.

Apparently, the rules have changed recently.

We are now seeing a preference for DMOZ descriptions where available, as well as an apparent devaluing of text snippets, which I always kind of liked (easier to see garbage phrases associated with poorly made doorway pages etc)

I responded to a question about descriptions in a thread in the High Rankings Forum and a member pointed out that, based on my own data, the rules had apparently changed since the last time I looked at them. So I checked further.

Here is the data I came up with.

The problem is that many people use the same description for DMOZ as they do for their homepage description, so now we have to be very careful about assuming that the description that is being shown is being pulled from the meta description and not DMOZ, since they may be identical.

Since I know my site (McAnerin Networks) is in DMOZ and has a different meta description on page, I'll use it as a test subject. I know my meta tags are valid and not broken, due to testing I did earlier for an article and meta-tag generator, so that will eliminate that as a possible factor for anything.

Note that this site sells SEO services, but I'm not currently taking new clients without referrals, so please don't consider this as promotional - it's just the site I know best.

INITIAL DATA

Page: mcanerin.com homepage

Meta Description: Website promotion company specializing in cross-border internet marketing. Offices in Calgary, Alberta, Canada and Las Vegas. Free ranking report, SEO tools

DMOZ Description: Offers Internet and website promotion, search engine optimization and search engine submission. Includes location.

Page Title: Website Promotion Internet Marketing - McAnerin Networks Inc.

Keywords Tested: mcanerin, internet promotion canada, website promotion canada, internet promotion USA, mcanerin internet marketing, mcanerin seo, mcanerin website promotion

Keywords for this have been chosen a combination of either being in or not in the description, as well as causing the site to show up easily, rather than actually being keywords I target for marketing purposes, which is irrelevant here.

The name "mcanerin" does not appear in either description, but is in the domain, title and many incoming links. It is also in the body text and headers on the page being tested.

TESTS (using Google)

mcanerin: Offers Internet and website promotion, search engine optimization and search engine submission. Includes location. (DMOZ Description)

mcanerin internet marketing: Website promotion company specializing in cross-border internet marketing. Offices in Calgary, Alberta, Canada and Las Vegas. (Meta Description)

mcanerin seo: ... Ian McAnerin, founder of MNI, is one of the best known SEO's in Canada, and speaks frequently. He has also has written many articles and is currently ... (Text Snippet)

mcanerin website promotion: Offers Internet and website promotion, search engine optimization and search engine submission. Includes location. (DMOZ Description)

internet promotion canada: Offers Internet and website promotion, search engine optimization and search ... Website and Internet Promotion. In the US and Canada, website promotion can ... (Combination of DMOZ + Text Snippet)

website promotion canada: ... In the US and Canada, website promotion can take many forms, including search engine optimization ( SEO ), directory and search engine submission, ... (Text Snippet)

internet promotion USA: Offers Internet and website promotion, search engine optimization and search engine submission. Includes location. (DMOZ Description)

CONCLUSION

I found it very interesting that although "mcanerin" was used frequently on the page, Google usually chose not to use a text snippet containing it.

By itself, it triggered a DMOZ description. Combined with words contained only in the meta description, it triggered the meta description, and combined with words only contained on the page, triggered a text snippet.

The rules seem to be:

  1. Use the DMOZ Description as default
  2. If the terms appear in both descriptions, use DMOZ
  3. If some terms appear in order (exactly) in DMOZ but not all, use DMOZ
  4. If some terms appear in DMOZ but not in order, use DMOZ + a text snippet containing the remainder.
  5. If none of the terms appear in DMOZ but do in the meta, use the meta
  6. If the terms do not appear in either description, use a text snippet
  7. If the terms appear exactly (order does not matter, but must be beside each other) in the text but not in either description, use a text snippet.

There may be other rules, but that's what it looks like so far.

I was interested to see the results of a site that is not in DMOZ, so I used a client. Due to client confidentiality issues, I won't disclose the URL or keywords used.

The client is a PR 7 pharmacy that does not have it's DMOZ application reviewed yet. It ranks very well organically for it's keywords, however.

Search 1: keyword is in both text and meta description exactly. Description is shown.

Search 2: keyword is in text only. Text snippet is shown.

Search 3: keyword is in both text and meta description but out of order: text snippet is shown

Search 4: exact keyword is in meta description only: shows meta description, but not of home page? Shows the meta of a PR 0 page buried in the depths of a dynamic system. This is the same meta description as the home page (PR7).

Odd...That last search is unexpected. Can anyone confirm this behavior with a different site?That's my information so far. I'd be interested in seeing what others get

I made a chart suggesting the order of priority - please not that this is under heavy testing and is probably not accurate yet - I'd appreciate additional information that might make it accurate.

"Exact" means the keywords are right next to each other, "Spread" means they are in the same sentance or paragraph, but not next to each other. "Partial" means that only some of the words exist in the description.

Possible Order of Preference (not fully tested)

  1. DMOZ Exact
  2. Meta Exact
  3. Text Exact
  4. DMOZ Spread
  5. Text Spread
  6. Meta Spread
  7. DMOZ Partial
  8. Text Partial
  9. Meta Partial

No On-Page Usage: DMOZ, then Meta.

Will only use a maximum of two sources in the case of partials.