AJAX, Web 2.0 and SEO

by Scott Allen - January 9, 2007 
Filed Under AJAX, SEO

AJAX offers some incredible new functionality for web sites, but it is not SEO-friendly by default. However, you can still successfully optimize a site that uses AJAX and Web 2.0 technologies. This article discusses some of the problems and solutions for AJAX SEO.

Recently a colleague of mine and I were discussing the topic of AJAX and SEO, and he needed some solutions for a web site he was developing. After our discussion, I realized that there is a lot of confusion about search engine optimization and Web 2.0 technologies. That prompted me to share my input on this topic. To start, lets discuss an article I recently read at Search Engine Watch entitled “Web 2.0 Technologies and Search Visibility” about AJAX / Web 2.0 and Search Engine Optimization.

Here is an excerpt from the article:

And what about Web 2.0? Is it just a popular buzz word or does it hold additional value? Scott Orth presented a case study to demonstrate that Web 2.0 is all about a user experience. The case study revealed how a static site was improved by adding a lot of dynamic tools and demos that improved user experience. The site also featured an inquiry form that delivered instant results, rather than the typical “thank you, someone will be in touch shortly” message. How ingenious!

Web 2.0 is an entity that people are still trying to wrap their heads around, like an out-of-focus image that is coming into view, and gets clearer the closer you get to it. As Web 2.0 evolves and it’s embraced by more people, I think it will become more defined in the minds of developers and consumers. Web 2.0 and all it’s elements do bring a lot of new functionality to the web user experience, but there are problems for search engine optimization. The author of this article echoes my point:

This all sounds wonderful but what about the search engines? Are these three technologies good, bad or just plain ugly to the search engines?

And now to address AJAX and where it fits into the SEO picture:

Ajax would simply fall under the category of “ugly,” not because it makes for ugly web pages but rather it is invisible to search engines. Because search engines do not support JavaScript, and Ajax uses JavaScript to function, search engines will not see Ajax-delivered content. One example of this would be if your navigation was delivered with Ajax. If this is the only source of navigating the site, engines will not be able to crawl and find additional pages beyond the first page. The same is true of content. If content is delivered by Ajax, search engines will not see it.

So we have part of our answer: AJAX itself is not visible to search engine and therefore content delivered by AJAX will not be seen by search engines. This presents a problem, but there is always a solution, which we will address in a moment.

In my opinion, there are two main issues to bear in mind when Search Engine Optimizing AJAX:

  1. Lack of content and navigational links on initial page load. This is a problem because the initial HTML page load is all the search engine spiders will see.
  2. Lack of unique URL’s for search engines to index. AJAX makes it easy to serve all your content on one page, just like Flash. Unfortunately, search engines need unique URL’s.

Let’s address the solution to Issue #1. The author of our article answers with one possible solution, that in my opinion is a good start, but doesn’t go far enough:

So how can one enjoy the benefits of Ajax while pleasing the search engines at the same time? The simple answer to this is to make sure navigation and content is in html. This will not only help with search visibility but will serve those end users who browse with JavaScript turned off. It is really about accommodating everybody, using Ajax features to enhance the web site but ensuring html content is accessible for those who cannot decipher Ajax commands — search engines and users.

The initial load of the page definitely needs to contain valid HTML content and navigation. I could not agree more, but I would take my solution a step further. If the initial state of the page needs to be served dynamically, then use a server-side scripting language such as PHP, Perl, ASP, or ColdFusion instead of a client-side script, such as AJAX or JavaScript. I’ll give you a little background, just in case you’re not familiar with server-side vs. client-side scripting.

Picture the internet as a window to a house. (More like a one-way reflective mirror.) Inside the house is the server side and you can’t see what’s going on in there from the outside of the house. Server-side scripts are processed on the server, or inside the house. Outside the house, the client side (or browser side), everything is in plain view and it can be seen from outside or inside the house.

Client-side scripts such as JavaScript or AJAX get executed on the client (browser) side, or outside the house and the code is in full view to anyone who chooses to click “View Source”. Your code is naked, free for anyone to steal or hack. But, there are of lot of times when executing code on the client side is very advantageous, such as in Web 2.0 AJAX applications when you need to update code on the page without reloading it, so it can greatly add to the user experience.

Server-side scripts such as PHP, ASP, ColdFusion, Perl (CGI), etc have a lot of advantages, including that they are more secure. They can execute and write the code to the page so that by the time it is viewable in the browser (or outside the window), it looks like static html and you can’t tell exactly what went on inside the house (or server). Search engines like this better, and your casual viewer can’t grab your code and mess with it. Some of the drawbacks are that you can’t update the code of the page without reloading it, like you can with JavaScript or AJAX.

The skill lies in knowing when to use which type of scripting. To both create an excellent user-experience and satisfy the search engines requires a careful balance of these technologies. In my opinion, if at all possible, use server-side scripts as a first line of attack, and use client-side scripts as a second line of attack. Then when the capabilities of server-side languages does not satisfy your requirements for functionality, bring in the AJAX, JavaScript, etc. Make sure all content that you want the search engine to evaluate is written to the page (by either static html or dynamic server-side script) before it crosses the Internet “window” (from our analogy). This is the initial state of the page, and the only one the search engine will see. After that point, the page is downloaded on the user’s computer and I say let the AJAX/JavaScript do its thing! Just remember that you should not alter the content of the page drastically or the search engine will penalize you. However it’s OK if things change when the user mouses over something or clicks a link, or something is time-triggered, etc. It’s just that the default state should be about the same for the user and the search engine. If not, you’ll get penalized or removed from the search engine altogether, as it would be considered cloaking (which I am NOT advocating). I have used this particular method of mixing server-side and client-side scripting successfully for years, creating many web sites that deliver a high-end user experience, and are search-engine optimized.

Now, let’s address Issue #2 – Unique URL’s. You need to think like a search engine spider when creating the structure of an AJAX site. Make sure that they can get to a unique page with valid, quality content. You can still have dynamic pages, but you may want to consider using URL rewrites to create Search Engine Friendly URL’s.

Back to our article…let’s see what the experts say in closing:

Yahoo’s Amit Kumar added that while technology is awesome, simplicity is also crucial so engines can understand content of page. Google’s Dan Crow explained that using CSS, Ajax and Web 2.0 technologies with workarounds will accommodate for search engines in their current state of understanding, but at the same time be prepared for the future when search engines are able to better comprehend these technologies.

Hopefully that gives a useful overview of search engine optimization in a Web 2.0 world. If you would like to discuss this topic further, or you need search engine optimization for your AJAX web site, feel free to contact me.

hostgator web hosting affordable fast reliable servers: AJAX, Web 2.0 and SEO seo ajax

sb 468x60: AJAX, Web 2.0 and SEO seo ajax

Submit Your Site to Best of the Web!



Comments

4 Responses to “AJAX, Web 2.0 and SEO”

  1. TagNe.ws on January 9th, 2007 5:20 am

    I’ve tagged this story at TagNe.ws – I hope it gets enough votes to make it to the front page!

    http://www.tagne.ws/InternetTrends/AJAX-Web-20-SEO-In-Summary-1/

  2. Scott Allen on January 11th, 2007 3:13 pm

    Thanks!

  3. Aaron Shear on January 15th, 2007 7:35 pm

    So I know the DHTML works, but have you proven this concept? I would love to see samples of it.

  4. Scott Allen on January 15th, 2007 7:53 pm

    Yes. It works perfectly. :)

Leave a Reply
If you have any questions about commenting, please see our Comment Policy.