Continuing our Web Directions South build-up, we present an interview with Scott Gledhill. Scott is presenting a talk at the conference next week entitled “Is SEO Evil?” We talked to Scott to find out.

Scott is a Web designer, consultant and expert in Search Engine Optimisation (SEO), whilst blogging on Web standards, accessibility and SEO on his Web site Standardzilla.

How important is SEO for Web developers?

SEO is just as important as Web standards development and accessibility when developing your Web sites. Why create a Web site that works well in all browsers, can be accessed universally by all, but cannot be found efficiently, when you’re launching a new Web site, blog or business?

Starting early with the basic development concepts of accessibility and standards is one of your first steps. Identifying problem areas and optimising your Web site for search engines (without compromising content) is something that should also be done in these initial stages side-by-side with these other development methodologies. Many of these issues can’t be reverse engineered once they are committed or built into a Web site before it launches, these are SEO issues that are best to scope out in the early days.

How has AJAX and Flash complicated the SEO process?

Flash and AJAX are interesting because when used wisely, and in small snippets, they can enhance the usability of your Web site, which I see as good SEO. When used to create whole Web sites or sections of content you will run into the similar issues as you do with accessibility involving dynamically generated content and the fact that it’s hidden to the user and/or the search engine.

With AJAX you can develop with HIJAX in mind, which is creating a functional Web site that works without JavaScript enabled initially, but with JavaScript on you get the enhanced functionality and everyone is happy and can see the content.

Flash will always require alternate content as search engines can’t crawl .swf files efficiently. Implementation can be done quite easily using SWFObject or something similar. Getting the actual content in there is dependent on how you are publishing your files. XML feeds are great as you can sometimes just convert them straight into HTML for your alternate content which aids accessibility and SEO.

Another interesting point with AJAX is that some businesses discourage its use due to a drop in page impressions which can look bad for SEO analytics and ad placement revenue. Probably something more the business would decide on, but it’s a shame to see those factors deciding on the use of some technologies.

To what degree can good content compensate for a lack of SEO?

Good content is the real key to SEO as we head into the future. Google is continually updating their algorithms to analyse relationships between external Web sites, trends in how they link to each other and navigational structures to try and get away from anything that can be manipulated such as <title> elements and meta-data.

Amazon, EBay or Wikipedia are great examples of Web sites that win out because of quality, and sheer quantity of content. These Web sites might not necessarily be perfectly optimised (all use table layouts to some degree), but the amount of content across their Web sites gives them all major clout with Google. It’s much harder for smaller sites to do the same, as even if they have very high quality content, they don’t have the same kind of volume that could make that sort of difference in the search results.

Most Web sites still benefit from optimising the other factors of SEO and will still see a difference in ranking when they do so, but content is extremely important and plays into most other factors of search.

Do you consider SEO parasitic given that it depends on the implementation of specific search products?

Good question. Google are trying to shake SEOs with their ever changing, unreasonably strict Webmaster Guidelines. They are trying to deter SEOs from doing any kind of manipulation of search engine rankings, which really does makes everyone analyse each granular detail and begin to hypothesise about what each rule really means.

At the same time, SEOs are all hanging on as tight as they can, waiting for Google’s next algorithmic change and then tweaking their Web sites accordingly. I believe this can cause a bit of an “us vs. them” attitude, and Google don’t really communicate any of their intentions to the real world.

It’s a very different industry and mindset from development, which can be interesting at times, but equally frustrating.

Is the SEO industry dominated by working with one search engine?

Google is the obvious dominator at the moment, they are pretty hard to miss across the industry, and unfortunately that is skewing all SEO techniques to treat the Google Webmaster Guidelines as if it were the Bible.

I try to treat SEO more holistically and think forward to when Google might not be the primary search engine and other technologies, such as Microformats or Natural Language Search begin to mature and create more variety in how you can search the Internet.

You can create a solid foundation for SEO for general search, and then optimise for Google later on if you need to do that. Some Web sites do get a lot of traffic from other search engines such as Yahoo or MSN, but the principles are essentially the same minus a few specific tweaks or additions to some code.

How does the SEO industry typically react to a change in Google’s PageRank algorithm?

PageRank is now a small factor amongst hundreds that determine how your Web site (and each individual page on that Web site) ranks in the search engines. It’s always nice to see a quick snapshot of your pages through the PageRank criteria, but the industry is getting more switched on about that not being the sole measure of how a Web site ranks.

It’s the non-SEOs that will usually panic over any Page Rank shifts while using the Google Toolbar that is in many browsers. The reality is though, that toolbar is only updated every three months for a sample of how your PageRank is doing. Google update it all the time, but again it’s only one of many factors in measuring the effectiveness of your Web page.

Is SEO involved in a rankings race with search providers, where increased rankings are neutered by search algorithm changes to produce more “natural” results?

I think that used to be the case back in the day when search engines used to trust Webmasters to label their own content using methods such as meta-data keywords and alt attributes for images and putting more weight on them for search results.

Nowadays, the algorithms are getting a lot more complex and the search engines are a lot more sophisticated at detecting the low quality noise from the real, quality content.

There are definitely still ways to artificially boost your rankings — but the consequences can really hurt businesses and the focus for most SEOs are going back to just making sure that Web sites are technically sound and creating content strategies that define long term SEO success.