I was recently asked to review a former associate’s new website to advise him on how he could improve it for targeted search results. I provided him with a list of the SEO Best Practices I have developed over the years. I recommended he compare his site to them and bring his site up to these best practices. These are built around my philosophy of how SEO works and have helped websites large and small achieve their goals.
The goal of SEO is to bring targeted search engine visitors to your website and ideally convert them to take some specific action.
The goal of the searcher is have their problems solved, needs filled, or questions answered.
The goal of the search engine is to show the best, most relevant website to their users — the searchers.
Sites that rank well are those that best solve the searcher’s (user’s) problem, fill their needs or provide them with the information they are seeking. This can be accomplished through implementing SEO Best Practices in three areas.
First is Technical Optimization. This is the area most people associate with SEO; what I refer to as the -“under the hood”- stuff and includes your website’s title tags, Meta tags, internal linking strategies, and proper use of alt tags. A recent and extremely important update to technical optimization is the use of microdata specifically designed for and by the search engines called schemas. These schemas are being actively pushed by the Google and Bing.
Next is Content Optimization. The expression “Content is King” has always been and should always be the mantra when it comes to what goes into a website. Content optimization means insuring you have the right type of content on the right page for the right audience. Content Optimization also includes Header tags, description tags, image and video optimization.
Finally there is Linking and Social Optimization. This includes both the number and the quality of incoming links from other sites and the anchor text being used from other sites who link to you. The Social aspect is how popular are your tweets, are your posts being liked and shared, are you using the “rel:author” tag to establish authority with Google?
Even though it’s called “Search Engine Optimization” I always recommend optimizing the site so that when a human visitor arrives they can make sense of the site. The potential that search engines will penalize sites that don’t read well exists, but it is the risk of losing prospective clients and customers due to not being able to comprehend the message that is greater.
SEO Best Practices
I began developing these Best Practices in 1998, and from ’98 until 2008 they didn’t change much. Only in the past 4 years have I made changes and improvements to them. These Best Practices remain unchanged and are based on Google Webmaster Guidelines.
1. Don’t try and jam 5 or six topics onto one page, keep it simple. By splitting pages into sub topics you are capitalizing on the long tail keyword opportunity each new page opens up and are increasing your likelihood of capturing traffic further down the conversion funnel.
2. The URLs of dynamic, database-driven pages should look simple, static and contain a keyword phrase relevant to the theme of the page.
3. Each page should have a unique title tag. Page titles should be no longer than 70 characters.
4. Each page should have a unique description metatag. This description metatag should be no more than 156 characters. Some search engines will use your description in the Search Engine Results so make the copy informational about the content of the page, but avoid sales pitches.
5. Keyword metatags are not important to Google at all. However Yahoo and Bing (which account for 20 +% of traffic) do pay attention. Keep keyword phrases relevant to the copy on the page and to no more than 8 per page.
6. When using an <Hx> tag, the text in between this tag must be more important than the content on the rest of the page. Treat Hx (aka Header tags) as you would a headline to draw people’s (and search engines) attention to the headline and the paragraphs below.
7. The permanent body copy should be contextually sufficient with a recommended minimum of 300 words. The more copy the better and the more opportunity you will have for adding in additional keyword phrases.
8. Text links that point users to pages within the site should have descriptive text that describes the page being pointed to as opposed to non-descriptive text; i.e. “Find out more about our best practices” – points to a best practice page as opposed to a link that says – “Click Here”. or “Read more” etc.
9. Graphic or image names should contain descriptive phrases, where applicable. For example instead of calling an image – image123.jpg, call it descriptive-name.jpg. Graphics or images used in the site should have descriptive, alternative attributes that are useful for visitors.
10. Each site should have a properly configured robots.txt file on the home page only. The spiders will only read one robots.txt, so having them on every page does nothing. Do not use a “visit after” metatag. There is no such recognizable tag.
11. The site should have an XML Site map for submission to Google.
12. Canonicalization is the process by which URLs are standardized. Standardization transforms a URL into a canonical URL so it is possible to determine if two syntactically different URLs may be equivalent.
13. The site should have links to popular social networking sites such as Facebook, Twitter, LinkedIn or YouTube.
14. The site should have inbound links to it from other sites. The more inbound links the higher your site will rank.
15. The site should have a custom error page with search option to give visitors the chance to stick around and explore your site more.
16. The site should avoid using pop-ups, especially for critical functions such as registering. If the visitor has their pop up blocker turned on then they will not see the pop up.
17. The exact same content must be visible to both users and search engine spiders.
18. Avoid auto start video or audio playback; give visitors the option to control media they encounter on your site.
19. A means of gathering data such as Google Analytics and Google Webmaster Tool Code should be installed.
20. Structured data formats such as OG for Facebook and Schema for Google should be incorporated.
21. It is highly recommended a mobile version of your site be made available. Google would prefer responsive design, i.e. one site for all devices. However Google will not penalize sites for mobile sub domains (i.e. m.sitename.com).
As I was sharing these it occurred to me they would make the basis for a blog post series. This post is the first in the series. In future posts (rolling out over the coming days) I will explain each best practice in detail.