5 Tips for Creating a Search-Friendly Website

Jun 27, 2012

Building a website is only one step in developing your shop’s online presence.

Once you have a site, you need to focus your attention on making sure search engines can find your website and direct customers to you, according to AJ Kumar, who recently covered the topic for Enterpreneur.com.

“To understand this issue, you need to know how search engines build the indices from which they derive the website listings displayed on their results pages,” Kumar wrote. “Google and the other search engines don’t have teams of people who archive every single page on the Web. It relies on programs called ‘spiders’-automated robots that move between links and store information in a site’s code in their databases.”

“Making sure these spiders can access all of the content on your site is extremely important for SEO,” he continued. “Unfortunately, a number of website architecture mistakes can make large portions of your site unavailable to the search engines’ spider programs.”

Kumar shared these five tips for making your website search friendly.

1. Overuse of content in image or script files. “Because they aren’t living, breathing human readers, search engine spiders can read only the text-based content that’s presented to them,” he wrote. “If you store information in image files, Flash animations or Javascript codes, such as your website’s header graphic or introductory video, the spiders may be unable to process the content appropriately.” Kumar suggested duplicating the information stored in these formats with text versions. There are also online tools available that can show you what spiders see when they visit your site so you can fill in any missing information.

2. Deep vs. shallow navigation. “Because search engine spiders move between the pages of your site through the links you’ve created, it’s important to make this movement as easy as possible for them,” he wrote. “If your navigation structure is deep, meaning certain pages can be accessed only after a long string of sequential clicks, you run the risk that the spiders won’t penetrate deeply enough into your site to index all of your pages appropriately.” Kumar suggested breaking up your navigation into sub-categories or incorporating additional internal links.

3. Inconsistent linking practices. “Again, because the search engines can’t apply human judgment to see what you meant to do, their spider programs may index the URLs ‘www.yoursite.com/page1.html’ and ‘yoursite.com/page1.html’ as two separate pages-even though both links direct visitors to the same location,” he wrote. “To prevent these indexing errors, be consistent in the way you build and name links. If you’ve made this mistake in the past, use 301 redirects to let the search engine spiders know that both the ‘www’ and ‘non-www’ versions of your URLs are the same.”

4. Incorrect redirections. “When it comes to 301 redirects, any time you move the pages on your website-whether you’re simply renaming them or transferring your entire site to a new hosting account or URL-you’ll want to put the correct redirects into place,” Kumar wrote. “Failing to do so can result in future indexing errors and eliminate the benefits provided by the backlinks you’ve spent time acquiring, as these links no longer point to valid pages. Both of these issues can reduce the search engine results rankings you’ve worked hard to develop.”

5. Failure to include a site map. “As you improve the accessibility features of your website’s architecture, make sure you have a site map in place,” he wrote. “This file provides the spiders with an accessible reference of all the pages on your site, allowing indexing to proceed correctly.”

To read the complete Entrepreneur.com article, click here.