Thursday, May 24, 2012

Best SEO 2012-your correct website architectural issues

Website architecture is a very important part of the overall search engine optimization strategy. Your site can get a great head start in achieving top rankings by ensuring you have implemented SEO friendly website architecture. Make sure you evaluate your client's website against the following SEO best practices:


Domain redirections


Check for redirections. The non-WWW version of the domain name should be 301 redirected to its WWW version. If any of the existing pages have been moved temporarily, then use 302 redirection. If the pages have been deleted and if your Analytics/Webmaster tool shows links to these deleted pages, then set up 301 redirections from the deleted URLs to the most relevant pages on your website or to the new version of the deleted URLs.


Preferred domain setting


Make sure you set the preferred domain to the WWW version in the Google webmaster tools.


Checking internal links for inconsistencies


Check which page on your site tops the list. Typically this should be one of your service offering pages, product pages or the home page. If you see the "privacy policy" or "terms and conditions" page, then your internal link structure needs attention. Also, extremely large numbers could indicate poor or rampant site-wide internal linking.


Domain blacklist checks


Check if your domain name has been blacklisted. (domain-blacklist.e-dns.org/)


If at all you do e-mail marketing, please use some other domain name and not your main business domain name.


Checking sitelinks


Check Google results page (or your webmaster account) if your website has been awarded with any sitelinks. If yes, then check if there are any pages which you do not want to be shown in the search results.


Web crawl error reporting


Check your Google Webmaster account. Navigate to "Diagnostics>Crawl Errors" and check for any errors. Make sure you rectify these issues ASAP.


URL deep directory depth issues


Search Engines measure the importance of a page's subject matter in part, by its proximity to the Homepage. Oftentimes content kept in directories that are more than 3 directories deep are considered of low value and will have difficulty ranking well for their topic.


URL Structure - Check Points


1. How many directories is the deepest content?


2. How many clicks from the homepage is major content?


URL separators


I always recommend the use of "hyphens" in the URLs as compared to "underscores". Do not use any special characters as URL separators. More than 4 separators in a URL can incur a spamming penalty and reduce the ability of the page to rank significantly.


Note examples of URLs using separators and the type, which can include:


1. Hyphens (considered a space)


2. Plus Signs (considered a character)


3. Underscores (considered a character)


Note: Some types of websites such as blogs, we give a pass on this issue.


URL capitalizations


DO NOT use capitalizations in the URLs. If you already have URLs with capital letters in them, I strongly recommend setting up permanent redirections (301) to their lower case versions to avoid URL confusion issues.


Even slight changes to URL formatting, such as adding capitalization can result in a splitting of PageRank and Link value.


Note examples of URLs using Capitalization.


Check primary navigation pages and see if you can spot PR splits.


www(dot)example(dot)com/Games/


Vs.


www(dot)example(dot)com/games/


XML sitemaps


XML Sitemaps are created to be submitted directly to search engines, providing them with the exact contents of your website.


Typically the name of the XML Sitemap is: "sitemap.xml"


If present, look for the following settings.


Check "ChangeFreq"


Check "Priority"


Check "Last mod"


Check for


1. non-www versions of URLs (assuming www is the canonicalized version)


2. HTTPS URLs


3. URLs for other domains or sub domains


Other Points of Interest


1. XML Sitemaps cannot include redirects


2. XML Sitemaps can only include URLs which appear in the directory or sub-directories of the sitemap itself.


3. XML Sitemaps should only list pages which have unique content. Avoid listing poor quality pages.


4. XML Sitemaps should contain no more than 50k URLs


5. XML Sitemaps should be no larger than 10megs in size.


XML Sitemap Verified in GWT


Make sure your XML sitemap has been submitted using Google Webmaster Tools and have been verified.


XML Sitemap in Robots.txt


Listing your XML Sitemap in your robots.txt is a good way to ensure that Yahoo, Bing and Google can regularly find and crawl your current XML Sitemap.


robots.txt issues


Check if your site has a robots.txt file. (www.example.com/robots.txt). If not, then create one and at least have the default robots.txt file on your server. You might want to block search engines from indexing the files from your website folders like, "images", "admin", "scripts" and any other specific folders.


Appended Parameters


Multiple parameters, can cause problems for Search Engines, create URL confusion by creating unnecessarily high numbers of URLs that point to identical or similar content. As a result, spiders may consume much more bandwidth than necessary, or may be unable to completely index all the content on your site.


Check:


1. For URLs with multiple appended parameters


2. Note the largest number in a URL you found


3. Note the length of the parameters (anything over 10 characters could appear to be a Session ID)


URL Encodings


Encoded URLs (The '%' sign in the URLs) can sometimes be the cause of spider loops and generally increase URL confusion by producing the same content on multiple unique URLs. All the encodings in the URLs should be stripped and discontinue linking to these encoded URLs. Also, redirecting the current encoded URLs to their new "clean versions" would be ideal.


Session IDs


Session ID's appended to a URL can produce spider loops and cause the search engines to slow crawl a site, abandon sections of a site entirely, or even the site itself.


Breadcrumbs


Breadcrumbs are a clean, user friendly way to increase effectiveness of Content Soiling through internal linking, pass valuable keywords through anchor text, as well as provide a crawlable path for search engines to follow. Make sure your site has breadcrumbs implemented.


Background Images issues


Images set as background images provide no SEO benefit. Only images on-page can have descriptive alt tags and SEO friendly file names. Try avoiding background images as much as possible.


Source code size issues


This is the page size Search Engines will be downloading during their crawl. This does not include images and dynamic elements of the website. The threshold for this are pages over 300k. Make sure none of your pages have a source code which exceeds the 300k threshold.


User download size issues


This is the size page human visitors will be downloading during their visit. This includes images and dynamic elements. The threshold for this are pages over 500k. Make sure none of your pages have a user download size which exceeds the 500k threshold.


Homepage Meta-Tags


Check for the following Meta-Tags and make sure you do not use them on you website:


Meta Refresh


meta http-equiv="refresh" content="0;url=www.example.com/redirect.aspx"


Redirects a visitor to another URL after a specified amount of time. Does not pass along full PageRank and Link Value


NoArchive


Prevents a cached copy of this page from being available in the search results.


NoSnippet


Prevents descriptions from appearing below the page in the search results, as well as prevents caching of the page.


Noodp and Noydir Meta Tags


Use the following tags on your home page:


Noodp


This tag notifies Search Engines that you do not want them to replace your existing Title and Meta Description Tags in SERP results with equivalent data found in your current DMOZ.org listing.


Noydir


Prevents the use of Titles and descriptions from the Yahoo Directory in Search Results.


HiJacking


Find the IP address of your domain name using any online tool. Check if the IP address stays in the URL box (address bar) the entire time while clicking through the site.


1. Ping the site


2. Navigate to the site via IP


3. Does the IP stay in the browsers URL box when you click from page to page? If so, it's vulnerable.


Image Alt-Tags


Alt-Tags allow for further optimization of a webpage by adding spiderable descriptions of images. Alt-Tags are the primary source of information search engines rely on to assign value to images, can help increase Topic Authority and affect the ability of an image to appear in an "Image Search" or a traditional search result.


Use Caption Text for Images


Images surrounded by caption text have a better chance of appearing in "Image Searches".


Descriptive Filenames for Images


Images with descriptive file names provide yet another opportunity to add a keyword on-page and can increase the ability of the specific image to appear in "Image Searches".


Bad Example: /images/image908761.gif


Good Example: /images/iphone4s.gif


Check for Oversized Images


Note any oversized images (over 500k) you can find. Large images can take longer to download and according to Google, the response time for requesting your images and file size can affect its ability to rank in "Image Searches".


Site Hosting Location


If localization is important, the country of hosting can help determine if a website appears in local search results.


These points cover the most important factors of website architecture which, if implemented, can have a very positive impact on your organic SEO efforts.


Vikas Solanki is the owner & founder of SEOzy.com. An SEO Consultant with 8+ years of experience, more than 90% of our clients are in TOP positions on major search engines.


Source

No comments:

Post a Comment

Type your comment here...