Through the Spyglass

IDENTIFYING TRENDS, ANTICIPATING CHANGES, CHARTING NEW DIRECTIONS IN THE DIGITAL SPACE.


Ten Essential SEO Checklist Items

Many people who start an SEO campaign often ask, “What are the main things I need to focus on to start moving up in the ranks?” That question is actually very subjective in nature because not all websites are created equal. Website A may need significantly more original content while website B may have major under-the-hood technical issues like bloated JavaScript or no XML sitemap. The point is that the main things to focus on will differ for you depending on your website, however, we’ve compiled a list of ten essential items to check off when performing an initial site audit.

10.) JavaScript & CSS Bloating

JavaScript and CSS can really help present a visually appealing experience to the user, but too much of a good thing can happen with these two. All too often we find that our clients have an over abundance of JavaScript and/or CSS just below the <head>. This can cause serious load time issues that will annoy the crawlers and in some cases scare them off completely. 

A good practice is to load the JavaScript externally or push these codes to the bottom of the page so the user at least gets the essence of what they are looking for while the JavaScript and CSS load. You want to keep the homepage as lean as possible because although load time is not a huge factor with respect to prominent SEO opinion these days, we have noticed a rankings difference among competitors with faster load times.

9.) Non-Use or Miss-Use of Canonicals

If you have poor rel=canonical tagging, you could face a pretty hard-hitting penalty as the crawlers don't take too kindly to duplicate content. A lot of times we find our clients unintentionally run into this issue for a number of different reasons, whether they were running a test or they had a change in their development team midway through a build-out. 

Check your canonical tags to ensure they are properly placed on the correct pages. According to Google, canonicals should only be used on pages that have duplicate content. 

8.) Flat vs. Deep Site Architecture

This one can arguably be categorized as either a strategic or a technical issue. There are usually two types of site formats that businesses choose to build. The first and more common is a silo structure with many drill down pages that support each silo, so as to show that the company is in fact an authority on the search subject matter. The second format has fewer vertical levels and sublevles resulting in fewer layers of content. Sites that choose a flat architecture rather than a "siloed" deep architecture run the risk of not having enough content to deem themselves an authority on a given subject to the search engines. This is not always the case, as there are many sites with a flat, horizontal type structure that rank very well, but the odds are more in your favor with a deep structure.

After you create a structure, ensure that your site, especially one with virtual and physical silos that contain plenty of depth, has a strong internal linking network. You want to make sure that your links are leading the user to relevant pages. For example, if I am looking for red running shoes within the content of the site, I don't want to end up on a level 3 or level 4 page showing me bicycle shorts. Neither do the crawlers.  

7.) Improper or Non-Use of Meta Tags

This one is self-explanatory but still very much a common problem. Obviously you will have trouble ranking on key search terms if you aren't titling your pages correctly.

It is essential that you label things correctly (assuming you have already done the keyword research, of course). This is important post-Hummingbird because Google is now able to parse out the entire query and understand in fine detail exactly what a user is searching for. Because of this update, it behooves businesses to ensure that they are leveraging the meta data as optimally as possible.  

6.) Bad or No XML Sitemap

Submitting an XML sitemap should be all but standard practice for most businesses. An XML sitemap tells Google what pages to index, and you want to make sure you are giving the crawlers good information. 

You want your sitemap to be completely free of URLs that are 404 or 302 type URLs. Google has almost zero tolerance for a bad sitemap. There is only room for maybe 1% of potentially bad URLs to keep the sitemap from  hurting your SEO.

5.) Inconsistent URLs/Parameters

URL parameters provide information about a page and make the content of the page change to show additional information about certain products. A URL parameter can be identified with a "?" in the URL, which indicates where the parameter begins. If you have incorrect parameters in your URLs, you could potentially wind up with a duplicate content situation. 

You will want to utilize your Robots.txt file here, blocking potential duplicate content pages if you are currently working on them, or the most common fix is to add canonical tags to those pages.

4.) Content Presentation/Site-wide Links

A while back, Google had a 100 links per page limit that doesn't really hold too much weight these days. Now there are plenty of sites with hundreds of links per page, but sometimes the links are not accompanied by enough supporting content. This could be a real ranking nightmare because the crawlers will immediately suspect a site of link spam when they come across these pages. 

Although having a page with more than 50 or even 100 links looks pretty sketchy, it doesn't automatically mean that you will be penalized. The key is to make sure the links and whatever content you have are synergistic, working together to contribute to the purpose of the page. The popular opinion is that site-wide links are bad, but if they are earned naturally, then it's not necessarily a bad thing, even post-Panda/Penguin. If you did in fact earn them naturally, you should maintain the equity from those links and not remove them.

3.) Cloaking

Cloaking is one of those things that can draw a site penalty very quickly. Cloaking is basically showing the search engine one thing, and the user another. This is a direct violation of Google's Webmaster Guidelines and will have a severe impact on your rankings.

Always be sure to show both the search engines and the users relevant content that relates to what the user is looking for. Google penalizes cloaking and similar kinds of things because they want to provide the best user experience possible, and in order to do that, they can't have websites reporting content to the crawlers that is ultimately useless to the user.

2.) 301s Redirect Failure

Many clients who wish to perform a site overhaul or site migration run into a very big problem here that could have major ranking implications. The problem is that you think you have correctly implemented your list of 301 redirects successfully only to find out months later that most, if not all, of them failed for one reason or another. 

It is vital that you check and test your 301 redirects after going live with a new site. There are a number of reasons why 301 redirects "break," including human database errors and coding changes. Always keep a watchful eye on those redirects.

1.) Robots.txt

This may seem like an obvious or even an unlikely choice for this list, but there have been times when a client can't figure out what is happening with their rankings place, and the only problem was a robots.txt file that was somehow instructed to block the crawlers from inspecting the key pages of the site. It is important to note that the robots.txt file will not exclude your pages from being indexed, but it will communicate to the crawlers not to parse out content of certain pages you designate in the robots.txt file. 

It's always good to have an SEO checklist and starting with your robots.txt file isn't a bad idea.