Last week, we kicked off our SEO audit series covering arguably the largest aspects of the SEO process, content and keywords. Quality content, targeting relevant keywords, will get you very far in SEO, but if you really want to beat out the competition in Google, you’ve got to leave no stone unturned in your SEO audit.
To reiterate when and why you’d want to perform an SEO audit, when you have a website that you want to rank well in Google, you need to first A) Identify potential problems that could prevent this from happening and B) Come up with a strategy that will get you to that coveted first page again and again.
Our previous post outlines the steps necessary to analyze content problems — perhaps there’s not enough content on a website, or perhaps the content that is there has been copied and pasted from another website. That’s a problem, and you should really check out part 1 of this series for tips on tackling such issues.
So, let’s say you’re past that. You’ve got a website filled with wonderful content, and now you want to make sure Google is aware that you’re out there. There’s where Google Analytics and Webmaster Tools come in.
Google Analytics and Webmaster Tools Integration
These tools are fairly easy to setup, and they greatly enhance your insight as a webmaster (plus your website’s visibility potential). First, we’ll tackle Webmaster Tools, which gives access to backlink reports, Google crawl bot feedback (i.e. if Google has issues crawling your website, you will be notified), and much more. Once verified, you can directly submit pages of your site to Google’s index. You can also submit a sitemap.xml file, which helps map out the structure of your website all nice and tidy like for Google’s crawl bots. More on this feature a little later.
So, in order to get started, you’ll find that a Google Account is required. If you happen to have a Gmail account, that will be sufficient. If not, then you really should create one.
Once you’re logged into your Google account, you’ll be able to add a website in Webmaster Tools. When that’s done, you have to verify your ownership. There are a few ways you can do this, but I generally prefer downloading the .html file Google provides and dropping it in the root directory of your website (this can be done using an FTP client such as FileZilla).
Now that Webmaster Tools is up and running, let’s move onto Google Analytics. This tool lets you monitor the traffic coming to your website. You can analyze which pages are most popular, which keywords are bringing in the most visits, etc. This data can be used to really fine tune your website and determine what’s working and what isn’t. This and watching organic search engine traffic increase month by month is one of the most rewarding aspects of doing SEO.
So, let’s get this thing setup. Follow that Analytics link, and if you’re still signed into your Google account, you can visit the “Admin” section (orange button near the top) to add a new account (click that + New Account button).
Now you should be at the page pictured above. You can assign whatever name you wish, but do be sure to get that URL correct. It’s up to you if you want to share your site’s data. When you’re done here, click that “Get Tracking ID” button at the bottom. If you’re not intimidated by web code, you can implement the tracking code within the <head></head> section on each page that you want to track. If this is a bit much for you, feel free to ask a web developer to take care of it for you.
You’ll want to wait a month or so before you really start to analyze and compare traffic (and as you become more seasoned, you can track conversions and optimize your site to be a mean product or service-selling machine), but for right now, so long as you start to see visits being tracked within the next day or so, you can at least feel satisfaction in knowing that it’s working properly.
Robots.txt and Sitemaps
Now it’s time to make sure your website is very “search engine-friendly.” By this, we mean that you want to make your website very accommodating to Google’s crawl bots. Think of this as being the virtual equivalent of putting a nice welcome mat by the door and leaving cookies and milk on the table for Google.
Thankfully, this can be fairly easy to do. First, you’ll want two different types of sitemaps: .xml and .html. The .html sitemap is like a table of contents for your site. It lists the various pages in a nice, presentable format that’s inviting to both search engines and human visitors. You can either manually create this sort of sitemap yourself, linking to it from either your website’s footer or main navigation, or you can let an automated program do this for you.
That link above lets you tackle two birds with one stone, as it can also provide you with a spiffy sitemap.xml file as well. This type of sitemap shouldn’t be linked from anywhere on your site, as it’s specifically for the search engines. That’s why it looks a lot like what you see when you peek under the hood of a web page — naked code.
The sitemap.xml you drop into your website’s root directory — much like the Webmaster Tools verification .html file we dealt with a little while ago. In fact, speaking of Webmaster Tools, we should probably revisit it right now to add this sitemap.
When you’re back in Webmaster Tools (while logged into your Google Account and having clicked on the newly-created and verified website profile), you can now add a sitemap by clicking the “Sitemaps” section on the right-hand side of the page. On the next page, there will be an “Add/Test Sitemap” button. Click that, and seeing as you just placed the file in your website’s root directory, you can just enter sitemap.xml in the field that appears. Click submit sitemap, and the deed is done. You’ve now made your website’s structure very known and familiar to the almighty Google.
Now, while we’re at it, let’s quickly make a robots.txt file. This is a simple .txt document (something you’d open with notepad) that you can utilize to block Google’s crawl bots from seeing certain pages of your site. Doing that’s a bit more advanced SEO work, so we’ll just stick with the basic implementation.
Open notepad (or the Mac equivalent) and copy and paste the following:
User-agent: *
Disallow:
This tells Google.. Well, not a whole lot. It’s a placeholder for now. However, below that second line, you can skip a line and add a new one that directs Google to your sitemap.xml like so:
Sitemap: http://www.yourwebsite.com/sitemap.xml
Save that file, call it “robots” and make sure it has a .txt extension. Now, once again, drop that file into your website’s root directory. That’s all you’ll need to do for now.
Between the implementation of Google Analytics and Webmaster tools, the creation and submission of sitemaps and a robots.txt file, you’ve now created a line of communication with Google — a relationship if you will. Like any healthy relationship, you’ll want to be open and honest with Google. Listen to Google. Be there for Google, and in turn, a beautiful online presence should begin to blossom. Next week, we’ll wrap up our SEO audit series with the ins and outs of onsite optimization and link acquisition.
Until next time, if you have anything you’d like to add to our SEO audit series, or you’d like to ask us a question, feel free to do so in the comments section below.