Log File Evaluation 101 – Whiteboard Friday

Log file evaluation can present among the most detailed insights about what Googlebot is doing in your website, however it may be an intimidating topic. On this week’s Whiteboard Friday, Britney Muller breaks down log file evaluation to make it slightly extra accessible to SEOs all over the place.


Click on on the whiteboard picture above to open a high-resolution model in a brand new tab!

Video Transcription

Hey, Moz followers. Welcome to a different version of Whiteboard Friday. Right this moment we’re going over all issues log file evaluation, which is so extremely essential as a result of it actually tells you the ins and outs of what Googlebot is doing in your websites.

So I will stroll you thru the three main areas, the primary being the forms of logs that you simply may see from a specific website, what that appears like, what that info means. The second being easy methods to analyze that knowledge and easy methods to get insights, after which the third being easy methods to use that to optimize your pages and your website.

For a primer on what log file evaluation is and its software in search engine optimisation, take a look at our article: Methods to Use Server Log Evaluation for Technical search engine optimisation

1. Varieties

So let’s get proper into it. There are three main forms of logs, the first one being Apache. However you will additionally see W3C, elastic load balancing, which you may see lots with issues like Kibana. However you additionally will possible come throughout some customized log recordsdata. So for these bigger websites, that is not unusual. I do know Moz has a customized log file system. Fastly is a customized sort setup. So simply bear in mind that these are on the market.

Log knowledge

So what are you going to see in these logs? The info that is available in is primarily in these coloured ones right here.

So you’ll hopefully for positive see:

  • the request server IP;
  • the timestamp, which means the date and time that this request was made;
  • the URL requested, so what web page are they visiting;
  • the HTTP standing code, was it a 200, did it resolve, was it a 301 redirect;
  • the person agent, and so for us SEOs we’re simply taking a look at these person brokers’ Googlebot.

So log recordsdata historically home all knowledge, all visits from people and site visitors, however we need to analyze the Googlebot site visitors. Methodology (Get/Submit), after which time taken, consumer IP, and the referrer are generally included. So what this appears to be like like, it is type of like glibbery gloop.

It is a phrase I simply made up, and it simply appears to be like like that. It is similar to bleh. What’s that? It appears to be like loopy. It is a new language. However basically you will possible see that IP, in order that pink IP deal with, that timestamp, which is able to generally appear like that, that technique (get/submit), which I do not utterly perceive or essentially want to make use of in among the evaluation, nevertheless it’s good to concentrate on all this stuff, the URL requested, that standing code, all of this stuff right here.

2. Analyzing

So what are you going to do with that knowledge? How can we use it? So there’s quite a few instruments which are actually nice for doing among the heavy lifting for you. Screaming Frog Log File Analyzer is nice. I’ve used it lots. I actually, actually prefer it. However it’s a must to have your log recordsdata in a selected sort of format for them to make use of it.

Splunk can also be an awesome useful resource. Sumo Logic and I do know there is a bunch of others. When you’re working with actually massive websites, like I’ve previously, you are going to run into issues right here as a result of it is not going to be in a standard log file. So what you are able to do is to manually do a few of this your self, which I do know sounds slightly bit loopy.

Guide Excel evaluation

However cling in there. Belief me, it is enjoyable and tremendous fascinating. So what I’ve performed previously is I’ll import a CSV log file into Excel, and I’ll use the Textual content Import Wizard and you may principally delineate what the separators are for this craziness. So whether or not or not it’s an area or a comma or a quote, you possibly can form of break these up so that every of these dwell inside their very own columns. I would not fear about having additional clean columns, however you possibly can separate these. From there, what you’d do is simply create pivot tables. So I can hyperlink to a useful resource on how one can simply do this.

Prime pages

However basically what you possibly can take a look at in Excel is: Okay, what are the highest pages that Googlebot hits by frequency? What are these high pages by the variety of occasions it is requested?

Prime folders

You may also take a look at the highest folder requests, which is basically fascinating and actually essential. On high of that, you can even look into: What are the most typical Googlebot varieties which are hitting your website? Is it Googlebot cellular? Is it Googlebot photographs? Are they hitting the right sources? Tremendous essential. You may also do a pivot desk with standing codes and take a look at that. I like to use a few of these purple issues to the highest pages and high folders stories. So now you are getting some insights into: Okay, how did a few of these high pages resolve? What are the highest folders trying like?

You may also do this for Googlebot IPs. That is the very best hack I’ve discovered with log file evaluation. I’ll create a pivot desk simply with Googlebot IPs, this proper right here. So I’ll often get, generally it is a bunch of them, however I am going to get all of the distinctive ones, and I can go to terminal in your laptop, on most traditional computer systems.

I attempted to attract it. It appears to be like like that. However all you do is you sort in “host” and you then put in that IP deal with. You are able to do it in your terminal with this IP deal with, and you will notice it resolve as a Google.com. That verifies that it is certainly a Googlebot and never another crawler spoofing Google. In order that’s one thing that these instruments are inclined to mechanically maintain, however there are methods to do it manually too, which is simply good to concentrate on.

3. Optimize pages and crawl finances

All proper, so how do you optimize for this knowledge and actually begin to improve your crawl finances? After I say “crawl finances,” it primarily is simply which means the variety of occasions that Googlebot is coming to your website and the variety of pages that they usually crawl. So what’s that with? What does that crawl finances appear like, and how will you make it extra environment friendly?

  • Server error consciousness: So server error consciousness is a extremely essential one. It is good to control a rise in 500 errors on a few of your pages.
  • 404s: Legitimate? Referrer?: One other factor to check out is all of the 400s that Googlebot is discovering. It is so essential to see: Okay, is that 400 request, is it a sound 400? Does that web page not exist? Or is it a web page that ought to exist and not does, however you may possibly repair? If there’s an error there or if it should not be there, what’s the referrer? How is Googlebot discovering that, and how will you begin to clear a few of these issues up?
  • Isolate 301s and repair continuously hit 301 chains: 301s, so a variety of questions on 301s in these log recordsdata. The most effective trick that I’ve form of found, and I do know different individuals have found, is to isolate and repair essentially the most continuously hit 301 chains. So you are able to do that in a pivot desk. It is really lots simpler to do that when you will have type of paired it up with crawl knowledge, as a result of now you will have some extra insights into that chain. What you are able to do is you possibly can take a look at essentially the most continuously hit 301s and see: Are there any simple, fast fixes for that chain? Is there one thing you possibly can take away and shortly resolve to only be like a one hop or a two hop?
  • Cellular first: You possibly can control cellular first. In case your website has gone cellular first, you possibly can dig into that, into the logs and consider what that appears like. Curiously, the Googlebot remains to be going to appear like this appropriate Googlebot 2.0. Nevertheless, it’ll have all the cellular implications within the parentheses earlier than it. So I am positive these instruments can mechanically know that. However when you’re doing among the stuff manually, it is good to concentrate on what that appears like.
  • Missed content material: So what’s actually essential is to check out: What’s Googlebot discovering and crawling, and what are they only utterly lacking? So the best method to do this is to cross-compare together with your website map. It is a actually wonderful means to check out what is perhaps missed and why and how will you possibly reprioritize that knowledge within the website map or combine it into navigation if in any respect potential.
  • Examine frequency of hits to site visitors: This was an superior tip I bought on Twitter, and I can not bear in mind who mentioned it. They mentioned evaluate frequency of Googlebot hits to site visitors. I assumed that was sensible, as a result of one, not solely do you see a possible correlation, however you can even see the place you may need to improve crawl site visitors or crawls on a selected, high-traffic web page. Actually fascinating to type of check out that.
  • URL parameters: Check out if Googlebot is hitting any URLs with the parameter strings. You do not need that. It is usually simply duplicate content material or one thing that may be assigned in Google Search Console with the parameter part. So any e-commerce on the market, undoubtedly examine that out and type of get that each one straightened out.
  • Consider days, weeks, months: You possibly can consider days, weeks, and months that it is hit. So is there a spike each Wednesday? Is there a spike each month? It is type of fascinating to know, not completely important.
  • Consider pace and exterior sources: You possibly can consider the pace of the requests and if there’s any exterior sources that may doubtlessly be cleaned up and pace up the crawling course of a bit.
  • Optimize navigation and inside hyperlinks: You additionally need to optimize that navigation, like I mentioned earlier, and use that meta no index.
  • Meta noindex and robots.txt disallow: So if there are issues that you do not need within the index and if there are issues that you do not need to be crawled out of your robots.txt, you possibly can add all these issues and begin to assist some of these things out as properly.

Reevaluate

Lastly, it is actually useful to attach the crawl knowledge with a few of this knowledge. So when you’re utilizing one thing like Screaming Frog or DeepCrawl, they permit these integrations with totally different server log recordsdata, and it provides you extra perception. From there, you simply need to reevaluate. So that you need to type of proceed this cycle time and again.

You need to take a look at what is going on on, have a few of your efforts labored, is it being cleaned up, and go from there. So I hope this helps. I do know it was lots, however I would like it to be form of a broad overview of log file evaluation. I sit up for your entire questions and feedback beneath. I’ll see you once more quickly on one other Whiteboard Friday. Thanks.

Video transcription by Speechpad.com



Supply hyperlink

You might like

Leave a Reply

Your email address will not be published. Required fields are marked *