Eliminate Google Webmaster Tools tracking errors | Adrenalina

Eliminate Google Webmaster Tools tracking errors

One of the preoccupations when you analyze a website is to check the Google Webmaster Tools reports (WMT), the tool which Google offers to control several aspects of our site. Ranging from keywords, duplicated content, indexing and even different types of errors which can happen in our site.

It obviously isn’t the only tool and not one of the best besides being another complement of the info Google has about our site, very valuable info we seek to improve our rankings inside of the SEO services.

The reported tracking errors in WMT can be of several types and for each device group, be it web or mobile.

What are the Type 50x & 40x tracking errors?

Type 500 errors generally forming at server level, like when Apache or nginx doesn’t have have further resources or the database can’t be accessed by the app.

Webmaster tools errors

Webmaster tools 40x Errors Report

Type 400 are app errors, most common are 404 & 410 which aren’t great differentiated by Google.

All tracking errors are important and lowering the number, especially when it goes over the indexed pages number, must be priority one.

We’ll always take into account our site’s authority to find out if mistakes keep on spawning or they’re historicaly, because the Google robot can take months to check an error page if our site doesn’t have much authority, even over a year.

 

How to eliminate our site’s tracking errors

Type 500 errors can be easily picked and are provoked by hosting services of poor quality or by badly-managed traffic peaks, supposing they aren’t harmful app mistakes.

Errors 404 can form out of 2 reasons: pages were or are linked from the same site and they no longer exist, or the same case but from external sites. Links can exist or not, but it depends on the Google robot to be re-checked.

Once we ID the error sites we want to fix, we gotta check if they’re still linked from our site. If that’s so then the 1st step is to remove the link. Next is taking out that page from the Google catalog. To do that the best manner is to add it to the file robots.txt with the command disallow. If we can find a error URL pattern then we can add it to robots.txt and shorten the text lines.

Generally, when you’ve got a large site with lotta errors, say ‘bout 10,000 indexed pages and 6,000 errors, it’s usual that, as the Google robot removes errors goes on adding them, it could increase the errors number, but after some days these will go down too.

Notice that the new errors can have different URL patterns and you must be attentive to add them to robots.txt

Example: we’ve recently migrated a site made in Drupal to WordPress and, in about 3,000 indexed pages, the error count climbed up to 8,000. Applying those rules helped correct the errors in just 7 days’ time.

remove errors webmaster tools

Errors eliminated from webmaster tools

We must be especially careful with adding disallow rules to robots.txt when they match with URL marked in the sitemap.xml because we’ll get a warning message.

Another more drastic solution for sites with few indexed pages and with a perfect published pages’ control, is to permanently add in the robots.txt file a disallow:/ command and make each desired page be indexed as allow.

Like this we won’t have error problems and we won’t make the Google robot work extra time: it’ll surely be glad of that.

Autor: mauro flores

Estudiamos y analizamos su negocio en profundidad, definimos objetivos y planteamos la estrategia de marketing más adecuada centrándonos en conseguir cada uno de los objetivos propuestos. Solicita Presupuesto Ahora

Uso de cookies

En este sitio web utilizamos cookies propias y de terceros para mejorar nuestros servicios, para que usted tenga la mejor experiencia de usuario y analizar su visita. Si continúa navegando está dando su consentimiento para la aceptación de las mencionadas cookies y la aceptación de nuestra política de cookies, pinche el enlace para mayor información.