How frequently is Googlebot crawling your site and identify the most commonly crawled pages and folders.
Check status codes for each of your site’s pages to eliminate errors. Create 301 redirects for old content in a single click.
Is your site’s crawl budget being wasted on secondary pages?
Test locally or download and import web server logs from AWS S3.
Web server logs are the best source of information to identify which URLs search engines are crawling. This data is critical to diagnose problems with low search engine ranking.
Static Site SEOImport server access logs provided by Amazon's CloudFront and Simple Storage Service (S3) or Apache and translate them into human readable statistics, reports and graphs.
Read S3 access logs directly from Amazon S3 buckets or load a standard Apache (W3C format) log from a file. Parse the logs to analyse the type of visitor and verify if the bot request is genuine via reverse IP address lookup to eliminate fake bots. Test which URLs are crawled by which bot and if the crawl budget is being wasted on static resources when updated content is indexed infrequently.
For SEO URL consistency is essential. When you update content and change navigation 301 redirects should be created to avoid 404 errors. Managing 301 redirects is not a simple task. Use web log analysis to check all redirects are working and pointing to the required target.
A 301 redirect status code by the server tells the browser that the page has moved permanently. Google has confirmed that 301 does not lower the page rank. This is an essential feature if you are updating content or modifying the navigation of your site.
Without a 301 redirect the web server will reply with a 404 error if the content is missing. Use the Web server logs to identify errors caused by updated content and with a single click create a 301 redirect to the new target.
Bot count
Crawl Frequency
Status Codes
File Types
The Googlebot can discover URLs, but it doesn't always index them. If you are publishing new content the web logs will reveal if Googlebot has crawled and able to index the page.
Broken links affect you page search rankings. Analyse web logs to check 404 errors for internal and external links.
Orphan pages can occur where a page is generated for a list with pagination and the page is not included in menu navigation. Use web log analysis to identify if all of your pages are indexed and have been crawled by Google.
Speed is a known factor in SEO particularly for mobile. Use web logs to identify large pages which are slow to load to solve performance issues and improve your score on Google page insights.