Watch our latest video: How to Go Viral on Quora https://bit.ly/ViralOnQuora1

Subscribe to our YouTube channel! http://smr.sh/KdD

In this lesson, you’ll learn how to improve your website’s health with the help of our all-round technical SEO tool – Site Audit.
Watch the full course for free: https://bit.ly/3d7lPIu

0:20 Technical SEO
0:55 Site Audit tool
3:22 Total Score metric
4:10 Thematic reports
5:08 Crawled Pages tab
5:45 Statistics tab
6:47 Compare Crawls report
7:00 Progress tab
7:30 Summary

✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹
You might find it useful:
Apply your newly acquired knowledge by practicing with SEMrush and build up your real-world skills.
Go to Site Audit:
➠ https://bit.ly/2XzKdfj

Get a comprehensive and detailed overview of technical SEO in our free course:
➠ https://bit.ly/2ZCU9Hp
✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹

This is the beginning of our course, and it’s devoted to technical SEO. Technical SEO is actually the foundation to your entire SEO strategy. It’s all about how your website is built and structured, and how easy it is for search engines to crawl and index your content. Some technical SEO issues might even render your website invisible to search engines, so it’s critical to understand, define, and be able to fix them.

So, in the first lesson of our module, we will check on and take steps to improve your website’s health. We will streamline this process with the help of our all-round technical SEO tool – Site Audit.

It helps you find, prioritize and fix technical and on-page issues to boost your website’s health and your SEO.

Site Audit
Let’s learn more about Site Audit, and start with setting it up.

At the first step:

Enter your crawl scope. It can be a domain, a subdomain, or a subfolder.
Set the limit of crawled pages. You can crawl up to 100,000 pages per audit.
Specify the crawl source. It can be the website itself, its sitemap, or even a custom sitemap. If you want to give your AMP pages priority over other pages during the crawl, check the Crawl AMP pages first checkbox below.
At the second step:

Choose a user agent name to crawl your site. It can be either the mobile or desktop version of the SEMrush Bot or the Google Bot.
Then set a crawl-delay. We recommend choosing the minimum delay option to maximize the auditing speed, but if you suspect that our crawler slows down your website, then choose the second option to tell it to respect the crawl-delay directive from your robots.txt file. The third option is introduced in case you don’t have access to your robots.txt file, but you want to avoid your website’s performance drop during crawling. If you choose this option, the crawling speed will be limited to 1 URL per 2 seconds.
At the third step:

Apply allow/disallow rules to include or exclude specific parts of your site from the audit. Entering a specific subfolder in the Allow box will narrow down the audit scope to this very subfolder. Disallow rules, in turn, tell the tool to exclude the entered subfolder from crawling.
At the fourth step:

Specify URL parameters you want to be ignored during crawling. As a result, certain URLs with and without entered parameters will be considered as the same page.
Finally, at the fifth step:

Schedule automatic audits, setting the preferred day of the week for the audit. You can also check the checkbox above the Start Site Audit button to let the tool send you an email once an audit is complete. Click this button to start the audit.
Later on, you’ll be able to connect your Google Analytics account. We recommend doing this for the sake of prioritization: your top-viewed pages will show up first in the audit list. Remember, you can always read just setup parameters for future recrawls.

Once the report is ready, you will see the tool’s main screen. It comprises:

The Total Score metric, which reflects the density of your website’s problems.
The crawled pages count with the breakdown of your pages by their status – healthy/broken/have issues/redirects/blocked.
The Robots.txt Updates widget, which checks the robots.txt file for availability and changes made to it since the previous crawl. It’s important to keep abreast of even minor changes to your robots.txt file, since any issue may damage your rankings.
Three types of issues that prevent your website from getting high rankings or cause a bad user experience.
These types are grouped by their impact on your website’s health in descending order: errors, warnings and notices. We advise you to focus on errors first, since those affect your SEO efforts the most.

#TechnicalSEO #TechnicalSEOcourse #SiteAudit #SEOaudit #SEMrushAcademy

source