Tips for Using Google Search Console to Grow Website

Google Search Console was previously known as Webmaster Tools. This is a collection of tools, data, and diagnostics that Google provides to help you build a Google-friendly site.

You can only get search engine optimization information from Google.

Side note – Bing also offers similar tools at Bing website master Tools.

Google Search Console (Webmaster Tools), can be downloaded for free. To make an impact on your website’s SEO, you must first understand Google Search Console.

Google launched the new Google Search Console in the 2018 beta.

It can be difficult to understand. This tutorial will show you how to use each feature and give you creative ideas.

Getting Started

To get started, you’ll need a website. Once your site is linked to Search Console.


There are many methods to verify your Search Console account. Google Analytics has our preference because it reduces the number of files and tags you have to keep.

If you are using WordPress, the Yoast SEO plug-in is a great choice. To verify that it is active, you need to keep it turned on

Search Console takes into account all subdomains and protocols that have different properties when verifying accounts.

A new website has had its HTTPS changes. Your data will not be correct if the property you verified using Google Search Console is different than the one Google shows in search results.

Understanding Settings

The new Search Console is simpler and easier to use. Maintain ownership verification and manage users if needed.

The ownership verification can be changed (e.g. if you move the DNS provider or plugins).

Access can be granted to anyone by going to Users & Permissions. This section can be used for website owners and DIYers who want to assess a consultant.

Like other sections in Search Console, the interface is clean and straightforward…once you realize that most things are clickable and sortable. The sort & filter button is located in the upside-down triangle. Click on any section to access more information.

Improvements & URL Inspection

The search console’s primary purpose is to improve your website’s content to make it more useful for Google’s users. The URL Inspection tool and Enhancements tab can help you address how Google and other users view your website.

Website problems can be difficult to spot in an environment where every person has their own browser, device, and settings. We will discuss technical troubleshooting in the Coverage section.

Mobile Usability

Google said that it would remove sites that aren’t mobile-friendly listings from mobile search results. Users don’t like websites that do not work on mobile devices.

Googlebot will identify any usability problems it finds. They are not website killers. These errors are more like weights that slow down your website.

Keep in mind that just because your site “has no errors does not mean it is mobile-friendly.

Google’s Pop-up Glossary is a useful reference tool that can help you understand the elements. The glossary lists all elements of SERPs as well as how you can affect them.

Google will inform you how many pages have been verified. Google will tell you how many pages have been validated.

Once you’ve found an error page you can use the URL inspection tool to determine why.

URL Inspection

You can use the URL Inspection tool to inspect URLs. To see detailed information about how Googlebot perceives your page and handles it, drag any URL from your website into this tool.

You also have the option to use the URL Inspection tool for submissions to Google. This page provides incredible detail about how Googlebot crawled and indexed it.

You can see how Googlebot pulls the HTML to determine if your site has errors.

Googlebot will render your website visually in the same way as a human browser.

It is possible to see the resources Googlebot is having trouble downloading CSS files.

The URL inspection tool is amazing. This tool doesn’t do bulk processing which is a problem. For this information, we’ll be looking at Coverage and Performance.


The Performance section is the most important in Search Console. This section contains all the data you need to optimize search results and increase organic traffic.

Search Console has added the Performance section (previously Search Analytics) to its database.

Queries: These are keywords that users searched on Google Search.

Clicks: The number of clicks that brought a user to your property via a Google search result page.

Impressions – This is the number of links that users saw from Google search results. This includes links not scrolled to view.

CTR – Click-through rate – The click count is divided by the impression count. It is the sum of the impressions divided by each row. An (-), dash will be displayed if there are not many impressions in the row.

Position – This is the average rank of all the top-ranking results on your site. If two queries returned results at positions 6, 6, and 6, then your average position would have been (2 + 3 = 3)/2 = 2. If there was a second query that returned results in positions 5, 9, and 2, the average position would be (2 + 2 = 3)/2 = 2. This is because the position doesn’t exist.

To use Search Console effectively, you must modify the groupings. You can view Queries, for example, after filtering a page.

We recommend that you have a hypothesis and click on the buttons to sort, drill down and filter as needed. These are the top two data points we currently pull.

Learn why a page is losing visitors

  1. You should check all metrics boxes
  2. Select a date range or filter by page
  3. Click here to search for a suspect, country, or device

Get New and Revised Content Ideas

  1. You should check all metrics boxes
  2. Filter page
  3. Click to See Queries
  4. Sort by Impressions
  5. Search for queries that are not directly related to the page, but still, rank it.
  6. This data is used to create or revise a page to answer the query.

This video shows how we think about tabs. The design is slightly different, but the process, data, and tabs remain the same.

Google’s organic search results success depends on links. Insufficient data is a major problem.

The Search Console now offers more link data than the Webmaster Tools. These links are easier to find than ever and may even be more complete than the Webmaster Tools.

You should carefully analyze these reports. Google has updated its reports to show a bias toward link quantity over quality.

Ahrefs can be used to enhance SEO.

Search Console links are comprehensive and free.

Analyze your Search Console link data. Make sure to download the data to Excel and combine it with Ahrefs data.

Let’s have a look at each report.

Google uses links to understand how the Web works.

There are three key points in this section.

Google will not give you all its link data. Professional SEOs and site owners who have a budget can use SEMrush’s, Backlink Tool. This tool provides more qualitative and useful data. LinkMiner and SEMrush offer a free trial span>

It is impossible to determine the exact Google use for this data for any query. Instead, focus on the bigger picture and problem diagnosis.

Third, this section will contain more information than you might expect. Keep clicking to find more.

These are the things that you should do when creating your Links Report.

Understanding the content being linked to is essential.

Use export to manually tag high-quality links and pair them up with SEO tool data.

This data can be used to find similar websites for the content market. This can also help determine how spammy the web is.

Third, make sure your anchor text tells the right story.

Many website owners find that it suffices to just look at their links and carry on doing the same thing.

This is an underutilized advanced tool, which can be used with data from third-party SEO tools.

This section lets you understand the links on your site as well as how Googlebot crawls it.

This report should be used only to identify outliers.

Sort the page according to the most and smallest links. Look for pages that are not linked as often as they should.

Higher rankings don’t necessarily mean pages are crawled less often. Googlebot doesn’t consider anchor text or context to be more important than links.

If you have underperforming content, your internal links might not be showing the correct picture to Googlebot. This is because it’s more popular and more relevant.

Pages that have more links than necessary might be in an out-of-date tag or category. You need to update the category structure.

While they are not easy to build, internal hyperlinks can be very simple to make.

Manual Actions

Google uses a combination of threats, announcements, and rewards to trigger webmaster behavior. This creates better signals for Googlebot.

A Web Spam Team member who discovers unusual marketing or website behavior will let you know via this section.

If you get a message, it is important to act quickly.


Google stores copies of your site on its servers. They use these copies to analyze search results and provide relevant search results.

Google Search can be used to understand how Google obtains these copies.

Websites that look good, but have a poor copy on Google will not rank well for relevant searches.

This report will show you exactly what Google has, and whether it matches your website.


Google must have an “indexed” copy of the URL to be able to show it in its search results.

Google won’t give organic traffic for pages that aren’t in its index.

This report will verify that Google has indexed all pages that you want to index.

It is easy to get lost in the data, just as with the Performance report.

Create a hypothesis (e.g. “Some pages don’t perform as well because they were incorrectly excluded”) and then click to drill down for the answers.

The new Search Console provides detailed reporting on errors. You can also see exact URLs, trends, and other information that could help you to match errors with site changes.

Data can also be used to support a positive hypothesis. “My plugin should exclude low-quality pages in the index “.

The Coverage report has a lot of data. It deserves its post.

Start with the Coverage Report.


Sitemaps help search engines improve the crawl of your website. Googlebot crawls your website and looks at your sitemap for guidance.

Sitemaps need to be in XML format. XML Sitemaps shouldn’t have any errors – Googlebot will ignore them, though it is less ruthless than Bingbot.

Find errors that need to be corrected. This can be used for reverse engineering pages not being indexed.

Other resources

Google provides many additional resources to help address specific issues.

Search Console’s help section is extensive. Please submit feedback if you have any questions or find anything unclear.

Next steps

You must have a verified Google Search Console account. Make sure all versions of your website are available.

Keep in mind that Google Search Console is a tool. Google Search Console does not do any work. Learn more about how it works and what the data mean.

Editor’s Note – This article is intended for both professionals and DIY SEOs. Please let me know if you notice any errors or phrasings.

We will be happy to hear your thoughts

Leave a reply

Enable registration in settings - general