Though your Google ranking is supposed to go up as your Domain Authority (DA) increases, some factors will likely lower your rankings over time if not managed.
Before we start, why not download my Spike Google Traffic Checklist that outlines all the strategies implemented in this post? Go ahead and get it.
Spike Google Traffic Checklist
Download the free SEO checklist to maximize each blog post for organic traffic
If you’ve ever noticed your Google ranking declining without knowing why, I’ve been there and I’m here for you. That’s why I want to show you how to increase the SEO ranking of your site as it worked for this blog.
At a point in time, my blog had a DA of 18 and was receiving steady Google traffic enough to make a newbie blogger satisfied.
After some successful link building efforts, my DA jumped to 22. Yay!
“A higher domain authority meant more traffic,” so I thought to myself.
I remember clicking on the Moz extension in my Google Chrome browser just to see the DA of my blog. I’ll leave my blog homepage opened on my browser with Moz proudly displaying my DA and open a new tab.
When I’m done with what I was doing online, I’ll go back to my homepage and stare at it. Look at the beautiful design of my blog and the DA to back it up.
I was elated that I had crossed the 20-point DA mark. My next goal would be 25, then 30.
But my joy was mercilessly short-lived.
I took a peek into my Google Analytics and saw that my organic traffic had slowly started to dwindle. Surely, this wasn’t what I expected.
This new development pretty much messed up the doctrine I religiously held on to as it concerns DA and Google rankings.
I painfully learned that a higher DA doesn’t always equal higher organic rankings.
As your blog grows, some menacing factors must creep in to ruin your SEO. I had to undergo a great learning curve to identify and fix those things.
I’m not going to share general SEO tips here.
I’m only going to show you the exact things I corrected to attain the organic traffic that befitted my new Domain Authority (DA).
This post may contain affiliate links. See the full disclosure here.
How To Improve Google Ranking – The Plain Truth
While creating this guide on how to increase website traffic through Google, I began to see the results of the strategies I’m about to share with you now.
My Google traffic increased as a result of what I did a week before writing this blog post.
I’m cutting out all the fluff you must’ve read online about this topic and giving you an honest, real-world solution to this issue because I know how terrible it feels to see your traffic and rankings dip after devoting your life to the growth of your blog.
How To Increase SEO Ranking Of Your Blog
My Google traffic shot up a week after I completed the steps I’m about to show you.
I’ve included step-by-step screenshots to help you implement these tips on your blog while you read.
Disavow incoming toxic links
For most sites, including mine, unhealthy sites linking to your blog posts could make up for 80% of all SEO issues you have.
Most times, you have no control over who chooses to link to you.
As you keep creating amazing content and growing your blog, you’ll naturally gather healthy links pointing to your blog – as well as toxic links.
My first big win after a period of battling with declining organic traffic was realizing that there’s such a thing as toxic backlinks.
After which I ended up identifying bad links to my affected blog posts. And a week after I either requested the link to be removed by the blog owner or disavow it, the rankings picked up.
What if the owner of the site linking to your blog refuses to remove the toxic link?
Google reluctantly came up with a solution for that.
The link disavow tool.
Luckily, with Google’s disavow tool, you can let Google know which links should be associated with your blog when ranking it, and which links should not.
Google’s dislike for disavowing links is readily seen in the way it’s hidden in Search Console.
They made it that way so that a website owner has to specifically search for the link disavow tool in Google to find it.
The disavow tool was created to dissipate the pressure mounted on Google by those in the SEO community affected by the Google Penguin update.
Blog owners who participated in building links through black-hat strategies were the worst casualties of the Penguin update.
Since there was no way to remove unnatural links from all the sites linking to them, they kept asking for a tool to disavow such links.
John Mueller from Google, while answering a question posed to him on which kind of links should be disavowed, said that links that caused a manual action and links that would get a manual action if the webspam team were to look at it should be disavowed.
How do you know the links that can trigger a manual action from Google on your website?
Answer: As a beginner, some SEO tools can give you a clue.
I normally use SEMrush to determine which links are toxic, as they examine each link against 30 toxic SEO factors to establish whether it’s harmful to your site or not.
The idea of it all is to generate a plain text document containing all the domains you want to disavow and submit to Google.
If this is your first time creating a disavow file, you might not know the correct template to use.
Here’s where SEMrush helps us by not only identifying toxic links but allowing us to export them to a .txt format in the right order before uploading it to the Google disavow tool.
To be fair, the disavow process can be done using a tool like SEMrush or Google Search Console.
For beginners, it’s best to start with a tool until you’ve outgrown it.
Alright, let’s get started.
Disavowing Using SEMrush
The backlink audit tool in SEMrush is free, you don’t need to have a paid plan to use it.
Head over to SEMrush and type in your domain name. Then click on “Get Insights.”
A pop-up box will come up requiring you to create an account or log in to your existing account.
If this is your first time, you’ll need to create an account, confirm your email and then continue to SEMrush.
In your SEMrush dashboard, make sure you’ve keyed in your domain name in the search bar, then click on the ‘+’ sign beside the “Project” button.
You’ll be required to enter the domain of the project and the project name. Do that and click on the “Create” button.
You’ll be taken to your project dashboard. It contains many great SEO features, but what we need to locate has to do with backlinks.
So, go to the “Backlink Audit” section and click on the “Set Up” button.
A pop-up alert requesting that you specify the scope of the backlink audit will come up.
You can set your backlinks to be analyzed from the root domain, the www version of your site, the non-www version of your site, or both the www and non-www version of your site.
I recommend you tick the box for the “root domain” and start the backlink audit.
Once you start the backlink audit, SEMrush will automatically pull and display some backlinks to your site.
But those are not the complete list of links pointing to your site.
You’ll need to connect your SEMrush account to Google Search Console so that the complete and latest links to your site can be generated.
To do this, go to the “About” tab and click on the “Connect Google Search Console” button.
SEMrush needs to have access to your Google Search Console account, so it can accurately pull all the links to your website and crawl them.
After you’ve allowed SEMrush access, you’ll need to start uploading the backlinks.
Click on “Check connection and upload backlinks.”
The process will take a little while.
When it’s done, the time to start auditing your backlinks begins.
Head over to the “Audit” tab so you can preview all incoming links to your site.
Before we go on, let me tell you some things beforehand.
SEMrush uses thirty factors to rate the toxicity of each backlink to your site.
A red rating of 60-100 means the referring domain is damn toxic, an orange color rating of 45-59 says the referring domain is potentially toxic, and a green rating of 44-0 means the link is safe.
While this serves as an excellent clue, you mustn’t rush to disavow any link just because of SEMrush metrics.
You could shoot yourself in the foot if links are disavowed carelessly. Your organic rankings can suffer a great hit if a good domain passing a powerful link juice is mistakenly disavowed.
Before you determine to disavow any link, I’ll advise that you manually visit the site and also know why SEMrush marked the link as toxic or potentially toxic. I’ll show you how to know as we go on.
Let’s go on.
Scroll down to your backlinks. You’ll see some with a toxic score, potentially toxic score, and good scores.
We’re going to zero in on all the links, not just the toxic and potentially toxic domains.
My first port of call before disavowing any link is to know why SEMrush flagged the link as toxic or potentially toxic in the first place.
On the right side of every backlink, you’ll see the anchor text, toxic score, and actions to perform on the link such as deleting or moving it to the disavow file.
I’m going to click on the toxic score of a potentially toxic link to know why SEMrush marked it that way.
I clicked on the toxic score of a potentially toxic domain linking to me. It has an orange color toxic score of 52.
The reasons why SEMrush marked it as potentially toxic are “community error, non-indexed domain, spam TLD, and low domain power.”
Community error means that several SEMrush users have already disavowed that particular domain. This is a strong reason why I usually disavow any toxic domains linking to me, though not before visiting the site personally.
Non-indexed domain spells out the fact that the domain is not indexed on Google, usually due to a penalty.
In this case, I go to Google search and type in “site:domain.com” to see if the website is actually on Google or not.
If it’s not, then it’s probably incurring a Google penalty and it’s best to disavow immediately.
Spam TLD means that the website appears to be a source of spam. To confirm, I’ll visit the site and log in via the Moz extension in my browser to view the spam score.
If the site itself seems shady and doesn’t look alright, I’ll consider it before deciding to disavow.
Low domain power means that the Domain Authority (DA) of the website is low. This in itself is not a reason to disavow a link.
If it’s just an issue of low domain power, I’ll visit the site. If it looks good with quality content and a low spam score, then it’s alright.
There are other reasons why SEMrush marks some links as toxic, but these are the most troublesome.
Disavowing URLs and disavowing domains
Sometimes, you might find yourself in a position where you want to disavow a particular URL of a domain, not the entire domain.
Maybe the domain itself isn’t shady, just the URL displaying an error 404 or a redirect.
Disavowing URLs is done so that should the website link to you later in another blog post, you might get the full SEO benefits of that link.
In that case, SEMrush provides an option to disavow the URL.
For example, here’s an example of a URL I want to get disavowed. I want to disavow just the URL because the website links to me on other pages.
At the right-most side of the link, I click on “Delete” and set it to URL, then I click the “To Disavow” button.
You can either choose to disavow the entire domain or just the URL, after which you move it to the disavow file.
Now, you can go through the list of all your backlinks and determine which of them needs to be disavowed.
Don’t limit your disavow focus to just the toxic and potentially toxic links SEMrush gives you.
I’ll advise that if you have the time, go through all your links individually, clicking on the toxic score to know why SEMrush marked it the way it is, and manually visit suspicious sites.
Link Disavow Format
SEMrush automatically displays your link in the correct format for disavowing as specified by Google.
Google gave guidelines regarding how domain and URLs should be disavowed.
A domain is disavowed in the following format:
URLs are disavowed in this format:
When you download your disavow file, your domain and URLs will appear in this same format.
Downloading disavow file and uploading it to Google Disavow tool
When you’re done sending all unwanted backlinks to the disavow file, then you can export the file in a .TXT document.
Scroll up to the “Disavow” tab and click on the “Export to TXT” button.
Your .TXT file will be downloaded in a plain text document like Notepad.
Now, you have your disavow file ready to be submitted to Google’s Disavow Tool.
Disavowing using Google Search Console
An alternative way to disavow your links is by using Google Search Console. This method is for experts who know exactly what they’re doing and what to look out for.
Head over to Google Search Console and scroll down to the “Links” section at the left.
Then go to the top linking sites section and click on the “more” text link under it.
You’ll see a list of all the links Google records as belonging to your website.
The thing about this method is that you have to manually view each site and decide if it’s worth keeping or disavowing, and that can be time-consuming.
To view each domain linking to you, click on it once and you’ll see the page on your website that’s being linked to.
You’ll see the exact page of your website that the link is directed at.
Click on the page(s) to show you the exact page where you can find the link on the linking site.
Now, you can hover above it and click on the little arrow button that appears. It’ll direct you to the page where the link to your site is found.
When you get to the website linking to you, some things to check out for are duplicate content, how thin the overall page content is, annoying redirects, how related the linking page is to your website, and your gut reaction to the site.
Make a list of each website you want to be disavowed and include it in your disavow file with the format I showed you earlier.
Submitting The Disavow File
The next step is to upload the disavow file to the Google Disavow Tool.
Go to the disavow tool and click on “Disavow Links.”
You’ll be taken to a warning page where Google says the tool is an advanced feature and should be used with caution.
Once again, click on the “Disavow Links” button.
Now you can submit your disavow file. Click on the “Choose File” button to browse for the downloaded file, then go ahead and submit it.
You’ll usually see results in your organic traffic 4-5 days after it was submitted.
Pro Tip #1: From time to time, head back to your SEMrush account and re-crawl the latest links to your site from your Google Search Console so you can evaluate the toxic ones and disavow them.
In the backlink audit tool of SEMrush at the top-right corner, you’ll see a “Re-run campaign” button.
If you click on it, it’ll start re-crawling new backlinks from your Google Search Console account.
You’ll have to be patient because it takes time to crawl.
Pro Tip #2: Re-evaluate your disavowed links occasionally so you can spot the ones that are now in good standing with Google.
Occasionally, some sites penalized by Google will do what they need to do to remove the penalty.
And there’s no way you would know.
Meanwhile, they are sitting comfortably in your disavow file.
Once in a while, it’s good to go through your disavowed links in SEMrush so you can see if there’s any difference to the linking domain.
If you find that you no longer want to disavow a particular domain, all you need do is to open your disavow file and delete that domain.
Then you upload the updated disavow file to Google Disavow Tool.
Pro Tip #3: Don’t disavow frequently.
Google doesn’t seem to like the continuous use of the disavow tool as implied by John Mueller.
Even when you crawl your backlinks regularly, disavow only those links that you think are toxic, such as paid links, non-indexed domain links, links from spam forums, links from pages with duplicate content, and any link that looks suspicious to you.
Disavowing too frequently can harm your rankings. The best interval to use the disavow tool is every 6-12 months.
It’s okay to earn links from sites with foreign languages and slightly different niches. Don’t fret about every strange backlink you have.
Anchor text of links to your blog posts shouldn’t be your target keyword
…or similar to your target keyword.
Exact match anchor texts send a strong signal to Google that’s interpreted as a black-hat link building tactic, thanks to the Penguin update.
While I was doing my backlink audit on SEMrush, it flagged a backlink to my site as potentially dangerous because it contained an anchor text that was similar to the keyword I was trying to rank for.
I was trying to rank for the term “content marketing statistics” and the anchor text of the backlink to the post was “content marketing.”
Here is the screenshot:
Before this, I also noticed that the ranking position of that blog post dropped by a few spots.
SEMrush marked the anchor text warning as “money,” meaning that the anchor text contains a non-branded keyword that the site is trying to rank for in search results.
After I contacted the website owner to change the anchor text, the blog post returned to its original position in a month.
Here’s something else to take note of.
The anchor text of backlinks to your site is also as important as that of internal links.
If the anchor text of internal links on your site all contain the exact keywords you’re trying to rank for, Google will interpret it as unnatural.
Most times, you have no control over the anchor text of sites linking to you.
Aside from guest blogging, the only other situation where you can choose your anchor text is during link exchanges.
A smart SEO strategy to use when picking your anchor text is to use a variation of your target keyword.
Either that, or you can use a text that’s related to what the target page is all about.
Mark sponsored post and affiliate links with the appropriate link attribution
It’s very easy for bloggers and influencers to leave dofollow links in sponsored posts thinking that Google won’t notice.
And it’s simply a matter of time before Google spots the trend and kick your blog out of search results.
Protecting your blog from a penalty is better than earning a couple of bucks.
The long-term interest of your blog should come before a paid post, not otherwise.
Since 2005, Google introduced the nofollow attribute as a means of identifying sponsored links, affiliate links, and those gotten through advertisements.
A lot of water has passed under the bridge since that time.
The web has changed.
Google has evolved.
The SEO community woke up on September 10, 2019, to be greeted with the news that Google has provided new link attributes to identify the different nature of links.
- rel=”sponsored”: This new attribute is to be used to identify links that are part of sponsorships or advertisements.
- rel=”ugc”: User Generated Content (UGC) is suited to flag links that were churned out by independent users such as in comments and forum posts.
- rel=”nofollow”: The nofollow attribute is used when you want to link to a page without passing a vote of confidence and SEO juice to that page.
What if you want to use more than one attribute, is it possible?
Yes, you can. For instance, rel=”sponsored nofollow” tells Google that the post is sponsored and doesn’t pass ranking credit to the linked page.
By default, all links are dofollow.
Changing the attribute value to nofollow, sponsored or ugc isn’t hard at all.
Here’s how to do it.
Head over to your Visual editor when writing your blog post. Switch it to “Text” mode, this usually shows the HTML version of your post.
Locate the link that you wish to modify its attribute from the Visual tab and toggle the Text tab.
You should be taken to the exact spot where the link is placed.
Here’s an unmodified link:
Now, I want to make it nofollow. So I’m going to place the rel=”nofollow” attribute between the actual “href” link and the anchor text.
You can switch back to your visual editor. Don’t forget that you can also include more than one link attribute like rel=”nofollow ugc” in it.
In some cases, you might find that a link has the rel=”noopener noreferrer” attribute. Something like this:
All you need to do is add “nofollow” at the end. It should look like, rel=”noopener noreferrer nofollow”.
It’s still the same thing as nofollow. Update the blog post when you’re done.
That’s how nofollow, sponsored, and ugc attribute is done.
Optimize robots.txt file
Every website ought to have a robots.txt file.
The robots.txt file tells Google and other bots which pages of your site to crawl and which ones shouldn’t be crawled.
Google makes use of crawl rate limit and crawl budget when crawling and indexing sites.
The crawl rate limit is the highest rate at which Google can fetch a site while the crawl budget is the number of URLs that Googlebot can crawl and index.
The idea behind using a robots.txt file is to avoid Google spending crawl budget on unimportant or duplicate pages of your site and instead focus on amazing content that provides real value, blog posts that need to be crawled.
This is a little SEO hack many website owners know nothing about.
You might want to find the robots.txt file of your site.
Simply type in yourdomain.com/robots.txt into a web browser and you’ll see your file.
If it shows you a 404 error or an empty file, you should fix it immediately.
Most websites usually have a default robots.txt file created upon installation. To modify it, you’ll just need to go into your cPanel, open the robots.txt, and modify the file.
WordPress sites are an exception.
Though the robots.txt file is created on installation, it is virtual. You won’t find it in your cPanel.
If you’re a WordPress user, you’ll need to create a new robots.txt file. This way you can optimize and modify it the way you want to.
First, let’s create our robots.txt file.
Use a plain text editor like Notepad when creating it. I’m going to give you the code for my robots.txt file.
In your plain text editor, paste the following code especially if it’s for a WordPress site:
Save the file as robots.txt. Remember to add the extension “.txt” when saving it, very important.
Now, you can upload it to the file manager in your web hosting cPanel. Go to your file manager.
Tick the box for web root (public_html) and click on the “Go” button.
Then you’ll need to upload your robots.txt file. Click on the “Upload” button.
Click on the “Choose File” button so you can browse through your local computer to get your robots.txt file.
Upload it and go back to your public_html file manager.
You should see your robots.txt file among other files in your public_html folder now.
And if you go to the web address, yourdomain.com/robots.txt, you’ll also notice that the contents of your robots.txt file have been modified to match the current one in your cPanel.
It’s a healthy practice to check out the robots.txt file of other blogs, especially those ranking in first positions of Google for competitive keywords.
That way, you’ll always know what’s working and what’s not working.
Make sure your content is the best out there
“Content is king” might have been rehashed too often that it’s now hackneyed, but there’s still lots of SEO truth behind that ancient blogging phrase.
The two heavyweight factors Google considers when ranking posts on search results are links and content.
Nothing beats a reverberating content that users love, share, bookmark, and come back later to peruse slowly.
SEO is no longer focused solely on keywords and backlinks, it has now gravitated towards creating an optimal experience for Google users.
Here’s something to practice before writing your next blog post.
Check out the search results on the first page for your target keyword. If possible, open all of them in new tabs.
You’re going to beat all their records with your blog post. If they didn’t explain a concept in simple terms, do it.
If the topic calls for step-by-step examples, by all means, provide it and add visuals too.
If it’s okay with you, embed a video.
Copy the entire blog post into a text editor like Microsoft Word and check out the word count.
If the word count is 2000 words, then write a 4000-word blog post.
If it’s already a 5000-blog post, then make yours 6000 words and above.
I know this sounds like a herculean task, but when that blog post shoots up the rank in search results, it’ll be well worth the effort.
What the heck am I going to write that’ll take up 5000 words and above?
Like what content marketers already practicing this rule of outdoing each other will tell you, scratch beyond the surface of your topic and go deep.
As you begin to cover every little area of your topic, your word count will begin to build.
When you’re done, your blog post should contain everything that a reader needs to know concerning the topic.
A reader doesn’t have to go back to the search results and look for something that was mentioned in your blog post.
That’s what’s called an in-depth guide.
Pro Tip: Producing great content is better than cranking out ordinary, average posts. But you know what’s best? Creating linkable content.
Great content + link attraction = best content.
The best content is the one that magnetizes links on its own. Think of sources, surveys, industry studies, and anything that’ll require someone to reference your content.
Create content based on search intent
Understanding search intent is the new rule for SEO.
A major reason why some blog posts fail to show up in search results is that they were written amiss.
Google has gotten pretty good at understanding the why behind a search keyword.
For example, a search query for “countertop ice maker” doesn’t show informational blog posts explaining the function and uses of an ice maker.
Instead, it shows product pages of different portable ice makers for sale. E-commerce brands like Amazon and Walmart dominate the search results for this keyword.
Google understands that anyone who searches for that query does so with commercial intent, ready to buy on the spot.
Writing a blog post on it and hoping it’ll crack the first page for this keyword is almost impossible.
Let’s take a look at this keyword, “sew a button” and we’ll quickly observe that the first page is filled with How-To posts.
The search intent behind “sew a button” is purely informational.
Trying to squeeze a button-sewing landing page into the first page of Google search results for that keyword will be almost fruitless.
From this analysis, it’s obvious that you can get to the first page of Google for your target keyword quickly if you correctly discern the search intent of that keyword.
It’s easy to understand the search intent behind most keywords.
However, you might come across some seed keywords that can prove tricky to understand.
In that case, it’s best to plug the keyword into Google search and take a cursory look at the search results displayed.
The organic results are a pointer to Google’s understanding of that keyword intent.
When understanding the search intent of a query, following the trend of the organic listings is usually the best and easiest way to go.
Remember meta descriptions
Meta descriptions, while not a ranking factor, helps in boosting the click-through rate (CTR) of your blog posts and homepage.
Here is a meta description Facebook used for their homepage:
A meta description is the organic real estate Google gives you to explain what the entire content of a web page is and why readers should click through to it.
A well thought out meta description could mean the difference between a Google user lumping your blog posts alongside others or feeling that you speak to their need accurately and clicking through.
However, Google doesn’t use the meta descriptions all the time. Sometimes, a related text from your content is pulled and used as the meta description.
Nevertheless, Google and top SEO experts encourage the use of meta descriptions.
If you have a WordPress site, creating meta descriptions is a walk in the park.
All you need to do is install the Yoast SEO plugin.
Then open the post that you want to insert a meta description and scroll down past the visual editor to the Yoast SEO section.
Click on the “Edit snippet” button.
The field where you can insert your meta description will appear. Make sure to include your target keyword in your meta description.
Click on the “Close snippet editor” button when you’re done.
Make sure you update the blog post so it can reflect the changes you’ve made.
That’s how easy it is to add a meta description.
Optimize old blog posts for greater traffic
It’s no news that Google prefers to deliver relevant and updated content to users while relegating stale content in the SERPs.
Sometimes, plain updating an old blog post can revive its dying traffic.
Other times, you need to do more than just updating a post that’s losing ground in the SERPs.
I once wrote a blog post that targeted a low-competition keyword. For a while, it ranked well until I noticed in my Google Analytics that traffic to the post has become almost non-existent.
Well, I didn’t understand what was happening too well at that time. I was still a newbie in blogging so I conveniently ignored the issue.
It wasn’t until recently that I looked back at that post with “expert” eyes and conducted an SEO audit on it.
I quickly discovered that the keyword I was targeting ‘disappeared’ from Google Keyword Planner.
The keyword simply wasn’t there anymore.
So the post was only ranking for related keywords.
It happens that way at times. So I changed the target keywords for that post and integrated other Latent Semantic Indexing (LSI) keywords into it.
I also checked the keywords that the post was ranking for but wasn’t mentioned in the body text and title. I included those too.
Two weeks later, the traffic to that post picked up. I learned new things that day and I’m going to share them with you.
First, I learned that old blog posts are responsible for generating the most traffic to a website.
And also, an SEO audit should be carried out on them every 6-9 months.
Let’s breakdown the steps to optimize an old blog post.
Is your target keyword still active?
Search for your target keyword in Google Keyword Planner to know if it’s still there. It’s easy to skip this part.
Does the blog content need updating?
If the information in your blog post has become outdated, it’s best to revamp it.
While giving the content a makeover, make it longer and more comprehensive.
Infuse LSI keywords into the old post
Google takes meaning into consideration when returning results for search queries than just looking at the keyword verbatim.
LSI keywords are related search terms that surround a particular keyword.
Making these associated phrases a part of the content helps Google to understand the blog post better, and will help it rank for a wide variety of similar queries.
One way to get different LSI keywords for your target phrase is to type in your keyword in Google search. Then scroll down to the “searches related to” section.
Another way is to plug the keyword into Google Keyword Planner and browse through the keyword ideas generated.
Don’t forget to use your target keyword for your blog post title, URL, headers, and 2-3 times in the body of your post.
Amp up traffic from related keywords
Naturally, your blog post is going to rank for keywords that you didn’t even mention.
Keywords that you know nothing about.
With the Google Search Console tool, you can find those keywords and work them into your blog post. Doing this, you’ll be better positioned to rank for them with more traffic as a bonus.
Let’s find those unknown keywords.
Head over to Google Search Console and click on the “Performance” section. Then scroll down and go to the “Queries” tab.
Under the “Queries” tab, you’ll see a list of keywords that Google users search for to get to your site.
Among them are keywords that you weren’t even trying to rank for, but some of your blog posts show up for them on Google search results.
Now, you’re going to identify those keywords and the exact pages they direct traffic to, and then naturally slot in those keywords.
If you see a particular keyword you don’t remember targeting, click on it once then go to the “Pages” tab to see the blog post ranking for it.
In the “Pages” tab, I have two blog posts ranking for the keyword “what is a good page load time.”
This ranking is purely natural because I had no intention of targeting that keyword.
If I want to rank for this keyword, I can insert it as an h2 heading, “what is a good page load time?” inside those blog posts.
Then I can answer the question in a concise way, increasing the possibility of it showing as a featured snippet for the question keyword.
Repeat the process for all unknown keywords you want to intentionally rank for.
That’s how you do a quick SEO audit on old blog posts for more traffic.
Request Google indexing for new and updated posts
After you’ve updated your old posts, it might not be reflected in Google search results.
If you leave it without doing nothing, it can take 2-3 weeks before Google will re-crawl the optimized post.
That’s a lot of time to waste.
Right inside your Google Search Console, you can request indexing of any post you’ve made changes to.
In a few days, the changes will be reflected in organic results.
At the top of your Search Console account, there’s a search bar with a text in it saying, “Inspect any URL in https://www.yourdomain.com.”
Type in the URL of the blog post you’ve just updated into it.
You can also type in the URL of a just-published blog post that hasn’t been crawled by Google yet, so it can be indexed faster.
Copy and paste the URL into the search bar and press the “Enter” button on your keyboard.
You’ll be taken to a page where you can submit your request for indexing. Click on the “Request Indexing” text link and Google will start the process.
In 2-3 days, your updated page will be reflected in Google search.
A bonus point to increasing your SEO is to work on your website’s speed. Google uses page speed as a ranking factor now.
A week after you’ve implemented all the strategies contained here, you should see some improvements in your Google traffic.
Want something handy to reference when carrying out these tactics on your blog content?
Then download my Spike Google Traffic Checklist and go make that organic traffic shoot up.
Spike Google Traffic Checklist
Download the free SEO checklist to maximize each blog post for organic traffic