Ultimate Guide How to Restore Website Traffic After Editing .htaccess File
Hey! Today we will talk about a rather narrow and specific topic, but no less useful and important.
I have met a lot of people with such a problem, but I have seen very little material on the net on this topic, it is practically non-existent. I decided to fill in this hole and will try to answer the most frequent questions of site administrators and give specific steps - on what to do. Let's start in order.

.htaccess File
The .htaccess file, short for "Hypertext Access," is a configuration file the Apache web server uses to control various aspects of website functionality. It is a powerful tool that allows website administrators to define rules and directives for their web server, such as URL rewriting, access control, redirection, and more. With the .htaccess file, you can modify server settings per directory without directly editing the main server configuration file. This provides flexibility and allows for the customization of specific directories or files within a website.
In simple words - with the help of this configuration file you set the main URL for your site, the default one. For example, if a user enters your website address - www.example.com, and you choose example.com as the default (in the .htaccess file), then the user will be redirected to the default address (example.com).
The presence and proper design of this file are a huge plus for search engines and have a positive effect on SEO because it ensures that the user does not get lost and enters the correct URL the first time and finds your web page. But what if you haven't thought about which URL to choose www or nowww before and made changes to the .htaccess file just now?
Traffic Dropped
When you have made changes or just created a .htaccess file without further editing, it may happen that for each page listed in the sitemap file, you will receive a copy of it. For example, if in your sitemap file, you have the page https://example.com, and in the .htaccess file you have specified the default URL format - https://www.example.com, then for each page https://example.com/blogs/blog1.html you will get a copy of it in the search engine https://www.example.com/blogs/blog1.html.
So what? Pages are successfully indexed and present in the search engine, so what? But the fact is that for Google these two sites (https://example.com and https://www.example.com) are two completely different sites, which means that for search engine algorithms, you copy content from someone else's site and your page is a duplicate.
For those who don't know, duplicate pages are prohibited by Google policy, which means that under such conditions, you will receive a penalty of algorithms for copying the content of someone else's page (despite the fact that you didn't). I agree, this is an unpleasant situation, but it can and should be solved until Google indexes each of your copies of the page, doubling the number of pages 2 times exactly.
Understand and Track It
There is nothing complicated here. The first thing you should pay attention to is your traffic graph. It drops very sharply to almost zero values. This means that you have received an algorithm penalty and your pages are in monstrously low positions in the search engine and no one sees them. But don't panic, we'll fix it now.

Impressions Dropped After htaccess File Changes
The second thing you should pay attention to is the "Pages" tab in Google Search Console. If you pay attention to the number of indexed pages, and it is more than the actual number of pages in the sitemap file, then each of your pages is copied, as I described above, and this is another sign that the problem is in the .htaccess file.
Action Algorithm
When we figured out what this problem is and how to track it down, we can proceed to how we can get rid of it. And here is nothing complicated:
-
Decide on the URL format. Choose how you want your URL to appear. It does not affect SEO in any way and does not affect anything at all, to be honest. The main thing is to make a choice once and for all and make changes to the .htaccess file.
For the www version (https://www.example.com), the file will have the following format:
Rewrite Engine On RewriteCond %{HTTP_HOST} !^www\.example\.com [NC] RewriteRule ^(.*)$ http://www.example.com/$1 [L,R=301]
For the nowww version (https://example.com), the file will have the following format:
Rewrite Engine On RewriteCond %{HTTP_HOST} ^www\.example\.com [NC] RewriteRule ^(.*)$ http://example.com/$1 [L,R=301]
Remove all unnecessary pages from the search engine. If you have determined that the default form of your URL is https://example.com, then open the list of all indexed pages in the "Pages" tab in Google Search Console and find all pages that have the format - https://www.example.com. We need to get rid of them. Copy these URLs into a separate file, we will remove them now.
Detecting All www Indexed Pages
Go to the "Removals" tab and make a request to remove these pages from the search engine. For this, we only need the URLs that we copied in the previous step. The request is usually processed quickly - break not build, so after a while, you can check how successfully we completed this step and enter the command in the Google search box - site: https://www.example.com/blogs/blog1.html. If some pages still appear, then we have not deleted all the pages, or we still have to wait.
Removing URL From Search Engine
Generate a new sitemap file. When we got rid of the pages due to which we received a penalty from Google, we must make sure that the indexing of these pages is not repeated, and for this we need to generate either a new sitemap file or clean the old one, removing all the URLs we donβt need (https:/ /www.example.com or https://example.com, depending on what you chose in the first step).
Submiting New Version of Sitemap in GSC
- Clean up all internal links. Now you need to pay attention to the internal navigation of the site and fix any internal links whose address is no longer relevant. To be honest, I have not come across this, because on my site the redirect occurs along a unique path, according to the hosting file architecture, but still, if you link to your other pages in this way, it is worth paying attention to.
Waiting and Monitoring
Then it remains only to wait for your site to be restored to its rights to occupy high positions. How long this will last no one knows. This may take from several weeks to several months. It all depends on the following factors:
- Size of the Website: The larger the website, the longer it may take for Google to recrawl and reindex all the pages.
- The severity of the Penalty: The extent of the penalty and the violations that led to it can affect the recovery time. More severe penalties may take longer to recover from.
- Implementation of Fixes: How quickly and effectively you implement the necessary fixes, such as 301 redirects and updating internal and external links, can impact the recovery process.
- Crawl Rate and Frequency: Google's crawl rate and frequency for your website can influence how quickly the changes are detected and processed.
- Reevaluation by Google: After implementing the necessary changes, Google needs to reevaluate your website and determine if it complies with its guidelines. This evaluation process can also impact the recovery time.
- Competition and Niche: The competitiveness of your industry and the niche your website belongs to can play a role in how quickly you see improvements. Some niches may have more competition and stricter guidelines, which can prolong the recovery process.
Personally, my recovery took about two weeks and I can say from my own experience that it was a very painful waiting weeks. So I can only advise you to wait and be patient, as well as monitor traffic graphs and indexed pages every day in order to keep track of if something goes wrong again.
Conclusion
Here's a guide I came up with. Today we figured out how to solve such a small, but very nasty trouble with the .htaccess file and restore your traffic in the search engine. The steps, like the problem itself, are elementary and simple, nevertheless, it is worth thinking about it and taking into account all the consequences, because due to several characters in this configuration file, you can lose traffic for months, which is unpleasant.