8249 West 95th Street, Overland Park, KS 66212

How Many htaccess Redirects Are Too Many?

I hear it pretty often, “Don’t add too many redirects to htaccess or you’ll slow down a site!” Whenever I hear that I say to myself “Good point – but how many are too many?!” I’ve never encountered anyone with a good answer to that question, and I know it can depend on your server too, but I wanted to really test this out and see if there would be a predictable reduction in performance, or if there was a point at which performance just went in the toilet.

The htaccess file is accessed every time every resource is loaded. Every time a URL is loaded, the htaccess file is queried to see if the URL is on the list. The htaccess file does not load into memory, but is read sequentially. The server looks down the list for something that applies to the URL. If something does apply and it has a “last” tag on it (meaning read no further) this is where the server stops reading the file and follows that rule. If there is no last tag, the server continues down the list to see if anything else applies.

I’ve always heard 1:1 redirects are detrimental to site speed. A 1:1 redirect is when you write an htaccess rule that says “Redirect 301 /1stURL/ /2ndURL/”. Since each line is a single redirect, you can end up with a long list in htaccess, which must be parsed.

Top tip for using 1:1 redirects: place them last in the file because if the redirect is handled by a regex statement with a last tag, the 1:1 rules won’t be parsed. Problems are easier to diagnose that way.

Since I don’t like to repeat things without testing them, I decided to see if I could bring my site to its knees by adding lots and lots of redirects. I added them by the thousands and tested the site’s speed. The thing that would be impacted by the addition of huge numbers of redirects is something called Time To First Byte, or TTFB. TTFB is the time it takes for a server to get it’s stuff together and send something to a client’s browser. This includes checking htaccess to see if there are any rules that apply to the URL the user requested. TTFB is a good indicator of the impact of a large htaccess file. There are other things that could be impacted, but for the sake of isolating data, we’ll rely on TTFB for this test.

Baseline TTFB with my existing htaccess file: 85.3ms – 115ms
Overall homepage load time with my existing htaccess file: 4.95s -5.4s

Additional htaccess RedirectsTTFB in millesecondsPercent Difference in TTFBPage Load Time in secondsPercent Difference in Load Time
1,000905.7%5.01.005%
2,0008505.01.005%
5,00011025.6%5.36.8%
8,00014149.5%5.24.9%
10,00011227.4%5.12.9%
12,00011832.5%5.12.9%
15,00013545.5%5.36.8%
20,00014350.8%5.24.9%
25,00018574.1%5.12.9%
30,00017167.2%5.12.9%
40,00018875.5%5.24.9%
50,00021486.3%5.48.7%
100,000368124.5%5.48.7%
200,000685155.8%5.815.8%

As you can see in the table above I started adding redirects in increments of 5,000 and then 10,000 because the impact on load times was negligible and I didn’t have all day to run speed tests. It seems that the impact on TTFB was fairly consistent all along with incremental increases. Total page load time wasn’t slowed as predictably. It wasn’t meaningfully impacted until we got to about 30,000 redirects. Overall load time didn’t vary from the baseline tests until 100,000 redirects were in the file. TTFB wasn’t outside the norm until more than 10,000 redirects were in the htaccess file.

How many htaccess redirects are too many? The impact on page load times from htaccess redirects.

The number of redirects it took to have a meaningful impact on the load time of my homepage was so ridiculous that it’s just not reasonable to say “Don’t use 1:1 redirects in an htaccess file because it’ll slow down the site.” I’ve shown here that until you get to an absurd amount of redirects that the impact is negligible. Optimizing a few images on the homepage that I’m too lazy to “getaroundto” would completely resolve the additional load time 100,000 redirects added.