The connection most people miss
When business owners think about page speed, they think about user experience — nobody likes a slow website. That's true, but for local service businesses, there's a second consequence that's less obvious: slow websites get crawled less often by AI systems, which means they get recommended less.
This is not a theory. It's how web crawlers work, and it applies to every AI system that uses real-time web search to generate recommendations.
How AI systems discover businesses
AI systems like Perplexity and ChatGPT's browsing mode don't have a static database of every business in the world. They discover businesses by crawling the web — sending automated requests to websites, reading the content, and storing what they find.
Crawlers have a budget. They can only visit a certain number of pages per day, and they allocate that budget based on how efficiently a site responds. A site that loads quickly gets more pages crawled per visit. A site that loads slowly gets fewer pages crawled — or gets deprioritised entirely in favour of faster sites.
For a local service business, this means: if your site takes four seconds to load, the crawler may only visit your homepage and skip your service pages. Your service pages are where the specific, structured information lives — the pages that tell AI systems exactly what you do and where you do it. If those pages aren't crawled, they can't be cited.
What counts as fast enough
Google's own guidance suggests that pages should load in under three seconds on mobile. For AI crawlers, the threshold is similar — but the measurement that matters most is Time to First Byte (TTFB), which is how long the server takes to start sending data after a request.
A TTFB under 800 milliseconds is considered good. Above 1.8 seconds is considered poor. Most shared hosting environments and WordPress sites with unoptimised plugins have TTFBs well above 1.8 seconds.
Core Web Vitals — Google's set of user experience metrics — are also relevant. Largest Contentful Paint (LCP) measures how long the main content takes to appear. Cumulative Layout Shift (CLS) measures how much the page jumps around while loading. Both affect how crawlers evaluate your site.
The infrastructure question
Page speed is primarily an infrastructure problem, not a design problem. You can have a beautifully designed site that loads slowly because it's hosted on a cheap shared server, or a plain site that loads instantly because it's deployed on edge infrastructure.
Edge infrastructure means your site is served from servers physically close to the visitor — or in this case, close to the crawler. When a Perplexity crawler in a US data centre requests your site, a site hosted on a US edge network responds in milliseconds. A site hosted on a shared server in a distant data centre takes much longer.
For local service businesses, the practical implication is: where your site is hosted matters as much as what's on it. A fast host with a well-structured site will consistently outperform a slow host with perfect content.
What you can measure
You don't need to understand server infrastructure to check your site's speed. These free tools give you an accurate picture:
Google PageSpeed Insights (pagespeed.web.dev) — tests your site on both mobile and desktop, gives a score from 0–100, and lists specific issues. A score above 70 on mobile is a reasonable target for a local service business site.
Google Search Console — if your site is registered, the Core Web Vitals report shows real-world performance data from actual visitors, not just lab tests.
WebPageTest (webpagetest.org) — more detailed than PageSpeed Insights, shows TTFB, waterfall charts of every resource that loads, and performance from multiple locations.
The honest picture
Page speed is one factor among several. A fast site with no schema markup and a weak Google Business Profile will still struggle to get AI recommendations. A slow site with excellent schema and strong reviews may still get recommended occasionally.
But speed is the foundation that everything else builds on. A slow site limits how often AI systems crawl it, which limits how current their information about your business is. A fast site gets crawled more often, which means updates to your schema, new service pages, and new content get picked up faster.
For a local service business, the goal is not to have the fastest site on the internet. The goal is to not have a site so slow that crawlers give up on it.
Frequently asked questions
My site looks fine on my computer. Does that mean it's fast? Not necessarily. Your computer may have the site cached from a previous visit, which makes it appear to load instantly. Use an incognito window and test with PageSpeed Insights to get an accurate reading.
Does page speed affect my regular Google ranking? Yes. Google has used page speed as a ranking factor since 2018 for desktop and 2021 for mobile. Slow sites rank lower in traditional search results as well as being crawled less often.
What's the biggest cause of slow load times for small business sites? Unoptimised images are the most common cause. A homepage with a 4MB hero image will load slowly regardless of hosting quality. The second most common cause is too many third-party scripts — chat widgets, analytics tools, and social media embeds that each add load time.
Can I speed up my existing site without rebuilding it? Often yes. Compressing images, enabling caching, and removing unnecessary plugins or scripts can significantly improve load times without a full rebuild. However, if the underlying hosting is slow (shared hosting with high TTFB), there's a ceiling to how much you can improve without moving to better infrastructure.
Related reading
- What is an AI-ready website?
- How AI crawl speeds are changing
- What is schema markup and why does it matter?
If you want to see what an AI-ready site would look like for your business, get a free preview — we build the preview first, you only pay if you like it.
