Part of the Speed audit
Check your browser caching headers
When browsers re-download the same CSS and JavaScript on every visit, your pages load slower than they need to. SiteCurl checks up to 5 static resources for cache headers.
No signup required. Results in under 60 seconds.
What this check does
SiteCurl finds the CSS stylesheets and JavaScript files linked on your page, then sends a HEAD request to each one. It checks the response for a Cache-Control or Expires header. If neither is present, the resource is flagged as uncached.
The check covers up to 5 static resources per page, with a 3-second timeout per request and a 4-second total time budget. For each uncached file, SiteCurl shows the URL so you know exactly which resource needs fixing.
Cache headers tell the browser how long it can keep a local copy of a file. Without them, the browser downloads the same file again on every page load, even if it has not changed. This wastes bandwidth and adds load time for returning visitors.
How this shows up in the real world
Every time a visitor loads your page, the browser fetches CSS, JavaScript, fonts, and images from your server. If your server sends a Cache-Control header with a max-age value, the browser saves that file locally. On the next page load, it skips the download and uses the saved copy.
Without that header, the browser has no instructions. It either re-downloads everything or asks the server if the file changed (a conditional request). Both options are slower than reading from the local cache.
The impact grows with page count. A visitor who browses 5 pages on your site downloads the same CSS file 5 times if caching is missing. With caching, they download it once. On slow connections, this difference is measured in seconds.
Most modern hosting platforms set cache headers on static assets by default. But custom server configs, CDN missetups, or dynamically served assets can break this. The only way to know is to check the actual response headers.
Why it matters
Returning visitors are your most valuable traffic. They already know your site and are more likely to convert. If your pages load slowly on repeat visits because of missing cache headers, you lose the visitors who already trust you.
Cache headers also reduce server load. Every uncached request hits your origin server. On a traffic spike, thousands of redundant downloads can slow everything down. Caching pushes that work to the browser, keeping your server free for requests that actually need fresh data.
Search engines factor page speed into rankings. Google's Core Web Vitals measure how fast your page becomes usable. Caching directly improves Largest Contentful Paint on repeat visits by eliminating extra downloads.
Who this impacts most
E-commerce sites with product catalogs take the biggest hit. Shoppers browse multiple pages in a session. Without caching, they re-download shared assets on every product page. That slows down the browsing experience and increases bounce rates.
Content sites and blogs have a similar pattern. Readers who click through to a second or third article should not wait for the same stylesheet to download again. Caching makes page-to-page navigation feel instant.
Agencies managing client sites should check caching as part of every launch review. A fresh site with no cache headers performs well on the first visit but poorly for every visit after. That gap shows up in analytics as high bounce on return traffic.
How to fix it
Step 1: Identify uncached resources. Run a SiteCurl scan and look at the cache headers finding. It lists the specific CSS and JS files missing cache headers. Note each URL.
Step 2: Add Cache-Control headers on your server. For Nginx, add this to your server block for static files: location ~* \.(css|js|woff2|png|jpg|webp)$ { add_header Cache-Control "public, max-age=31536000"; }. For Apache, add to .htaccess: Header set Cache-Control "public, max-age=31536000" inside a FilesMatch block.
Step 3: Use fingerprinted filenames. If your build tool adds a hash to file names (like app-3f8a2b.css), you can set a one-year max-age safely. When the file changes, the hash changes, and browsers fetch the new version.
Step 4: Check your CDN settings. If you use Cloudflare, Fastly, or another CDN, verify it passes cache headers from your origin. Some CDNs strip or override them. Check the response headers from the CDN, not just your origin server.
Step 5: Handle dynamic assets differently. Files that change often (like a config file loaded via JS) should use a shorter max-age or no-cache with an ETag. This lets the browser check for updates without a full re-download.
Common mistakes when fixing this
Setting max-age too short. A max-age of 300 (5 minutes) means browsers re-download files 12 times per hour. For CSS and JS that change only on deploy, use 31536000 (one year) with fingerprinted file names.
Caching HTML pages too aggressively. Cache-Control on static assets is great. On HTML pages, it can serve stale content. Keep long max-age values for CSS, JS, and images. Use short or no cache for HTML documents.
Forgetting to test after CDN changes. Your origin server may have the right headers, but a CDN missetup can strip them. Always check the final response headers as seen by the browser, not just your server config.
Using Expires instead of Cache-Control. Expires works, but it uses an absolute date that can drift. Cache-Control with max-age is relative and more reliable. If you set both, Cache-Control takes priority.
How to verify the fix
After adding cache headers, run another SiteCurl scan. The finding should show zero uncached resources. You can also verify by hand: open your browser dev tools (F12), go to the Network tab, reload the page, and click on a CSS or JS file. Look for the Cache-Control header in the response headers.
To confirm caching works, reload the page a second time and check the Size column in the Network tab. Cached files show '(disk cache)' or '(memory cache)' instead of a byte count.
The bottom line
Cache headers are a one-time server config change that speeds up every repeat visit. Without them, browsers re-download the same files over and over. Add Cache-Control: public, max-age=31536000 to static assets with fingerprinted file names, and your returning visitors get a noticeably faster experience.
Example findings from a scan
2 of 4 static resources lack cache headers
CSS file /assets/styles.css missing Cache-Control header
All checked static resources have cache headers
Related checks
Frequently asked questions
What are cache headers?
Cache headers are instructions your server sends to the browser. They say how long the browser can keep a local copy of a file before downloading it again. The two main headers are Cache-Control and Expires.
How many resources does SiteCurl check?
SiteCurl checks up to 5 CSS and JavaScript files per page, with a 3-second timeout per request. It sends a HEAD request to each one and checks for Cache-Control or Expires headers.
What is a good max-age value for static assets?
For CSS and JS files with fingerprinted names (a hash in the filename), use 31536000 (one year). The hash changes when the file changes, so browsers always get the latest version. For files without fingerprints, use a shorter value like 86400 (one day).
Can I check caching without signing up?
Yes. The free audit checks your home page for caching issues as part of a full seven-category scan. No signup required.
Do cache headers affect SEO?
Not directly. But page speed is a ranking factor, and caching improves speed for returning visitors. Faster pages also tend to have lower bounce rates, which indirectly helps rankings.
What is the difference between Cache-Control and Expires?
Cache-Control uses a relative duration (seconds from now). Expires uses an absolute date. Cache-Control is preferred because it does not depend on clock synchronization between server and browser. If both are set, Cache-Control wins.
Check your site speed now