Part of the Speed audit
Check your page's DOM size
A bloated DOM slows down rendering, JavaScript, and scrolling. SiteCurl counts every HTML element on your page and flags when the number gets too high.
No signup required. Results in under 60 seconds.
What this check does
SiteCurl counts every HTML element (node) on your page. A div, a p, a span, an img tag: each one is a node. The total count tells you how complex your page structure is.
Pages with fewer than 1,500 nodes pass. Between 1,500 and 3,000 nodes triggers a warning. Over 3,000 nodes is flagged as critical. These thresholds match what Google's Lighthouse uses as guidance for DOM size.
The check runs on every page in your scan. Some pages may be lean while others (like a long product listing or a page built with a drag-and-drop builder) can have thousands of extra nodes.
How this shows up in the real world
The DOM (Document Object Model) is the browser's internal representation of your page. Every HTML tag becomes a node in this tree. The browser uses this tree for everything: applying CSS rules, calculating layout, running JavaScript queries, handling scroll events, and painting pixels on screen.
When the tree is small, all of this is fast. When it grows past 1,500 nodes, each operation takes measurably longer. CSS selectors have more elements to match against. Layout calculations involve more boxes to position. JavaScript calls like querySelectorAll search through more nodes.
The problem is worst on mobile devices. Phones have less RAM and slower processors than laptops. A page with 3,000 nodes that feels fine on a desktop can stutter and lag on a mid-range phone. Scrolling becomes janky, tap responses feel delayed, and the page uses more battery power.
Page builders (WordPress Elementor, Wix, Squarespace) are the most common source of DOM bloat. They wrap every content block in multiple layers of divs for styling and layout. A simple text section that could be one p tag becomes 5 or 6 nested divs.
Why it matters
DOM size directly affects three Core Web Vitals metrics. A large DOM increases Largest Contentful Paint (the browser has more to process before showing content), Interaction to Next Paint (JavaScript takes longer to respond to clicks), and Cumulative Layout Shift (more elements means more layout recalculations).
Memory usage goes up with node count. Each node takes memory. On phones with 2-4 GB of RAM shared across all apps, a heavy page can push the browser to drop cached data, causing jank and reloads.
Google flags excessive DOM size in Lighthouse audits and has confirmed that page experience signals (including speed) affect search rankings. A page that lags on mobile is less likely to rank well in mobile search results.
Who this impacts most
Sites built with page builders (Elementor, Divi, Beaver Builder) almost always exceed 1,500 nodes. The visual editor adds wrapper divs, spacing elements, and style containers that are invisible to the site owner but real to the browser.
E-commerce sites with long product listings on a single page are another common case. A page showing 50 products with images, prices, ratings, and buttons can easily hit 3,000+ nodes.
Landing pages with many sections (hero, features, testimonials, pricing, FAQ, footer) add up faster than most people expect. Each section is typically 100-300 nodes. Eight sections plus a header and footer can push past the warning threshold.
How to fix it
Step 1: Find the heaviest sections. Open your browser dev tools (F12), go to the Elements tab, and look for deeply nested structures. Right-click a section and choose 'Inspect' to see how many wrappers it uses.
Step 2: Remove extra wrapper divs. Page builders add extra divs for spacing and alignment that CSS alone can handle. If you have access to the template code, flatten nested structures. Replace <div><div><div><p>Text</p></div></div></div> with <p>Text</p> and use CSS for spacing and alignment.
Step 3: Lazy-load off-screen content. For long pages, do not render sections below the fold until the user scrolls to them. Use the content-visibility: auto CSS property or JavaScript intersection observers to load content on demand.
Step 4: Paginate or virtualize long lists. Instead of showing 100 products on one page, show 20 and add pagination. For infinite-scroll designs, use virtual scrolling that only keeps visible items in the DOM.
Step 5: Audit your page builder settings. Some page builders have a 'clean output' or 'optimized markup' option that reduces wrapper elements. Check your builder's performance settings before rebuilding manually.
Common mistakes when fixing this
Adding more sections to improve content depth. More content is good for SEO, but more DOM nodes is bad for speed. If you need more text, add it to existing sections instead of creating new ones.
Hiding elements with CSS instead of removing them. display: none hides an element visually, but it still exists in the DOM and still consumes memory. If content is not needed on the page, remove it from the HTML entirely.
Using divs for everything. Semantic HTML elements like section, article, and nav do not reduce node count, but replacing nested div structures with single semantic elements does. Fewer wrappers means fewer nodes.
How to verify the fix
After making changes, run another SiteCurl scan. The DOM size finding shows the updated node count. You can also check in Chrome DevTools: open the Console tab and run document.querySelectorAll('*').length to get the current node count on any page.
Compare the count before and after your changes. Aim for under 1,500 nodes. If you cannot get below that, focus on keeping the most interactive pages (forms, product listings, checkout) as lean as possible.
The bottom line
Every HTML element on your page costs processing time, memory, and rendering speed. Pages under 1,500 nodes render quickly on all devices. Pages over 3,000 nodes cause visible lag on phones. Flatten nested structures, remove hidden elements, and paginate long lists to keep your DOM lean.
Example findings from a scan
DOM has 2,847 nodes (recommended under 1,500)
DOM has 412 nodes, page structure is lean
DOM is very large (4,200 nodes, over 3,000)
Related checks
Frequently asked questions
What is the DOM?
The DOM (Document Object Model) is the browser's internal map of your page. Every HTML tag is a node in this map. The browser uses it to apply styles, calculate layout, and respond to user actions. More nodes means more work for the browser.
How many DOM nodes is too many?
SiteCurl flags a warning at 1,500 nodes and a critical issue at 3,000 nodes. These thresholds align with Google Lighthouse guidance. Most well-built pages fall between 500 and 1,200 nodes.
Why does my page builder create so many nodes?
Page builders like Elementor and Divi wrap every content block in multiple divs for layout and styling. A section you see as one block may be 5 or 6 nested elements in the HTML. Check your builder's performance or clean-output settings to reduce this.
Can I check DOM size without signing up?
Yes. The free audit includes DOM size as part of a full seven-category scan of your home page. No signup required.
Does DOM size affect SEO?
Indirectly, yes. Google uses page experience signals including Core Web Vitals in its ranking algorithm. A large DOM slows down rendering and interaction metrics, which can affect how your page performs in search results.
How do I count DOM nodes manually?
Open your browser dev tools (F12), go to the Console tab, and type document.querySelectorAll('*').length. This returns the total number of elements on the page.
Check your site speed now