Technical SEO Essentials
Technical SEO explained for service businesses. Covers site speed, mobile optimisation, structured data, and common technical issues that hurt rankings.
Technical SEO is the foundation that makes everything else work. You can have brilliant content and a strong backlink profile, but if Google cannot crawl your site properly, or if your pages take ten seconds to load, you will not rank.
The good news is that most technical SEO issues are fixable, and you do not need to be a developer to understand them. This article covers what matters most for service business websites.
Why Technical SEO Matters
Search engines need to find your pages, read them, and understand what they are about. Technical SEO ensures nothing gets in the way of that process.
A technically sound website loads quickly, works properly on mobile devices, is secure, and makes it easy for search engines to crawl and index content. Technical problems create friction that hurts your rankings.
Technical issues also affect user experience. Slow pages frustrate visitors. Broken links waste their time. Poor mobile experience sends them to competitors. Technical SEO is not just about search engines; it is about making your site work properly.
Site Speed: Why Milliseconds Matter
Page speed is a confirmed Google ranking factor. Slow sites rank worse than fast sites, all else being equal. But more importantly, slow sites lose visitors. Research consistently shows that conversion rates drop significantly as page load time increases.
According to Google, as page load time goes from one second to three seconds, the probability of bounce increases by 32%. From one to five seconds, bounce probability increases by 90%. Speed matters.
Measuring speed
Google PageSpeed Insights is the starting point. Enter your URL and Google analyses both mobile and desktop performance. It gives you a score from 0 to 100 and specific recommendations for improvement.
Focus on the Core Web Vitals metrics that Google highlights:
Largest Contentful Paint (LCP) measures how quickly the main content loads. Aim for under 2.5 seconds.
Interaction to Next Paint (INP) measures responsiveness when users interact with your page. Aim for under 200 milliseconds.
Cumulative Layout Shift (CLS) measures visual stability, whether elements jump around as the page loads. Aim for under 0.1.
GTmetrix provides similar analysis with a different presentation. Sometimes its recommendations are more actionable than PageSpeed Insights.
Common speed issues and fixes
Unoptimised images are the most common culprit. A photograph straight from a camera might be 5MB. That same image optimised for web might be 200KB with no visible quality loss.
Compress images before uploading. Tools like TinyPNG, Squoosh, or ImageOptim reduce file sizes dramatically. Use modern formats like WebP where supported. Resize images to the dimensions they will actually display; do not upload a 4000-pixel-wide image to display at 800 pixels.
Too many HTTP requests slow pages down. Each file, image, script, and stylesheet requires a separate request. Combine files where possible. Remove unnecessary plugins and scripts.
No caching means browsers download the same files every time someone visits. Caching stores files locally so repeat visits load faster. Most web hosts offer caching options, or plugins can handle it for WordPress sites.
Slow hosting limits how fast your site can be. Budget shared hosting often has slow servers. For a business website, quality hosting is worth the modest extra cost. Look for hosts with SSD storage and good server response times.
Render-blocking resources are scripts and stylesheets that must load before the page can display. Moving non-essential scripts to load later (deferring) improves initial load time.
Third-party scripts like chat widgets, analytics, social media embeds, and advertising code all add load time. Audit what you actually need. Each script has a cost.
Quick wins for speed
Start with images. Compressing and properly sizing images often delivers the biggest improvement with the least effort.
Enable caching if you have not already.
Remove plugins or scripts you are not actually using.
Check your hosting. If your server response time is slow (over 600ms), consider upgrading.
Use a content delivery network (CDN) if you have visitors from multiple regions. CDNs serve files from locations closer to visitors.
Mobile-First Indexing
Google uses mobile-first indexing, meaning it primarily uses the mobile version of your site for ranking and indexing. If your site does not work well on mobile, you have a problem.
What mobile-first means
When Google crawls your site, it sees the mobile version. If content exists on your desktop site but not your mobile site, Google may not see it. If your mobile site is slower or has less content, that is what Google evaluates.
This does not mean desktop does not matter, but mobile is the priority. Your mobile experience should be at least as good as desktop.
Testing mobile-friendliness
Google's Mobile-Friendly Test tells you whether Google considers your page mobile-friendly and highlights specific issues.
Google Search Console has a Mobile Usability report showing mobile issues across your site.
Manual testing on actual devices is valuable. Load your site on a phone. Is text readable without zooming? Can you tap buttons without hitting the wrong one? Does content fit the screen without horizontal scrolling?
Common mobile issues
Text too small requires users to zoom to read. Body text should be at least 16 pixels.
Tap targets too close makes it hard to tap the right button or link. Buttons should be at least 48 pixels tall with space between them.
Content wider than screen creates horizontal scrolling. Use responsive design that adapts to screen width.
Intrusive interstitials are popups that cover content on mobile. Google penalises pages where popups make content hard to access on mobile.
Unplayable content like Flash video does not work on mobile. Use HTML5 video or YouTube embeds.
HTTPS and Security
Your site should use HTTPS, not HTTP. This is non-negotiable in 2024.
HTTPS encrypts data between the browser and your server. Google confirmed HTTPS as a ranking signal back in 2014, and browsers now mark HTTP sites as "not secure", which scares visitors away.
If you are still on HTTP, get an SSL certificate. Many hosts offer free certificates through Let's Encrypt. The process is usually straightforward; your host's support can guide you.
Once HTTPS is active, ensure all internal links and resources use HTTPS. Mixed content (loading some resources over HTTP on an HTTPS page) causes security warnings.
Set up redirects from HTTP to HTTPS so old links and bookmarks still work.
Crawlability and Indexing
If Google cannot find and crawl your pages, they cannot rank. Several factors affect crawlability.
Robots.txt
The robots.txt file tells search engines which parts of your site they can and cannot crawl. It lives at yoursite.com/robots.txt.
A misconfigured robots.txt can accidentally block important pages or your entire site. Check that you are not blocking pages you want indexed.
Common robots.txt issues include blocking CSS and JavaScript files (which prevents Google rendering pages properly) and using leftover development blocks that prevent crawling.
XML sitemaps
An XML sitemap lists all the pages you want search engines to index. It helps Google discover pages, especially on larger sites or sites with complex navigation.
Submit your sitemap through Google Search Console. Most CMS platforms generate sitemaps automatically; you just need to tell Google where it is.
Keep your sitemap updated as you add or remove pages.
Internal linking
Google discovers pages by following links. If a page has no internal links pointing to it, Google may not find it. This is called an orphan page.
Ensure important pages are linked from your navigation, your homepage, or related content pages. A logical site structure with clear internal linking helps Google understand and crawl your site.
Checking indexing status
Google Search Console shows which of your pages are indexed and reports any issues. The Coverage report identifies pages with errors, pages excluded from indexing, and valid indexed pages.
If important pages are not indexed, investigate why. Common reasons include: blocked by robots.txt, marked as noindex, duplicate content, low quality content, or crawl errors.
Structured Data and Schema Markup
Structured data is code that helps search engines understand what your content is about. It can also enable rich results, enhanced search listings with additional information.
Schema for service businesses
LocalBusiness schema tells Google you are a local business and provides structured information about your location, hours, services, and contact details.
Service schema describes specific services you offer.
Review schema can enable star ratings to appear in search results (though Google has become stricter about which reviews qualify).
FAQ schema can make FAQ content appear directly in search results as expandable questions.
Implementing schema
You can add schema manually as JSON-LD code in your page headers. This is the cleanest approach but requires some technical knowledge.
Plugins can handle schema for WordPress sites. Yoast SEO, Rank Math, and Schema Pro all offer schema functionality.
Google's Structured Data Markup Helper walks you through creating schema by tagging elements on your page.
Test your schema with Google's Rich Results Test to ensure it is valid and eligible for rich results.
Do not over-optimise
Only use schema that accurately represents your content. Marking up content that does not exist, or using schema types that do not apply, can result in manual actions from Google.
Schema is about helping search engines understand your content, not tricking them into showing rich results.
Common Technical Issues and Fixes
Duplicate content
Duplicate content is the same or very similar content appearing at multiple URLs. This confuses search engines about which version to rank and dilutes your signals.
Common causes include: www and non-www versions of your site both accessible, HTTP and HTTPS versions both live, URL parameters creating multiple versions of pages, and printer-friendly versions.
Fix with canonical tags, which tell Google which version is the original. Use 301 redirects to send duplicate URLs to the canonical version.
Broken links
Broken links (404 errors) frustrate users and waste crawl budget. Google Search Console reports crawl errors including 404s.
Fix broken links by either restoring the missing page, redirecting to a relevant existing page, or updating the link to point somewhere valid.
Redirect chains
A redirect chain is when one redirect leads to another, which leads to another. Each hop adds latency and dilutes link equity.
Ideally, redirects should go directly from old URL to final destination. Audit your redirects and fix chains.
Thin content pages
Pages with little useful content may not be indexed or may be seen as low quality. This includes pages with just a few sentences, boilerplate pages, and auto-generated pages with little unique content.
Either improve thin pages with more useful content, consolidate them with related content, or remove them and redirect to relevant pages.
Technical SEO Tools
Google Search Console is essential and free. It shows how Google sees your site, reports issues, and tracks search performance.
Google PageSpeed Insights analyses speed and Core Web Vitals.
Screaming Frog crawls your site like a search engine and identifies technical issues. The free version handles up to 500 URLs. The paid version offers unlimited crawling and additional features.
Ahrefs and SEMrush include site audit tools that identify technical issues alongside their other SEO features.
GTmetrix and WebPageTest provide detailed performance analysis.
Technical SEO Audit Checklist
Run through this periodically to catch issues:
Site loads over HTTPS with HTTP redirecting properly.
Mobile-friendly test passes with no issues.
PageSpeed score is acceptable (aim for 70+ on mobile, though higher is better).
Core Web Vitals are in the green.
Robots.txt is not blocking important pages.
XML sitemap exists and is submitted to Search Console.
Important pages are indexed (check Search Console coverage).
No critical crawl errors in Search Console.
Structured data is valid (test with Rich Results Test).
No broken links on key pages.
Page titles and meta descriptions are unique across the site.
Images have alt text.
Internal linking connects important pages.
No duplicate content issues (check for canonical tags where needed).
Technical SEO is not exciting, but it is foundational. A site with technical problems is fighting with one hand tied behind its back. Fix the basics, and your content and link building efforts will be far more effective.