steps to deliver better technical seo to your clients

Five steps to deliver better technical SEO services to your customers

Control over the technical SEO of a site is both an art and a science. Take it from me – a content strategist in heart and soul – technical SEO requires a balance of knowledge, dedication and grit to be competent. And for many it can be both daunting and complicated to undertake such technical matters.

But no matter how code-heavy and cumbersome technical SEO may seem, the understanding of the core concepts is within reach for most search marketers. Yes, it helps to have HTML chops or a developer on hand to help implement scripts and such. However, the idea of ​​providing top-level technical SEO services should not be as intimidating as for most agencies and consultants.

To help dial the technical side of your SEO services, I have shared five starting places. These steps reflect the 80/20 of technical SEO and much of what I've taken over from my code-savvy colleagues over the past decade.

1. Check Google Analytics, Tag Manager and Search Console; define conversions

If you maintain ongoing SEO assignments, it is crucial to set up Google Analytics or an equally adequate web analysis platform. Moreover, the creation of Google Tag Manager and Google Search Console offers you more technical SEO options and information about the health of a site.

In addition to verifying a site on these platforms, you also want to define some KPIs and conversion points. This can be as simple as tracking organic traffic and form submissions, or as advanced as setting five different conversion goals, such as form submissions, store purchases, PDF downloads, Facebook tracking, email signups, etc. In short, without any kind of keep track of conversions, you are essentially blind.

set goals in Google Analytics

Determining how to measure the success of a site is essential for providing high-quality SEO services. From a technical point of view, both Google Search Console and Analytics can provide critical insights to help you make continuous improvements. These include crawl errors, duplicate metadata, toxic links, bounce pages, and drop-offs, to name just a few.

2. Implement structured data marking

Implementing structured data marking has become an integral part of technical SEO. Because Google has been a topic of attention in recent years, more and more search marketers are embracing ways to deploy structured data markup or Schema for their customers. In turn, many CMS platforms are now equipped with simple plug-ins and developer capabilities to implement Schema.

In essence, Schema is a unique form of markup developed to help webmasters better communicate the content of a site with search engines. By tagging certain elements of the page content with Schema markers (i.e., Assessments, Total Review, Business Location, Person, etc.), you help Google and other search engines to better interpret and display such content to users.

Google & # 39; s structured data test tool

These markers can improve the visibility of your site with features such as rich snippets, extended meta descriptions, and other improved listings that can offer a competitive advantage. Within Google Search Console, not only can a handy validation tool be used to help assess the marks of a site, but this platform also records any errors with structured data.

3. Review link toxicity regularly

It should now be no secret that poor quality links that refer to a site can hamper their ranking. Moreover, a site that has clearly built links manually using keyword-filled anchor text runs a high risk of being de-indexed or completely removed from Google.

If you have just returned from 10 years to a time when you built a few (hundred?) Sketchy links to your site, consider assessing the site's link toxicity. Toxic links that come from sources with spam can really ruin your credibility as a trusted site. That is why it is important to identify and reject links that may interfere with your rankings.

technical seo tool for checking for toxic links

(Not only does it Backlink Audit Tool in SEMRush makes it easy to locate potentially toxic links, but also to take the necessary steps to remove or ignore certain links.)

If there is one SEO variable that you sometimes have no control over, they are backlinks. New, spam-like links can come out of nowhere, so you consider existential questions about the internet. Checking in regularly with the backlinks of a site is a critical diligence in maintaining a healthy site for your SEO customers.

4. Consistently monitor the health, speed and performance of the site

GTmetrix is ​​an industry standard tool for efficiently identifying technical bottlenecks for a site. With this tool you can discover important insights about the speed, health and overall performance of a site, along with useful recommendations for improving such problems.

Undoubtedly, site speed has become a remarkable ranking factor. It reflects Google's mission to offer search users the best possible experience. As such, fast-loading sites are rewarded and slow-loading sites are unlikely to realize their full SEO potential.

Pagespeed Insights score as a technical SEO tool

In addition to GTmetrix, Google PageSpeed ​​Insights and Web.Dev are a number of additional tools that help improve the speed and performance of a site. Like the GTmetrix and SEMRush recommendations, these tools provide easy-to-digest guidance, supported by a thorough analysis of a number of variables.

The page speed improvements provided by these tools can range from compressing images to minimizing redirects and server requests. In other words, some developer experience can be useful here.

A final core aspect of maintaining optimum site health is limiting crawl errors to an absolute minimum. Although it is actually quite easy to check, regularly repairing 404 errors and correcting crawl optimization issues can improve your technical SEO services. These options are available in the Site Audit Tool from SEMRush.

technical SEO site audit tool

(The intuitive analysis of the Site Audit Tool crawl report makes error solving a seamless process. Users can easily find broken links, error pages, insufficient titles and metadata, and other details to improve site health and performance improve.)

5. Canonicalize pages & # 39; s and audit robots.txt

If there is one problem that is virtually unavoidable, multiple versions of the same page are discovered, or duplicate content. As a rather hysterical example, I once came across a site with five iterations from the same "about us" page:

  • https://site.com/about-us/
  • https://www.site.com/about-us/
  • https://www.site.com/about-us
  • https://site.com/about-us
  • http://www.site.com/about-us

For a search engine, the above looks like five separate pages, all with exactly the same content. This then causes confusion, or worse, making the site look spam or superficial with so much duplicate content. The solution for this is canonicalization.

Because canonical tags and duplicate content have been important topics of discussion, most plug-ins and CMS integrations are equipped with canonicalization capabilities to keep your SEO inside.

yoast plug-in for wordpress to create canonical URL & # 39; s technical SEO tool

(In this image, the very popular Yoast SEO plug-in for WordPress has a Canonical URL function under the gear icon. This simple functionality makes it easy to define the desired, canonical URL for a specific page.)

Similarly, the robots.txt file is a communication tool designed to indicate which parts of a website should not be processed or crawled. Here certain URL & # 39; s can be refused, preventing search engines from crawling and indexing them. Because the robots.txt file is often updated over time, certain folders or content on a site cannot be allowed to crawl and index. In turn, it is wise to check the robots.txt file of a site to make sure it matches your SEO goals and to prevent future conflicts.

technical tool for SEO performance for a robots.txt file

Finally, keep in mind that not all search engine crawlers are made equal. There is a good chance that these pages will still be crawled, but it is unlikely that they will be indexed. If the robots.txt file states URL & # 39; s as & # 39; s not indexing & # 39; s, you can trust that everything in that URL & # 39; s is not counted as shallow or duplicate content when the search engine measure your site.

Tyler Tafelsky is a Senior SEO Specialist at Captivate Search Marketing based in Atlanta, Georgia. Tyler has been active in the industry since 2009 and offers extensive experience in the search marketing profession, including technical SEO, content strategy and PPC advertisements.

Related reading

Amazon Google market share for e-commerce, data
The power of page speed Practical tips and tools to speed up your site
Five ways to target ads on Google without keywords
Two simple behavioral levers to improve your link building efforts

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *