Often we hear how daunting SEO can be, even for a senior marketing manager. SEO seems like such an abstract concept because there are so many layers and factors that can affect performance. It’s not as simple as pay per click where you can literally pay for higher placement if you’re not seeing the results as expected (okay, not that simply but you know where we’re getting at).
One of the ways we like to introduce business owners to SEO is by presenting the service as three (equally important) pillars that will push the needle to grow the business. These three pillars are Authority, Content, and our favorite- Technical SEO, which is frequently referred to as on-site SEO.
On-Site Search Engine Optimization includes any task that is completed within the framework of your actual website. This also includes content optimization like the implementation of focus keywords or title tag updates. Read on to learn how easy it is to perform a technical SEO audit.
It is a long time digital marketing myth that meta descriptions are a factor in ranking, but the truth is, they really only influence a user’s decision to click through to the website from the SERP.
Quick Technical SEO Audit Tasks That will Help You Rank on Google
There are specific areas that our strategists focus on when analyzing a website’s HTML. Not to be a stereotypical SEO but in terms of prioritizing the most important tactics, “it depends”.
Whether your website operates as an affiliate marketing opportunity or an e-commerce sales driver, technical SEO tactics can make or break your visibility online. Our SEO strategists rely on these quick but effective tactics to outrank the competition and help grow our client’s business.
Run a Website Crawl
Even if your website doesn’t have many pages, it is still imperative to frequently have crawlers (also referred to as robots) run your website to avoid a drop in traffic or visibility due to technical issues and crawl errors.
Use crawlers like Googlebot or Bingbot to find errors including dead pages, broken links, bad indexing, hacking, server accessibility, downtime, and more because links break, images and pages are removed – that’s why we crawl and review the website every week.
Tools that we like for this tactic are; Screaming Frog, SEMRush, and DeepCrawl.
Index The Correct Version of Your Website
On occasion, website updates and servers begin serving the wrong version of a website (non-https instead of HTTP) so it is important to manually verify the most accurate website is indexing. If both the HTTP and HTTPS versions are indexed, both URLs could be considered duplicate content. Search engines despise duplicate content, so you could potentially end up with a manual action.
Duplicate content issues aren’t as widely found as much these days, as the google bots get smarter and understand the purpose of different content but it could still cause a manual action if you do not properly run a site audit and review similar content.
Manual actions are served as discipline from Search Engines like Google when community standards are mishandled. This causes websites to lose placement on the search engine result pages.
Make The Most Out of Your Crawl Budget
Crawler bots are busy. There are over 1 billion websites online right now. These robots are expected to crawl every URL on the internet so don’t waste their time with unnecessary crawls of useless pages.
Make your important pages indexable (make sure to use rel=index tags), deindex PPC landing pages, and keep your number of pages down to a minimum by reorganizing similar content into what we like to refer to as “Power Pages”, i.e. long-form, authoritative, comprehensive, and informational content.
Use noindex tags on pages that are not as important so google bots don’t use up your crawl budget on content that will never get to the first page of Google – or – pages that shouldn’t even be found via organic search (think like your privacy page or contact us page).
Track Keyword Rankings, Impressions, Queries
Analyze trends in rankings, impressions, and queries via Google Search Console to identify optimization opportunities and indexation issues. If you’re tracking your keywords and noting any substantial changes, then you’d be able to tell if/when an indexation issue arises.
Did a keyword that you typically rank highly for drop off within the search engine results pages (SERPS)? Chances are you have an indexation issue – or – you could potentially be a victim of Google’s algorithm updates.
A proper SEO strategy cannot be judged without tracking organic search performance. There are several keyword rankings and audit tools available for these types of tracking. At Digital Strike Targeted Marketing, we like SEMRush.com, Cocolyze.com, and Ahrefs.com. And don’t forget these are great tools for keyword research as well.
Evaluate Your Organic Search Traffic Trends
Throughout the month, you should be evaluating traffic trends in Google Analytics to analyze overall search engine performance from a traffic volume standpoint. If there are specific web pages that are seeing an increase in traffic (specifically from organic search), it should go without saying that you’re doing something right.
Check for any technical SEO issues on important pages first if you’re not seeing the volume as expected and if you’re not the webmaster – your company should consult a digital marketing agency that specializes in SEO.
Ensure web crawlers know what page is the authority on a given topic and improve authority via internal linking opportunities via backlinks. Many SEO agencies fail with link building campaigns by not considering the importance of internal links and proper anchor text usage.
If you want a particular page to rank highly in the search engine rankings, you need to mold the authority by strategically placing internal links that point from other highly authoritative pages on your website to similar topical content. Boosting your homepage’s authority is a great first start for increasing organic traffic.
Update Page Title, Meta Descriptions, and H1 Tags
When the opportunity arises, improve clickability via Title Tags and Meta Descriptions while improving keyword importance via H1 Tag recommendations.
The days of seeing huge traffic increases just from updating metadata are long gone but on-page SEO updates can make or break a website’s ability to rank.
Keep in mind, meta descriptions are not a factor that Google looks at for rankings but by testing out different verbiage within your metadata, you can find the right mix that increases organic search traffic by using persuasive copy to entice users to enter your website.
Site Speed / Page Load Time Tests
It should come as no surprise that website speed is considered an important ranking factor. A slow page load typically insinuates a poor user experience.
There are several tools across the internet that will test site speed and provided a list of opportunities to improve. We prefer to use Google produced tools such as Page Speed Insights or Test My Site.
You should evaluate load times both independently and against the competition to determine if recommendations for improving speeds are necessary, or something you should prioritize.
Optimize User Experience
Aside from website page speed, there are several other factors that search engines note as ranking factors, such as mobile-friendliness, usability, accessibility, and efficiency.
In an effort to make this as clear as possible, keep it simple and you will be rewarded with rankings. Google provides several tools to increase your scoring on the Core Web Vitals report which will tell you if your website provides a good or poor user experience.
The better user experience, the higher your website will rank on the SERPs.
In May 2020, Google announced Core Web Vitals as a new way to judge the user-friendliness of a website. Since the global pandemic, Google has pushed off the rollout of these changes until 2021, with the very polite announcement of giving at least six months warning before these metrics will affect your visibility in the SERPs.
This report pulls data from the CrUX report (Chrome User Experience Report) which considers field data based on real-world Chrome users’ behavior and experience.
“Core Web Vitals are a set of real-world, user-centered metrics that quantify key aspects of the user experience. They measure dimensions of web usability such as load time, interactivity, and the stability of content as it loads (so you don’t accidentally tap that button when it shifts under your finger – how annoying!).”- Google Webmasters Blog, 5/2020.
The Core Web Vitals report ranks your website’s status as Good, Needs Improvement, or Poor for the following metrics:
- Largest Contentful Paint (LCP) – the time it takes for the largest content to show on-screen from when the user clicks the link or types in the URL to produce the full view of the page. This is typically a hero image or video on the homepage. Google recommends your pages’ largest content to show up in less than 2.5 seconds to be in good standing.
- First Input Delay (FID) – the time it takes for your website to respond to a user’s action on-site and the website to compute the action, i.e. user clicks on link and the time it takes to navigate to that page within that URL. Google recommends less than 100 milliseconds between actions on-site.
Cumulative Layout Shift (CLS) – the amount of change between the current layout to a new page layout between navigation. Think of this as counting the time between your service page to a blog template.
If a page is changing its template design while a user is trying to interact with it, it is considered to be a poor user experience. If it is a large upheaval, Google will score your changes between 0-1, 1 being the most shifting and 0 being the least. Google recommends a score of less than 0.1.
How To Prioritize Technical SEO Audit Action Items
Technical SEO requires several tools and a lot of patience to comb through every page and every line of code on your website. And to be honest, it’s complicated.
This means you will have to pay for, not only the tools and time — but for the knowledge and years of experience the SEO has put in to quickly assess your website or troubleshoot issues.
Age of website
So frequently, we hear new clients complain about our initial suggestion: “time for a new website”. We promise we’re not just saying that to sell you a new website, that’s slimy and that’s not Digital Strike.
But the fact of the matter is that websites these days are different. Technology has changed. You’d be surprised how starting over can improve your current website’s performance.
How much time do you have before a big launch? Are you testing a new product? Do you have the months available to allow for implementation and testing? Typically technical updates can be quick fixes and you can start seeing results within the next one to three months but (yeah, we’re gonna say it)— it depends.