SEO Best Practices and Useful Tools

SEO is An Ever-Evolving Discipline: Here are Some SEO Best Practices That Continue to Be Relevant, Along with Some Useful Resources

Of all the digital marketing disciplines that exist today, none seem to quite hold the aura of mystery that surrounds Search Engine Optimization, or SEO. This is no doubt due in part to its ever-changing nature — a widely accepted tactic that worked seamlessly to drive a webpage up the Google search results only a year ago might have already lost its edge, while some tactics not yet even in the digital marketing mainstream are already proving themselves in practical application.

Despite this aura of mystery, at its core SEO remains a knowable and conquerable process — and, of course, also an ongoing one — with a flexible approach and an open mind. Let’s take a look at some of the best SEO practices that continue to prove their efficacy and are likely to do so for some time to come. These are certainly worth focusing on — after all, the appeal of free, organic traffic isn’t likely to lose its luster anytime soon.

Strive for Topical Authority — But Keep Relevance in Mind

According to Search Engine Journal, topical authority  “is the perceived authority (a website) has over a subject or area of a subject.” Most other SEO resources view the term in a similar fashion. Well-known SEO and data management utility creator Moz uses Domain Authority as its barometer, while SEMRush employs an Authority Score to gauge the same approximate metrics in determining just how well a given website has established itself as the “go to” source on a given topic.

There seems to be an increasing emphasis on topical authority in securing a high ranking on the results pages, and that’s a good thing. As it has evolved over the years, Google consistently placed a premium on a website’s longevity, provided the website in question continues to add new and relevant content, and the topical authority paradigm incentivizes a website to maximize the helpful, informative experience it provides to its visitors. At the same time, it disincentivizes the production of a sheer volume of content presented in an illogical, non-helpful manner that doesn’t generally conform to logical search intent — something that has at times been achieved through less than above board SEO practices.

In striving for topical authority, providing a comprehensive array of content on a given topic is essential, but attempting a jack of all trades approach that’s centered on sheer volume usually won’t work. For example, a CPA firm targeting local, medium-sized businesses won’t be well-served by generating content about the history of accounting — such a direction won’t match its core prospects’ search intent and will even seem incongruous when positioned alongside more relevant posts about allowable deductions and tax minimization strategy.


When Doing Keyword Research, Volume Shouldn’t be the Primary Objective.

Trying to rank for keywords solely based on their search volume is not only often a fool’s errand, considering how much competition there usually is at the upper end of the spectrum, but it also might not yield the qualified traffic results you’re after, even assuming that you ascend to the top of the search results heap.

Instead, it makes a lot more sense to strategize keywords based on buyer intent. In other words, take some time to put yourself in the head of your prospect and consider the search terms they might use when researching their challenges in your niche. The goal should be to provide helpful content for prospects at various stages of the buying process, from the prospect who might be seeking out general information at the very start of their journey, to the more informed prospect who has already done due diligence and is looking to distill the compiled information down to a level where a purchase, or other definitive decision, can be made.

Since the buyer journey is a dynamic process so, too, must be the approach to providing helpful information that will ultimately get rewarded by Google. For example, a prospect at the very start of the process to determine how best to outfit their garage gym and pandemic-proof their workout routine might search for “garage gym essentials”, which will likely yield a variety of space efficient fitness equipment designed to maximize the often-limited square footage that’s endemic to garage gyms.

This is as it should be since a prospect that uses such a search term probably isn’t going to be all that concerned with head-to-head equipment comparisons, or the pros and cons of a given product, just yet. At this stage, an SEO-savvy equipment retailer can provide helpful content centered on just what the most useful and space efficient components might be and, if that retailer carries an extensive product line, can appeal to a number of buyer personas by creating both content focused on prospects that might be hardcore, CrossFit fanatics and other content for prospects who are less intense and might just be looking for equipment that will help them get — or stay — in shape.

However, after a certain amount of research, that prospect will have moved further down the marketing funnel while also gaining knowledge that will prompt them to undertake increasingly more refined searches, such as “the best functional trainer for a home gym”. Creating content that will rank for searches of this sort will require a different, more specific approach and should be focused on articles elaborating on the benefits of the various functional trainers included in that retailer’s product line.

Focus on the User Experience

Back in June of 2021, Google released an algorithm update that put a higher premium on the user experience, including favoring pages that had a fast load time and were mobile friendly. This emphasis isn’t likely to change anytime soon — internet users overall expect ever more rapid results, and a greater percentage of searches are being made via mobile with every passing year.

Performing a website audit to determine if any unnecessarily large or non-compressed website images are being used can work wonders here — slow page load times are often attributable to overly large image files. Also, while the vast majority of websites are hosted by companies that employ state of the art, rapidly responding servers, some aren’t — the hosting provider can, at times, be the culprit where slow page load times are a problem.

Also, many website owners and administrators spend a lot of time attempting to minimize their bounce rate — the percentage of visitors who arrive at a site and then depart without viewing additional pages. Not only is this not an important factor where SEO is concerned, but it’s also an imperfect metric in determining whether a given website page is satisfying visitor demands. For example, a given webpage may be so informative that the visitor sees no need to navigate elsewhere on the site. Also, a website’s bounce rate is largely dependent on its purpose. On the average, blog sites have bounce rates between 70 and 90%, while service sites are much lower.

There are indications that, in the coming years, Google will put an increasing emphasis on user satisfaction when determining its rankings. This will include metrics that focus on engagement. Among them are considerations that monitor whether or not visitors are clicking through to the links provided on the page, as well as factoring in time spent on the page. This is also a welcome development, as these aren’t aspects of a webpage that lend themselves to artificial manipulation — they represent the antithesis of clickbait.

For content providers, this means that more time and effort should be spent on perfecting article titles and meta descriptions, so that visitors don’t arrive at a destination due to being unclear on what that content will offer them. Making sure that there is an appreciable number of directly relevant keywords in the meta description will also help here.

There are also instances where a webpage may prematurely drive away traffic despite providing stellar content. Aesthetics matter when maximizing the amount of time a visitor spends on a page. Making an article as thorough as possible shouldn’t come at the expense of white space that makes it easier to follow, nor should off the wall type fonts be used just for the sake of differentiating that article from other content that might be available.

There’s a reason why you won’t find an endless array of fonts being used in top-performing content — a lot of fonts are just too difficult to follow for any length of time.

Make Sure Your Existing Content — and Site Structure, is as Good as it Can Be

There are more than a few creators who generate new content that’s shaped by what the competition is currently ranking well for and little else. This is to be expected, but far too much of it simply emulates what’s already out there without improving upon it, offering a different perspective or a more immersive overall experience.

Generating new content is, beyond question, vitally important — Google will eventually penalize a site whose content remains dormant over an extended period. But putting in the extra effort to make sure existing content is as good as it can be will often help with rankings. This might include taking a second look to make sure that an appropriate number of internal links from other site pages is being used, that heading hierarchy is also properly organized and doesn’t include duplicate H1 tags, and that meta descriptions are as good — and accurate — as they can be.

And, of course, topical relevance shouldn’t be overlooked. Does the content consistently stay on point, and provide useful information centered on its intended niche, or does some of it seem superfluous and not have an obvious purpose, other than to add to the sheer number of pages on the site?

Google has rapidly evolved over the years, so it’s easy to lose sight of any fallibilities it might have. One of these is its continued inability — or unwillingness, depending on perspective — to delve deeply into a site’s structure to rank what would otherwise be useful content that might be buried deep within its confines. Google has, on a number of occasions, gone on record saying that the site structure level at which a given piece of content resides isn’t a ranking factor in and of itself, but even as recently as early this year, Databox reported that Google’s spiders — the instruments it uses to crawl a site and determine its content — require an easily navigated, user-friendly site structure to perform at their best.

Also, over time a website will almost always expand, and this can lead to two or more of its pages vying for position in the rankings because both have been unintentionally optimized for the same, or nearly same, keyword phrases. In essence, these pages can siphon “SEO power” and resulting traffic from each other. Making it a habit to review webpage content at regular intervals, then adjusting the keyword phrases being optimized for, can go a long way toward alleviating this problem — so can making sure that their URL structure isn’t so similar that the pages interfere with each other where traffic is concerned.

Site Security is an Absolute Must

There was once a time when an SSL (Secure Sockets Layer) certificate and the https prefix that comes with it were seen as a nice thing to have, but not a necessity, unless financial transactions were taking place on the site it was connected to. That time has passed. The SSL security protocol creates an encrypted link between web server and web browser and makes visitors feel safer when arriving at a site — any site. While Google announced that it has removed its Safe Browsing standard as a ranking factor, the consensus is that there is still a correlation between a site having an SSL certificate and enjoying higher rankings.

Even relatively well-maintained sites can be susceptible to malware. These sites are often vulnerable when third party plug-ins are installed and not consistently updated — it doesn’t take much time for an outdated plug-in to act as a beckoning gateway for bad actors. Regularly logging on to the website hosting account to update plug-ins and run malware scans isn’t an absolute guarantee that a site won’t ever be infected with malware, but it certainly lessens the odds substantially.

Also, because a lot of malware is designed to divert SEO influence to illegitimate sites, the longer malware is allowed to fester, the more damage it can cause.

Pop-Ups May Have Their Place, But . . .

Marketers have long used pop-ups to encourage conversions — most often in the form of email list signups — and while they do have their use, they should be used with plenty of restraint on mobile sites. Google punishes mobile sites that feature what it deems to have intrusive pop-ups and has been doing so for about five years now.

What’s Google’s idea of “intrusive”? It’s pretty broad, actually. If a pop-up covers so much territory that it forces the visitor to close it to consume content, it’s considered intrusive. The barometer for this intrusion is generally considered to be anything more than 15% of the viewable screen territory — that’s not much real estate on most mobile devices.

There are some workarounds. Setting a delay for a pop-up to appear will usually diminish most of its negative SEO consequences, and Google doesn’t currently penalize exit intent pop-ups at all.


Backlinks Matter . . .  A Lot

Backlinks refer to the practice of outside domains linking to certain pages on a given site and, when attained honestly, confer authority to the content they link to. They’re an often-overlooked aspect of SEO, probably because backlinks are generally tedious and labor intensive to secure. Yet, they’re vitally important — so much so that in most cases a webpage that features a substantial amount of legitimate backlinks from authoritative sites can outrank a competitor that has seemingly more direct relevance to a search query.

Performing proper competitive analysis to determine which sites are linking to competitors is the first step toward harnessing the power of backlinks. From there, committing to an outreach program to relevant sites — usually through email — to get those sites to link to desired content can substantially boost content rankings, especially when those sites have a high authority score of their own.

Submit a Site Map

On the whole, Google is able to accurately crawl and assess most properly constructed sites but submitting a site map to Google Search Console removes any inconsistencies that might exist in this area. And while Google represents the vast majority of search traffic, doing the same thing with Bing is also a good idea.

SEO Tools

Developing and maintaining an effective SEO strategy doesn’t happen by accident. What we might be convinced are valuable keyword phrases may in reality be used less often than expected and might not even signify actual buyer intent. Researching what prospects are searching for and then delivering content that solves their challenges is a far better way to go.

Here are just a few free and freemium resources that can help with that endeavor. There are literally dozens more out there to choose from but accessing the inhabitants of this list will provide invaluable assistance and insight for SEO strategy.  

Google Auto-Fill, Google “People Also Ask” and More

Entering the beginning of a keyword phrase and seeing what Google finishes it off with may seem pretty low-tech, but it’s effective. After all, Google bases its auto-fill predictions on actual previous searches that closely match what it perceives to be the query. 

Venturing down to the bottom of the search results page, where Google locates its “People Also Ask” feature will provide an often-comprehensive list of related searches to gear future SEO efforts towards.

Google Trends and Keyword Planner are also valuable free resources.


Answer the Public

Entering search terms on this site can unearth a world of possibilities, including related search queries and their variations. As an added bonus, Answer the Public’s animated home page is borderline transfixing and worth checking out on its own. 

Screaming Frog

Rather than assisting with keyword research, UK-based Screaming Frog will unearth any internal issues that might negatively affect a site’s SEO standing, including pages that haven’t yet had any other internal pages point to them, pages that are loading particularly slowly — usually due to bloated image size — missing or duplicated heading tags and ineffective meta titles and descriptions.

It’s a free resource to optimize and monitor fewer than 500 URLs. Past that threshold, Screaming Frog’s paid version also includes some advanced features, such as the ability to save its audits.

Ubersuggest

Created by renowned digital marketer Neil Patel, Ubersuggest is a great resource for both keyword suggestions and competitive analysis. It will provide an often-comprehensive list of websites that are currently linking to the competition, the relative authority of those websites, along with particular pieces of content that have generated substantial engagement. Ubersuggest’s free version does limit the number of searches you can run in a given period, whereas its paid counterparts don’t.

SEMRush

The paid version of SEMRush is considered to be a top-level SEO resource, but there is a free version as well. As you’d expect, it has its limits as far as the number of pages that can be crawled in a single audit, among other things, but by SEMRush’s own account, it’s a good fit for freelancers with a quantitatively limited need for its marketing analytics.


The SEO universe is rapidly changing, so it lends itself to an experimental approach. We’re curious, what direction — or directions — do you think it will take in the future? We’d love to hear from you.

About the Author

Related Articles

Mastering A Mobile Friendly Website

You can’t ignore the importance of mobile friendly sites, and they’re only demanding more and more attention from companies. In order

Ready to Grow Your Buisness?