Learn about fundamental concepts and how to leverage the power of SEO to get more traffic, leads, and sales.
Think of the Internet as a spider-web of interlinked websites and webpages. Search engines follow all of the links that exist between different websites to form their view of the web. This is called “crawling.” During their crawl, search engines analyze over 200 factors to develop a holistic view of every website they encounter. When a user performs their search, they receive hundreds of thousands of results to their query in the form of Search Engine Result Pages (SERP).
SEO is the combination of on-page optimizations, including the optimization of website code, page content, information hierarchy, and meta data, and off-page optimizations, including the acquisition of clout through publicity, that will improve a website's keyword ranking positions in a search engine's search index. This is achieved by understanding and leveraging what we know about the way search engines, like Google, crawl and index the web.
Optimizing website code, on-page content, information hierarchy, and meta data improves the way website users and search engines understand and interact with a website. Ensuring the website is optimized is the first step towards increasing keyword rankings and organic website sessions.
The website’s information architecture, the hierarchy of webpages in relation to the overall site, as well as their interrelatedness, is expressed through URL structuring and breadcrumbs. Having a clear and simple URL structure is essential in creating an environment for both users and search engines that is easy to understand.
Search engines crawl and index the Internet using automated scripts, often referred to as search engine robots or spiders. Unlike humans, these spiders crawl the web by clicking through every link they encounter, within sites and between sites. They then use this interlinking information to form a map of the Internet and the relationships between sites. A mini-map is also formed for the pages within a site. The largest, most authoritative websites are formed when a relatively large number of links point towards them.
As pages with higher authority across the Internet link out to smaller sites, those smaller sites inherit some of its authority. This is because the link is passing “equity.” Links have always represented the single most important subset of "off-site elements" in terms of gaining rankings. The more linked-to a page becomes, the more authority it garnishes. Interlinking is a way a Webmaster can redistribute the link equity a particular page is receiving. Thus, the equity a homepage has received can be directed to deeper, value-generating pages. Therefore, redistributing link equity enables profit pages to rank.
Search engines are influenced by a brand's clout, the online engagement around it, and the quality of the of those conversations. This engagement can greatly impact a website's visibility in organic search. In addition, the sentiment of the conversations, positive or negative, can also affect what appears in search for specific queries. Off-page optimization leverages brand communities, influencers, thought leaders, and content publishers, in addition to their audiences, to build brand awareness and strengthen brand recall. By building the right kind of online engagement for a brand, its website can benefit from an increase in organic search performance.
SEO is important when launching a new website, migrating a website to a new host or CMS, targeting multiple languages or regions, or when its important to forecast demand.
SEO has proven, measurable results. With web analytics, marketers are able to effectively and efficiently set and measure target KPIs. Compared to traditional media, SEO provides a relatively low-cost alternative for businesses to reach large audiences.
Given that search engines allow users to quickly find products and services they are interested in, companies are more effectively allocating their marketing budgets to a more qualified audience. Due to its characteristically low customer acquisition costs, SEO has one of the best returns on investment. This, coupled with the fact that there aren’t any associated cost-per-click fees, means that SEO can dramatically reduce a business' customer acquisition costs.
The value proposition is a statement that summarizes why a consumer should buy a product or use a service. This statement should convince a potential consumer that one particular product or service (or brand) will add more value or better-solves a problem than other similar offerings. Keyword research, audience segmentation and positioning will improve a marketer's understanding of consumer behaviour and will help her to formulate value propositions. If this information is used correctly, the number of website visitors that complete her specified, value-generating actions will improve.
A good SEO strategy has the ability to increase organic search visits to a website. In most cases though, quality is better than quantity. The good news is that SEO can help with traffic relevancy too. When pages are optimized to target a more relevant audience, changing our focus to more targeted keywords, may decrease website traffic as relevance improves. As users become more relevant, website conversion rates tend to improve.
Relying on a single source of traffic can make organizations sensitive to external market conditions that are outside of their control. The Internet is in a constant state of flux. Relying on social media referrals to drive visits to your website can result in dramatic drop-offs in traffic when websites like Facebook and Twitter alter their algorithms. Investing too heavily in pay-per-click can increase your customer acquisition costs, making it increasingly difficult to compete. These are just two examples of the risks associated with relying too heavily on one channel to generate value. Like diversifying your investment portfolio, increasing the number of channels that drive traffic to your website will reduce the risk that your organization will be harmed by an unpredictable marketing environment.
The interlinking between a website’s pages is an extremely important ranking factor for internal pages in Search. The following discussion includes the factors that should be considered when designing website navigation, and the resulting information architecture of a website, to ensure search engine crawlers can locate and index all important website pages.
Ensure that all drop-down menus, particularly the main navigational menu, contain HTML links in the source code that are crawlable. Ensure no-follow directives are removed and the navigation functions in environments that have JavaScript and Flash disabled.
If a browser does not render a website’s drop-down menus when JavaScript is disabled, search engines may not be able to efficiently crawl the website.
Use breadcrumbs to specify a page’s location in the website’s hierarchy. In addition, a single URL and breadcrumb path must be chosen and used for pages that appear under multiple categories in a navigational menu.
Search engines will use a website’s breadcrumbs, along with its URL structure, navigational menus, and HTML sitemap to better-understand the website and its information architecture. Conflicting information creates confusion for users and search engine algorithms.
Implementing the appropriate structured data mark-up alongside breadcrumbs also helps search engines better-present their results. This enriches the user experience and can result in increased prominence on search engines as a result. This is particularly important for resource pages and blog posts.
Always link to a single version of any given URL. When linking to internal pages, decide if your website’s URLs should contain http or https, www or not, and a trailing slash appended to the end of the URL or not.
Below are eight unique URL variants. Remember to choose just one:
Ensure you keep internal linking consistent, to ensure search engine crawlers encounter as few redirects as possible to conserve your crawl budget.
Ensure that your website has an HTML sitemap that is dynamically generated, and place a link to it in the footer of the website. HTML sitemaps make it possible for all important, live pages on a website to be a single click away from your home page. The HTML sitemap also further communicates the website’s information architecture and the relationship between pages. The sitemap design should communicate a consistent message with the main navigational menu, as well as the website’s URL structure and breadcrumbs.
Best practices dictate that the sitemap should automatically update to include any newly created pages and their respective URLs. When a page “expires,” or is removed from the website for whatever reason, the page should also be removed from the HTML sitemap.
The HTML sitemap also allows users to quickly see all important website pages in the event they are not able to effectively navigate it, for whatever reason. In addition, the HTML Sitemap places all links to important pages one level away from any point of entry. This facilitates both website crawls and improves link equity distribution from the home page.
Note that the HTML sitemap is not to be confused with the XML sitemap that is submitted to Google through Google Search Console.
As discussed, URL structure conveys important information to search engines like Google. It’s important that existing URLs, for any website, change as little as possible. This avoids losing any page authority and keeps header response code errors to a minimum.
When creating URLs, it’s important to avoid duplicate content, prevent crawl/indexing issues, and be relevant to users.
Including important keywords in URL paths is important for both users and search engines. It’s important to include these keywords, without creating excessively long URLs. Note that over-optimizing keywords in URLs can lead to a poor user experience that can negatively impact rankings.
Ensure links are up-to-date and return “200 OK” header response codes. Sending crawlers through multiple 301 redirects, or to too many non-existent pages that return a “404 Page Not Found” header response code, negatively impacts crawl budget and link equity distribution.
If a website does send a user to a “404 Page Not Found” page, it’s important that the experience is enjoyable as possible and users can quickly find their way to another relevant page. For this, it is recommended to create a custom 404 page template that is well-structured, informative, and looks pleasing to the eye.
Search engines grab Meta Information from a page and serve it to its users in search results. Page Titles are the large, typically blue pieces of text in search results, and are used by search engine algorithms to rank pages for specific keywords.
Effectively written Meta Descriptions are used to entice searchers to click, and are therefore shown on search result pages as grey text. You can view this as “marketing text,” or an “elevator pitch.” Well-written Meta Descriptions explain what the user is going to gain by visiting the page, what makes it unique or superior in value to the alternatives (value proposition), and contain a call to action. Note that Meta Descriptions are not used by search engine algorithms to rank pages and search engines may modify your Meta Description as they see fit.
Page Titles and Meta Descriptions are elements located in the <head> section of each webpage. The Page Title is especially important because it is the primary indicator of relevance to the search and it is the first thing that a user sees on a search engine results page (SERP).
Page Titles and Meta Descriptions should read as natural, entice the audience to click on your result, and effectively target the right keywords for the page.
W3.org defines HTML Headings as follows:
“HTML defines six levels of headings. A heading element implies all of the font changes, paragraph breaks before and after, and any white space necessary to render the heading. The heading elements are H1, H2, H3, H4, H5, and H6 with H1 being the highest (or most important) level and H6 the least…Use the DIV element together with headings elements when you want to make the hierarchical structure of a document explicit.”
Headings should not only be used to format text in a certain way. Their primary purpose is to structure the page’s content hierarchically. H3, H4, H5, and H6 headings are not absolute requirements and should only be used in areas that make sense. When writing headings, they should be unique from page-to-page and reinforce the key themes of the page.
An alt attribute should be assigned to each image rendered on the page. In the event the web browser cannot load the image, this alternative text will be displayed.
Alt attributes are used by search engine algorithms to analyze the relevance and importance of images to the page. These tags also increase the likelihood that the image will appear in image-based search results.
Social sharing isn’t only an important ranking factor; social has become a driver for brand awareness. Controlling the messages your business serves to social audiences is an important consideration in this. By implementing the open graph protocol on your website, you ensure the consistency of your brand messaging across platforms that recognize it (Facebook, Twitter, and LinkedIn).
Structured data mark-up, or schema, is a standard way to annotate content so that machines can understand it. When your website includes structured data mark-up, search engines like Google may use that data to index your content better. Schema can also lead to presenting your website in a more prominent way in search results, and better integrate it into Google’s increasingly rich search experience.
Acceptable rich snippets can be found on schema.org. The website offers several options for integrating the rich snippets through Microdata, JSON-LD, or RDF-a. JSON-LD implementation is most recommended.
Complete documentation for schema may be found here: https://schema.org/docs/documents.html
Links aside, arguably the most important ranking factor is website content. Sites that offer superior content to that of their competitors generally rank higher for competing terms. When developing content for your site, it’s equally as important to manage content duplication. Duplicate content sends a strong low-quality signal to search engines, but it isn’t always intentional. In fact, content duplication (the same, or similar content across multiple URLs on a single domain) is often created unintentionally.
To effectively manage duplicate content, it’s important that a primary URL is defined for each unique piece of content (website page). The process of canonicalization is how we manage duplicate content across websites.
The canonicalization process directs search engines to the page that is officially recognized as true and genuine, and it is done in two ways:
For website pages you do not want search engines to index (such is the case for pages or templates that are under development), you can:
The meta noindex tag should be placed in the <head> section of every page you would like to keep out of a search engine’s index and looks like this:
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
Up to this point, duplicate content has been referred to as an on-site problem where multiple pages (URLs) on the same website display the same or very similar content. The discussion will now turn to duplicate content across websites, which is equally as important.
If your site’s content finds its way onto another website, and that website has more authority, it may outrank your website for related search terms. This is why, when advertising your business and its services, it’s important to create unique content for those advertisements. Taking or repurposing existing website content for use on another website can negatively impact the quality assessment of your website.
Another culprit of duplicate content is the trailing slashes in URLs. If there isn’t a way to avoid the duplicate content caused by trailing slashes in URLs, ensure that all internal links point to a single version (slash or unslashed -- just pick one!). Once this is done, apply a 301 redirect from all non-trailing slash URLs to trailing slash URLs (or vice-versa).
An XML sitemap, not to be confused with the HTML sitemap discussed in an earlier section, should be periodically submitted to Google & Bing using Search Console or Webmaster Tools.
These sitemaps give webmasters a way to list, in hierarchical format, all file destinations for any given website. The purpose of the XML sitemap is to suggest how the website should be crawled and indexed and helps search engines discover new content. They differ from traditional HTML sitemaps in that they are not a page on the website that users will access.
It’s important to understand that XML sitemaps do not affect a page’s rankings - rather, these sitemaps affects their presence in the search engine index.
Multiple standards exist for producing an XML Sitemap, however Google recommends using the standard put forth by Sitemaps.org. More information about XML sitemaps can be found in the Google help center.
The following basic rules should be followed when creating and maintaining XML sitemaps:
Search engines are now assessing a page’s load time, and have incorporated the analysis into their algorithms. Load times should be reduced as much as possible, but avoiding the use of large image files and heavy code.
There are two ways to render mobile-friendly websites, without creating a separate mobile URL:
Google strongly recommends that developers create a responsive website that adapts to the size of the user’s screen. Developers can then signal to browsers that the website will adapt to all devices, by including the following meta tag in the <head> section of all it webpages:
<meta name=”viewport” content=”width-device-width, initial-scale=1”>
Remember that search engines like Google need to experience your website the same way a user would. Thus, serving different content to “Googlebot” can lead to an algorithmic search engine penalty.
The web is becoming a more secure place. That’s why Google has made a push for prioritizing secure websites and content. This means that the HTTPS protocol is used as a quality ranking-signal in their search algorithm.
Most webmasters know that HTTPS gives website users added security and privacy. What many people don't know is that referral data from a referring site using https is stripped for sites that do not use https. This traffic will show up in your analytics as “direct” traffic. By switching to https, you regain the lost referral data in Google Analytics.
Google has also made it clear that HTTPS is a ranking factor and switching to HTTPS will result in a “boost” in rankings.
As with everything, implementing HTTPS is not as straight-forward as one might think. The first thing to keep in mind is that HTTPS may decrease page speed. This is why it is so important to optimize your website for speed first. In addition to this, all links, including canonical tags, image src links, and JavaScript will need to be changed over to HTTPS. If left unchanged, some browsers will trigger a “page is insecure” warning that may turn off many users. Finally, if a migration strategy is not well-planned, switching to HTTPS may increase the number of internal 301 redirects. This decreases crawl efficiency, as a result.
Visit the following resource pages to learn more about HTTPS:
Although often overlooked, word count is an important on-page optimization. Pages with low word counts do not give Google much to work with; when there are more words to a page, not only will search engines typically understand the purpose of that page better, there is evidence that suggests they may spend more time crawling the website.
Google My Business is an integral part of Local SEO and should be treated as such. The Google My Business panel appears to the right of a search results page when you type in a business-related query. If your business relies on local clients, you need to optimize your Google My Business listing. This involves registering your business with the service.
It is important to maintain total uniformity in your NAPW (business name , address, phone number, website address) across your Google My Business pages and local citation sources.
Find out more about Google My Business here: https://yoast.com/google-my-business/
When linking to other websites from your own website, it is important that the links reach the desired URL. It’s also important that you link to authoritative sources that preserve the integrity of your brand.
A strong backlink profile is important to maintaining and improving rankings for your website. There are countless ways to gain and attract good backlinks. The issue at hand is there are just as many ways to attract spammy, low quality backlinks. Search engines like Google do not like these types of links, and will penalize you for having them. Many so-called ‘marketers’ resort to black-hat tactics of buying links from link farms and spamming directories, message boards, and the comments sections of many blogs. This is detrimental to your business and removes the long term, positive ROI that SEO provides.
For more info about backlinks, check out this MOZ post.
Local citations are like the online version of the Yellow Pages; they are digital directory listings that help people locate your business. For multi-location firms that rely on local business, local SEO should be an integral part of the SEO strategy.
Local SEO includes the following:
Google serves local search results from its Google My Business platform. Here, businesses are able to list their business locations and contact information on Google Maps and dedicated search listing pages. When a user performs a search that that is local in nature, search engines will typically serve these local results. The listings are usually shown alongside a map in search results. On-page and off-page optimizations ensure that these local business listings have the best chance of appearing in organic search results, when local search results are triggered.
Search engines also return standard organic results for queries they deem to be relevant to the searcher's intent and region. Ensuring your multi-regional website is optimized for organic search ensures that search engines rank the most relevant page for each region. This means that if someone located in Houston, TX searches for “inbound marketing agency,” search engines should return results for inbound marketing agencies located in or near Houston. In addition, multi-regional website optimization ensures that the searcher is directed to the most relevant page for their region or language.
Learn more about local citations at this blog post here.
Social signals refer to social media profiles owned by your brand and how complete those properties are. Social Signals are a ranking factor, so it’s important not to disregard them. Additionally, social media is another channel that can be used to attract a qualified, engaged audience.
Over the years, many so-called “SEO professionals” have spammed the Internet with poor-quality content and links, under the guise of search engine optimization. As search engine algorithms have become more sophisticated, they have become almost immune to these unethical practices. Known as black-hat search engine optimizers, these individuals are oftentimes difficult to discern from true SEO practitioners.
A company that participates in “black-hat” SEO not only runs the risk of not seeing a return on their investment, but many risk algorithmic and manual search engine penalties that can cause organic rankings and search traffic to drop. In addition, the worst offenders are permanently banned from appearing in Google search. Yes, the largest search engines employ a team of web spam specialists that manually penalize websites each and every day.
Some of the most common spamming techniques include:
9am to 5pm ET, Monday - Friday
Copyright © 2024 Tandm Digital Agency. All rights reserved.