Search Engine Optimization (SEO) is the practice of increasing organic traffic to a website. The main purpose of SEO is to increase the ranking of a website within the search engines’ indexes so that it is shown at the beginning of a search engine results page (SERP). Search engines (e.g., Google, Bing, Yahoo) have crawlers (i.e., spiders) that go hunting for any and all pages they can find throughout the Internet. The pages they find are then indexed within the search engine and fed through an algorithm that is continuously updated to try and match search engine queries in the best and most efficient way possible utilizing weighted variable components existing within any one page or domain (Moz). The exact algorithm is unknown and kept behind cloak and dagger by the powers that be (namely Google), but enough experts and their analyzed data exists to point us in the right direction as to what goes into the ranking process and what will impact a website. As the locally ranked #1 SEO experts in Waco, Texas, we’re going to outline what you need to know about SEO and how to use it to your advantage.
Now, before we really dig deep into SEO, it’s important to point out that SEO is not an exact science for the simple reason that the algorithms used by the search engines are not made public. Even so, you’ll find plenty of articles stating things like, “7 aspects of SEO that will chart you right to spot #1,” or, “The 12-Step Plan to Perfecting Your Website’s SEO,” but there is not any exact or sure-fire way to make your website jump ranks. That said, there are plenty of things you can do and keeping up with all of them will keep you in the race for the long haul.
Evolution of SEO variables
For a while, Google released major updates to their algorithms a couple of times a year and would release “guides” to the changes and what to focus on without actually giving away specifics to how each change played into website rankings. A lot of their updates are to address loopholes that allow for quicker ranking increases, but each update that closes a loophole also turns those methods into penalized variables. For example, keyword stuffing is a practice where primary keywords, along with any number of keyword variations and even unrelated keywords are written repeatedly throughout any one page, its embedded code, meta descriptions, and headings in order to increase its ranking solely through rate of keyword usage and subsequently high rates of traffic. When that loophole was closed, the algorithm also negatively ranked such practices, meaning any site that uses keyword stuffing is now penalized and drops rank accordingly.
Major algorithm updates include Pigeon, Hummingbird, and Penguin (there might be a pattern in their names…). In 2015, a Google employee was asked on Twitter about some suspected updates. The employee confirmed that their algorithm received updates, and that they actually incorporate multiple “updates” to individual lines of code each day as minor adjustments are needed. The reporter asked for a name for the updates. As a joke, the employee said all of those updates are collectively called “Fred,” and that any other unnamed updates are to be assumptively named “Fred.” The joke caught on and has held since. So, on our current Fred algorithm, SEO specialists have a number of agreed-upon variables impacting SERP rankings. In June of 2013, Moz surveyed 128 specialists about their perceived weighted rankings of 9 SEO categories (for Google) and their averaged responses were recorded as follows:
- Domain-Level & Link Authority Features (20.94%)
- This includes the “quantity of links to the domain, trust/quality of links to the domain, domain-level PageRank, etc.”
- Page-Level Link Features (19.15%)
- “PageRank, TrustRank, quantity of links, anchor text distribution, quality of link sources, etc.”
- Page-Level Keywords & Content Features (14.94%)
- “TF*IDF, topic-modeling scores on content, content quantity/relevance, etc.”
- Page-Level & Keyword-Agnostic Features (9.8%)
- “Content length, readability, uniqueness, loadspeed, etc.”
- Domain-Level Brand Features (8.59%)
- “Offline usage of brand/domain name, mentions of brand/domain in news/media/press, entity association, etc.”
- User/Usage & Traffic Query Data (8.06%)
- “Traffic/usage signals from browsers/toolbars/clickstream, quantity/diversity/CTR of queries, etc.”
- Social Metrics (7.24%)
- “Quantity/quality of tweeted links, Facebook shares, Google +1s, etc.”
- Domain Level Keyword Usage (6.98%)
- “Exact-match keyword domains, partial-keyword matches, etc.”
- Domain-Level & Keyword-Agnostic Features (5.21%)
- “Domain name length, TLD extension, domain HTTP response time, etc.”
(credit: Moz.com 2019)
Top SEO Categories
Based on Moz’s 2013 survey findings, 3 categories account for over 55% of the weighted SEO variables impacting website rankings. The first and second categories focus on links: quantity and quality of outbound links within page content, quantity and quality of backlinks (other websites linking to yours), and internal linking (page content referencing other pages within your domain). The third category focuses on keywords and quality content. This should be reassuring; it means you don’t have to be a coding expert to rank well in search engine queries. Even the fourth category, which accounts for almost 10% of the pie, focuses on content quality.
Black Hat SEO vs. White Hat SEO
Backing up slightly, there are two schools of thought behind approaching SEO: black hat and white hat. HBO’s Westworld has its characters choose if they are going to be the good guys (white hats) or the bad guys (black hats). This is basically what the naming is getting at here. Basically, black hat techniques are the ones looking to exploit loopholes in order to rank quickly. The issue with this is when an update comes along that closes the loophole, it will then penalize those who actively use it. White hat techniques are basically following the written rules verbatim (Google’s Webmaster Guidelines) and letting the search engine rank you over time. The issue here is it can take a while to see the effects, but you’re less likely to get penalized down the line with future updates. Now, nothing is ever textbook perfect, so we tend to end up in “Grey Hat” territory where there’s a mix of approaches being used and we bend rules without actually breaking them. Any company that says they’ve never been penalized is a company that hasn’t been in business longer than 6 months. Even if you’re a purist “White Hat,” the rules do evolve and eventually you will get hit.
There are any number of Black Hat techniques out there, but some of the more prevalent ones include keyword stuffing, content scraping, doorway pages, link schemes, link farms “creating pages with little or no original content, cloaking, sneaky redirects, hidden text or links… loading pages with irrelevant keywords, creating page with malicious behavior,” (e.g., phishing, viruses, trojans) and, “abusing rich snippets markup” (Neil Patel, OptinMonster and Google’s Webmaster Guidelines).
Cognitive SEO outlines some Black Hat SEO techniques and what they look like:
- Hidden Text: Text content visitors cannot see but is readable by search engines. Hidden text is defined as spam by all major search engines.
- Link Farms: Websites that hyperlink to all other sites in their group and tend to be automatically created. Link Farming is identified as spam by all major search engines.
- Keyword Stuffing: Meta tags and/or page content will be overloaded with keywords.
- Scraping: Plagiarizing content from other websites.
- Paid Links: Buying links on websites with the sole intent of increasing rankings.
- Doorways: Pages built specifically to spam search engine indexes with pages linked to specific keyword phrases and automatically redirect visitors.
- Cloaking: Different content is shown to the search engines than is shown to visitors for the purposes of artificially increasing ranking.
They also outline some Grey Hat SEO techniques and what they look like:
- Three-way links: Reciprocal links between multiple sites and pages.
- Article spinning: Rewriting existing articles to avoid plagiarism and maintain high-quality content that is keyword relevant.
- Buying old or expired domains: The act of purchasing others’ domain names that already have high domain name authority and search engine rankings
- Google Bombs and Googlewashing: The practice of creating pages with large numbers of links that flood the page with visitors originally searching unrelated terms. This tends to be used for comical or satirical purposes.
There are a few basics that will directly impact website rankings like page load times (including picture and video load times) and crawlability, but the larger-picture ticket items revolve around quality: quality of visitor experience, quality of content and topic relevance, quality of site structure, and quality of page and file names.
Crawlability, or the ability for search engine spiders to find each page within your domain, can be addressed by the site structure and site mapping. If a page exists but has no links to the root domain then spiders can’t index it and that page effectively won’t exist to search engines (regardless of how well-written and SEO optimized it might be). Site structures (mapping) should look like a pyramid, or like the “Pay It Forward” model (Moz).
File Naming and Extensions
Images and links should be well-labeled, meaning extensions (after domain name and slash) are readable and not a jumble of random alphanumerics and symbols. In fact, domain names themselves matter to some extent, even if it’s less so than most other aspects covered here. A domain name (e.g., websitemasters.com) that is too long, unreadable, mixes alphanumeric characters, or has multiple dashes will be ranked lower than a website with a short, concise, easily read website name with a known top-level Domain (e.g., .com, .net, .co) (Moz). Search engine friendly URLs that are short and specific tend to rank better than excessively long or include unreadable extensions (Search Engine Journal).
Conversion Rate Optimization (CRO or CR)
CRO is the ratio of visitors who complete a goal. A goal can be signing up for email subscriptions, reading an entire article, adding products to a cart, completing a purchase, etc. (Moz).
Local SEO is specific to businesses that deal with face-to-face services, have a physical location, or operate throughout a geographic region their services extend to (Moz).
Click-Through Rate (CTR) and Dwell Time
High CTRs can be achieved through quality titles, descriptions, meta descriptions, headings, and URLs (Moz). While CTR is important, so is dwell time, meaning how long a visitor sits on a page. “80% of marketers say video has increased dwell time on their website,” according to Hubspot research. Adding images helps with dwell time as well, not to mention that it chunks information, making it easier to consume. Buzzfeed is really good at using images, memes, and GIFs to increase dwell time and chunk their information.
“Chunking” is actually a cognitive psychology term and educational practice where information is parsed out into smaller pieces. Research in educational psychology reveals that chunking student assignments, readings, and tests lead to higher rate of information retention and higher test scores.
Tools exist specifically for researching keywords such as Google’s Keyword Planner, Moz’s Keyword Explorer, and Social Media (e.g., Pinterest). The general idea is to search your topic’s overarching theme or keyword, along with sub-topics in a variety of phrasings, then see what turns up and possibly search any resulting terms for additional branching. Search Engine Journal suggests starting with keyword research prior to writing content as it will help direct written content for rank ability.
Using your main keyword in the first paragraph is recommended, specifically within the first 100 words. However, a word of caution, keyword stuffing can be triggered within search engine algorithms if too many of the same term happens too frequently or isn’t fluent within the context. So, if the keyword does not naturally fit within the first 100 words, don’t force it. Search engine algorithms are increasingly intuitive (machine learning is extensive when it comes to fluent language and grammar), so don’t think you’ll outsmart the system by using your primary keyword(s) a dozen times in the first couple of paragraphs.
Semantically-related keywords, or variations of your main keyword, are highly desirable in SEO for the mere fact that it’s good writing. Have you ever had an English teacher say something like, “You need more variation in how you start your sentences.” Read a third grader’s attempt at 3 paragraphs and you’ll know what I mean. It will be repetitive, simple sentences. Variety is key here as well. Plus, doing so will demonstrate relevance to ranking algorithms.
Don’t worry too much about it though. It’ll happen naturally as you write. If you are concerned about hitting those related keywords throughout your writing, there are tools to help spark some ideas, like LSI Graph.
Relevant Headings and Subheadings
Your main keyword needs to be at the start of your title tag, and Search Engine Journal suggests trying to “keep it under 60 characters.” Moz has a title tag tool to help with this. Meta descriptions should be under 160 characters and include the targeted keyword with a brief, informative page description. Optimized meta descriptions are no longer used as a ranking factor in Google’s algorithm, but it is still important for CTR. It is also typically used in SERPs.
Optimized H1 tags involve the targeted keyword with key modifiers directly related to the content. This can technically be the same as the title tag without penalty. As for H2 tags (and any other headings), they are key to establishing hierarchy within our content and subsequently weighted value within search engines. Clear sections in logical orders are now a part of “quality” content factors. Using variations of the primary keyword can also increase the relevance and value of your headings. For example, if “New York City” is the primary phrase, secondary variations might include “NYC,” “Big Apple,” or even “The City that Never Sleeps.”
Links: Internal, Outbound, and Backlinks
Internal Links help both users and search engines navigate your site more effectively. If you mention something you have another page on, go ahead and link it.
Outbound links help establish relevancy and quality content to the ranking algorithms. Don’t be afraid to link out to other sites, especially those with higher Domain Authority rankings. Linking to cited sources or relevant subsidiaries helps search engines to better understand the relevance and specifics behind your content, likely increasing your ranking in the process.
As for backlinks, they’re great if you can get them. Any other highly ranked domain that links to you as a reference or source will effectively add to your authority and ranking strength.
Saving the most important for last, quality content. The value of your content directly impacts your SERP rankings. Always ask “Why this content?” It needs to have a purpose, from earning external links to educating a target population. Without the content, you don’t have anything. You can optimize every other aspect of your website, but if there’s no purpose or content, it won’t get ranked. Whatever the content is, it needs to be relevant to your purpose and goals. It needs to be well-written, utilizing varied sentence structure and proper grammar. Also, don’t forget to include links, semantically-related keywords, headings and subheadings, title tags, and meta descriptions.
If you are needing help with your SEO, have questions about SEO, or are thinking of ways to improve your current marketing plan, go ahead a schedule a FREE consultation with us by clicking here.