Uncategorized

200+ Google Ranking Signals for eCommerce and Content Sites running WordPress or Shopify

Whether you are running a content on WordPress or e-commerce storefront on Shopify, there are over 200+ ranking factors that Google takes into account.

This list is an aggregation of the ranking signals from all around the web.

As a former search engineer in Silicon Valley, I’ve expanded on these signals to explain from a technical perspective what to look out for in a deep dive. 

For those running an e-commerce store front on Shopify or Magento, I’ve added a special icon to take note of that have a high impact value. 

Likewise, sites running a blog on WordPress, there are a special icon next to be aware of. 

Bullseye - 200+ Google Ranking Signals for WordPress and Shopify
200+ Google Ranking Signals for WordPress and Shopify

Domain Factors

  • Domain Age
    • Age used to be a factor. Many domains that are for sale because of their age have existing backlink profiles that has aged, giving a domain its equity. Today, it’s less about the how old the domain and more important about the value of the content. Remember, Google’s goal is to answer a user’s query intent with the right answer in the fastest time. An old aged domain will not out perform another domain that provides in-depth content to a query intent. 
  • Keyword Appears in Top Level Domain
    • Today, the weight of a keyword in the top level domain no longer gives the boost as it used to. For long tail keywords with vey low difficulty, it can help with ranking above other low ranked domains. 
  • Keyword As First Word in Domain
    • The position of the keyword in the domain name has a slight boost. For different search intent keywords, the position can play a role especially if the query contains a branded keyword. 
  • Domain Registration Length
    • The length of a domain measured based on today’s date vs when it expires plays a role in domain authority ranking. 
    • Spammy sites generally will register for 1 year as they churn through them where as legitimate sites will purchase a domain for several years in advance. 
  • Keyword in Subdomain
    • Keywords in subdomains can also play a role in ranking. The boost are more visible for low difficulty, long tail keywords.
  • Domain History
    • A domain history can be carried over to a new owner regardless if it was expired. This can be both bad and good site history which makes it very risky if the previous owner were using spammy techniques such as buying links on link farms, content that was of low quality, or suffered from code injections.
    • Google wants to protect its users from sites that may have been used for phishing, spreading malware, or neglect maintenance that cause the site to be hacked.  And the easiest way for them is to lower the trust score while still indexing these sites. 
  • Exact Match Domain (EMD)
    • EMD was a simple algorithmic way for Google to boost brand specific search intent query. However, as Google began to improve its ability to utilize Natural Language Processing or NLP to read content like a human would, they are now able to understand content better. And now, valuable content trumps EMD when it comes to weighting. 
  • Public vs Private WhoIs
    • This is a low ranking signal as many commercial sites will utilize a private whois not because they have anything to hide, but many domain name registrar offers it for free. 
    • You may get preferential treatment for a public WHOIS if multiple signals from the site are added together to form a positive trust factor.  
  • Penalized WhoIs Owner
    • If a domain has been tagged to be spammy, Google can often cross reference that record against other records and IP blocks also owned. If enough domains by the same owner have similar patterns of being spammy, you could be penalized. 
  • Country TLD Extension
    • Country specific TLD assist Google to try rank a site specific for a specific locale. If you have a local specific business in another country, it may make sense then to register a country specific TLD. 
    • The drawback is that domain will not be able to rank as well on global search indexes. 
    • Google has indexes specific to each country and/or region as user intent query, browser language, and location of search origin plays into which Google index should it pass the query to. 

Page-Level Factors

  • Keyword in Title Tag
    • Today, keywords in title are not as important as it once was. It’s a relevancy signal yes. However, valuable content and structure trumps keywords in the title tag. 
  • Title Tag Starts with Keyword
    • Google today can take a search query from the user and measure the distance of the keywords found in the title tag. 
    • To emulate user behavior Google will try to mimic behavior of users who would click on a link because they read from left to right.
    • Hence a study by Moz shows that title tags with keywords that starts earlier in the title ranks better than towards the end.
  • Keyword In Description Tag
    • Google does use the description tag as a ranking signal. However the construct allows Google to surface pieces of the meta description for search queries where it’s unable to surface the context from the page content itself. 
    • This then drives up the click through rate (CTR) for a link in the Google search results page.
  • Keyword Appears in H1, H2, H3, H4, and H5 Tags
    • H[1-6] are structured markup tags for headings
    • Structured markup helps not only to organize content for users to absorb information but also for search engines to understand the page just as humans would.
    • There’s been studies where content outlined with heading tags and keywords in those headings are ranked higher than headings without keywords
  • TF-IDF
    • Short for term frequency inverse document frequency. 
    • Google takes a user’s search intent query and passes to a process known as a query planner. 
    • The purpose of a query planner is to try to understand what the intent of the query is based on multitude of information such as user’s location, the Google site they are on, IP address, device, and more. 
    • From that, they can determine if the user is looking for information about a specific sports team and their score or maybe information about the sports team. 
    • Once it knows what the intent is and which indexes it needs to retrieve documents from, it gathers up the documents what is known as a collection. 
    • Next, it analyzes the search query again and uses a more advance version of the TF-IDF to analyze and weight each document to return the score of how important that document is to that query. 
    • The not so advance TF-IDF would allow spammy pages to keyword stuff causing the document to rank higher. 
    • However, Google has tuned TF-IDF to use the score as one signal for ranking the value and relevancy of pages in a collection to the search intent. 
  • Content Length
    • In the past, you once could get by with 500 words pieces of content. 
    • Today, now that the web and human behavior has matured, Google has determined more valuable pieces of content are generally have more content that are allows one to read a piece of content that goes in deep. 
    • Web users now will rather read a high quality, in depth piece of content than to bounce between multiple pieces of medium quality content that doesn’t go in depth.
    • Additionally content that are in depth will provide more breadth in the coverage of keywords that leads to more discoverability and backlinks. 
    • And with more back links and more search volume a page can receive, it increases other ranking signals such as click throughs and dwell time. 
    • As a result, content on average of 2000 words have a higher position in the top 5. 
  • Table of Contents
    • Google tries to mimic as much of user behavior when it analyzes a piece of content structure. 
    • Humans want to see the 2000 word piece structured with appropriate H1 and H2 tags and quickly anchor to any part of the page through a table of content. 
    • Google will use this the table of content to understand the paths to these pieces of content through the table of contents.
    • An added bonus is that table of contents will surface as part of Google’s siteslinks which enhances your page’s listing in the search results page of Google. And that in turn takes up more real estate and drives more click throughs. 
  • Keyword Density
    • Part of TF-IDF, Google uses a more advance algorithm to score a page. However, too much of a keyword or if the content is not readable by a human, Google can now penalize a page of being too spammy in the Google Search console. 
    • Google’s machine learning NLP tries to mimic humans and if a human believes a page looks too spammy with keyword stuffing, so can Google machine learning NLP. 
  • Latent Semantic Indexing Keywords in Content (LSI)
    • In the early days of Google and other search engines uses LSI to better understand the purpose of the content. 
    • A page containing keywords that could have more than one meaning. Is the content talking “cars” the vehicle, the Disney animated movie, CARS (Canadian Association for Rally Sports), CARS (Canadian Aviation Regulations), or The Cars music band
    • Once Google has process the content of the page, it understands the overall topical scope of the page that it adds as part of the meta info on that piece of document. 
    • Having clear concise content on keywords helps Google to understand and rank the page higher for relevant search intent queries. 
  • LSI Keywords in Title and Description Tags
    • LSI keywords in meta tags are low quality signals that does not trump over LSI found in the content. 
    • None the less, it helps user who sees it in the search result page which CARS are you referring to in the title tag that Google indexes and surfaces
  • Page Covers Topic In-Depth
    • Content that goes deep and in-depth ranks higher just by nature of having a wider keyword foot print and content that is shareable due to how valuable it is. Shareable content drives back links from high domain authority sites and builds page equity which increases the page’s ranking across multiple keywords. 
  • Page Loading Speed via HTML
    • The PageSpeed Insights tool by Google is to be taken with a grain of salt. The tool tries to normalize a user perceived performance into 1 number. 
    • Your goal should be analyze if your site is loading any spinners that blocks content from rendering, to understand the “why”. Many themes and templates are beautiful with animations and videos that are “cool”, but blocks the user from getting the answer they came to your site in the first place. 
    • Google crawls sites in two waves, the first wave is what is called server side rendering. That is all the text it see without any executive any JavaScript. 
    • Once Google Bot craws the content, the content goes into a queue to be later indexed.
    • The second wave is the client side rendering. This process is resource intensive because it has to simulate a browser opening a page, waiting for all those JavaScript to fetched and executed, CSS to render, images to be loaded, and wait long enough that it thinks the page has fully loaded. And if the page takes 25 seconds to fully load for a user, not only is it a bad user experience to a human, Google will penalize that site for being too slow. 
    • Google in the end wants to reward site that loads fast because when a user leaves the search result page to view content, they want the answer fast. 
    • Additionally, Google resources are finite which they impose a crawl budget on every site based on the category. 
    • If the site is loading too slow, it eats away from the crawl budget that Google has allocated to your site. This means, your content may not be indexed completely until GoogleBot has freed up enough resources to come back around. 
  • Page Loading Speed via Chrome
    • Some Chrome users may have opt in by default to report information back to Google. This is how the Google CrUX data is generated. 
    • This provides RUM (Real User Metric) data on how well a site performs in the field
    • This enables Google to collect data such as dwell time and bounce rates without a site having to install Google Analytics. 
    • This is just one of many data points that Google samples anonymously without passing any PI information
  • Use of AMP
    • Google has been trying to push sites to use AMP. There’s not been any study that shows AMP powered sites get an extra boot in ranking.
    • Instead, AMP is more about user experience, improving the site load performance by hosting the content on Google’s edge network. 
    • This improves Google’s ecosystem with a feedback loop that makes users happy when they are search on Google on slow networks. 
    • If your site focuses on news, you may benefit the use of Amp in Google News vertical which content ranks higher in that search index.
    • The trade off with AMP is that it limits some features and capabilities in terms of user experience that might more value to your website than an AMP powered page. 
  • Entity Match
    • Each search a user makes goes into a query planner. The purpose of the query planner is to understand the intent and match what sources of indexes it needs to query for that entity. 
    • For example, a user querying “black matte lipstick”, the query planner may decide that it has shopping intent, image intent, video intent, and search intent. 
    • And then it sends the request off to the shopping, image, video, and search indexes to return back the corpus of data for each index to be rendered on the search results page. 
  • Google Hummingbird
    • This codename given by Google was a significant change to the algorithm in 2013. The shift placed emphasis on natural language queries and the content of individual pages of a website. 
    • Today, content writers are encouraged when it comes to optimizing their site, to write naturally than to force keywords. That is, providing content that delivers value and goes as deep as possible on a specific context.
    • Additionally, valuable internal linking around contextual information is encouraged as it not only keeps a visitor engaged, it’s a pattern that Google can identify a well thought navigation that when a visitor lands on the site, the experience would delight them. 
  • Duplicate Content
    • Once a strategy used by SEO experts during the early days, Google today is now able to identify spamming pages in a website and alert you in the Google search console. 
    • Duplicated content tries to achieve a wider foot print in the index by creating multiple entry points, but it also fragments the link equity of pages that these pages will often times not rank on Page 1 thus many experts assume this is a mark of penalization. 
  • Rel=Canonical
    • In e-commerce, a site may use Rel=Canonical tags to identify the canonical URL that Google or Bing bot should use to index and surface. One use case is when a site has a mobile flavor using “m.” subdomain to identify a mobile specific experience that is almost identical to the primary desktop experience, usually with “www.” subdomain. 
    • To avoid Google from indexing both, the “mobile” flavor can hint to Google using rel=canonical tag to not index this page and instead use the desktop version instead. 
    • This doesn’t improve the site’s crawl budget as Google still has to crawl the mobile site to see these tags in the first place. 
  • Image Optimization
    • Images provides visual context to the end user and to the text surround it. Having a well named image, width/height, title, alt text, and caption improves the site for accessibility and ADA compliance. 
    • When all meta information around an image is used, Google is able to better determine the context of the image and what it is about, and index the images accordingly into Google Image Search index which can provide another way for users to discover your site.
    • This is extremely valuable for e-commerce sites that are catering to information intent queries like “black matte lipstick” which Google search images will put forth imagery around that query into real estate that is above the fold, above organic search listings. 
  • Content Recency
    • Google Caffeine will often times bump content to the top of a SRP if its recently updated. This makes sure that when a user searches and the intent appears that they are looking for the most up-to-date piece of content, having text in the title that is every green not only lets Google know the content was updated recently, but the user who is scanning the results. 
    • However, after a day or so, recently publish content will drop in ranking as it begins to age.
    • As a good rule of thumb, its important for content creators to revisit their content every 4-6 months, identify what keywords the page is ranked for, its position, and refresh the content so that it remains relevant. 
  • Magnitude of Content Updates
    • Content that is refreshed on a routine basis trains GoogleBot to come back and crawl the site often. 
    • The type of updates are those where content is refreshed for ever green for example, content that was relevant in 2019 are updated for 2020. And adding new pieces of content to the page. 
    • Small changes like typos is not strong enough of a diff between two copies that signals to Google that this page was refreshed.
  • Historical Page Updates
    • Frequency of page updates plays a factor. Not only does the page have been “seasoned” which sites backlink into, building equity, a page that is refreshed lets Google know that in this corpus or collection of documents for Google search intent query, that this page may have some net new information other sites remain stale on. 
  • Keyword Prominence
    • The location of target keywords are important for ranking and has been determined that it impacts ranking to reach page 1 of Google. 
    • Moving keywords into the first few hundred words has been generally found on many studies of content on page 1 for many queries.
    • The reason is that normal humans when they click on a search result listing, they are scanning the first few hundred words if they got a general answer to their query before they dwell any more time on the page. 
  • Keyword in H2, H3 Tags
    • Keywords that appears in the H2 and H3 tags is another signal of a well structured page.
    • A human would scan these headings before diving into a paragraph of text. 
    • From the point of view of Google, these tags in the HTML document helps not only users but Google to understand the page’s structure.
  • Outbound Link Quality
    • Outbound links provides an avenue for visitors to leave the site to continue their research. 
    • This is a trust signal for Google that this document was researched and cited by other trusted sources. 
    • While a content producer may worry that off site navigation can impact dwell time, a well researched document will actually improve dwell time as users will bookmark the page later or come back to it. 
    • These outbound links can leak out link juice as some folks in the industry likes to call it. That is, every page has a limited amount of link equity. 
    • To avoid the loss of link equity, outbound links can add rel=nofollow attribute that tells Google the relationship of this link to a linked page that GoogleBot should not follow and only is intended for real users to click and follow. 
  • Outbound Link Theme
    • If the link is missing rel=nofollow, GoogleBot can crawl naturally to the targeted page off on another domain and understand the relationship of two pages.
    • The target page may have more specific theme around the movie “Cars” by Disney that together, your page and the target page is going to score higher in the topical authority of the Disney’s “Cars” movie than just automobiles in general. 
  • Grammar and Spelling
    • Grammar and spelling improves usability. That is, while indexing side of Google doesn’t really emphasize on how well your grammar or spelling is, a real user may.
    • Depending on your target audience, if they read at a 6th or 7th grade level, the content fixed for mistakes can improve dwell time and reduce bounce rate which in turns improve the ranking. 
  • Syndicated Content
    • If content on the page are aggregated from another source, it’s considered duplicated. Duplicated content that provides no real value then the original source will not rank well or indexed at all. 
    • It is very difficult for an unoriginal piece of content to beat another piece of content that it copies from because the original has been aged in the index and have built link equity that when both documents appears are return in search intent query, the ranking algorithm will score various properties and weight two pages, the winner will be the one that has been around for a while. 
  • Mobile Friendly Update
    • “Mobilegeddon” was a code name for an update that happened on April 21, 2015. 
    • Google learned that that there was a shift on user behavior is moving from desktop to mobile. 
    • The behavior of a search query on mobile is to try to answer a question fast, but not used for research paper like on a desktop. 
    • This means, sites that are mobile friendly are ranked higher than pages that are not when analyzing a collection or corpus of documents in the mobile index.
  • Mobile Usability
    • Google can now analyze a page to determine if it renders on a device that is mobile friendly. This means, are the text large enough to read on a mobile screen? Are buttons actions far apart with enough breathing room that users can navigate easily? Does the text and navigation resize without bleeding over the view port causing large amounts of white space?
  • “Hidden” Content on Mobile
    • Many themes will use a concept of “adaptive” design. This means, on mobile, the navigation for a desktop or fly out menus may be hidden or not visible
    • In these cases, hidden content will not be penalized, but also doesn’t add value either. However, if there appears to be large copious amount of text shoved and hidden, it could appear spammy.
    • Therefore, it’s recommended that if the content is of value, they should be visible. 
    • Likely candidates for non visible content would be content like accordion effect or navigation fly out menus. 
  • Helpful “Supplementary Content”
    • Tools embedded in the page that provides solutions to problems is another key signal that the document isn’t all informational. 
    • Such tools are valuable to users thus Google will weight this as a positive signal.
  • Content Hidden Behind Tabs
    • Content hidden behind tabs, accordion, or fly out menus may not be indexed because the content are sometimes driven dynamically by Javascript.
    • In the past, GoogleBot only crawled and index in 1 wave, server side generated markup only. 
    • Today, GoogleBot can now crawl and index client side rendered content. However, its tricky that the site must perform and load quickly for GoogleBot to discover semantic anchor links or buttons to reveal these hidden areas. 
    • If your hidden content is not indexed, the first thing to look for is if the markup to reveal is an <a> tag. <a> or anchor tags hints to Google that the link is clickable, and Google’s client side renderer will attempt to click and observe the changes in markup that happens after. 
  • Number of Outbound Links
    • Dofollow links leaks page juice or PageRank because the relationship strength of two documents are moved from source to target. 
    • To provide a great user experience, it’s recommended to use rel=nofollow to avoid the loss of link equity or link juice
  • Multimedia
    • Embedding images, videos, and/or tools enhanced the page as another signal.
    • Multimedia content not only improves the usability of page, it increases dwell time that is picked up by the browser or Google analytics
    • Additionally, when a user clicks back, the difference in times between the first click event pixel to your page and the second click event can be used to determine if the quality of content answered the user or did they click to the next page in the SRP to try to get their query answered.
  • Number of Internal Links Pointing to Page
    • Internal links is a sign of a well built content silo. That is, the a collection of content formed around a topic gives this site a specific topical authority.
    • Additionally, this improves dwell time on a site as real users can expand their research by going for breadth or depth into more content that the site links in bound into.
    • Internal links releases trapped equity and strengthens the relationship of the target page to let both a real user and Google know that there is value to the targeted page.
  • Quality of Internal Links Pointing to Page
    • Each page has PageRank score. While PR score is claimed by Google no longer used, there is none-the-less a score a page has defined by multiple positive and negative signals. 
    • A new page without any internal link from high quality internal pages will have a low score vs a page receiving trapped link equity from a page with a high authority score.
  • Broken Links
    • Broken links is wasteful from the point of view of crawl budget
    • Google has finite resources on CPU, memory, and bandwidth that it can send to spider a site. Hard broken links that returns 404 shows a site has been neglected. Soft 404s which are pages that returns 200 response codes, but appears to be a general page like product not found or out of stock, is treated as wasteful. 
    • The ratio of these types of links that leads no where vs real pages with real value is another signal that factors into the domain quality score. 
  • Reading Level
    • Reading level may or may not be a ranking signal directly with Google
    • There are studies that its correlation between content written and consume for a 6th or 7th grade reading level but is not a fact for causation. 
    • Pages that are easy to consume that are not too technical can improve the bounce rate and dwell time that affects ranking. 
  • Affiliate Links
    • Affiliate links are fine if there are substance to the content around those links. That is, if a site is loaded with links, Google may not rank the site as well as another affiliate site that researches an affiliate product thoroughly.
  • HTML errors/W3C validation
    • Invalid markup makes it hard for Google Bot to understand semantically what a site is about. It can expend the crawl budget trying to parse content and build queues of text it needs to analyze later for indexing and ranking. 
    • For example, many developers may choose to use a <div> or <span> tag that is clickable to reveal more information. This is semantically incorrect as the purpose of these tags are not for clicking. 
    • Whether its Google Bot’s server side or client side indexing discovers these tags, they are not queued later to be clicked and thus, text that are surfaced to the end user cannot be seen and followed by Google Bot
  • Domain Authority
    • High domain authority sites are generally ranked higher than those without. Building domain authority requires back links from other high domain authority sites.
    • Domain authority is also topical meaning you simply cannot purchase an expired domain and change the content to rank for another topic. The reason is that you will lose valuable back links that helps Google to understand the theme of source and target pages. 
    • Building skyscraper content or large lists on a topic can help with building domain authority as it naturally will get other sites to build links into the page.
  • Page’s PageRank
    • Each page has its own score. This doesn’t mean that a page with a low score cannot rank on page 1 of Google. It just means it is really hard or won’t stay on page 1 long enough. 
    • PageRank score can be simply be the number of unique domains with high domain authority and the number of unique pages with dofollow links and high page authority that are inbound to a page.
  • URL Length
    • Long URLs can appear as spammy when surfaced on Google’s SRP. This negative feedback loop by real users to avoid pages whose URL looks spammy will generally make a user go to the next link on the SRP that looks more trusted.
    • Therefore, studies shows lower number of characters in a URL have a tendency to rank higher than longer URLs.
  • URL Path
    • Structured pages that are 1 or 2 level deep have a small positive signal than sites buried down in the breadcrumb trail.
  • Human Editors
    • Humans do review a site when needed. Sometimes Google Bot may flag a false positive in Google Search console that a page appears to be spamming the search engine or appears to be hacked for content injection. 
    • You can submit the URL for review that only a human can evaluate and subjected to remove the penalty. 
  • Content silos
    • Content silos are a category of pages that have topical authority. If pages collectively are similar in nature, all pages can receive a a positive ranking signal for search intent queries on that query. 
  • Cloud Tags
    • Tags provides additional context and clues for Google Bot to discover more content that are related.
    • Tags are a simple way to categorize sets of content on a site
  • Keyword in URL
    • This is another relevancy signal which provides a small benefit.
    • It adds value because the keyword are sometimes surfaced in different indexes by Google to the end user. And when an end user sees a keyword in the URL, they are more inclined to click on it. 
  • URL bread crumbs
    • Breadcrumbs is a semantic way for Google and users to understand what a page is about. That is, an e-commerce site such as Macys the sells products from beauty to house hold items, breadcrumbs assists with categorizing what a page is about. Is the page all about bedding for the bed room or more specific, a wok for the kitchen?
    • This will provide better ranking for specific user inquiries. 
  • References and Sources
    • In the early days of Google, Google was the source for indexing research papers with cited, trusted sources.
    • Today, it still is. Trust quality scores, domain authority, and page authority all works together in symphony to rank content.
  • Bullets and Numbered Lists
    • Lists are great for usability for users to scan for what they are looking for. This improves time on page which is a ranking factor. 
  • Priority of Page in Sitemap
    • Sitemaps are one way for a Googlebot to discovery content to crawl and index fast. 
    • However, if a page is not reachable naturally from the root of the domain, Google can also rank a page lower as it may consider pages to be doorways intended to create a larger footprint in the index. 
  • Too Many Outbound Links
    • Outbound links such as directory listings can be negative because it impacts time on page, higher bounce rates, loss of link equity, and generally these pages are constructed to be of poor quality of content as it didn’t provide the answer to the user’s query to get there in the first place.
  • UX Signals From Other Keywords Pages Rank For
    • Content silos ranks well because they are focus on a topic that can be expanded on a topic and can go as deep into a topic. 
    • This builds thematic concepts around a collection of pages where sets of keywords all pages are ranked for can assist with ranking all pages in the topic. 
    • This also means the content creator has acute knowledge about this topic.
  • Page Age
    • Google prefers to see fresh content that will place it at the top of a SRP. Over time, the age of a page or sometimes called seasoning will drop the ranking through a decay algorithm.
    • To avoid the page decay factor, content should be reviewed and refreshed every 4-6 months, not just for typos and grammar check, but for new or outdated information on the topic. 
  • User Friendly Layout
    • Now that Google has rolled out mobile first indexing for new domains, this means a fast and friendly user layout should take precedence. 
    • Understanding the core user experience should render above the fold whether on desktop or mobile is important ranking factor. 
    • If you can deliver the core value of why the user clicked to your page in the first place by high quality content above the fold, the trickle down effect of other ranking factors kicks in. 
    • This is extremely important when selecting a theme that maybe too flashy with animations, spinners, widgets that weren’t well put together that can delay a page from loading fast enough especially on mobile. This can cause a user to exit the page when connected on slow cellular networks or low end smart phones.
  • Parked Domains
    • Parked domains provides little to no value, thus they are generally never found on page 1 of any search query. 
  • Useful Content
    • Quality vs Usefulness is an often topic that comes up. In the early days of Google, research papers index have high quality because they are written by a professional. Often times, they are too hard to read which deters the usefulness. They are often times too academic in nature that a regular Joe who clicks on the first link in the SRP may not able to fully understand the document. And this negative feedback loop is what Google wants to avoid if they want people to continue to use their search engine. 
    • A page that delivers useful information that anyone can immediate gratification and value is a positive feedback loop and that in turn impacts other ranking factors on page. 

Site-Level Factors

Content Provides Value and Unique Insights
Content that do provide value are not necessary penalized, they simply do not rank as high. As the web and users mature, building high quality pages with valuable information and content should be first and foremost.

Contact Us Page
A contact us page that is part of the overall site architecture provides context to Google if the site is trust worthy and should be it included as a local listing. Imagine if you are lawyer or doctor office, having a contact us page shows legitimacy and that you are catering to a local community. So with the address information found on the Contact Us page, Google can predetermine when a user intent query is, “dentist near me”, it triangulate and user geo location to understand where the user is, and surfaces relevant sites of dentist office in their area.

Domain Trust/TrustRank
Trust is another important factor. It continues to monitor sites during its crawl for any signs that it might be phishing for information, its been previously hacked, are there other authoritative sites linking to it, and more. The worse thing for Google’s ecosystem and feedback loop is to ignore trust and surface a page that a visitor from a SRP maybe expose or vulnerable to a website that have a low trust factor.

Site Architecture
There are two primary reasons for a well structured site, one, to create topical and themed content silos that Google understands what the site is about. The second reason, naturally Googlebot should be able to crawl without the need of a sitemap. Should not be able to crawl naturally through semantic markup, it may raise a flag on content that it discovered inorganically via the sitemap that a normal user would not be able to get to.

In early days, ghosting pages or doorway/hallway pages was a technique by black hat SEOs to spam the search index with a higher foot print and rank for keywords through keyword stuffing. The gaming created GoogleBot to work harder to crawl pages which is why crawl budgets are enforced and ghosted pages are penalized both at the page and domain authority level.

Site Updates
Web site updates impacts specific parts of which index its added into. Google News prefer freshness and you can generally immediate value for ranking before the decay factor kicks in.

Same goes for new content pieces.

While Google denies frequency is not a part of their algorithm, it may very well not be part of their core algorithm. None-the-less, there has been studies that refreshed content do make the top of long tail keywords.

Presence of Sitemap
Sitemaps assist Google for discovery pages as they are added or removed, but for optimize ranking, its not a factor.

Site Uptime
Google Bot again has finite resources and there are a queue of workers that sends a both in two waves to crawl both server side and client side rendered content. If the site is not available at the the time of the crawl, Google has to skip and go to the next site in the queue. It schedules a time in the future to come back at a later time.

If a site is constant unaccessible, it can hurt the ranking as deems a bad user experience for page to continue to rank on page 1 and then for a user to click on the link and see the site is down.

The bounce rate effect are factored into as a signal further penalizing the page from staying ranked.

Server Location
The location of the server is important for Google to crawl the site quickly. A site hosted in Europe and a bot is coming from North America must make a network hop over the ocean causing performance issues and latency. This eats away from the crawl budget of how much time it should wait for the page to the load.

Additionally the IP address block helps Google to understand the next go around, which GoogleBot should it send closer to the origin host to make crawling faster.

The IP block assists with identifying which index should the content be ranked for. For example, an IP block that has clear ownership to a data center in Turkey, then the document maybe more relevant to users in Turkey searching for content.

SSL Certificate
SSL certificates are cheap and/or free to get. Google wants the web browsing experience to be safe and wants to avoid man-in-the-middle attack especially as users migrate to a mobile first browsing experience on public WIFIs.

A site without a certificate are factored into the TrustFactor on whether the site is trust worthy and can be ranked lower than another site that took security seriously.

Recently, Google Chrome 80 released in early February is going to be enforcing same-site strict and secure cookies. A site can only be secured with a SSL certificate and for continued to functional properly. Otherwise those cookies will not be dropped and that could impact site functionality and/or conversions.

Terms of Service and Privacy Pages
Terms of Service and Privacy Pages are two important pages that every site should have as legislation are coming down enforced through GDPR and CCPA. Many states in U.S. having general regulation around privacy and many are waiting to see how the CCPA enforcement is going to impact businesses before fast following.

Google has shifted much of their business in light of CCPA to be categorized as a data processor and not marketing.

Many of their strategies across consumer touch point experiences such as Google will be privacy first and sites that are in compliance will have a high trust factor and positive ranking signal.

E-A-T or Expertise Authority Trustworthiness is one of the core changes to Google algorithm in August 1. Having these two pages will improve E-A-T.

Duplicate Meta Information On-Site
Many e-commerce sites like on Shopify or WordPress will generate a standard meta tag for all new product or blog pages.

As a content producer, unique meta title and description helps with the semantic understand of what the page is about from Google’s POV.

And when the page is indexed, its the title tag that is surfaced on the SERP which should be unique and specific to what the page is about.

If duplicate title tags are generated, the site can be considered spammy and ranking is affected.

Breadcrumb Navigation
Breadcrumbs provides a hierarchal way to understand the relationship of pages. We generally call it taxonomy that takes a generic root level page and narrow it down to a very specific piece of content through breadcrumbs.

These breadcrumbs also is surfaced as sitelinks on Google’s SERP which takes up more real estate and improves your site’s visibility against other links.

Mobile Optimized
In 2019, Google announced mobile first indexing for new sites. Old sites that were one crawled with a desktop crawler are now switching to mobile first.

It’s now surfacing warnings and/or errors in Google search console if a site is not optimized for mobile.

Additionally, sites should be mindful what plugins or apps they are adding that may improve conversion, but may also cannibalize visitor traffic and signals to Google that lets them the know the page is loading quickly.

This is extremely important for mobile sites that are slow to load and render.

YouTube
Video widget takes a large portion of the SERP’s real estate for discoverability.

New videos published are pushed to the first few positions the video carousel for a search term before it begins to decay.

Shopping
Ecommerce site should take heed of the shopping widget on Google. For buyer intent queries, the shopping widget is surfaced above the organic search results.

Google owns the shopping section and is in their best interest to keep people on the Google experience as long as possible without sending them off to another domain.

To list your shop, you must send to Google a feed of your site’s catalog.

Site Usability
It’s more important to understand the information architect (IA) of a site when it comes to usability. Many themes are too rich and doesn’t help user who are confused on how to navigate the site to understand what they need to do next.

Frustration sits in and that could impact your bounce rate, dwell time, and page views which are all signals to Google of a bad user experience.

Bad user experiences is not a good feedback loop for Google and thus will penalize sites for it.

Favicons
Favicons are now displayed on SERP which gives a site their brand identity. Brand identity can sometimes be powerful way to to improve the CTR. However, there is no coloration that improves ranking.

Use of Google Analytics and Google Search Console
Through the use of Google Analytics, Google can mine more specific behavioral information of how users are interacting on the site. The Google search console simply reports.

Since Google Analytics is just one piece of analytics that Google can use to understand how people behave on your site, it doesn’t hurt not to include it.

It is important for those running Ad Words for the same targeted pages not to bid too high that could cannibalize organic traffic.

This potentially not only lowers your margin, it affects the CTR performance of a high quality page.

User Reviews / Site Reputation
User reviews or user generated content (UGC) provides value to the visitor to make a informative buying or researching decisions. Studies done by e-commerce SEO teams have found that reviews do not improve ranking.

Accessibility (ADA Compliance)
Accessibility or ADA compliance improves a site for the disabled. Screen reading is important.

Besides getting sued for not being ADA compliant, sites that are in compliant are seen to rank slightly better.

Backlink Factors

Linking Domain Age

An aged domain can have valuable domain authority and back links that can expedite new content to rank from existing back links. However, if a domain has been expired for too long, all the historical information would be reset.

# of Linking Root Domains

# of unique domains that is back linking that are high authoritative and with dofollow attribute on links has been correlated to provide a higher ranking position in the top 10.

# of Links from Separate C-Class IPs

Link farms generally will work with multiple domains hosted on the same server or from the same hosting provider in the same class c of IPs. Natural link building comes from separate blocks of c-class IPs that helps with ranking signal

# of Linking Pages

Each page linking has specific amount of page equity and domain authority. Links from different pages on the same domain has an effect on ranking in the SERP.

Backlink Anchor Text
Early days of SEO, the context text that is encapsulated in an anchor text provides information on what the link maybe is about. While today, this is not as important as the value of the target content, it still none the less a signal that can improve ranking.

Over stuffing the anchor text with keywords can also send a negative signal.

Don’t over think this one.

Alt Tag (Form Image Links)
Alternate text provides screen readers context on what the image is about from accessibility standpoint. Google will leverage this to understand more about the image combined with the name and context around the image to rank it in image search.

Care for adding alt tags is another signal that positively impacts ranking.

Links from .edu or .gov Domains
While .edu and .gov use to provide importance in the early days of Google when documents were to be trusted, today, the value of getting back links from these TLDs aren’t as strong signals anymore.

There are plenty of cases where sites are ranking just fine without these TLDs.

Authority of Linking Page
Early days, PageRank score was a score that Google use to determine the strength and authority of a page. A high authority page that links in bound to your site improves your page rank and authority.

Authority of Linking Domain
Each domain has a domain rating and trust factor. Having unique domains with a high domain rating or score has been correlated to improve ranking a page and thus is a ranking signal.

Links from Competitors
This is questionable because links from competitors provides clues to Google what the page is about or the niche the site is in. This provides thematic or topical understand on what the pages are about so they are ranked with the correct search intent queries.

Links from “Expected Websites”
Some experts believe that if links garnered from other sites in your industry, it builds up trust.

Links from Link Farms
This black hat technique may work and get your content ranked quickly, but if you are working on the long game, link farms are generally easily to detect and easily can penalize all sites linked from those link farms.

Its best to avoid link farms or questionable sites that are irreverent to your site.

Guest Posts
Guests posts are easy ways to get sites to link back to you. However, Google is now able to pick up clues on guest posts that explicitly calls out the post as “guest post” and doesn’t provide as valuable link strength as it once before.

Large scale guest posts can put you into a penalty box as man of those sites have questionable content or copy that reads differently from post to post.

Links from Ads
Link from ads are generally nofollow by default. Google can generally discern what links are from an ad due to the nature of redirects and patterns of many of these advertising networks.

Homepage Authority
Receiving back links from the home page are valuable as many root level pages have large amounts of page rank equity that it can pass on.

Nofollow Links
Google does not follow nofollow links because there are explicitly telling Googlebot not to follow, crawl, and index those links. Therefore, Google is not able to establish a link between two pieces of content and leak equity.

Diversity of Link Types
Organic back link profiles are diverse that some come from blog posts, guest posts, comments, forums, and more. If a site receives too many back links from comments, it can be considered SPAM even with a nofollow attribute.

“Sponsored” or “UGC” Tags
Anchor links with “rel=sponsored” or “rel=UGC” maybe using as “hint” for crawling or indexing, and as a”hint” for ranking per Moz.

Contextual Links
Links discovered in a context are more valuable than links on a standalone. Imagine if a human read a piece of great content that has hyperlink in the content vs a hyperlink floating on a page. Which one looks more legitimate and higher probability of click throughs? Google is able to understand the context in and around anchor tags more than a links without any context.

Excessive 301 Redirects to Page
301 redirects helps pass off link equity. However, too many redirects is a bad user experience due to multiple round trips where link equity can get lost on each round trip. Therefore, review the links to make sure it takes the least amount of hops from point a to point b.

Internal Link Anchor Text
Many major retail sites such as Macys.com and Walmart.com have link blocks that are designed for the purpose of passing trapped link equity to pages that needs it most. While many SEO experts may say internal links are weighted differently than external links, the problem is over building of internal links. This dilutes a page’s link equity or page rank.

Thus other experts have found that using “striking distance” SEO strategy is the best use for internal linking strategy than using a related post widget.

Link Title Attribution
Link title attribution is another signal that is used for ADA compliance or accessibility. Therefore, it provides a relevancy signal for users that maybe disabled.

Country TLD of Referring Domain
Links from country specific TLD often times will help you to rank better in those country indexes assuming the indexes are not shared. For example, Google.com is shared in both U.S. and Mexico in terms of indexing. The different set of results depends on the preferred language of the browser, IP of origin, and if the user landed on Google.com.mx or Google.com.

Link Location in Content
Links that are discovered early on in the content have higher strength signal than links positioned later on the document.

Link Location on Page
Links can be found in different part of the page’s layout. Many WordPress sites will have links found on the side rails. Those will have lower strength that links found in the primary body, front and center.

Linking Domain Relevancy
Back links from sites in the same topical authority are more valuable than links coming from sites that are unrelated.

Page-Level Relevancy
Links from pages that are relevant to a topic that it links to are more valuable than links from unrelated page content.

Keyword in Title
If a site anchors a link to your page with keywords in both the anchor text and in your title, the signal is a strong ranking factor.

Positive Link Velocity
Links build with natural link velocity often times will see higher ranking. This is part of Google’s decay algorithm that re-evaluates content to be very relevant that centers around trending or seasonal queries.

Negative Link Velocity
Backlinks can be lost after domain is expired or migrated. They can negatively impact page ranking.

Links from “Hub” Pages
Hub pages are pages that are directories. Directories such as Yelp are in their own classification thus receiving a back link from Yelp can categorize your site as a local listing and appears differently for locale specific queries.

Link from Authority Sites
High authority sites with high domain rating and page score will receive more link juice to not only boost a target page but also target domain.

Linked to as Wikipedia Source
Links from wikipedia are nofollow, but the by product is that other sites use wikipedia for research and generally will link out to sites cited in wikipedia passing of link equity.

Contextual text
Contextual text surrounding a backlink helps Google to better understand what the piece of content it links to is about. This why inline links are better than floating links.

Backlink age
Back links are aged with a decay factor. The longer the back link stays, the better the page ranks.

Links from Real Sites vs “Splogs”
In the early days, link farms would use sites such as Del.icio.us or Blogspot to build authority quickly in new link farms or site. Today, Google is now smarter to ignore these sources as valuable link sources and instead emphasize natural linking from real sites.

Natural Link Profile
A black hat SEO expert will have a tendency to go over zealous building links too fast. It only takes a few link farms to get caught by Google to penalize everyone receiving back links. Instead focus on natural linking strategy which comes from well written content that are of value.

Reciprocal Links
Private link farms generally will ask members to participate in reciprocal links. Since these link exchanges are in different niches, getting caught is much easier and should be avoid at all cost if you plan to have your site around for multiple years.

User Generated Content Links
UGC content links such as those found on forums and comments are not are of high value in comparison to links found in content pieces.

Links from 301
301 redirects are ways to help GoogleBot to crawl content and index faster. Each hop, link juice is lost. There’s generally no way around it other than to avoid too many redirects or sunsetting pages without any real reason.

Schema.org Usage
Microformats is another way to learn more about page and index specific set of data differently. An e-commerce site may want to serve pricing and reviews next to their listing on SERP where as a events site may want to index events that can go stale quickly.

Each of these meta micro formats are rendered into SERP differently and take up more real estate than a standard index thus you receive higher click throughs, more dwell time, and eventually higher ranking as a result of these positive feedback loop.

TrustRank of Linking Site
An example of trustworthy site is New York Times or Washington Post. When these sites link to you, they are assumed to be very trustworthy. That is passed on to your site giving your domain credibility.

Number of Outbound Links on Page
PageRank or sometimes called Link Juice is finite. Each hop, a little bit of link juice is lost as its passed off to a target page. Therefore limit the amount of outbound links or use rel=nofollow to avoid lost of link juice while still maintaining a great user experience for your site visitors.

Forum Links
Google has caught on to spamming forums to get links. Most links are rel=nofollow which passes no link equity.

Word Count of Linking Content
Content length is more valuable as it expands the keyword footprint, topical authority, depth and breadth, engages user to spend more time, and improve back linking. All of these positive signals stems from higher word count of linked content. Aim for a minimum of 2000 word posts.

Quality of Linking Content
Many sites may charge $5-$50 for a back link. These are simple links that are cheap because it there is no content to write. There are generally the first to get flagged by Google vs receiving links from sites that pays someone to write a piece 500-1000 word piece.

Sitewide Links
In some link building schemes, links are placed in footers. In these cases, link equity is deduced and consolidated as 1 link.

Leave a Reply

Your email address will not be published. Required fields are marked *