Copy

<<First Name>>,

These are strange times we are in. I’m sure (like I am) you’re having trouble focusing on working with the constantly changing news. And that's part of the reason why this newsletter is so delayed.

I think this current situation  is a good reminder on how to maintain your mental health as things change (like they do in digital marketing as well). Things like:

  1. Plan and control what you can.
  2. Stay agile and keep learning.
  3. Hone in your persuasion skills as ultimately decisions are made by people.
  4. Take digital free breaks to regulate your stress (ideally outside) or adopt other self-care activities.
  5. Help others if you can (I got a chance to help one of my clients pivot to online concerts, check them out Thursday nights)

In my personal life I’ve been using those same skills to plan for schools being closed, having supplies on hand, and persuading family members to stay safe. I’ve also taken digital breaks outside. I highly encourage all of you who can to get outdoors. Here's a photo from my family’s National Botanical Gardens outing last weekend.
 


As a marketer that focuses on digital - you got this. 

Whether it’s COVID-19 or keeping your marketing chops and persuasion skills up to date. 

If you’re interested in learning more about how you can hone your interpersonal skills, you want to sign up for my new podcast.  My co-host and I interviewed successful digital marketers about how they handled the “people” stuff of digital marketing and led organizational change (at times) that led to their digital marketing success. 

Go ahead and sign up now to be notified when it launches.

OK. To help you stay on top of digital marketing changes from the last month, here are the essential bits that you might have missed: 

Google Announcements and Updates

Google will switch to all mobile-first indexing by September 2020

Here’s a link to Google’s announcement. So now Google is sending notifications to sites not switched over that they need to move now or potentially have indexing and ranking issues.

If you aren’t sure if you’ve been switched over, you can look at your settings pages in GSC:

A change to Google Images

Google Images is now showing icons on the desktop that provides useful information to indicate if images lead to pages with products for sale, recipes or video content. You can see more here:

Google is ignoring paid links and might get rid of nofollow  

John Mueller hinted at how Google handles paid links in a recent Google Webmaster Hangout. And from Pubcon you can see Google also hinted that they might get rid of no follow altogether: 

Your takeaway:

Start using Google’s UGC and sponsored link identifiers if your site has that type of content. Google's instructions are here. And don’t buy links. This has been my recommendation to my clients since I started in SEO...

Changes to change of address tool

Google has made some improvements to the Change of Address tool, a critical tool to use if you’re changing domains (even just from HTTP to HTTPs)

Your takeaway:

Familiarize yourself with the tool if you have any upcoming domain moves.

Signs that Google is ranking based on entity metrics

I’ve been watching this for the past year or so, and now Bill Slawski has surfaced a new patent entitled  “Ranking search results based on entity metrics” -another indicator that Google might be using entity signals in its ranking algorithm. The patent hints that Google is using machine learning to allow Google to calculate the probability of user intent based on both user language AND tone. Here’s the blurb from the patent:

“An entity type associated with the search result is determined, wherein the entity type is determined at least in part from the knowledge graph. A weight is determined for each metric of the plurality of metrics based at least in part on the entity type.”  

I've been helping my clients improve their entity understanding in Google's eyes. Just hit reply to this email if you're interested in learning more.

Ranking factors for Google Discover:

Here's another great share from Pubcon about how to get more Google Discovery traffic. NOTE: if  you DO have traffic from Google Discover the only way to see that traffic is in Google Search Console. 

Googlebot does not click “load more” buttons

Google has shared that if the “load more” button powered by JavaScript (and I shared ways that you can load this in CSS in last month’s newsletter) then it won’t load those “more” and won’t see the content underneath.

Via @johnmu: Googlebot will not click a "load more" button. In the past, Google might try to trigger that, but it's expensive. Instead, Google uses frame expansion to render the page on a very long viewport. It'll do that once & see what loads, then index what it can see in that viewport/frame. So if your content is outside of that viewport/frame, they won’t index it.

Here’s the spot in a Hangout video where John Mueller explains this:

Keep in mind that EVEN if the content is indexed, it probably won’t rank as well as content that is visible on page load. 

New Review Snippet reporting from Google

Google announced new reports within Google Search Console to help you with your review snippets. It highlights any errors there might be in the structured data.  There is also new support for review snippets in the rich result testing tool.

New Episodes of Search Console Training

Google has released a ton of new training around how to use the Google Search Console. Here’s what they announced over the past month: 

What Rich Results are and how to improve them

The session covers:

  1. Monitor and optimize your site’s appearance and performance with Google rich results
  2. Determine errors in your structure data
  3. Validate changes on your site through communication with Google

**Click here to watch it on YouTube
 
Improving your AMP implementation 

XML Sitemaps

The session talks about XML sitemaps. You don’t need to worry about a sitemap if you have a small site, but if you should  listen to you the training if your site meets the following criteria:

  1. Your site is really large

    • A sitemap will help Google prioritize the URLs to crawl

  2. Your pages are isolated or not well linked to each other

    • A sitemap might help Google find those pages

  3. Your site is now, or has a lot of quickly changing content (i.e. a news site)

    • A sitemap will help Google discover your content

Keep in mind that GSC, it will only show the sitemaps that have been submitted, not XML sitemap discovered via robots.txt.

Bing Announcements and Updates

Using Bing to understand searcher’s intent

Look at how Bing has pivoted their results! They are blatantly adding filtering to help the searcher focus their searchers...also great insight for you as you're selecting keywords and writing content. See the screenshot below:

Bing is starting to use Turing Natural Language Generation (T-NLG)

It's similar to Google’s BERT, and the SEO community has said that there’s nothing you can do to “optimize” for it. However, it seems as though google will release a curriculum according to this tweet:

The goal for T-NLG is to write human-like summaries for a wide range of text documents -- the blog post announcing this says emails, blog posts, Word documents, Excel sheets, and PowerPoint presentations can all benefit from T-NLG. They hope that it will also improve the fluency of chatbots and digital assistants.

Here’s the tweet announcing it:

Tips for Bing Featured Snippets

As usual, Bing needs clearer signals than Google - note the specific guidance around using HTML paragraph signals (<p>).

Technical SEO

Building a Robots.txt file that works

What a great guide!  A few reminders for me (and some new information).
  1. If you ever call out Googlebot in the file, you then need to duplicate ALL of the rules underneath the Googlebot line, or Googlebot will NOT run using those rules.

  2. The priority of the rules is not determined by their order, but by the character length of the rule. 

  3. when you have two rules, with the same length and opposite behavior (one allowing the crawling and the other one disallowing it), the less restrictive rule applies

This is why you should always test your robots.txt before releasing - either by using the tool within Google Search Console, or you can execute a crawl with a new custom robots.txt file in your favorite crawling tool.

503 + 429 codes = slows the crawl rate, might result in pages being removed

Gary Illyes says they will slow down crawling for that page, and if those codes persist for a longer period of time, then those pages can get removed from the index.

Safari to start only accepting HTTPs certificates that are 13 months valid

Starting September 1, Safari will only accept HTTPs certificates that are valid for 13 months

There used to be lifelong security certificates, but Safari is no longer accepting them.

Michael Karg (in a tweet) explained why:

The challenge is that if there’s no valid certificate on the old domain then users will see a certificate warning instead of being redirected:

So, of course, the next question is how long to keep the certificate valid on the first domain?

John Mueller recommends indefinitely so that it doesn’t get grabbed by spammers.

Here’s the tweet with this answer:

How do deal with expired products 

Here's a great infographic on how to deal with discontinued products on an eCommerce site. 

Additionally, Will Reynolds kicked off a conversation on Twitter suggesting that eCommerce sites give you an option to be notified when the product comes back in stock. It seems as though Zappos, Lulus and Tortuga Backpack already do this. 

Google's behavior after merging websites

John Mueller in a recent Hangout (see below) clarified that when you merge multiple websites to a single domain or split a website onto multiple domains, it will take Google much longer to process and the outcome is more unpredictable than simply moving a website to a new domain.

Here are my notes from the video:

  • A move like that takes a lot longer for Google to process and it’s hard to determine what will happen at the end.
  • With the merge, it creates (in Google’s eyes) a new site. It might be that the final state is much different than the individual sites you merged together.
  • There’s no real “trick” to handling this. Take it page by page and migrate it to the new bigger website.

Crawl issues might impact Hreflang pickup

Gary Illyes clarified that Hreflang annotations aren’t processed until all pages in the set are crawled and indexed.  Gary said that for all the hreflang annotations to be picked up, all the linked pages need to be crawled and indexed. This means that you *really* need to ensure that Googlebot is effectively crawling all of those linked pages.

What is the lift on implementing structured data markup?

Great numbers you can use for your bench marking effort from Abbey Hamilton’s  presentation at SMX West:

  • Review markup can create a +3.6% increase in CTR
  • Article markup in the health niche can result in a 16% increase in clicks and 
  • Recipe markup resulted in an incredible 101% increase in sessions 

You can see the full deck here.

Python script to find all mixed content warnings showed to your site visitors.

If you are interested in playing with this, it seems as though Josh will DM you the script.

Lighthouse now has CMS specific guidance

It has had Wordpress advice for months, but now has actionable guidance for React, Angular, Magento or AMP. I’m patiently awaiting Drupal guidance for my clients…

Last mod date in your XML sitemap file

This should *not* be set to when the sitemap was generated and instead should be specific to each page and should indicate the last time the page’s content was changed.

Content

Google is “white labeling” a set of health sites

I saw this while working on various clients in February, and now I’ve discovered a Google patent that adds credence to what I’ve been telling my clients, which is this:

It might now be impossible to rank for certain health queries because it seems as though Google has a “white-labeled” set of sites it uses for search results for health queries and is only pulling from those certain set of sites (the client we were looking at was in the pharma industry).

Here’s the patent that Bill Slawski discovered that speaks to this phenomenon: Google Using Website Representation Vectors to Classify with Expertise and Authority.” In it, Bill writes about how a recent Google patent describes how Google could use something called Website Representation Vectors to classify certain features found on websites. 

Here are the crib notes from the patent:

  • Google can use Neural Networks to understand certain patterns in websites so that they can classify those sites as an authority within domains like health and finance, etc.
  • They can also determine the degree of expertise based on the topic. 
  • Google can use terms from a searcher’s query to understand which expertise group of websites to draw from.
  • How a website is classified can be determined from many factors including text or images from the website, links, and more. 
Note that this patent was filed in August of 2018, around the same time as the Medic Update.

The patent also notes that Google can determine the level of expertise based on many factors including content on the page, links on the page (perhaps links to the page), etc.

To add support to the idea that Google may be using this in search, check out an example of how Google displays the EAT of your authors by displaying the Knowledge Panel:

Common content problems that lead to algorithm update issues

Be aware of these, as when the algorithms change they might lead to ranking issues:

Making sure that evergreen content is NOT a blog was not on my audit list, but it is now!

Google prefers HTML over .pdfs but is testing mobile screenshots of .pdfs in search.

Note that Google said at Pubcon Florida in 2019 that they strongly prefer an HTML version over a PDF. So, if you have the ability to create HTML versions, it may help, but maintaining a PDF shouldn’t stop it from being indexed.

Featured Snippets are also being generated for local queries

Importance of site hierarchy

Full disclosure: I’m sharing this with you so that I also have it documented for a future audit.

Here’s the question from a Google Hangout:

And John’s answer:

Internal linking is really important. It is almost the best way to understand the context of individual pages within your website. It is a lot easier for us to understand that this is the hierarchy of your pages — these are the higher-level pages, the more important pages, and the less important pages. All of that is something that is really important for us to be able to pick up on. It is something that is reflected in a lot of places in search.

For example, something that I saw recently: If you have a really flat hierarchy in that you have all of your pages linked from all of your other pages, then we can’t really tell which of these pages are more important than others — which can lead to situations like a site link on one of your pages shown on your search result that points to a completely unrelated page.

From our point of view, we think, “All of these pages are all equally important and they are all related because they are all linking to each other. Taking any random page and using that as a site link is something that could be reasonable.” Whereas if you have a clear hierarchy, it is a lot easier for us to understand that this part of the website belongs together and this other part belongs here and that this is the higher-level part. All of this is a lot clearer to understand.

With that said, the number of clicks to reach a page from your homepage is something that helps us to kind of understand how that page fits in, but it is not necessarily that we would say it is bad if it is above three or four or five. The important part is really that you have a clear hierarchy, and when you have that hierarchy, the actual number of clicks is less important.

That said, if you have something that you really want to highlight on your website and you kind of say, “This is my favorite or best-selling product,” or, “This is something completely new I want to promote,” then having that linked more prominently within your website definitely makes sense because then we can understand that you really care about this new thing or unique thing and we can show it appropriately and more visibly in search.

- John Mueller, February 7, 2019

Voice SEO

Power of Q&A  for voice

From Greg Gifford at Pubcon:

Q&A is a direct feed into Google's entity database and it serves voice search answers. DO NOT NEGLECT IT.

I agree.

Google Assistant can read any page out loud 

Google announced that the Assistant can now read out loud content of any article. If for some reason, if you don’t want to use this feature, you can use the nopagereadaloud tag:

Developers can also add this read out loud option to their mobile apps, by using Actions on Google
Local SEO

Google ordered to reveal the identity of the person who left a bad review

If this becomes a court order beyond Australia this would be huge for both small businesses and multi location businesses. Previously people could leave reviews anonymously. An Australian court has ordered Google to provide the details on an anonymous user who defamed a Melbourne dentist with their not-so-kind review.

Social Media Updates

FTC is cracking down on non-disclosed influencer marketing

Here’s the FTC’s statement. Make sure to disclose if you’re using an agent who is promoting you or if you have influencers who promote your content.

SEO Measurements

How to pull ALL of  your data out of Google Search Console

Learn how to pull all of your search analytics data out of Google Search Console and set yourself up for serious analysis. (All code included and linked on Github for use) 

De-duping search results might roll out to other search features

Last month we mentioned that if you are in the Featured Snippet spot,  you will no longer rank at position #1, and it seems that behavior might impact other parts of search features in the future:

Using Python to scrape People Also Ask

Screaming frog a few months ago released instructions for how to use its tool to scrape the visual PPAs (the 4)  for data analysis (important if you’re trying to write a Featured Snippet) and this SEO has taken that a step further and has instructions for how to scrape all of the PPAs using Python.

New SERP eye-tracking research from Nielsen Norman Group

I have been using their eye-tracking research on SERPs for years in my training presentations and they have done new research showing that the search result changes we’ve all seen have definitely impacted how people engage with the result pages.

Searchers are now using a pinball patterns (no longer a F shaped pattern)

“Because search-results pages are now so inconsistent from query to query, users are often forced to assess the page before digging in and making a selection. That means that the layout of a SERP can determine which links get visibility and clicks.

The inconsistency in SERP layout means that users have to work more to process it than they used to. It may be that search engines try to encourage people to explore more than just the first result. Yet, people are fairly fast in choosing a search result— we found that users spent an average of 5.7 seconds considering results before they made their first selection (with a 95% confidence interval from 4.9 to 6.5 seconds).” - Nielsen Norman Group

This means that “rank tracking” where the top results “wins'' the most clicks is broken. The current pattern of user behavior is looking at a variety of places on the search to determine which element answers their question. You need to know which features take the most attention on the SERPs you are trying to compete with and most SEO tools don’t measure that. 

Danny Sullivan from Google agrees that rank tracking is an outdated model of SEO measurement

Here’s how Google measures “position” in Google Search Console:

“Most of the search result elements have a numeric position value describing their position on the page.The following desktop diagram shows a search results page with four blue link sections (1, 3, 4, 5), an AMP carousel (2), and a Knowledge Panel card (6). Position for each is calculated from top to bottom on the primary side of the page, then top to bottom on the secondary side of the page (in right-to-left languages, the right side is the primary side), as shown here” - Google

“The location of the result elements on the page can vary depending on the device type, search features, and the screen size, but the general rule is the same: position is calculated top to bottom, left to right (or right to left for RTL languages). Note that this method of calculation might change in the future.

All links within a single element have the same position. For example, in the previous diagram, all items in the AMP carousel have position 2; all links in the “blue link” block at position 5 have position 5; all links in the knowledge card have position 6, and so on.” - Google

Neilson Norman group also was able to see that this is impacting clicks across search positions (though their data doesn’t break down CTR based on SERP features).

Moving forward we need to start measuring absolute rankings - the position on the page that includes the ads, and Featured snippets. Hopefully we will get this in the SEO tools soon. For now you need to look at the SERPs and notate your reporting.
New Tools and Resources

Pentest subdomain finder

A free tool to spot any subdomains associated with the domain you’re evaluating. (or any other subdomain finder for that matter)

Stay healthy and disconnect when you can, to stay calm during these weird times. If you’re like me and trying to work with little ones at home, this is a great playlist to try to block them out. Or you can use the Noisli app to run coffee shop sounds.


If you need help during this time of craziness, feel free to book my free 15 minutes consult.

Stay safe!

Best,

Katherine Watier Ong
WO Strategies LLC
www.wostrategies.com

Facebook
Twitter
Website
Pinterest
LinkedIn
YouTube
Copyright © 2020 WO Strategies LLC, All rights reserved.

Our mailing address is:
WO Strategies LLC, 6933 Tyndale St, McLean, VA 22101

Want to change how you receive these emails?
You can
update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp