November Google SEO Office Hours

Tuesday, November 29, 2022

Thanks to everyone who submitted questions for the November edition of the Google SEO office hours! In this episode, you'll hear answers from folks on the Google Search team: Gary Illyes, Lizzi Sassman, John Mueller, Alan Kent, and Duy Nguyen. You can check out the full recording on our YouTube channel, and we're also publishing the transcript of the questions for easier reference in this blog post.

We've been experimenting with how we do office hours over the past few months, so you may see us trying variations over time. The goal is to make it easier to publish these and for other people on the Google Search team to contribute answers, so the overhead for publishing these is kept at a minimum. We're currently aiming to publish this audio-only format on a monthly cadence.

Want to submit your question for the December edition of Google SEO office hours? The form is open and you can submit a question for next month's edition. We recommend focusing on general questions related to Google Search and SEO; if you have a specific question about your website, the Google Search Central Forum is the best place to go for site-specific help.

How did you like the new setup? Drop us a note with the Send Feedback button on this blog post or on the YouTube video with your comments!


Can we use multiple values in one schema markup field using comma separation?

Lizzi: Abhishek is asking, "Can we use multiple values in one schema markup field using comma separation? For example, GTIN equals value one, value two".

Well, as always, you should check the documentation for the particular feature because some of the guidance might differ from feature to feature. But in general, it's a good markup practice to specify one value per field. In the case of GTIN, there should only really be one value, since this is a unique product identifier. If you're specifying a GTIN and ISBN, then use the GTIN, and then the ISBN property, so that we know which value applies to which property.

What are the options to use the disavow feature in Search Console for domain properties?

John: Pierre asks: "The disavow feature in Search Console is currently unavailable for domain properties. What are the options then?"

Well, if you have domain level verification in place, you can verify the prefix level without needing any additional tokens. Verify that host and do what you need to do. Also, keep in mind that disavowing random links that look weird or that some tool has flagged, is not a good use of your time. It changes nothing. Use the disavow tool for situations where you actually paid for links and can't get them removed afterwards.

How bad is the helpful content update for sites accepting guest posts for money?

Duy: Hello, this is Duy recording for SEO Office Hours. Latha asked, "How bad is the helpful content update for sites accepting guest posts for money?"

Our systems can identify sites with low value, low quality content, or content created just for search engines. Sites accepting guest posts for money without carefully vetting content and links risk ranking lowering service results, not just because of the helpful content update, but also because of the other systems that we already put in place.

What to do when Google does not detect the canonical tag correctly?

John: "What to do when Google does not detect the canonical tag correctly?"

So taking a step back, canonicalization is based on more than only the link rel="canonical" element. When Google discovers URLs that are significantly alike, our systems try to pick one URL that best represents the content. For that we take into account not only the link rel="canonical", but also redirects, sitemaps, internal links, external links, and more. If you have strong feelings about which URL should be used, then make sure that all of your signals align. Keep in mind that canonicalization is mostly about which URL is shown. It's not something that affects the ranking of the content.

Is Google more likely to crawl and index the pages if the content is short?

Gary: "If a site has a directory covering on each topic, which isn't searched for very often, is Google more likely to crawl and index the pages if the content is short, and so it's cheaper to store it in the index."

This is an interesting question. The length of the content doesn't influence how often we crawl and whether we index it. It also doesn't contribute to the crawl rate of a URL pattern. Niche content can also be indexed. It's not in any way penalized, but generally content, that's popular on the internet, for example, many people linked to it, gets crawled and indexed easier.

Could dynamic sorting of listings be a reason for not indexing product images?

Alan: Paul asked, "Could dynamic sorting of listings be the reason for not indexing product images?"

It is unlikely that dynamic sorting of listings would be the reason for product images not being indexed. Your product images should be referenced from your product description pages, so we know the product the image is for. If needed, you can create a sitemap file or provide a Google Merchant Center feed so Google can find all your product pages without depending upon your listings page.

Is there a timeframe for the site migration?

Lizzi: Sergey is asking, "Is there a timeframe for the site migration? We're migrating a large site to a new domain at the moment. After four months, there is still no signs of the new domain getting SERP positions of the old one. So what should we do?"

With a big change like this, it's totally normal to see ranking fluctuations, especially while you're still in the middle of the migration. There's no set timeframe for when things will settle down, and also keep in mind that only moving one section of your site is not necessarily indicative of a whole site move when it comes to Search. If you're still moving things around, you're going to continue to see fluctuations. It's hard to say what you should do next without seeing the site itself. And for site specific help, we definitely recommend posting in the forums so people can see your specific situation and give more specific advice.

Could the use of HTTP/3 improve SEO because it improves performance?

John: Flavio asks, "Could the use of HTTP/3, even indirectly, improve SEO, perhaps because it improves performance?"

Google doesn't use HTTP/3 as a factor in ranking at the moment. As far as I know, we don't use it in crawling either. In terms of performance, I suspect the gains users see from using HTTP/3 would not be enough to significantly affect the core web vitals, which are the metrics that we use in the page experience ranking factor. While making a faster server is always a good idea, I doubt you'd see a direct connection with SEO only from using HTTP/3. Similar to how you'd be hard pressed to finding a direct connection to using a faster kind of RAM in your servers.

Duy: Andrea asked, "Why does Google keep using backlinks as a ranking factor if link building campaigns are not allowed? Why can't Google find other ranking factors that can't be easily manipulated like backlinks?

There are several things to unpack here. First, backlinks as a signal has a lot less significant impact compared to when Google Search first started out many years ago. We have robust ranking signals, hundreds of them, to make sure that we are able to rank the most relevant and useful results for all queries. Second, full link building campaigns, which are essentially link spam according to our spam policy. We have many algorithms capable of detecting unnatural links at scale and nullify them. This means that spammers or SEOs spending money on links truly have no way of knowing if the money they spent on link building is actually worth it or not, since it's really likely that they're just wasting money building all these spammy links and they were already nullified by our systems as soon as we see them.

John: Sam asks, "does it matter if the vast majority of anchors for internal content links are the same?"

Well, This is fine. It's normal even. When it comes to menus, they're usually everywhere, and even products when they're linked within an e-commerce site, they're usually linked with the same link all the time. That's perfectly fine. There's nothing that you really need to do there in terms of SEO.

If you add the website schema, do you add software application schema too?

Lizzi: Anonymous is asking, "If you add the latest website schema to your home page, do you still add software application or organization schema too? Google updated its schema markup of documentation by adding website schema for brand, but it does not mention what happens with organization or software application schema.

Well, it really depends. Sorry to give that answer. These are different features. If your site is about a software application, then sure, you can also add software application structured data. Just make sure that you nest everything so that there's one website node on the home page and not multiple website nodes. That's really the key with this thing.

Is an excessive number of noindex pages an issue for disovery or indexing?

Gary: Chris is asking, "How much of an issue for Google is excessive number of noindex pages, and whether it will affect discovery and indexing of content if the volume is too high?"

Good question. noindex is a very powerful tool search engines support to help you, the site owner keep content out of their indexes. For this reason, it doesn't carry any unintended effects when it comes to crawling and indexing. For example, having many pages with noindex will not influence how Google crawls and indexes your site.

Is it a problem if the URL and the page don't use the same language?

Alan: Yasser asked if the URL doesn't contain characters of the same language used in the pages, would that affect a site ranking?

From an SEO point of view, there's no negative effect, if the URL is in a different language than the page content. Users on the other hand, may care, especially if they share the URL with other people.

How should creators respond to sites that scrape and spam?

Duy: Kristen asked, "How should content creators respond to sites that use AI to plagiarize the content, modify it, and then outrank them in search results?

Scraping content, even with some modification, is against our spam policy. We have many algorithms to go after such behaviors and demote site scraping content from other sites. If you come across sites that repeatedly scrape content, that perform well on Search, please feel free to report them to us, with our spam report form so that we can further improve our systems, both in detecting the spam and also ranking overall.

Is it true that Google rotates indexed pages?

Lizzi: Ridwan is asking, "Is it true that Google rotates indexed pages? Because the site I'm working on is rotating on indexed pages. Like for example, page A is indexed on Monday through Thursday, but not indexed Friday through Sunday."

Okay. So the answer is real quick. No, this is not true. We are not rotating the index based off of days of the week.

Should we keep an eye on the ratio between indexed and non-indexed pages?

John: Anton asks, "Should we keep an eye on the ratio between indexed and non-indexed pages in Search Console in order to better recognize possibly wasted crawl budget on non-indexed pages?"

No, there is no magic ratio to watch out for. Also, for a site that's not gigantic, with less than a million pages, perhaps, you really don't need to worry about the crawl budget of your website. It's fine to remove unnecessary internal links , but for small to medium sized sites, that's more of a site hygiene topic than an SEO one.

How do we enable the Discover feature?

Alan: Joydev asked, "How do we enable the Discover feature?"

You do not need to take any action to make your content enabled for Discover. We do that automatically. Google, however uses different criteria to decide whether to show your content in Discover versus search results. So getting traffic from search is not a guarantee that you'll get traffic from Discover.

Does having many noindex pages linked from spammy sites affect crawl budget?

Gary: Sam is asking another noindex related question: "A lot of SEOs are complaining about having millions of URLs flagged as excluded by noindex in Google Search Console. All to nonsense internal search pages linked from spammy sites. Is this a problem for crawl budget?"

noindex is there to help you keep things out of the index, and it doesn't come with unintended negative effects as we said previously. If you want to ensure that those pages or their URLs more specifically don't end up in Google's index, continue using noindex and don't worry about crawl budget.

Is it thin content if I break down a long article into pieces?

Lizzi: Lalindra is asking: "Would it be considered thin content if an article covering a lengthy topic was broken down into smaller articles and interlinked?"

Well, it's hard to know without looking at that content. But word count alone is not indicative of thin content. These are two perfectly legitimate approaches: it can be good to have a thorough article that deeply explores a topic, and it can be equally just as good to break it up into easier to understand topics. It really depends on the topic and the content on that page, and you know your audience best. So I would focus on what's most helpful to your users and that you're providing sufficient value on each page for whatever the topic might be.

Is it true that having lots of 404 pages can stop crawling and processing?

Gary: Michelle is asking: "HTTPS reports help center documentation says lots of