Unsolved Google Search Console Still Reporting Errors After Fixes
-
Hello,
I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site.
I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling.
According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails.
What could be going on here? How can we resolve these error in GSC.
-
Hi! Google Search Console has this issue, I would recommend not to pay much attention to it. If you know that everything's correct on the website, than you don't need to worry just because of Search Console issues.
-
In this case, it's likely that the Google bots may have crawled through your site before you fixed the errors and haven't yet recrawled to detect the changes. To fix this issue, you'll need to invest in premium SEO tools such as Ahrefs or Screaming Frog that can audit your website both before and after you make changes. Once you have them in place, take screenshots of the findings both before and after fixing the issues and send those to your client so they can see the improvements that have been made.
To give you an example, I recently encountered a similar issue while working with a medical billing company named HMS USA LLC. After running some SEO audits and making various fixes, the GSC errors had been cleared. However, it took a few attempts to get it right as the changes weren't detected on the first recrawl.
Hopefully, this information is useful and helps you understand why your GSC issues may still be showing up after being fixed. Good luck!
-
Hi,
We have had the similar problem before. We are an e-commerce company with the brand name VANCARO. As you know the user experience is very important for an e-commerce company. So we are very seriouse about the problems reported by GSC. But sometimes the update of GSC may be delayed. You need to observe a little more time. Or I can share you anoter tool : https://pagespeed.web.dev/. Hope it can help you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content homepage - Google canonical 'N/A'?
Hi, I redesigned a clients website and launched it two weeks ago. Since then, I have 301 redirected all old URL's in Google's search results to their counterparts on the new site. However, none of the new pages are appearing in the search results and even the homepage has disappeared. Only old site links are appearing (even though the old website has been taken down ) and in GSC, it's stating that: Page is not indexed: Duplicate, Google chose different canonical than user However, when I try to understand how to fix the issue and see which URL it is claiming to be a duplicate of, it says: Google-selected canonical: N/A It says that the last crawl was only yesterday - how can I possibly fix it without knowing which page it says it's a duplicate of? Is this something that just takes time, or is it permanent? I would understand if it was just Google taking time to crawl the pages and index but it seems to be adamant it's not going to show any of them at all. 55.png
Technical SEO | | goliath910 -
Backlinks on Moz not on Google Search Console
Moz is showing thousands of backlinks to my site that are not showing up on Google Search Console - which is good because those links were created by some spammer in Pakistan somewhere. I haven't yet submitted a disavow report to Google of well over 10K links because the list keeps growing every day with new backlinks that have been rerouted to a 404 page. I have asked Google to clarify and they put my question on their forum for an answer, which I'm still waiting for - so I thought I'd try my luck here. My question... If Moz does not match Google Search Console, and backlinks are important to results, how valid is the ranking that Moz creates to let me know how I'm doing in this competition and if I'm improving or not. If the goal is to get Google to pay attention and I use Moz to help me figure out how to do this, how can I do that if the backlink information isn't the same - by literally over 10 000 backlinks created by some spammer doing odd things... They've included the url from their deleted profile on my site with 100s of other urls, including Moz.com and are posting them everywhere with their preferred anchor text. Moz ranking considers the thousands of spam backlinks I can't get rid of and Google ignores them or disavows them. So isn't the rankings, data, and graphs apples and bananas? How can I know what my site's strength really is and if I'm improving or not if the data doesn't match? Complete SEO Novice Shannon Peel
Link Building | | MarketAPeel
Brand Storyteller
MarketAPeel0 -
Google Search Console - Excluded Pages and Multiple Properties
I have used Moz to identify keywords that are ideal for my website and then I optimized different pages for those keywords, but unfortunately rankings for some of the pages have declined. Since I am working with an ecommerce site, I read that having a lot of Excluded pages on the Google Search Console was to be expected so I initially ignored them. However, some of the pages I was trying to optimize are listed there, especially under the 'Crawled - currently not indexed' and the 'Discovered - currently not indexed' sections. I have read this page (link: https://moz.com/blog/crawled-currently-not-indexed-coverage-status ) and plan on focusing on Steps 5 & 7, but wanted to ask if anyone else has had experience with these issues. Also, does anyone know if having multiple properties (https vs http, www vs no www) can negatively affect a site? For example, could a sitemap from one property overwrite another? Would removing one property from the Console have any negative impact on the site? I plan on asking these questions on a Google forum, but I wanted to add it to this post in case anyone here had any insights. Thank you very much for your time,
SEO Tactics | | ForestGT
Forest0 -
Unsolved Almost every new page become Discovered - currently not indexed
Almost every new page that I create becomes Discovered - currently not indexed. It started a couple of months ago, before that all pages were indexed within a couple of weeks. Now there are pages that have not been indexed since the beginning of September. From a technical point of view, the pages are fine and acceptable for a Google bot. The pages are in the sitemap and have content. Basically, these are texts of 1000+ or 2000+ words. I've tried adding new content to pages and even transferring content to a new page with a different url. But in this way, I managed to index only a couple of pages. Has anyone encountered a similar problem?
Product Support | | roadlexx
Could it be that until September of this year, I hadn't added new content to the site for several months?
Please help, I am already losing my heart.0 -
Canonicalization, does it still index
If I have 2 pages that are identical but on different domains that our team manages, if we place a rel=canonical tag on the page we prefer/should display, will the page that doesn't have the canonical tag still be indexed and show on SERPs?
Technical SEO | | kroe10 -
Strange Crawl Report
Hey Moz Squad, So I have kind of strange case. My website locksmithplusinc.com has been around for a couple years. I have had all sorts of pages and blogs that have maybe ranked for a certain location a longtime ago and got deleted so I could speed up the site and consolidate my efforts. I said that because I think that might be part of the problem. When I was crawl reporting my site just three weeks ago on moz I had over 23 crawl report issues. Duplicate pages, missing meta tags the regular stuff. But now all of a sudden when I crawl report on MOZ it comes up with Zero issues. So I did another crawl On google analytic and this is what came up. SO im very confused because none of these url's are even url's on my site. So maybe people are searching for this stuff and clicking on broken links that are still indexed and getting this 404 error? What do you guys think? Thank you guys so much for taking a shot at this one. siS44ug
Technical SEO | | Meier0 -
132 pages reported as having Duplicate Page Content but I'm not sure where to go to fix the problems?
I am seeing “Duplicate Page Content” coming up in our
Technical SEO | | danatanseo
reports on SEOMOZ.org Here’s an example: http://www.ccisolutions.com/StoreFront/product/williams-sound-ppa-r35-e http://www.ccisolutions.com/StoreFront/product/aphex-230-master-voice-channel-processor http://www.ccisolutions.com/StoreFront/product/AT-AE4100.prod These three pages are for completely unrelated products.
They are returning “200” status codes, but are being identified as having
duplicate page content. It appears these are all going to the home page, but it’s
an odd version of the home page because there’s no title. I would understand if these pages 301-redirected to the home page if they were obsolete products, but it's not a 301-redirect. The referring page is
listed as: http://www.ccisolutions.com/StoreFront/category/cd-duplicators None of the 3 links in question appear anywhere on that page. It's puzzling. We have 132 of these. Can anyone help me figure out
why this is happening and how best to fix it? Thanks!0 -
Why Google not picking My META Description? Google itself populate the description.. How to control this Search Snippets??
Why Google not picking My META Description? Google itself populate the description.. How to control this Search Snippets??
Technical SEO | | greyniumseo0