Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hey all! Someone I work with recently redirected one of their site pages via Squarespace. They used Squarespace-provided code to make a 301 redirect. Following this, the primary keywords for the page that was redirect to have dropped pretty significantly. Any Squarespace pros out there who can help me figure out what's going on?

    | kelseyworsham
    0

  • We have thousands of Missing Description issues but most of them are account/login pages. i.s. /customer/account/ etc... We tried to de-index them through the Configuration using the instructions here - https://docs.magento.com/user-guide/marketing/search-engine-robots.html But they're still appearing as issues in the Site Crawl. Even without the site crawl issue, we don't really want these to appear in the SERPs. Does anybody know how to properly de-index these login pages in Magento? Thank you!

    | LASClients
    0

  • HEY EXPERTS, My website page speed is not increasing. I used the wp rocket plugin but still, I am facing errors of Reduce unused CSS, Properly size images, and Avoid serving legacy JavaScript to modern browsers. you can see in the image Screenshot (7).png I used many plugins for speed optimization but still facing errors. I optimized the images manually by using photoshop but still, I am facing the issue of images size. After Google Core Web Vital Update my website keyword position is down due to slow speed. Please guide me on how I increase the page speed of my website https://karmanwalayfabrics.pk Thanks

    | frazashfaq11
    0

  • Regarding backlinks, I'm wondering which is more advantageous for domain authority and Google reputation: Option 1: More backlinks including a lot of spammy links Option 2: Fewer backlinks but only reliable, non-spam links I've researched this topic around the web a bit and understand that the answer is somewhere in the middle, but given my site's specific backlink volume, the answer might lean one way or the other. For context, my site has a spam score of 2%, and when I did a quick backlink audit, roughly 20% are ones I want to disavow. However, I don't want to eliminate so many backlinks that my DA goes down. As always, we are working to build quality backlinks, but I'm interested in whether eliminating 20% of backlinks will hurt my DA. Thank you!

    | LianaLewis
    1
  • This question is deleted!

    0

  • We're seeing a couple of temporary redirects. One for the http pointing to https. Another for /checkout pointing to/checkout/cart. We don't have an internal dev so not sure how to remove these. Would anyone know? I've set up the 301s but they're not overriding and I'm still seeing the issues in the crawl. Thanks in advance for your help!

    | LASClients
    0

  • magento 302

    Hi all, I'm assigned a site in Magento. After the first craw, we found almost 15k 302 redirects. A sample URL ends with this /stores/store/switch/?SID=qdq9mf1u6afgodo1vtvk0ucdpb&___from_store=default&___store=german&uenc=aHR0cHM6Ly9qdWljeWZsdXRlcy5jb20vP19fX3N0b3JlPWdlcm1hbg%2C%2C And they are currently 302 redirecting to the homepage as well as other main pages and also product pages it seems. Some of these point to account pages where customers log in. Probably best for me to de-index those so no issues there. But I'm worried about the 302 redirects to public pages. The extension we have installed is SEO Suite Ultimate by MageWorx. Does anyone here have experience here specifically and how did you fix it? Thanks, JC

    | LASClients
    0

  • This one has sort of been asked already but I cannot find an answer. When we evaluate a new SEO client, previously with Majestic we would review the root domain vs sub domain (www) for which had the higher Trust Flow and Citation flow, and if there was a major difference, adjust the Google indexed domain to the higher peforming one. Is there a way to do this with Moz, Domain Authority, and Sub Domain authority are always returning the same DA for me. Thanks in advance.

    | practiceedge1
    0
  • This question is deleted!

    | Davden
    0

  • Hi. We recently created a Christmas category page on our eCommerce website (christowhome.co.uk). Earlier today, I Googled ‘Christow Christmas Silhouette Lights’ (Christow being the name of our website and Christmas silhouette lights being one of the sub-categories we recently created). I was curious to see how the page appeared on search. Bizarrely, the page appeared multiple times on search (if you click on the link above, it should show you the search results). As you can see, multiple meta titles and descriptions have been created for the same page. This is something that is affecting a number of our Christmas category pages. I don't quite understand why this has happened. We recently added filters to the category. Could the filters be responsible? Any idea how I can prevent this from happening? How I can stop Google indexing these weird replica pages? Many thanks, Dave

    | Davden
    0

  • crawl errors 4xx error

    I have a client who sells highly technical products and has lots and lots (a couple of hundred) pdf datasheets that can be downloaded from their website. But in order to download a datasheet, a user has to register on the site. Once they are registered, they can download whatever they want (I know this isn't a good idea but this wasn't set up by us and is historical). On doing a Moz crawl of the site, it came up with a couple of hundred 401 errors. When I investigated, they are all pages where there is a button to click through to get one of these downloads. The Moz error report calls the error "Bot verification". My questions are:
    Are these really errors?
    If so, what can I do to fix them?
    If not, can I just tell Moz to ignore them or will this cause bigger problems?

    | mfrgolfgti
    0

  • Hi, Trying to markup products for a site that does not show prices. Is there any way to markup a product price when the business model is: 1. customer calls or contacts shop. 2. shop gives a price quote based on level of detail and finish on the product 3. there is no base or top price. Thanks in advance!

    | plahpoy
    0

  • I have two websites that are hosted in the same CMS. Rather than having two separate robots.txt files (one for each domain), my web agency has created one which lists the sitemaps for both websites, like this: User-agent: * Disallow: Sitemap: https://www.siteA.org/sitemap Sitemap: https://www.siteB.com/sitemap Is this ok? I thought you needed one robots.txt per website which provides the URL for the sitemap. Will having both sitemap URLs listed in one robots.txt confuse the search engines?

    | ciehmoz
    0

  • hey guy, can anyone help me in finding broken outbound link on my website by using moz ? does Moz has this function ?

    | rogerdam
    0

  • I have two sites that are facilitated hosting in similar CMS. Maybe than having two separate robots.txt records (one for every space), my web office has made one which records the sitemaps for the two sites, similar to this:

    | eulabrant
    0

  • How do I fix a 404 redirect chain? I can't seem to find the answer and I'm worried about it effecting my SEO. Any help would be great!

    | sammecooper
    0

  • I have tons of links that I have had added a redirect to after creating my companies new website. Is it bad to have all these 301s? How do I permanently redirect those links? Also, on Google Search Console it's telling me I have 1,000+ excluded links. Is this bad? Will it negatively affect me? Is this something to do with my sitemap? Any help would be greatly appreciated 🙂

    | sammecooper
    0

  • Our website for my olansi company in London, China has hundreds of pages dedicated to every service we provide to China local areas. The total number of pages is approximately 100. Google caters pretty well for long-tail searches when it indexes all these pages, so we usually get a fair amount of traffic when this happens. However, Google occasionally drops most of our indexed pages from search engine results for a few days or weeks at a time - for example, Google is currently indexing 60 pages while last week it was back at 100. Can you tell me why this happens? When these pages don't display, we lose a lot of organic traffic. What are we doing wrong? Site url:https://www.olanside.com

    | sesahoda
    0

  • Curious if anyone else is having this problem. I have, for example, a page that is listed in Search Console as having a CLS of .44 - it is listed as a "CLS issue." The same page rendered in LightHouse shows 0 for field data CLS and 0.02 for lab data (both in the "green"). It has been over a month since I made updates to the page to improve CLS. I tried to submit a validation in Search Console, but "validation failed." I'm not sure what else to fix on the page when LightHouse data shows it as in the green! I have the same issue with other pages as well.

    | LivDetrick
    0

  • seo 4xx error 4xx error error fix error

    Hello! I have a new blog that is only 1 month old and I already have over 3000 4xx errors which I've never had on my previous blogs. I ran a crawl on my site and it's showing as my social media links as being indexed as pages. For example, my blog post link is:
    https://www.thebloggersincentive.com/blogging/get-past-a-creative-block-in-blogging/
    My site is then creating a link like the below:
    https://www.thebloggersincentive.com/blogging/get-past-a-creative-block-in-blogging/twitter.com/aliciajthomps0n
    But these are not real pages and I have no idea how they got created. I then paid someone to index the links because I was advised by Moz, but it's still not working. All the errors are the same, it's indexing my Twitter account and my Pinterest. Can someone please help, I'm really at a loss with it.
    2f86c9fe-95b4-4df5-aeb4-73570881938c-image.png

    | thebloggersi
    0

  • serp favicon

    Hi, I have a website where the favicon is not showing in the google mobile serps. It's appearing the default icon instead (world icon). This is the tag I have place in the head section of the website: <link rel="shortcut icon" href="/favicon.ico" /> The size of the favicon is 48x48 and it's appearing correctly in the browser tag. I've checked that the google robot can crawl it and in the server logs I can see requests from the "Google Favicon" user-agent. Has anyone had this same problem? Any advice?

    | dMaLasp
    0

  • We are implementing new design and removing some pages and adding new content. Task is to correctly map and redirect old pages that no longer exist.

    | KnutDSvendsen
    0

  • MOZ Community, I am trying to gauge both the potential upside and downside of buying a few (relatively long) URLs that encompass some new keywords that are surfacing in our industry and creating permanent redirects to our branded website. [This wasn't my idea!] These URLs haven't previously had any content or owners so their domain authority is low. Will Google still ding us for this behavior? I hope not but I worry that there might be some penalty for having a bunch of redirects pointing at our site. I have read that google will penalize you for buying content-rich sites with high DA and redirecting those URLs to your site but I am unclear about this other approach. It seems like a fairly mundane (and fruitless) play. I tried to explain that we won't reap any SEO rewards for owning these URLS (if there is no content) but that wasn't really heard. Thanks for any resources or information you can share! I would appreciate any resources.

    | ColleenHeadLight
    0

  • As in the title, we have a site with around 40k pages, but around a third of them are showing as "Indexed, not submitted in sitemap" in Google Search Console. We've double-checked the sitemaps we have submitted and the URLs are definitely in the sitemap. Any idea why this might be happening? Example URL with the error: https://www.teacherstoyourhome.co.uk/german-tutor/Egham Sitemap it is located on: https://www.teacherstoyourhome.co.uk/sitemap-subject-locations-surrey.xml

    | TTYH
    0

  • i add script for star snippet in my website but not work in my posts you can see it in this URL https://dlandroid.com/lucky-patcher/ when I searched in google my custom keyword "Lucky patcher apk" my competitor show with star snippet in SERP but my site doesn't show snippet stars.

    | hongloanj
    1
  • This question is deleted!

    0

  • I used http status code as 410 for some low quality pages in my site to Redirect to homepage. is this useful to improve my homepage authority?
    my website is: Nitamoshaver.com

    | ghorbanimahan
    0

  • For our product page, we want to be able to show the pricing in the local currency of the visitor. I discussed this with our web developer and he said that we can create country-specific pages, so one for UK, Australia, etc. I am afraid that this solution might hurt our SEO as Google might see this as duplicated content. What are your thoughts about this? The website runs on WordPress.

    | Maggie.Casas
    0

  • OK, been trying to piece together what is best practice for someone I'm working with, so here goes; Website was redesigned, changed urls from url a to url b. 301's put in place. However, the new url structure is not optimal. It's an e-commerce store, and all products are put in the root folder now: www.website.com/product-name A better, more organized url structure would be: www.website.com/category/product-name I think we can all agree on that. However, I'm torn on whether it's worth changing everything again, and how to handle things in terms of redirects. The way I see things, it would result in a redirect chain, which is not great and would reduce link equity. Keeping the products in the root moving forward with a poor structure doesn't feel great either. What to do? Any thoughts on this would be much appreciated!

    | Tomasvdw
    0

  • Hey Mozers! Moz Crawl tells me I am having an issue with my Wordpress category - it is returning a 5XX error and i'm not sure why? Can anyone help me determine the issue? Crawl Issues and Notices for: http://www.refusedcarfinance.com/news/category/news We found 1 crawler issue(s) for this page. High Priority Issues 1 5XX (Server Error) 5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.

    | RocketStats
    0

  • I have been trying to filter this traffic out of my Google Analytics data since it all seems to be related to spam traffic. I have had multiple instances wherein using this filter:
    (Backslash not displaying in message preview - I have written backlash to indicate its placement in the filter) Custom Filter - Exclude - Browser Size - ^backlash(not setbackslash)$ Traffic seems to appropriately filter out - but then the filter ceases working. In looking at a new site with Browser Size = (not set) traffic the filter preview doesn't appear to work either. Am I implementing the filter incorrectly? How do I filter this traffic out of GA data sucessfully? If I use the exact same method using RegEx in Google Data Studio - the filter works perfectly.

    | fuelmedical
    1

  • I have a client that has a HUGE website with thousands of product pages. We don't currently have a sitemap.xml because it would take so much power to map the sitemap. I have thought about creating a sitemap for the key pages on the website - but didn't want to hurt the SEO on the thousands of product pages. If you have a sitemap.xml that only has some of the pages on your site - will it negatively impact the other pages, that Google has indexed - but are not listed on the sitemap.xml.

    | jerrico1
    0

  • Hello, first sorry for my bad english,it isn't my first lanugage.
    I have a website with 13 years of history and activity. 5 Months ago we received an warning message from our domain provider which would seize our domain because of sanctions (i live in Iran), and they have seized many of iranian domains since then, therefore i have decided to quickly change my domain to another address so i could save my website as much as possible before they take out my domain...
    I have moved my website successfully to a new domain address and have done everything necessary for a good domain move (301 all links, change template and...) I have also used the "Change of Address Tool" provided in google search console so google knows my new domain address and change all of my links...
    Unfortunately 90% of my traffics comes from google, therefore we are hevaily depending on organic traffic.
    Since i have changed my domain address my traffic has been declining, and now i have only 30% of the traffic input left from google compared to my old domain 5 months ago. (i had recently some SEO troubles too which could effect this decline even more)
    Fortunately my old domain wasn't seized by the domain provider and i have successfully transfered it to another provide recently so there is no danger for my old domain anymore.
    My question is, should i move my website back to my old domain (cancel the google "Change of Address Tool" and use it again to move the new domain back to the old domain)? My old domain has more than 13 Years of history,has many backlinks within this 13years, and till now, i cannot get good rankings with new posts on the new domain, sometimes google even does not index my new articles several days, but my old domain ranks still well (i have tested a new article on the old domain to see how it performs and it was not very good, but i think it ranks still better than my new domain).
    My top pages and categories has been redirected successful and are still ranking well on google on the new domain address and hasn't been affected negativly, my main problem are new posts that are not ranking well o even does not get indexed for several days!
    I don't know what to do now, are 5months not enough for google to completly transfer all domain scores from my old domain to the new one? Will all scores of my old domain even transfer to my old domain eventually? How about the many Backlinks i have pointed to the old domain? (which 90% i cannot change or ask to change to my new address) Will the backlinks scores pointing to the old domain transfer to the new domain?In other hand i fear to move my site back to the old domain because i don't know how google would behave, would all my seo score and rankings come back after i move back to the old domain? Also as far as i know, after 6months of using the google "Change of Address Tool" i cannot cancle the domain change anymore, therefore i have roughly 1 month to decide to cancle the move or not...
    Please if anyone could help or guide me what to do it would be life saving for me, because my whole income and my family depends on my website...  😞

    | Milad25
    0

  • My client currently has a main website on a url and an eCommerce site on a subdomain. The eCommerce site is currently not mobile friendly, has images that are too small and are problematic - and I believe it negates some of the SEO work we do for them. I had to turn off Google Shopping ads because the quality score was so low. That being said, they are rebuilding a shopping cart on a new platform that will be mobile friendly BUT the images are going to be tiny until they slowly replace images over several months. Would you keep the shopping cart on a subdomain, or make it part of the main website URL? Can it negatively impact the progress we have made on the main site SEO.

    | jerrico1
    0

  • We are in a bit of a tricky situation since a key top-level page with lots of external links has been selected as a duplicate by Google. We do not have any canonical tag in place. Now this is fine if Google passes the link juice towards the page they have selected as canonical (an identical top-level page)- does anyone know the answer to this question? Due to various reasons, we can't put a canonical tag ourselves at this moment in time. So my question is, does a Google selected canonical work the same way and pass link juice as a user selected canonical? Thanks!

    | Lewald1
    0

  • Hi, I'm hoping someone can provide some insight. I Google searched "citizenpath" recently and found that all of our our sitelinks have identical text. The text seems to come from the site footer. It isn't using the meta descriptions (we definitely have) or even a Google-dictated snippet from the page. I understand we don't have "control" of this. It's also worth mentioning that if you search a specific page like "contact us citizenpath" you'll get a more appropriate excerpt. Can you help us understand what is happening? This isn't helpful for Google users or CitizenPath. Did the Google algorithm go awry or is there a technical error on our site? We use up-to-date versions of Wordpress and Yoast SEO. Thanks! search.png

    | 123Russ
    0

  • Hi The Service area pages created on my Shopify website is not indexing on google for a long time, Tried indexing the pages manually and also submitted the sitemap but still the pages doesn't seem to get indexed.
    Thanks in Advance.

    | Bhisshaun
    0

  • Hi there, We upgraded our webshop last weekend and our moz crawl on monday found a lot of errors we are trying to fix. I am having some communication problems with our webmaster so I need a little help. We have extremely long category pages url, does anyone have a guess which kind of mistake our webmaster could make:
    https://site-name.pl/category-name?page=3?resultsPerPage=53?resultsPerPage=53 .... And it keeps on repeating the string ?resultsPerPage=53 exactly 451 times as if there was some kind of loop. Thanks in advance for any kind of hint 🙂
    Kind regards,
    Isabelle

    | isabelledylag
    0

  • Absolutely no idea what is going on. All of our category / subcategory and other support pages are indexed and cached as normal, but suddenly none of our product pages are cached, and all of the product / offer schema snippets have been dropped from the serps as well (price, review count, average rating etc). When I inspect a product detail page url in GSC, I am either getting errors or it is returned as a soft 404. There have been no recent changes to our website that are obvious culprits. When I request indexing, it works fine for non-product pages, but generates the "Something went wrong
    If the issue persists, try again in a few hours" message for any product page submitted. We are not SEO novices. This is an Angular 7 site with a Universal version launched back in October (new site, same domain), and until this strange issue cropped up we'd enjoyed steady improvement of rankings and GSC technical issues. Has anyone seen anything like this? We are seeing rapid deterioration in rankings overnight for all product detail pages due to this issue. A term / page combination that ranked for over a decade in the top 10 lost 10 places overnight... There's just no obvious culprit. Using chrome dev tools to view as googlebot, everything is kosher. No weird redirects, no errors, returns 200 and page loads. Thank You

    | jamestown
    0

  • As of June 1 doctor pages on our website that say "No ratings are available yet" are being Soft 404ed in our Google Console. We suspect the issue is that wording, due to this post. https://www.contentkingapp.com/academy/index-coverage/faq/submitted-soft-404/ Just wondering if anyone with more expertise than me on 404s or local seo can validate that it is likely this issue. Some examples:
    https://www.nebraskamed.com/doctors/neil-s-kalsi
    https://www.nebraskamed.com/doctors/leslie-a-eiland
    https://www.nebraskamed.com/doctors/david-d-ingvoldstad 63647547-aa09-42a8-afe5-4431f277f611-image.png

    | Patrick_at_Nebraska_Medicine
    0

  • Our company implemented Google Shopping for our site for multiple countries, currencies and languages. Every combination of language and country is accessible via a url path and for all site pages, not just the pages with products for sale. I was not part of the project. We support 18 languages and 14 shop countries. When the project was finished we had a total of 240 language/country combinations listed in our rel alternate hreflang tags for every page and 240 language/country combinations in our XML sitemap for each page and canonicals are unique for every one of these page. My concern is with duplicate content. Also I can see odd language/country url combinations (like a country with a language spoken by a very low percentage of people in that country) are being crawled, indexed, and appearing in serps. This uses up my crawl budget for pages I don't care about. I don't this it is wise to disallow urls in robots.txt for that we are simultaneously listing in the XML sitemap. Is it true that these are requirements for Google Shopping to have XML sitemap and rel alternate hreflang for every language/country combination?

    | awilliams_kingston
    0

  • Here’s a situation I’ve been puzzling with for some time: The situation
    Please consider an international website targeting 3 regions. The real site has more regions, but I simplified the case for this question. screenshot1.png There is no default language. The content for each regional version is meant for that region only. The website.eu page is dynamic. When there is no region cookie, the page is identical to website.eu/nl/ (because Netherlands is the most important region) When there is a region cookie (set by a modal), there is a 302 redirect to the corresponding regional homepage What we want
    We want regional Google to index the correct regional homepages (eg. website.eu/nl/ on google.nl), instead of website.eu.
    Why? Because visitors surfing to website.eu sometimes tend to ignore the region modal and therefor browse the wrong version.
    For this, I set up canonicals and hreflangs as described below: screenshot2.png The problem
    It’s 40 days now since the above hreflangs and canonicals have been setup, but Google is still ranking website.eu instead of the regional homepages.
    Search console’s report for website.eu: screenshot3.png Any ideas why Google doesn’t respect our canonical? Maybe I’m overlooking something in this setup (combination of hreflangs and canonicals might be confusing)? Should I remove the hreflangs on the dynamic page, because there is no self-referencing hreflang? Or maybe it’s because website.eu has gathered a lot of backlinks over the years, whereas the regional homepages have much less, which might be why Google chooses to ig nore the canonical signals? Or maybe it’s a matter of time and I just need to wait longer? Note: I’m aware the language subfolders (eg. /be_nl) are not according to Google’s recommendations. But I’ve seen similar setups (like adobe.com and apple.com) where the regional homepage is showing ok. Any help appreciated!

    | dmduco
    0

  • Hi! We are trying to rank https://windowmart.ca for various local search terms. Our head office is in Edmonton where we try to rank https://windowmart.ca/edmonton-windows-doors/ for such terms as "windows Edmonton", "replacement windows Edmonton", "windows and doors Edmonton" as well as others. The website was the leader in its niche for around 2 years. Then we've got some server related issues, moved to a new server and connected CDN Nitropack that really improved our google speed test results. Recently we noticed that our rankings started to drop. Do you know if Nitropack can negatively effect local SEO rankings? Thank you!

    | vaskrupp
    0

  • Hi All, Sorry for what's about to be a long-ish question, but tl;dr: Has anyone else had experience with a 301 redirect at the server level between HTTP and HTTPS versions of a site in order to maintain accurate social media share counts? This is new to me and I'm wondering how common it is. I'm having issues with this forced redirect between HTTP/HTTPS as outlined below and am struggling to find any information that will help me to troubleshoot this or better understand the situation. If anyone has any recommendations for things to try or sources to read up on, I'd appreciate it. I'm especially concerned about any issues that this may be causing at the SEO level and the known-unknowns. A magazine I work for recently relaunched after switching platforms from Atavist to Newspack (which is run via WordPress). Since then, we've been having some issues with 301s, but they relate to new stories that are native to our new platform/CMS and have had zero URL changes. We've always used HTTPS. Basically, the preview for any post we make linking to the new site, including these new (non-migrated pages) on Facebook previews as a 301 in the title and with no image. This also overrides the social media metadata we set through Yoast Premium. I ran some of the links through the Facebook debugger and it appears that Facebook is reading these links to our site (using https) as redirects to http that then redirect to https. I was told by our tech support person on Newspack's team that this is intentional, so that Facebook will maintain accurate share counts versus separate share counts for http/https, however this forced redirect seems to be failing if we can't post our links with any metadata. (The only way to reliably fix is by adding a query parameter to each URL which, obviously, still gives us inaccurate share counts.) This is the first time I've encountered this intentional redirect thing and I've asked a few times for more information about how it's set up just for my own edification, but all I can get is that it’s something managed at the server level and is designed to prevent separate share counts for HTTP and HTTPS. Has anyone encountered this method before, and can anyone either explain it to me or point me in the direction of a resource where I can learn more about how it's configured as well as the pros and cons? I'm especially concerned about our SEO with this and how this may impact the way search engines read our site. So far, nothing's come up on scans, but I'd like to stay one step ahead of this. Thanks in advance!

    | ogiovetti
    0

  • Hi, I recently encountered a very strange problem.
    One of the pages I published in my website ranked very well for a couple of days on top 5, then after a couple of days, the page completely vanished, no matter how direct I search for it, does not appear on the results, I check GSC, everything seems to be normal, but when checking Google analytics, I find it strange that there is no data on the page since it disappeared and it also does not show up on the 'active pages' section no matter how many different computers i keep it open. I have checked to page 9, and used a couple of keyword tools and it appears nowhere! It didn't have any back links, but it was unique and high quality. I have checked on the page does still exist and it is still readable. Has this ´happened to anyone before? Any thoughts would be gratefully received.

    | JoelssonMedia
    0

  • Re: Are long URLs bad for SEO? Does a long domain name included on the count of characters bad for SEO as well?Here is an example : "https://kesslerfoundation.org/press-release/kessler-team-tests-regenerative-approach-preventing-osteoarthritis-after-knee-injury". This over 35 characters, however, does it begin after or before the domain name?

    | cesaromar1973
    0

  • We speak Persian and all people search in Persian on Google. But I read in some sources that the url should be in English. Please tell me which language to use for url writing?
    For example, I brought down two models: 1fb0e134-10dc-4737-904f-bfdf07143a98-image.png https://ghesta.ir/blog/how-to-become-rich/
    2)https://ghesta.ir/blog/چگونه-پولدار-شویم/

    | ghesta
    0

  • Hey Mozzers! I received a duplicate content notice from my Cycle7 Communications campaign today. I understand the concept of duplicate content, but none of the suggested fixes quite seems to fit. I have four pages with HubSpot forms embedded in them. (Only two of these pages have showed up so far in my campaign.) Each page contains a title (Content Marketing Consultation, Copywriting Consultation, etc), plus an embedded HubSpot form. The forms are all outwardly identical, but I use a separate form for each service that I offer. I’m not sure how to respond to this crawl issue: Using a 301 redirect doesn’t seem right, because each page/form combo is independent and serves a separate purpose. Using a rel=canonical link doesn’t seem right for the same reason that a 301 redirect doesn’t seem right. Using the Google Search Console URL Parameters tool is clearly contraindicated by Google’s documentation (I don’t have enough pages on my site). Is a meta robots noindex the best way to deal with duplicate content in this case? Thanks in advance for your help. AK

    | AndyKubrin
    0

  • What is your favorite tool for getting a report of URLs that are not cached/indexed in Google & Bing for an entire site? Basically I want a list of URLs not cached in Google and a seperate list for Bing. Thanks, Mark

    | elephantseo
    2

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.