Footer External Links
-
I manage a collection of 3 travel websites with DA's in 50's and 60's and have launched a new site which currently has a DA of 35.
Currently there are no links between sites in the footer. Adding a footer link to new site from the existing stronger sites will be relevant to users on those sites.
How would Google treat such footer links - we would be talking about a link from 5000+ pages from 3 sites?
Would it help us on the new site? Damage the new site?
Could it impact negatively on the currently strong websites?Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Capturing Source Dynamically for UTM Parameters
Does anyone have a tutorial on how to dynamically capture the referring source to be populated in UTM parameters for Google Analytics? We want to syndicate content and be able to see all of the websites that provided referral traffic for this specific objective. We want to set a specific utm_medium and utm_campaign but have the utm_source be dynamic and capture the referring website. If we set a permanent utm_source, it would appear the same for all incoming traffic. Thanks in advance!
Technical SEO | | peteboyd0 -
How to index e-commerce marketplace product pages
Hello! We are an online marketplace that submitted our sitemap through Google Search Console 2 weeks ago. Although the sitemap has been submitted successfully, out of ~10000 links (we have ~10000 product pages), we only have 25 that have been indexed. I've attached images of the reasons given for not indexing the platform. gsc-dashboard-1 gsc-dashboard-2 How would we go about fixing this?
Technical SEO | | fbcosta0 -
Traffic from External Sites
I'm hoping I came to the right forum. Is there a way in Moz to view a prospective ad source's traffic?
Digital Marketing | | EHCarrera20220 -
Unsolved Orphaned unwanted urls from the cms
Hi
Technical SEO | | MattHopkins
I am working on quite an old cms, and there are bunch of urls that don't make any sense.
https://www.trentfurniture.co.uk/products/all-outdoor-furniture/all-outdoor-furniture/1
https://www.trentfurniture.co.uk/products/all-chairs/all-chairs/1
https://www.trentfurniture.co.uk/products/all-industries/all-chairs/1
https://www.trentfurniture.co.uk/products/all-chairs/all-industries/1
https://www.trentfurniture.co.uk/products/all-chairs/banqueting-furniture/1
https://www.trentfurniture.co.uk/products/all-chairs/bar-furniture/1
https://www.trentfurniture.co.uk/products/all-chairs/bentwood-furniture/1
For example there are no internal links. And fortunately not much traffic at all. But I can't see in the cms why they are generating? I've tried to check the html code to check why, what's the reason? But all I can think of is the structure....? something odd the cms writes?
Anyone have any ideas please? And would I redirect all these? Just thinking there could be a better solution/fix, rather than redirects since there are no links or traffic.....Like the devs solve why they are generating.....Unfortunately I get very slow responses from the devs as a 3rd pty company, hence on here ;0). (Some of those are indexed too)... :0) Thanks in advance....0 -
Unsolved Duplicate LocalBusiness Schema Markup
Hello! I've been having a hard time finding an answer to this specific question so I figured I'd drop it here. I always add custom LocalBusiness markup to clients' homepages, but sometimes the client's website provider will include their own automated LocalBusiness markup. The codes I create often include more information. Assuming the website provider is unwilling to remove their markup, is it a bad idea to include my code as well? It seems like it could potentially be read as spammy by Google. Do the pros of having more detailed markup outweigh that potential negative impact?
Local Website Optimization | | GoogleAlgoServant0 -
Unsolved Have we been penalised?
Hey Community, We need help! Have we been penalised, or is there some technical SEO issue that is stopping our service pages from being properly read? Website: www.digitalnext.com.au In July 2021, we suffered a huge drop in coverage for both short and longtail keywords. We thought that this could have been because of the link spam, core web vitals or core update around that time period. SEMRush: https://gyazo.com/d85bd2541abd7c5ed2e33edecc62854c
Technical SEO | | StevenLord
GSC: https://gyazo.com/c1d689aff3506d5d4194848e625af6ec There is no manual action within GSC and we have historically ranked page 1 for super competitive keywords. After waiting some time thinking it was an error, we have then have taken the following actions: Launched new website. Rewrote all page content (except blog posts). Ensured each page passes core web vitals. Submitted a backlink detox. Removed a website that was spoofing our old one. Introduced strong pillar and cluster internal link structure. After 3 months of the new website, none of our core terms has come back and we are struggling for visibility. We still rank for some super long-tail keywords but this is the lowest amount of visibility we have had in over 5 years. Every time we launch a blog post it does rank for competitive keywords, yet the old keywords are still completely missing. It almost feels like any URLs that used to rank for core terms are being penalised. So, I am wondering whether this is a penalisation (and what algorithm), or, there is something wrong with the structure of our service pages for them to not rank. Look forward to hearing from you
Steven0 -
Fake Links indexing in google
Hello everyone, I have an interesting situation occurring here, and hoping maybe someone here has seen something of this nature or be able to offer some sort of advice. So, we recently installed a wordpress to a subdomain for our business and have been blogging through it. We added the google webmaster tools meta tag and I've noticed an increase in 404 links. I brought this up to or server admin, and he verified that there were a lot of ip's pinging our server looking for these links that don't exist. We've combed through our server files and nothing seems to be compromised. Today, we noticed that when you do site:ourdomain.com into google the subdomain with wordpress shows hundreds of these fake links, that when you visit them, return a 404 page. Just curious if anyone has seen anything like this, what it may be, how we can stop it, could it negatively impact us in anyway? Should we even worry about it? Here's the link to the google results. https://www.google.com/search?q=site%3Amshowells.com&oq=site%3A&aqs=chrome.0.69i59j69i57j69i58.1905j0j1&sourceid=chrome&es_sm=91&ie=UTF-8 (odd links show up on pages 2-3+)
Technical SEO | | mshowells0 -
What i should do about bad links ?
Hi, my blog is http://www.dota2club.com/ and i have many bad links to my blog what i should do about that and how ? i started 10 days ago guest blogging but my bad links from before are hurting my blog. please help 🙂 thank you !!!
Technical SEO | | wolfinjo0