Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Is there any SEO benefit for links shared in Facebook feeds or wall posts?

    | NinjaTEL3
    0

  • Hello all, All our category pages www.pitchcare.com/shop are linked to from every product page via the sidebar navigation. Which results in every category page having over 1700 links with the same anchor text. I have noticed that the category pages dont appear to be ranked when they most definately should be. For example http://www.pitchcare.com/shop/moss-control/index.html is not ranked for the term "moss control" instead another of our deeper pages is ranked on page 1. Reading a previous SEO MOZ article ·  Excessive Internal Anchor Text Linking / Manipulation Can Trip An Automated Penalty on Google
    I recently had my second run-in with a penalty at Google that appears to punish sites for excessive internal linking with "optimized" (or "keyword stuffed anchor text") links. When the links were removed (in both cases, they were found in the footer of the website sitewide), the rankings were restored immediately following Google's next crawl, indicating a fully automated filter (rather than a manual penalty requiring a re-consideration request). Do you think we may have triggered a penalty? If so what would be the best way to tackle this? Could we add no follows on the product pages? Cheers Todd

    | toddyC
    0

  • In 2008 we performed an experiment which showed some seemingly random behaviour by Google (indexation, caching, pagerank distributiuon). Today I put the results together and analysed the data we had and got some strange results which hint at a possibility that Google purposely throws in a normal behaviour deviation here and there. Do you think Google randomises its algorithm to prevent reverse engineering and enable chance discoveries or is it all a big load balancing act which produces quasi-random behaviour?

    | Dan-Petrovic
    0

  • I have a site that is successful on the SERPs for a certain geography, let's call it City A (I'm sure you can't tell what it is from my username). I'm moving to a new city in another state so I will be building my business in this area (City B). Should I create a new domain for City B with CityBWebsiteDesign.com or should I create a sub-domain called CityB.BrandableCompanyName.com and just redirect CityBWebsiteDesign.com to the URL for offline marketing purposes only? My current website BrandableCompanyName.com has some authority with Google. Will it be better to building something on the sub-domain and get any sort of cross-benefits or are there really no benefits to be had between sub-domains? The benefit of going with CityBWebsiteDesign.com would be having a keyword rich URL but I would basically be starting from zero with building authority. Specific experience you've had with this or cited examples would be great for the discussion! Thanks,
    Jared

    | JaredDetroit
    0

  • By paying you guys each month will you be making my website more visible and accessible or will you only point out the mistakes I should fix?

    | vacksah
    0

  • For some reason, our urls are set to change from “www.apprenda.com/ANYTHING" to “apprenda.com/ANYTHING” These register as different pages though?  We have rankings in SEOMoz Pro for terms where our homepage shows up 6th on google, but SEOMoz says it's not on the first page because it's checking against apprenda.com and not www.apprenda.com Also, it seems like for some reason pages with trailing slashes also register differently than those without. Should we be doing something for that? Something to make sure all pages get rewritten to having the trailing slash or not? For instance, this url: http://apprenda.com/saasgrid/features/multi-tenancy/ and this url” http://apprenda.com/saasgrid/features/multi-tenancy are really the same page.  Yet in our analytics, they register as different pages with their own stats, etc. What should we do in our particular case, and how can we get this fixed? I really appreciate the help, and thanks in advance! Jesse

    | ApprendaPlatform
    0

  • I whant to specify a link rel cannonical for each category page, how to do that without changing the code (just from admin section), because filters and sorting search are making the site dublicate content with their parameters; If there is a way please specify the method, i whant to avoid hours of working in a script like this. Thank's.

    | oneticsoft
    0

  • I'm using a forum plugin called Simple Press, and the rest of my site is looking good with only a few minor errors due to a long url. Anyway, the only 4 major errors I have are these; These 3 links have no titles, so is there somewhere I can give them titles, or do a rel=nofollow? /index.php?sf_ahah=acknowledge /index.php?sf_ahah=permissions /index.php?sf_ahah=tags And then the 3 above plus this one; http://www.societyforethicsand…..?xfeed=all Have no META DESCRIPTION associated with them. So, is there somewhere I can add the meta description for all 4? I have spoken to support, and it turns out the first 3 links with no titles are ajax content for pop ups, instead of waiting for them to work out how to resolve this issue, does anyone know how to stop them coming up as major errors?

    | CosmikCarrot
    0

  • Hi all, For a project I'm working on there will be an opportunity to have a number of websites link back to our main site. Rather than giving out a straight forward link text I'm more interested in building and handing out some kind of widget which is topical to both us and the websites giving us links. Although I do a fair bit of web development with various technologies I have never played around with building widgets as such which can pull in data feeds from our database etc... Does anyone have any good recommendations of tutorials covering this area or alternatively any companies offering this kind of widget building service. Thanks in advance, Darren

    | DarrenAtkinson
    0

  • I have a particular Page which shows primary contact details as well as "additional" contact details for the client. GIven I do not believe I want Google to misinterpret the focus of the page from the primary contact details which of the following three options would be best? Place the "additional" contact details (w/maps) in Javascript, Ajax or similar to suppress them from being crawled. Leave "additional" contact details alone but emphasize the Primary contact details by placing the Primary contact details in Rich Snippets/Microformats. Do nothing and allow Google to Crawl the pages with all contact details Thanks, Phil

    | AU-SEO
    0

  • We carry a few brands that have special foreign characters, e.g., Kühl, Lolë, but do search engines recognize special unicode characters? Obviously we would want to spend more energy optimizing keywords that potential customers can type with a keyboard, but is it worthwhile to throw in some encoded keywords and anchor text for people that copy-paste these words into a search? Do search engines typically equate special characters to their closest English equivalent, or are "Kuhl", "Kühl" and "Kühl" three entirely different terms?

    | TahoeMountain40
    0

  • Trying to figure out how to best optimize timing of new content... including blogs and other on page content?

    | AaronSchinke
    0

  • If a user runs a search that returns no results, and the server returns a 204 (No Content), will Googlebot treat that as the rough equivalent of a 404 or a noindex? If not, then it seems one would want to noindex the page to avoid low quality penalties, but that might require more back and forth with the server, which isn't ideal. Kurus

    | kurus
    0

  • Hello All, I am optimizing three websites for a services based company in the South Jersey Area. Of course within South Jersey there are certain counties, cities and towns I would like to show up for. For example- Pool Cleaning South Jersey Pool Cleaning Cherry Hill NJ Pool Cleaning Burlington County NJ Pool Cleaning Voorhies NJ Pool Cleaning. Do I need to create a page on my websites for every possible county, city and town I want to rank for? This would entail creating thousands of pages targeting these geographic keywords. I have seen other similar sites just list all the counties, cities and towns they service in the footer and it seems to work. Of course this would be beneficial for any business who is looking to not only rank in their home base but a predetermined radius around their home base as well. Thanks so much, Bill

    | wparlaman
    0

  • We're looking for a company that can help us optimize our google product feed.  Does anyone have any recommendations or suggestions? Thanks!

    | eric_since1910.com
    0

  • I catched a dropped domain with a nice keyword, but poor reputation. It used to have some malware on the site and WOT (site review tool available at Chrome among others) has very negative reviews tied to the site. I guess that Google has to have records about that as well, because Chrome used to prompt a warning when I entered the site. My question is: how long will the bad reputation last if I build a legitimate website there?

    | zapalka
    0

  • Hi, I just wanted to get some clarification on whether Google would penalize your site if you had many links coming from a questionable site. We've been struggling with rankings for years even though we have one of the oldest sites in the industry with a good link profile and the site is well optimized. I was looking through webmaster tools and noticed that one website links to us over 100,000 times, all to the home page. The site is www.vietnamfuntravel.com. When I looked at the site it seems that they operate a massive links exchange, I'm not sure what the history is and why they link to us so much though. Is there any chance that this could impact us negatively? if it is then what would be the best way to deal with the situation? I could ask them to take the links down but can't guarantee they would do it quickly (if at all). Would blocking their domain from our htaccess file have the desired effect?

    | Maximise
    0

  • What advice do you have for achieving verification for Google Places for a client? I have a client at the moment and I tried getting the call sent through and I'm not sure what happened but a couple of tries at this did not work. I've tried the post card way and I'm still waiting. Do I need to be more patient in Australia for this verification post card? Is there a way I can verify the info myself? note: I have set up a seperate email that there business email to handle a lot of the link building but this is different to there business email which Google uses.

    | iSenseWebSolutions
    0

  • My site has faceted navigation that allows shoppers to filter category page results by things brand, size, price range, etc.  These pages 302 redirect to the same page they came from, which already include canonical meta tags.  I added the rel="nofollow" attribute to the facet links and added the line "Disallow: /category_filter/" to robots.txt. One of our SEO consultants told me that this is likely diluting the potency of the page's link juice since they are divided among all the page's links, including the links I am instructing crawlers to disregard. Can anybody tell me whether I am following the best practices for links that redirect to the same page?

    | TahoeMountain40
    0

  • Hello All! Our site uses dynamically generated pages. I was about to begin the process of optimising our product category pages www.pitchcare.com/shop I was going to use internal anchor text from some high ranking pages within our site but each of the product category pages already have 1745 links! Am I correct in saying that internal anchor text links works to a certain point? (maybe 10 or so links) So any new internal anchor text links will count for nothing? Thanks Todd

    | toddyC
    0

  • If I have 2 domains with different content that are in same topic, and each one lives on its own IP-address, what could be the result if I do permanent redirect of just one internal page from one domain to counterpart page of another? What if I use rel=canonical instead of R301? Thank you!

    | kolio_kolev
    0

  • We are switching one of our sites to a magento site and dont want to loose current rankings what are the best practices for this?  Same Domain but the deep url pages will change urls

    | DavidKonigsberg
    0

  • Hi, I'd like to ask you what should I do in my situation. I've shorted my URLs from something like this: domain.com/module/action/type/id/keyword to this: domain.com/keyword After 301 SERP refreshed and position stayed the same (yea, lucky me :). After 2 days I got some hight PR links (4 and 5). After 8 days my new URL disapprear to one keyword. Now this take 6 days... I've removed these links and still no results. So the question is - what should I do? Remove new url and replace it with old one, get new links?

    | sui
    0

  • I have recently come across several bloggers that have been trying to formulate the best concise definition of SEO. What one sentence definitions have you used / seen? Avoid run-ons. Tweetable, even better.

    | Gyi
    0

  • Let's suppose that I want to rank for the keyword "hotels". If I put this keyword in ALL of the link anchor texts then Google will very likely penalize the site. My question is: How many keyword variations should I use in anchors (provided I want to rank for just one KW i.e. "hotels")? Would one keyword variation be okay and is it fine to use main keyword in 80% anchors and the keyword variation(s) in just 20% anchor texts, such as : hotels 80% cheap hotels 20% Note: I do not want to rank for "cheap hotels", just want to use it as an anchor variation of my desired keyword "hotels". Thanks!

    | RightDirection
    0

  • I'm curious if anyone here running a large, complex, dynamic site has used the Apache server mod_rewrite module  to simplify their site's URLs by rewriting them in a standard format. The chief use of this module for SEO purposes would be to aid in canonicalization and reduce duplicate content.  For example, you could easily convert all of you ALL CAPS or MixedCase URLs to lower case, change all "/index.html" URLs to just point to "/", change all word seperators to hyphens, and so on. Any server-side ninjas out there with stories to tell?  🙂

    | jcolman
    0

  • hi, i have way to many 302 redirects, how can i bulk change these to 301 i have started in cpanel but i could be old by the time i finsih

    | freedomelectronics
    1

  • Hello, I have what a I think it's a noob question.. I have a medium size website and need to put it into maintenance for the next 2 months, and afterwards activate a completly new site. My client asked me to do this, cause the same people whoe run the constant flow of information on the site, are the ones who are going to develop the new site, so he wants to just close it out So... what are the steps for doing this with minimum impact on any SEO advances made this past months?.. How do I tell the search engines, Hey, just under maintenance for a while....then... i'm back in the game but this is my new structure. and the old one should go here

    | daniel.alvarez
    0

  • Is there any way to predict if and how Organic traffic would change if we sucesfully added some high-quality links to our website? Quantifying link value would help to plan how much time/efforts we should spend on quality link-building. I understand that the more good links we get - the better. But beyond that, I am looking for some methodology/data/formulas that would help to decide if links are worth pursing. Here is an example: Let's say we acquired 20 high-quality links from PR 0-5 pages of some trusted web sites of PR6-8. Let's say that on these pages would also link to 10-20 other web sites. Would such campaign be of some direct value to our ecommerce website of PR6? My question is limited to how high-quality links improve overall Google search traffic to the website only. I am not interested in calculating value of individual keywords - most of our search traffic comes from long tail. I am also not interested in how to estimate referral traffic - both seem much easier topics to tackle. But how would I be able to measure the value of lets say 1 link from PR 8 site with a PR3 page, when there are 10 other external links on that page?

    | Quidsi
    0

  • If my site has just five content pages, instead of 25 or 50, then will it get penalized by Google for a given moderately competitive keyword?

    | RightDirection
    0

  • Prob the most n00b question of all, but once I understand this I will be able to research on my own from here: If a search engine produces results by the keywords from individual website posts/pages, then how are the keywords I choose for my homepage so important if the general homepage meta-tag keywords are essentially ignored by the search engines? Should I repeat my primary keywords on EVERY post, in addition to the ones that relate to that individual post or am I misunderstanding something fundamental? My new site is http://splatterMUSIC.com and I want to be at the top of the results for anyone wanting to watch music vlogs, album reviews, music lessons, funny music-related videos, new non-major label music videos, and all kinds of other concert footage, etc.

    | SEOsolver
    0

  • Hi there! Some doubts are confusing my head and need some assistence from you to get on the right track. I'll explain my situation and want to hear from you what do you really recommend for med/long term permanent results. 1 - I have a PR2 (.com.br) domain; 2 - I'm talking about little/med competition micro-niche keywords; 3 - I got all pages I want to, indexed (I have a well SEO constructed website with internal link building); 4 - If a keyword has average competition, I'll already start ranking in page #3 on the SERP's; For a few low competition keywords I start on page #1; 5 - I do a little whitehat link building, 1 or 2 backlinks on authority sites and then like 15 days later I came to page #1, generally on position 9/10; And then I got stucked 🙂 No more authority sites where I can get backlinks... I do some posts on the company twitter/facebook page's, but they are no follow, so I don't really now if this can help. (never see a SERP result). I did some "blackhat" stuff to see if it really work: I can say for sure the "profile backlinks" that we can buy from some sites doesn't work (maybe it's just for me). I can't see it on webmaster tool and neither my ranks changed since I bought a pack of 100 links (the links are working, I see it one by one) to test. Maybe the problem is about the domains, cause my site is .com.br and I'm buying .com profile links. I guess google understand backlinks from .com.br more valuable for my sites. Back to whitehat: I wrote some articles and posted it the right way, of course on .com.br articles sites, got it indexed and can see the backlink on webmaster tool, but no change on SERP's. (maybe this can be a long term result and I'm not seeing it yet). I'm really "scratching my hand" to do some blackhat stuff, but I don't want to lose what I already have done... I heard a lot about scrapebox but doesn't fell confortable to spam as hell a lot of blogs. I really want long term permanent results (my sites are totally whitehat/corporate sites). Can you expert guys give me some point to where I need to "walk" now to improve the SERP's? I never reached top #1 and want to try to rank at least one time to understand how this can be made... I'm thinking now to pay someone to rewrite 20 copies of an article and up it on some sites, to see if 20 can improve something. But still no confident, because it will cost like $100 for a good writer do it for me on my language. Maybe I can do better things with 100 bucks. I guess I did the path right: Internal SEO -> got indexed -> backlinking from authorities -> new articles backlinks to me (is it ok at this position or no?) -> (what next ?) I know SEO is a hard/never ending work, but what I'm trying to get cleaned on my head is the path of the work (if a right path really exists). Every word will be apreciated. What do you can suggest to me to try now? (please give me a hint to see SERP's results 🙂 if I feel that something worked, no matter how it can cost to me, but I'll pay for the work happily) Sorry if I'm a little confusing, english isnt' my first language. Thanks.

    | azaiats2
    0

  • I've read several blogs discussing how including more than one H1 per page is a serious no no. However, what is the most effective <h>tag to use for your global navigation system. Or should it not be an <h>tag period?</h></h>

    | calin_daniel
    0

  • Hi guys, I have been persevering with this ranking for some time now and thought you might be able to help. Or direct me to where I can get help. I am learning a lot through SEOmoz but I am still very green. Basically on the 20th of the 12th we jumped up to a 2nd place listing and then dropped back down on the 17th of the 1st 2011. The site is http://mlb.broomeaccommodation.com.au and the search term is 'Broome Accommodation'  as you can see it is a considerable drop and really affecting our bookings and sales figures. I have attached a link to a screen capture of the problem - http://exitforward.com/kimberleyaccomm/seomoz.png Interested to hear your thoughts and get some help on this frustrating matter Kind regards Bodie Czeladka

    | Bodie
    0

  • If a site removes thousands of pages in one day, without any redirects, is there reason to think Google will penalize the site for this? I have thousands of subcategory index pages. I've figured out a way to reduce the number, but it won't be easy to put in redirects for the ones I'm deleting. They will just disappear. There's no link juice issue. These pages are only linked internally, and indexed in Google. Nobody else links to them. Does anyone think it would be better to remove the pages gradually over time instead of all at once? Thanks!

    | Interesting.com
    0

  • All in the title really. One of our clients came up with errors with a server header check, so I pinged them and it times out. The hosting company have told them that it's because they're blocking ICMP requests and this doesn't affect SEO at all... but I know that sometimes pinging posts, etc... can be beneficial so is this correct? Thanks, Steve.

    | SteveOllington
    0

  • Suppose I got a blog about cooking and another about computers. What's the best architecture for SEO ? mysite.com/cooking-blog mysite.com/computers-blog OR cooking-blog.mysite.com computers-blog.mysite.com ?

    | marcelocustodio
    0

  • Hi people! When you try to access http://www.ufam.edu.br you are redirected to http://portal.ufam.edu.br. Here's the code: <title>UFAM - Universidade Federal do Amazonas</title> What are the implications for SEO of this ? Won't the juice be passed? Isn't it better for passing juice using redirect 301?

    | marcelocustodio
    0

  • Hi Basically my wordpress site exists of 4 Parent Categories, each of them having + 50 subpages (or child pages) in them. Some of these child pages have other child pages in them. This creates URL's (or slugs) looking like this: mydomain.com/sunflower-seeds/exotic-sunflower-seeds/new-york-seed-shops I found a plugin that 301 redirects a page to any given url. This way i can make www.mydomain.com/sunflower-seeds/exotic-sunflower-seeds/new-york-seed-shops redirect to mydomain.com/new-york-seed-shops, only with the purpose to have shorter links and the keyword appearing 'earlier' in the url. Another option to shorten the url's is to shorten the parent pages slugs, something like this: mydomain.com/seeds/exotic/new-york. Only downside is that the keyword for the parent pages themselves won't appear in the slug anymore. The keywords will still be there in the Title, Description and H1 tag, but not in the url. so if my keyword is stunflower seeds, instead of having www.mydomain.com/sunflower-seeds i'll have www.mydomain.com/seeds What would you do? 1: Leave urls like this - really long urls, keywords don't come 'early' in url
    mydomain.com/sunflower-seeds/exotic-sunflower-seeds/new-york-seed-shops 2: Make child pages 301 redirect - short urls, but url structure is lost (parent pages don't show in the url)mydomain.com/new-york-seed-shop3: User shorter slugs for Parent Pages - to me looks like the best option, but then i'll loose my keywords in the slugs for parent pages.mydomain.com/seeds/exotic/new-york4: Yes there's even a 4th option, again the example of how my slugs look now: mydomain.com/sunflower-seeds/exotic-sunflower-seeds/new-york-seed-shops I could use the plugin to redirect every child page (for example new-york-seed-shops) to this: mydomain.com/seeds/exotic/new-york-seed-shops. But i don't know if it's okay to redirect to different parent pages with wordpress...I'd also like to know if backlinks to 301 redirect pages count for the page they redirect to. And if they have the same impact/strength as links that link directly to the page.Thanks in advance

    | ZeroGrav
    1

  • Hi, I'm working on the SEO for a site and I'd love to get the additional page links under our main site link in google, any ideas as to how we can achieve this ? Thanks,

    | Prongo
    0

  • Hi, What is hte quickest way to get up the rankings for a particular not hugley competitive keyword ? Thanks,

    | Prongo
    0

  • What is the systematic what of doing keyword research? I have been optimizing some sites, keeping in mind the completion and relevance. But still I am not happy with the keywords selected. Can you tell me the international Standard why of selecting keywords for any site? A systematic way, which will work for any kind of website. Please give me some list of free tools also which I can use for it. Thanks

    | ShashankGupta
    0

  • Hi there, First Q&A question 🙂 So I understand the problems caused by having a few secure pages on a site. A few links to the https version a page and you have duplicate content issues. While there are several posts here at SEOmoz that talk about the different ways of dealing with this issue with respect to secure pages, the majority of this content assumes that the goal of the SEO is to make sure no duplicate https pages end up in the index. The posts also suggest that https should only used on log in pages, contact forms, shopping carts, etc." That's the root of my problem. I'm facing the prospect of switching to https across an entire site. In the light of other https related content I've read, this might seem unecessary or overkill, but there's a vaild reason behind it. I work for a certificate authority. A company that issues  SSL certificates, the cryptographic files that make the https protocol work. So there's an obvious need our site to "appear" protected, even if no sensitive data is being moved through the pages. The stronger push, however, stems from our membership of the Online Trust Alliance. https://otalliance.org/ Essentially, in the parts of the internet that deal with SSL and security, there's a push for all sites to utilize HSTS Headers and force sitewide https. Paypal and Bank of America are leading the way in this intiative, and other large retailers/banks/etc. will no doubt follow suit. Regardless of what you feel about all that, the reality is that we're looking at future that involves more privacy protection, more SSL, and more https. The bottom line for me is; I have a site of ~800 pages that I will need to switch to https. I'm finding it difficult to map the tips and tricks for keeping the odd pesky https page out of the index, to what amounts to a sitewide migratiion. So, here are a few general questions. What are the major considerations for such a switch? Are there any less obvious pitfalls lurking? Should I even consider trying to maintain an index of http pages, or should I start work on replacing (or have googlebot replace) the old pages with https versions? Is that something that can be done with canonicalization? or would something at the server level be necessary? How is that going to affect my page authority in general? What obvious questions am I not asking? Sorry to be so longwinded, but this is a tricky one for me, and I want to be sure I'm giving as much pertinent information as possible. Any input will be very much appreciated. Thanks, Dennis

    | dennis.globalsign
    0

  • Hi Guys, If you have a baby of a domain which is only a few months old what should  be the priorty for getting establish once all the on site stuff has been done? I know the directories are not as important as they use to be but is there a top list that should be worked through steadly to get the new site setup on? Kind regards

    | ao.com
    0

  • I am creating a process & strategy for publishing videos and would like some feedback on if it is worth syndicating the videos on multiple third party sites either manually or via TubeMogul. Currently the plan is to put the video on our main site, create a video sitemap, submit to webmaster tools so that hopefully the videos on our site show up in the SERPs. Then wait a week and syndicate on third party sites such as youtube, vimeo, viddler, etc. Are there any negative side affects? My thinking is that some people may only see the video if it is on youtube and the other niche sites because they may search those sites for videos rather than Google/Bing. Your thoughts?

    | elephantseo
    0

  • I recently redesigned a website that is now in WordPress. It was previously in some odd, custom platform that didn't work very well. The URL's for all the pages are now more search engine friendly and more concise. The problem is, now Google has all of the old pages and all of the new pages in its index. This is a duplicate problem since content is the same. I have set up a 301 redirect for every old URL to it's new counterpart. I was going to do a remove URL request in Webmaster Tools but it seems I need to have a 404 code and not a 301 on those pages to do that. Which is better to do to get the old URL's out of the index? 404 them and do a removal request or 301 them to the new URL? How long will it take Google to find these 301 redirects and keep just the new pages in the index?

    | DanDeceuster
    0

  • Hi, On our website Vliegtickets.nl we are now targeting one combination of keywords " flights + destination". We are working on a new website and we are rewriting texts for the launch. Our idea is to target again on the combination " flights + destination", but also target other combinations. Our intention is to have a first introduction text targeted on " flights + destination" and have lower parts of texts (descriptions) targeted on the long tail, with use of keywords like  flight / fly to / cheap tickets to etc. Our page will be divided like this: 1. introduction text max 150 words (h1 + h2) targeted on " flights + destination"
    2. box 2 Cityguide snippets / content in context of destination - targeted on city name
    3. box 3. Question: target on new generic keywords + combination of destination What is your point of view on box 3? Is it the right way to target a broader range of keywords? Should we use these long tail keywords (fly to / flights / cheap flights...) also in meta title / meta description / strong kwds, etc? Or will it be sufficient if we target on one combination and use those synonyms in a lower density? Or is it best to keep the focus on one main combination and other pages on other keyword combinations? Best regards, Vliegtickets.nl

    | vliegticketsnl
    0

  • Hello all, think per instance in a comparator of cars, motorbikes, etc, where you have dozens of brands, types of cars and motorbikes like diesel or oil, 4x4 vs sport, etc So, in one part of your site you are reviewing them in detail, explaining everything. You also have a database with hundreds of models with several specs like top speed, length, engine, etc so you can automatically create an info page for these hundreds of models. How would you make both of them live together in your website? If you add the review to the automatted articles, then you would have an unconsistency as you cannot manually review all the products. On the other hand, doing it separetly will lead to a very, very similar title posts and urls (revision vs automated versions). In my particular case, I just had the revisions until now and my site is developed in Wordpress. I had all the url posts below the home (mysite.com/review-of-car-x-of-brand-y) and now I am going to add the automatted ones and am thinking on place the automatted ones like WP Custom Posts and the url would be mysite.com/cars/description-of-car-x-of-brand-y. But still have the problem with categories, tags, etc, etc Well, it is long question but what do you think about this?

    | antorome
    1

  • I'm to get some ideas on restructuring existing content for geo-targeting. Example: Botox Page Tis is a hypothetical situation -- laser cosmetics clinic in Atlanta trying to rank for Atlanta Botox. The existing content is general information about botox procedures. The problem is editing the content to add Atlanta to the H1 tag and page copy. I'm wondering if there are some techniques to make the edits flow better? My idea is to add a geo-page for each procedure, but I'm wondering if this might interrupt or confuse users in the navigation funnel. Your thoughts? Thanks!

    | 190west
    0

  • I manage a site that has home page authority of 69, and overall domain authority of 63. To improve domain authority, would it help to remove some of the  pages that have 0 page authority?  There are over 1,000 pages to this site, and I always thought that the more pages you have, the better (generally). But, does it actually hurt the site to have pages that Google perceives as having 0 page authority, or does this have no bearing? Any insight would be greatly appreciated.

    | DiscoverBoating
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.