Can you provide the domain URL in question?
Also an example of what you are experiencing so we can help replicate.
By default the system should not longer index system URLs but if this is happening for your site we'll need to investigate this further.
Sidney: www.stringstalker.com is the domain. When a customer searches for "spypoint b47" in Google, the URL shown in the results takes them to stringstalker.partnersite.com website.
Sent on behalf of LittleBigJoel
1 person found this helpful
I reviewed the notes from our team and it states trial sites shouldn't be indexed. So in your case as its a live site you'll want to follow the note below.
"Note that trial sites will not be indexed by search engines. Should you want to disable search engine indexing after upgrade, please use a custom robots.txt file or upload an empty sitemap.xml.
If you'd like to use a custom sitemap.xml file, simply upload it to your site. If this file exists, it will be served instead of the BC-generated sitemap.xml."
Let me know how this goes for you.
Sidney, Can you again pass this on to the Devs on this, I do not think what they put in place is working.
I have not had this for us yet but since the update I have seen several people still have their URL's indexed.
Will do Liam!
Either myself or someone on the team will respond back with some more details on this occurrence.
1 person found this helpful
The current version available in production, has disabled indexing for trial sites by automatically serving a robots.txt file. See this new trial site URL for example:
Once the site is upgraded, we stop serving the robots.txt file and customers will have to create their own robots file and upload it in their site root.
The .businesscatalyst system URL is till appearing in google for the site www.williambaycottages.com.au. The duplicate of the site is affecting the SEO.
We did a search:
How can we ensure that all these URLs are removed? I contacted support who told me to reindex the site, which I have done but the problem is still occuring.
The client has threatened to go and get a Wordpress site so I am keen to resolve this issue before they do!h
I sent the information to Marius but I found the issue likely having people get their staging URL indexed by google and it is because of the sitemap.xml
1. If you duplicate an existing site the sitemap.xml that now shows in FTP comes over. Even if you take the new site live, make changes during its development the system thinks that the sitemap.xml it sees is one you have manually added so does not trigger the sitemap.xml updates with new URL's etc.
If you had not figured this out you do not know about it and you get things indexed and then have to work to get the removed.
2. In the other issue case if you have a website which you activate (pay) for the CMS to do development and testing on things like payment forms, eCommerce etc the system now turns on the sitemap.xml (as its not manual under the new UI).
But it wil generate the development URL. This is ok as you should not have linked the site openly and google should not see it.
If you take the site live, change to the new domain URL the sitemap will keep rendering in the old development URL.
If you delete the sitemap.xml in the hope it will generate a new one, this also then does not work (Did this to two sites) and checked the next day and no sitemap.xml was generated.
I knew I could go to the old UI or use the URL in the admin to access the old SEO page and I saw that the sitemap.xml was off. IT was not till I turned that back on and then waited for the next day I saw the sitemap.xml created and also with the correct links.
If you are not aware of these (which most wont, you do not expect this) and you do things like register your sitemap.xml with google webmaster tools etc or it finds it naturually you are going to run into a variety of issues from it thinkin you have duplicant content to registering those url's you do not want it to.
So with this in mind there needs to be some improvements to avoid and address these issues, the changes you guys made are not 100% just yet.
@jsjs2012 if you have had this you need to make sure things like the above has not occured and then do things like tell google webmaster tools about the development urls and how you not want them to show etc.
That and ensuring your not linking to those urls any where on the site (happens more then you think, people think they have not but they have) etc and if you just give it a few weeks google gets the idea and they first drop in ranking over your actual content and then do not show up at all.
The site in question has been live for a long time so the the xml sitemap has been up and running and is also not showing any of the system URLs in it.
As a comparison - head to google and search the following: site:williambaycottages.businesscatalyst.com
This shows all the pages on the website that google recognises with the system URL.
To compare to another site without the system URLS do the same search but for:
This site does not show any system URLS which is what we would like for the William Bay Cottages site.
Can you shed any light on this matter?
Your site was before the changes the BC team made to prevent this then.
As to how it happend - since we have had only one case (where the client shared the development link on a public site which google grabed the link and then crawled the site from) it normally is as a result of the site being linked so google can crawl it.
So, first step is to ensure this is not from the site itself. You need run a search (lots of tools out there) to crawl the site links on the site and ensure that there is no absolute url link to that development domain. If you do, once google is in their it will see all relative links as that develoment domain and index the URL's as such.
IF that is all good and it is fixed up, do you have webmaster tools set up?
We have been through the entire site to ensure there is no link with the system URL - there was was last week which we fixed. We then reindexed the site. I do have Webmaster Tools and the site was submitted long ago. Is there another step I should take now?
Yes, it looks great when you search william bay cottages, but this is not the issue. We are searching for (and trying to remove from the google search) all the pages that google has indexed that contain the business catalyst URL, to search for these you need to put the following into google:
One thing everyone needs to understand, what ever you do, the fix wont happen overnight, it may not even happen over a few days. When you have had this happen to fix it takes time.
Going live with a site to have fully fledged SEO can take up to 6 months, SEO takes time
This may work (someone correct me if I am wrong) add the site to Google's webmaster tools IE: mysite.businesscatalyst.com and verify it within Google. Once that is done you can remove that site from Google and the problem should be fixed. I had my site doing this in the beginning. Was a bad link, I fixed then removed the site from Google's search index.
Thanks Lynda, I have tried this so fingers crossed this is the answer! thanks again.
Is anything more being done to fix this issue?
Your suggestions are not going to stop the trial site domain from being indexed. Can this please be escalated as it 's a very big problem with many of our sites. One client in particular has paid an SEO company to do work on the site, but they are unable to complete their work because of duplicate content issues.
This site has 40 results in Google for the trial site domain: Google "site:hillslodge.pebbledesign.com"
I noticed that the above williambaycottages.businesscatalyst.com is still being indexed so the webmaster tools solution obviously has not worked.
The trial URL needs to be completely removed from Google's index. Not redirected, removed.
No, the webmaster tools did not work, none of the suggestions have. My client is also paying hefty fees for SEO and the dual domain name is also effecting their SEO results. Sadly, they don't plan to renew with BC as of December when their annual hosting expires and will be transfering to Word Press hosting. I was told that the BC updates in September would remedy this issue but sadly nothing has changed and we will lose a client due to this.
You did not see the updat eand check the dates on these posts Klaye?
Go into the domain manager and click the development URL and you will see a new option in there to prevent it being indexed.
It will take time for google to start ignoring it if it has picked it up though but the option has been implemented for a little while now.
That's a shame. We too have lost BC clients due to little things like this.
I've thought of a workaround and may put it into action unless BC can resolve the issue. I'll use our client for this example:
- Change the trial URL in the partner portal from hillslodge.pebbledesign.com to hillslodge2.pebbledesign.com (or something similar).
- Create an A Record for hillslodge.pebbledesign.com inside our partner hosting account and point it to our other hosting solution (Media Temple).
- Within that hosting account, add a robots.txt and disallow it from being indexed.
Obviously this is a bit of a stuff around, but I'm pretty sure it will work. Hopefully it won't affect the rankings of the main domain, as hillslodge.pebbledesign.com has a 301 redirect to hillslodge.com.au.
It would be much easier if BC could just resolve this issue themselves as this is not a time effective solution if we need to do it for all our clients.
Liam, yes this has been implemented for a while now and the redirect is working, but the domain is still being picked up, just Google "site:hillslodge.pebbledesign.com" and you can see.
I implemeted this change a long time ago too, and google is still recognising the system URL. It's been a few months now, I would assume changes would have been recognised now.
Agree with the above, it's been long enough for Google to stop indexing the trial domain. Something else needs to be done. Will this be looked into, or should I try implement my workaround above and see if it works?
Have you tried the new disavow feautre?
Google is thinking your links are stronger in that URL and it is likely that the site has links to that development URL.
I can honestly say I never had issues before unless clients linked the development url and with the new tool it sorts things out.
This finally appears to be working, but PDFs are not being redirected: https://www.google.com.au/search?q=site%3Ahillslodge.pebbledesign.com&oq=site%3Ahill&sugex p=chrome,mod=0&sourceid=chrome&ie=UTF-8
Is this going to be resolved?