I've got a client who's system and secure domains are ranking very high on google. My SEO advisor has mentioned that a key way to eliminate these URLs from google is through the use of disallowing content through robots.txt. Given BC's unique nature of dealing with system and secure domains I'm not too sure if this is even possible as any disallowances I've seen or used before have been directories and not absolute URL's, nor have I seen any mention of this possibility around. Any help or advice would be great!
Under Site Manager > Pages, when accessing a specific page, you can open the SEO Metadata section and tick “Hide this page for search engines”
Aside from this, using the robots.txt file is indeed an efficient way of instructing search engine robots which pages are not to be indexed.
Alex, if you read a bit more into Mike's post there is a key thing he mentions here and look at the date of the post.
This was from June. He mentiones secure thus worldsecuresystems. Which is a different domain for your secure content. Since that time the engineers did updates to prevent these from showing up search engines.
Europe, Middle East and Africa