1 Reply Latest reply on Jul 8, 2008 8:42 AM by bldrbldr

    deep-linking SEO problem

      It used to be that you could create multiple entry points to a single SWF using a sitemap. For instance, you could define url such as http://www.example.com/store?product=100 and have that url return HTML content with the description of the product for SE consumption, and then load the SWF that then navigates to the particular product. However, after last weeks announcement, googlebot will happily press every link button in the app and read all the text for everything that's reachable from this url. If you have properly designed this entry point, it will be indexed using the text from your entire site! In fact, every entry point will be indexed with exactly all the text of your entire site. Does the crawler stop when the 'state' of the page is changed (i.e. when the value for a bookmark changes to represent a state change)?

      How can we get around this?