As other answers say, Google's crawler (and I believe those of other search engines) does not interpret Javascript -- and you should not try to differentiate by user-agent or the like (at the risk of having your site downgraded or blocked for presenting different contents to users vs robots). Rather, do offer some (perhaps minimal) level of content to visitors that have Javascript blocked for whatever reason (including the cases where the reason is "being robots";-) -- after all, that's the very reason the noscript tag exists... to make it very, very easy to offer such "minimal level of content" (or, more than minimal, if you so choose;-) to non-users of Javascript!