In my last post I described a method of building web sites that have the ability for search engines to crawl the site. This is important in Silverlight because traditionally all of the content within a Silverlight site is contained within the XAML. With the method described, all the content is still in the HTML.
The question remains, though, why do I care? The first and most obvious answer is: if you don’t how will anyone ever find your content?
The second answer is a little more complicated. When you write a commercial web application you want to increase its visibility in search engines because another method of getting people to your site includes paid search. What does one have to do with the other you ask? Well if your site performs well in the search engine, then the cost of your paid search placement will go down. The closer to #1 your site is when you search your keyword terms, the less you have to pay for the targeted paid search.
I’ll describe more benefits to using this method of creating your Silverlight sites, but if any of this post doesn’t make sense, please comment and I’ll try to explain it better in a future post.