So you’ve downloaded the SDK, and you finally installed Visual Studio 2008. You crack it open and get your “Hello World” project running. Now you are ready to dive head first into a complete Silverlight site.
Getting buy-off from the various organizations will be your first hurdle. They are interested in how well the site will perform in the search engines. You don’t really care about that because you want to be able to have a stronger fidelity between what your designer has created and what you can deliver to the web.
Search Engine Optimization will be the engine that drives traffic to your site, but if the search engine’s spider cannot crawl your site then your site may not even exist. So what are you to do?
The official line from Microsoft is that they are working with the major search engine sites to help them crawl XAML files. After all XAML is just XML and is plain text. However, with Silverlight 2 all of those XAML files will most likely be packaged up into a XAP file. A XAP file is essentially a ZIP and I’m not sure that search engines have the time to unzip the XAP just to index the contents.
Well if you think about it, what do search engines index very well? If you guessed HTML pages then you’d be right. If you wrote your HTML page with content only then it is 100% pure gold for search engines.
Not to get off track here, but I am a big proponent of the Semantic Web. Although I probably won’t go so far as to say that you should be writing RDF triplets only, but I believe in the ideas of using elements that make sense. If you are writing a paragraph of text enclose it in a
<P> tag. If you want to emphasize something enclose it in an
<EM> tag. Also careful use of ID’s and Classes make your HTML much more readable and makes it more friendly to search engines.
Although this will create a bit of work for you in creating the XSLT the benefits you’ll get go beyond just SEO. I’ll investigate this more in a brief series of posts that will be published over the next couple of days.