Friday, February 25, 2011

Web Crawlers | Search Engine Robots | Search Engine Spiders

A web crawler (also known as a web spider or web robot) is a program or automated script which browses the internet seeking for web pages to process. Many applications mostly search engines, crawl websites everyday in order to find up-to-date data. Most of the web crawlers save a copy of the visited page so they could easily index it later.

.:: Flash Contents and Web Spiders ::.

Flash is cool, in fact it's much cooler than HTML. It's dynamic and cutting edge. Unfortunately, search engine spiders use trailing-edge technology.

Remember: a search engine spider is roughly equivalent to a version 2.0 web browser. Spiders simply can't interpret newer technologies, such as Flash. So even though that Flash animation may amaze your visitors, it's invisible to the search engines. If you're using Flash to add a bit of spice to your site, but most of your pages are written in standard HTML, this shouldn't be a problem.

 

But if you've created your entire site using Flash, you've got a serious problem getting your site into the engines. And you will have to optimize flash using different steps....

some clients wants to develop flash site so it seems to be tough job handling a flash site and a real challenge for seo!

So that is why having a full PHP page is not really a problem in SEO... because once you browse a particular PHP it was already converted to HTML.

I have never build flash content websites, i do attractiveness to my websites from the help of CSS. But it's wanted to SE spiders be updated to face the new technologies.

 

source : http://www.googlecommunity.com/forum/