Search Engine Blekko Release New Update

The search engine Blekko originally came out as a response to the power Google held over the market. Their complaint was that that Google wasn’t doing enough to provide fresh or spam free content, was invading web user’s privacy and was withholding too much information on their indexing methods. Now, in line with their original proclamation, the latest update to their web crawler focused completely on fresher and more relevant content and enhanced SEO data.

Founding of Blekko

Ironically founded by Richard Skrenta in 2010, thought of by many as the inventor of what we know today as the computer virus, Blekko stood by the idea that a search engine was a tool of the community. That all information involved with it compilation should be released and that spam should be ignored completely because it doesn’t contribute anything to the user.

SEO Data & Blekko

One of the big draw-cards for Blekko from an industry perspective of course, is the range of SEO data that they provide, free for the asking. And their latest update has really focused on bringing that to the forefront. SEO data will now be continuously refreshed. Webmaster have always had access SEO information about any webpage (rather than just their own as on Google) that comes up on the SERP by clicking a link just below the result. It provides tools such as: * The link profile * Crawl data * Duplicate content detection * Direct and indirect website comparisons * Anchor text data * List of pages indexed This SEO information can only be applied to Google to the extent that their ranking algorithm is the same. So think again before you go crazy optimising your page with Blekko analytics so it ranks better in Google. Although Blekko is ruthless with its policing of spam, even using human editors to inspect flagged sites individually. So getting a thumbs-up from Blekko makes it immensely unlikely that it will be penalised for spamming by Google.

Spam Free Fresh Content

Blekko’s crawler, Scoutjet, goes through around 100 million web pages per day, with an index of just over 5 billion pages, which they started literally from nothing. (Rumour has it that Google’s current index could be as big a 1 trillion webpages.) Blekko say that quality assurance is the reason for such a small index. Providing the freshest spam-free content can be incredibly challenging when your index is too large. Search engines can’t give enough attention to anomalies and rely completely on the indexing procedure and user reports.

Summary of Improved Features

* Top websites are updated each hour. * Crawl and index will be continuously updated. * SEO data will be continuously updated. * Index size is now over 5 billion pages, with a crawl rate of 100 million pages per day. * Continuous updates for backlink counts.