I'm developing a website in asp.internet however i possess some problem..

I've coded a course which could crawl a give web site i.e. thenextweb.com because of its links, and content and pictures.

Now i wish to store these indexed data within my table *Crawlr_Data*.

I would like the crawler runs after every half an hour and up-to-date the table with new links if available.

saved within the database

How do i run the crawler on back finish increase the database ? What technology like (web services, WCF) must i use or other factor in visual studio that we may use to ensure that i if host website its crawler continues running and upgrading table

Please suggest Thanks

You will find two ways to get this done using the Microsoft stack.

  1. Produce a plan to operate on the server. Possess the service itself manage if this awakens and crawls.

  2. Produce a console application to complete the crawl. Run the console application like a scheduled task using home windows task scheduler as frequently as you desire.

I suppose you will find different ways to get it done -- so saying you will find just two isn't totally accurate -- you will find third party programs that is going to do it for you personally also... I expect most otherwise all are implemented like a service. You might write a course that works on the server less a console application or like a service. But this generally is a bad idea.