Is it feasible or achievable to operate a lot off internet sites from only one code base?

For instance I've 1 site that bases it's connection string from the domain title or subdomain title. So, based on what domain/subdomain is striking the website the website returns content that's saved inside a database particularly for your site.

What kinds of issues might occur from carrying this out? Particularly if carrying this out with asp.internet.

The correct answer is acceptable.

Just observe that anybody can alter the domain title that you might get (as lengthy as is available set up a number header for this), so just make certain you do not bypass making something similar to '' but depending only on that for security (you would be mad though, clearly).

I see not a problem by using it.

It really works and it is proven. Se DotNetNuke just for 1 illustration of this.

Request are available in. Regex/character matchthe domain title. Load configurations for your domain (base road to images, css, config, pages and so forth) and from you go.

The gotcha to consider is that if the application is both a) storing data in memory and b) utilizing the same application space. Therefore if, for instance, you need to dish up two different blogs and also you want the information to become resident in memory (if, say, your back-finish store was XML and also you did not wish to parse XML with every request) then you will need to make certain that Asp.Internet sees each call like a separate application (which could both indicate exactly the same file-system folder and therefore uses exactly the same files).

I went into this exact situation when coding a multi-blog data provider for BlogEngine.Internet. It utilizes a single code base for everyone up different blogs in line with the asked for URL. However, since BlogEngine.Internet carries its data in memory, the information provider will not work unless of course IIS is set up to ensure that each blog is its very own application.