has anybody knowledge about import and export of huge collections (from the database within this situation) with .internet.

While using database export itself isn't feasible because you will find multiple database backends supported and that i need this for any type of platform independent export/import.

The issue here would be that the XmlSerializer or DataContractSerializer classes browse the data all at one time for deserialization - consider the information sets could possibly get very large this isn't achievable. What are the solutions which build upon the present serialization infrastructure but support iterative reading through from the files?


Have a look at FileHelpers. I have used that library previously also it read and validated (via characteristics mounted on my import class people) about 25k records over a couple of seconds.