Basically used DataContext DB to keep many data. I discovered the performance can be really very slow when data number matures.

The amount of information is about 6000 record.

Basically place one data and SubmitChange, the SubmitChange will definitely cost 1.X secs.

Can there be in whatever way to enhance the performance or it's the limitation.....


I did not test drive it myself but attempt to not call SumbitChanges( ) after each place. Perform all 6000 card inserts after which call SubmitChanges( ) only once. The DataContext should know all the new changes you earn.

var _db = new YourDbContext( );

for( int i = 0; i < 6000; i++ )
    FoobarObj newObject = new FoobarObj( )
        Name = "xyz_" + i.ToString( );

    _db.FoobarObjects.InsertOnSubmit( newObject );

_db.SubmitChanges( );

One second seems like too lengthy. Considerably longer than I've measured. Inside a trivial test I simply did:

using(var dc = new MyDc(@"isostore:/stuff.sdf"))

       Enumerable.Range(0, 6000).Select( i => new MyData { Data = "Hello World" }));

I'm able to place for a price of just one~2 item per ms in 6000 batches, which continued to be pretty steady because the data size ongoing to develop. Basically change it out to more compact batches (like say 5 products) it drops to around 10ms per item, since there's a great deal of overhead involved with initializing the datacontext which rules the execution time.

So there has to be another thing happening. Are you able to give some particulars about what you're placing? Perhaps a code sample which demonstrates the issue finish to finish?