I've an free library which is made to support large collections effectively. I've examined it on my small PC with 24 GB but want to test drive it on much a bigger system, as much as 1 TB. Most hosting solutions don't support this kind of memory dimensions and that i just have access in short amounts of time.

Things I have examined is an accumulation of 500 million objects with 12 fields the entire GC time is below .11 seconds. I've another test were it stores 128 billion elements with only one bit.

The library/test is small, and so i do not have require much apart from a great deal of primary memory.

Have you got any suggestions how I possibly could do that testing without purchasing my very own server with 96 or 192 GB?

EC2 has high-memory instances as high as 68.4 GB each, plus they charge on an hourly basis. Granted, that's not 100GB of memory, but, should you stack a couple of of these up together....

You need to have a look at Amazon . com EC2 or Google application engine.