at this time, i have to load huge data from database right into a vector, however when i loaded 38000 rows of information, this program get rid of OutOfMemoryError exception. So what can i actually do additional ?

I believe there might be some memory leak during my program, good techniques to identify it ?thanks

Provide more memory for your JVM (usually using -Xmx/-Xms) or don't load all of the data into memory.

For a lot of procedures on immeasureable data you will find calculations which have no need for use of everything at the same time. One class of these calculations are divide and conquer algorithms.

If you'll want all of the data in memory, try caching generally showing up objects. For instance, if you're searching at worker records and every one has employment title, make use of a HashMap when loading the information and reuse the task game titles already found. This could significantly lower the quantity of memory you are using.

Also, before you decide to do anything whatsoever, make use of a profiler to determine where memory has been wasted, and to see if things that may be garbage collected don't have any references going swimming. Again, String is a very common example, since if for instance you are while using first 10 chars of the 2000 char string, and you've got used substring rather than allocating a brand new String, that which you really have is really a mention of the a char[2000] array, with two indices pointing at and 10. Again, an enormous memory waster.

You can test growing the heap size:

 java -Xms<initial heap size> -Xmx<maximum heap size>

Default is

java -Xms32m -Xmx128m

Do you want to possess this type of large object saved in memory?

Depending of the items you've related to that data you might like to split it in lesser portions.

Load the information section by section. This can not allow you to focus on all data simultaneously, however, you will not need to alter the memory presented to the JVM.

Maybe optimize your computer data classes? I have seen a situation someone continues to be using Strings instead of native datatypes for example int or double for each class member that gave an OutOfMemoryError when storing a comparatively little bit of data objects in memory. Have a look that you are not copying your objects. And, obviously, boost the heap size:

java -Xmx512M (or anything you deem necessary)

You can run your code utilizing a profiler to know why and how the memory has been eaten up. Debug the right path with the loop watching what's being instantiated. You will find a variety of them JProfiler, Java Memory Profiler, begin to see the list of profilers here, and so on.

Give your program use more memory or far better re-think the process. Do you want a lot data within the memory?

I understand you are attempting to see the information into vector - otherwise, should you where attempting to display them, I'd have recommended you utilize NatTable. It is made for reading through large amount of information right into a table.

In my opinion it could prove useful for an additional readers here.

Make use of a memory planned file. Memory planned files can essentially grow as large as you would like, without striking the heap. It will require that you simply scribe your computer data inside a decoding-friendly way. (Like, it might seem sensible to order a set size for each row inside your data, to be able to rapidly skip numerous rows.)

Preon enables you cope with that simply. It is a framework that aims to complete to binary encoded data what Hibernate has accomplished for relational databases, and JAXB/XStream/XmlBeans to XML.