[OpenMap Users] reading very large shapefiles/dbf files

From: C S <usmsci_at_yahoo.com>
Date: Mon, 4 Oct 2010 09:24:33 -0700 (PDT)

Hi all,

   i have some code currently, that reads in shapefiles/dbf files using the openmap API. I have ran into a bottleneck now reading very large shapefiles. The system that is reading the shapefiles is using web services in java and it doesnt have enough memory to load and stream in multiple files at once and sometimes even one very very large file.

I am currently using the below code to read in the files:

ShapeFile shapeFile = null;
DbfInputStream dbfFile = null;
DbfTableModel dbfTable = null;

.....
.....

shapeFile = new ShapeFile(shpInfile);
dbfFile = new DbfInputStream(dbfInputStream);
dbfTable = new DbfTableModel(dbfFile);

is there a better way to read in the shapefiles and/or dbf blocks at a time to preserve memory at run time? My thought is, is that i can read a section in memory, do some calculations, read the next block and write over the previous block of memory with new data and so on.

Any help please on this matter is appreciated. thanks!

-cws


      

--
[To unsubscribe to this list send an email to "majdart_at_bbn.com"
with the following text in the BODY of the message "unsubscribe openmap-users"]
Received on Mon Oct 04 2010 - 12:25:46 EDT

This archive was generated by hypermail 2.3.0 : Tue Mar 28 2017 - 23:25:09 EDT