What is the use case for loading 400,000 records into memory at one time? I honestly can't think of a single scenario where that would be a good idea. Upgrading RAM and increasing the amount of memory allocated to the JVM for ColdFusion may help, but without knowing how much data is actually contained in each record (number of columns, datatypes, etc.) it is impossible to say for certain.
If you elaborate more on what you are doing with all of that data, people might be able to offer some constructive suggestions on how to accomplish it more efficiently.
I ahve to agree with what Carl said. I can't fathom a raeason to load that many records at once. Also, why Access? You could easily convert that to a real db like mysql that can handle large datasets much easier.
Thanks for answer,
This is the first time I work with so many records, so I did not think about it.
Then I did as usual, I loaded the whole table.
And it worked the first time (strange), then I checked one field of all records in one loop.
And did a small process if the data of the field was not correct.
It did work. (with a correct process time)
I had to repeat again, because, I did not check the right field.
And then I had this memory error message.
Then I think about, how it works.
I understood, that it loaded the whole table (datas) in memory.
Then, I only loaded the ID and the wanted field,
And in the loop (witch i do by steps now, startrow, endrow), if the field is not correct, I do a new query and load the full record. (all fileds) to run the small process which needs all fields for a copy of them..
Seems to work,
But very very slow.
I wait, I wait, not sure it is working correctly, I hope.
i started the same job on a Windows 7 with CF9 and a 4Go RAM,
With the new way of processing it takes around 1 hour to process 10 000 records.
I do it step by step in the loop (startrow and endrow)
I wait till the end , to check the full result.
Not easy to see if it is processing well or doing nothing.
On windows 7, I can open the mdb file (dispite of the lock) and see the progress.
On windows XP, I can refresh in Windows Explorer and see the mdb file size increasing.
(XP and 7 do not work the same way)
I let you know on result in the 2 environments.
Thanks again for your attention and advise.
The PC has only 1Go of RAM memory,
If I upgrade the memory to 2Go, does this should help ?
Yes, it will very likely help. However, it is not enough to double the RAM. I would also go along with what Carl says. The Out Of Memory error suggests the amount of memory available to the Java Virtual Machine(JVM) that powers ColdFusion is low in the circumstances. In fact, one can guess that it is several hundred MB.
I am on MySQL and, just days ago, dragged in 10 000 000 five-column records in a test. MS Access is not as optimized as specialist Relational Database Management Systems like MySQL and SQL Server. Its inefficiencies cost extra memory. That would explain why your system comes to a standstill after swallowng 200MB. So, after increasing the RAM, you should increase the amount of memory allocated to the JVM to a value an order of magnitude higher than 200MB. Let us say, 1024MB.
Better , if you go with Pagination.
I want to elaborate on something BKBK said:
MS Access is not as optimized as specialist Relational Database Management Systems like MySQL and SQL Server. Its inefficiencies cost extra memory. That would explain why your system comes to a standstill after swallowng 200MB. So, after increasing the RAM, you should increase the amount of memory allocated to the JVM to a value an order of magnitude higher than 200MB. Let us say, 1024MB.
ColdFusion cannot communicate directly with databases stored in Microsoft Access .MDB format at the Java level (via JDBC drivers); instead, ColdFusion uses a JDBC-ODBC bridge to talk to an ODBC datasource on your server (or workstation). ODBC is then using the MS Jet database engine to interact with your .MDB file. So, lots of moving pieces. And with MS Jet there is no database engine running as a service (like there is with MySQL, MSSQL, PostgreSQL, Oracle, or most other relational database systems). The MS Jet engine doesn't have the ability to cache execution plans, store recordsets in cache memory, keep an effective transaction log, handle more than a few simultaneous users, or perform many other optimizations that a modern relational database system has. Therefore, it will be inherently less efficient and performant than a modern relational database. This is why it is generally considered poor practice to use a Microsoft Access database as the back end of a web application. You would be better served to migrate that data into one of the previously mentioned database systems (MySQL and PostgreSQL are open-source so you don't have to buy them to use them).
Now onto other issues. You talk about doing some looping to process batches of records (and I'm assuming some UPDATE queries are involved). Usually there are better ways to tackle this using more efficient query design (and especially if you can utilize one of those great relational database systems mentioned previously). Can you describe what you are doing to the records in your table, and why? We might be able to offer some alternatives if we understand what you need to accomplish.
Cheers, Carl. What a delight to read!
Thanks for all.
These data comes from an Excel file which contains more than 400 000 lines.
I converted to Access, for a better manipulation with CF (in my knowledge).
I need to make a management sytem with those to create/update/delete via a form.
(and more than that, I can't all explain)
One of the output must be an Excel file too.
I use to work with Access for small sites,
very easy and flexible.
A single files to manipulate (copy, and you get all).
So as there will be a single user on this,
As far as I can , I will keep Access, if possible.
(I used Access very often for small sites, so I am very familiar with)
That job I did was to eliminate duplicate records from the original Excel.
So I just write a record in a new table, and write the next if not already present in the new table.
It may exist nicer solution to eliminate duplicate, but I do not know it.
The job has finished now on Window 7 and CF9.
So I think, now, next processes will be more simple. (lighter).
Thanks, again for any advice in DB manipulation.
Easy for small amount, need thinking for large amount.
The job has finished now on Window 7 and CF9.
Chapeau, Pierre! That is the bottom line. Everything else is optional and can wait till next time.