I have been having this ongoing battle with large files (3 MB
Basically each file is a comma delimited text file that on
every line has information that needs to be checked against the
database and then either updated or inserted.
Right now I loop through every line in the file and using
CFQUERY to do the database check and insert/update.
But with the files being 3 MB, it is slow and timesout. The
customer is not happy.
For the database check I use "IF EXISTS (....) UPDATE ....
ELSE INSERT ...."
The ISP is using CF Server MX 6 and SQL Server 2000. I cannot
use CFFILE for security reasons, they created their own CFX tag to
accomodate uploads. So trying to access the file via SQL Server
2000 probably isn't possible.
Is there a better way to deal with large files and trying to
get them into the database?
If you can't access the file via sql server, then its going
to be difficult. Row by row inserts are slow. It would be faster to
import the data into a staging table. Then update/insert the main
table. Finally truncate the staging table.
Sql server has a few tools for importing CSV files. I don't
know which (if any) you have access to:
BCP (can use it command line or via xp_cmdshell)
DTS package (may be restricted)
Linked Server (sp_addlinkedserver. requires one-time setup by
OpenDatasource/OpenRowset (ad-hoc querying. requires
Given the current restrictions, I'm assuming DTS is out.
Having the dba establish a linked server might be a viable option
or possibly bcp.