you can access a Connection object with :
theServiceFactory = createObject('java','coldfusion.server.ServiceFactory');
//Creating the connection object simply by passing the DSN name
con = theServiceFactory.getDataSourceService().getDataSource('datasourcenam e').getConnection();
My server admin doesnt want me to use the ServiceFactory. What are alternative methods of getting the Connection object?
PS I need to access the Connection object to perform an Oracle Batch Insert.
Why aren't you just using the datasource directly?
<cfquery name="queryname" dataSource="datasourcename">
<!--- Oracle Batch insert code --->
If you are stuck inside cfscript for some reason, try using the query function: http://help.adobe.com/en_US/ColdFusion/9.0/CFMLRef/WSe9cbe5cf462523a06 93d5dae123bcd28f6d-7ffb.html
Message was edited by: Miguel-F
Because that's not how Batch Insert works, it's a series of function calls on the connector:
local.sql = "INSERT INTO...VALUES(?,?,?,?)
local.sqlStatement = local.connect.prepareStatement(local.sql);
Which is just a series of insert statements, right? I think the same thing could be accomplished by doing something like:
<cfquery name="queryname" datasource="datasourcename">
INSERT INTO tablename (column1, column2, column3, column4)
<cfloop from="1" to="#whatever#" index="i">
( <cfqueryparam cfsqltype="cf_sql_varchar" value="#value1#" />
,<cfqueryparam cfsqltype="cf_sql_varchar" value="#value2#" />
,<cfqueryparam cfsqltype="cf_sql_varchar" value="#value3#" />
,<cfqueryparam cfsqltype="cf_sql_varchar" value="#value4#" /> )
<cfif i LT whatever>, </cfif>
<sarcasm>Gee, why didn't I think of that?!?</sarcasm>.
To paraphrase a quote from a Star Trek movie, "Watch as your server dies". Around 4000 records in, 20 minutes later,
your performance will grind to a halt as your heap space runs out and you have to restart the server.
The Oracle Batch Insert code above inserts 30000 records in < 60 seconds. Yeah, thirty thousand inserts in less than a minute.
Also for INSERT INTO Oracle has a 1000 record limit. I didnt want to run that 30 times. Using Batch Insert is MUCH MUCH faster without any fixed limit. I stopped testing at 5000 records at a time,
people routinely use it with 10000 record batches. .