2 Replies Latest reply on Oct 18, 2006 4:29 AM by eeWhatzUp

    Using cfexecute with big files

      I'm trying to create image files and save them for later use using 3rd party cgi (called getimage) that creates temporary image files without saving on the server.

      In order to do this, I wrote a script that makes a file that contains 'curl' execution line then used <cfexecute> to execute it.
      Following is my coding...

      output="curl -u user:pass">

      name = "/mydirectory/myfilename" output="mydirectory/myfilename.sit" timeout = "120"></cfexecute>

      sit file is made by program called StuffIt... something like pkzip program

      This coding works very nicely with image files that are less than 200MB.
      However, if the image file size is greater than 2-300MB, <cfexecute> times out no matter how high I set the 'timeout'.

      Is this a limitation of <cfexecute>? or is there any other way of running 'curl' command within web environment?

      I thank everyone in advance... :-)