Skip to main content
Inspiring
November 30, 2022
Answered

Large file uploads and Request Throttle Memory.

  • November 30, 2022
  • 1 reply
  • 493 views

I have some questions regarding large file uploads and memory usage. Maybe sombody has some more info on this.

 

I am working with CF 2021 and IIS 10.

 

I just finished troubleshooting a problem with large file uploads in a multipart form post. Anything larger then 200mb was failing with an internal server error. It turns out that setting max post size is not enough, I also have to set the request throttle memory setting to a value larger then the max file size I want to  upload.

 

This is really probematic because it actually seems to load the file into memory as it uploads, and it takes up the memory for the entire duration of the upload. Is this really the way uploads have to work in CF? It seems really stupid.  Is there any other way I can allow large file uploads without having to potentialy tie up so much memory?

This topic has been closed for replies.
Correct answer zeejayy

Hi Charlie, thanks for the reply.

 

I was actually able to see the IIS worker process, in task manager, using up increasing amounts of memory during the upload and maxing out at the total post size including the file. I know this type of thing was supposed to be fixed a long time ago so I am not sure what exactly was going on.

 

I am seeing this when posting a multipart form with a file and using CF file with action upload to save it to a dir. If the file size is over the value set in request throttle memory then I get a generic 500 error instead. I wasn't able to find any info in the logs relating to it.

 

I have never had a need to upload very large files directly via POST in CF before, dispite being a CF developer since ver 4.5. As such I have never needed to mess with the post size limit or the request throttle memory.

 

Dispite all of this I already found a better way to do it using a chunked uploader. This way the max post size stays way under the default limit and I don't need to fuss with anything there.

1 reply

zeejayyAuthor
Inspiring
November 30, 2022

I guess I should have waited a bit longer before posting this. Looks like I can chunk my uploads and it will probably solve my issue. I will leave this up in case anyone has other suggestions.

Charlie Arehart
Braniac
November 30, 2022

If you mean that you still think cf ties up memory in cf equal to the size of a uploaded file, no, that should not be the case as far as I know. That was a problem up until an update in cf7, about 15 years ago. Do you have some evidence that indicates it's doing that?

 

If you're only speculating, is the bottom line that you feel you cannot upload a large file? If so, what size file fails? What is the error (preferably a detailed error message from one of cf's logs, if none is shown onscreen)? And what are your values for those throttle settings? (I can concur that they've always been very poorly understood/documented.) 

/Charlie (troubleshooter, carehart. org)
zeejayyAuthorCorrect answer
Inspiring
November 30, 2022

Hi Charlie, thanks for the reply.

 

I was actually able to see the IIS worker process, in task manager, using up increasing amounts of memory during the upload and maxing out at the total post size including the file. I know this type of thing was supposed to be fixed a long time ago so I am not sure what exactly was going on.

 

I am seeing this when posting a multipart form with a file and using CF file with action upload to save it to a dir. If the file size is over the value set in request throttle memory then I get a generic 500 error instead. I wasn't able to find any info in the logs relating to it.

 

I have never had a need to upload very large files directly via POST in CF before, dispite being a CF developer since ver 4.5. As such I have never needed to mess with the post size limit or the request throttle memory.

 

Dispite all of this I already found a better way to do it using a chunked uploader. This way the max post size stays way under the default limit and I don't need to fuss with anything there.