Would really appreciate help with solving something that I am sure is pretty straightforward.
I've got a form with a variable number of form fields, that I need to submit to a separate Coldfusion page to record the results in a database. Basic stuff. The number of form fields can grow to well over the permitted value of form fields that the server will allow, and I don't want to change that for security reasons.
The form contains fields along the lines of :
...where the number could be considerable depending upon the condition that generated the list of people, and as a result, with 3x the number of form fields per person, it has been known to push it over the limit.
So my question is, is there a way of passing a large number of form fields safely? I've read that JSON, or a combination of jQuery and JSON might be an option, but I'm finding examples difficult to come by. I've had no experience of using JSON.
What would be incredibly useful if there was an example of reading the form fields into a suitable format, how to pass the field values to the page to be submitted to, and how to extract them on the page submitted to that allows me to read them into a database.
Sorry if this is really basic, but it's not been an issue before, and i'm a bit stumped as to how to solve it.
Many thanks in advance
Ben, while someone may well offer a specific answer to your request (of how to handle a "large number" of form fields differently), I will share instead that concern over raising that admin limit, while reasonable, can be over-stated.
First, note that the risk of raising it "too high" is not one that enables a break-in but a denial of service attack (and one which was not unique to CF), as explained in this post,
And perhaps more important, note as well there that .net had chosen a limit of 1,000, while tomcat (which underlies CF) had chosen a limit of 10,000. So why did Adobe choose 200? There was never any explanation (when this decision was made in 2012).
So should you really stress over finding a way to stay under that low limit? I'm just not sure it's worth the bother.
And if this issue is indeed only about a potential denial of service attack, do most sites really need to worry about that?
Of course when it comes to security matters, some will be far more cautious than others. And I'm not here making any specific recommendations. I'm just sharing these observations, for an interested person to consider and research more.
And any new info on the whole matter would of course be welcome, as it may come to light.
But sure, if someone may offer an alternative way to pass your data and not hit this limit, it may well be of value to others as well.
Thanks for your reply. I was aware that the DoS risk, but the issue is what to set it to. The scope of the data has vastly widened due to unforeseen circumstances. Whereas something like 200 or 300 would have been OK with the original remit, there's nothing to stop the data resulting in needing 100x that, for example. Obviously, the whole application should be rethought and done properly, but that's not an option to me in the time scales. Management are of the opinion that this scale change shouldn't be an issue, and as a result, i've not got any time do it, so looking (hoping!) for a quick fix!
Yeah, raising the limit by 200x (or 40,000) would seem to be pushing things...again if you're really concerned about a DOS attack. If you're not, then perhaps you can use this info to decide if the effort (to rewrite) would be worthwhile.
Or again perhaps someone may have a better solution for you. I wasn't trying to assert there isn't one; it was simply that I didn't want to leave you and your question hanging here, if those other considerations may be helpful.
Appreciate your response. DoS is a consideration. Don't want to do something that makes it vulnerable in that respect, so don't want to increase the field limit. A proper re-write isn't possible either. Caught between a rock and a hard place...
There are a couple of things going on here. First, if you have that many form fields and want to pass them all to the server, you're not going to find a more terse way to do it than just using regular form fields. JSON doesn't require less room to describe the same data - it requires more!
Third, honestly, if you have a legitimate form that big, it's ok to increase the limit that's there just to define the size of a legitimate form. That's really all it is, and that's why you can change it. These constraints are movable exactly for this reason. Of course, that does make your server slightly more vulnerable to denial of service attacks, but that's the price of having a big form, and it's still just "slightly more vulnerable" - even if you increase it by 100x in most cases. Can you believe that for years we didn't have any limit at all?
Anyway, it seems like you can either just increase the form size or rewrite your form, so you should let your manager know that.
Dave Watts, Eidolon LLC
3. Rethink your user interface for managing this amount of data. Should the submission of so many form fields be considered a single transaction?
4. Use a spreadsheet to layout your data, then, upload the spreadsheet and let CF process it.
Thanks for the replies (Dave and Michael). Yes, it seems an absurd potential number of fields, I know. I'd come to the conclusion that I need to try and do a per group (person) update - basically your option 1 Michael. Just trying to figure that out as well.
My initial thoughts coincide with yours. I would, on that page, write code to collect the fields into a data-interchange format such as JSON or XML. The JSON or XML file is then uploaded using something like:
<form method="post" action="uploadFiles.cfm" enctype="multipart/form-data">
<input name="fileContents" type="file"><br>
<input name="submit" type="submit" value="Upload File">
<form method="post" action="savedata.cfm"> <input type="hidden" name="jsondata"> <input name="submit" type="submit" value="Save" onclick="packData()"> </form>
Many thanks for the suggestions.
I modified the page to update each person individually, rather than all of the people in the category at once using ajax. I did it in a way that meant that after each persons details were updated, then they dropped off the too do list by refreshing the list (again, using ajax). This works well on the one hand, but as I expected, if the list of people is quite long (which is how we got to this problem in the first place), then loading the remaining records in the list takes bit of time, and as a result, slows things up. I'm currently trying to figure out if I can group the records into logical smaller, and more manageable groups that will not ever come close to the limit of form fields, and do the updates that way.
Your description suggests there is manual intervention somewhere. That makes things awkwardly unscalable and slow.
Dynamically collecting user-details of any size and passing them to the server is a common use-case. So solutions abound - just google. Whichever method you decide to use, you will get a scalable solution and better performance by automating the entire process.
I'm not sure how I could automate the entire process when manual intervention is required. Details needs to be recorded for each of the returned records. There is no other option other than to type them in, and then for the user to confirm that these are correct before submitting them.
Ah, I think I understand what you mean. I am talking about the server process. Whereas you are apparently doing both the client task and the server task together.
You should separate the responsibility between client and server.
type in the details of the returned records
confirm that the details are correct
submit the records
server (the tasks to automate):
receive the records from each client (from a max of possibly 100000 clients)
validate each client record
aggregate, process, store the records