Copy link to clipboard
Copied
Currently i am reading data which is stored in .file in the format (abc|def|ghi|jkl|....|)using array and display the required data from the array as report data.
Data is being appended to the .file every 20-39 mins. so the size of the .file is increasing which in turn makes the report to load slowly as it has to read the entire .file data everytime .
Now i need an approach to store the data from .file into oracle database so that i can display the data from database inorder to lessen the time to load the data.
any suggesstions pls ?
Copy link to clipboard
Copied
Ben Nadel wrote a function to convert CSV to query. You can consider your file is a CSV with "|" as the qualifier. Create another process (perhaps a scheduled task) to read the query, create INSERT statement, and dump it into database.
http://www.bennadel.com/blog/501-Parsing-CSV-Values-In-To-A-ColdFusion-Query.htm
If your file as a header, you scroll down to the bottom at the comments. Someone provided the modify code to allow header.
Copy link to clipboard
Copied
I am already using Ben Nadel's function to read the csv file and storing it in a query and then displaying it from query.everything works fine.
But as i said the contents to my file are getting updated every 20 -30 mins ie.,data is appended to the .file constantly.now how can a new task identify which line got appended newly so that that rows can be inserted to the database ?
Copy link to clipboard
Copied
This might help. Treat it the whole file as a list with CRLF as the qualifier.
http://stackoverflow.com/questions/18047172/count-number-of-lines-a-file-has-in-coldfusion
Then, you save the last line you read. When the next process picks up, you start reading the last line + 1.