RE: Easiest way to read from a file in Online
We have done this a couple of time using different scenarios, i belive scenario 1 would work just fine, we did exactly what you want to achive with it just a couple of month ago, but with alot more lines, so it works just fine.
The first and most important thing, dont initlize it from Service since it so often run into 60 sec limit or memory limit.
Even with the "save the last ID to an extratable" will take alot of time due to the initilization of the DocumentAgent, set it as a schedule task with the time lock and so forth.
(Might be ok if you are able to run it manually, but for a file that would take multiple hours of runtime it would cause a bit of a hazzard)
Anyway two scenarios we have built that works just fine when working with importing data to Service.
1. The console app.
Just build a simple console app that reads x number of thousand rows from your file, then sends them as a REST call to a crmscript that handles them, then returns an OK, when that OK hits the console app, it will take the next X thousand.
Works fine and have imported hundreds of thousands of lines of company data using this.
2. A more advance scenario, i dont belive you need it, but someone else might find this thread sometimes, the customer wants to send an email into service with massive amounts of data that will hit one our of the previous mentioned limits.
What we do here is the following:
a) email filter that puts them into a separate category and runs a script.
b) the script will read the base64 data from the file and send it to a azure service we built together with parameters of how many rows shall be returned before the next batch is sent as well as the callback url.
When the azure service recives the base64, it will start making rest calls similar to 1.
Works very well, but requires a bit more work.