What is the right way to load heavy data from a text file to db .

What is the right way to load heavy data from a text file to db .

Hi All,
I have a text file which may contains around 2 lacs of data row , i have to get that file from ftp and dump that data into DB , i am planning to do it in Timer for aschyncronous run of the process.
Is there any other way for heavy data dump? 
Hi Pramod,

A timer (or a couple of them, for separating the ftp download and data processing tasks) is probably the best way. You can also use BPT, but with that option you can't override the maximum activity duration (regarding automatic tasks, BPT is better for small duration activities, <5min).

Depending on your estimation for each task, remember to configure the timer timeout (default is 20 minutes). If the data loading part is going to take a lot of time, consider splitting its execution in multiple timer runs:
- Process file items (resume execution after last processed item)
- After 20 (or so) minutes, evaluate if you reached the end of the current time slot (if so, launch the timer again and stop)

Your timer should to process data in chunks, and be able to resume work automatically since last run.

PS: found out that 10 lac=1 million, thanks. :)
if the data is somewhat simple yet much, you can also opt for bulkimport.

There's also a presentation from NextStep 2011 that has some examples of timers processing data in incremental chunks: here.

Thanks Paulo,
I read in detail about timer and think using timer is the best option and also knew why giving Timeout is important.
Nowlooking at the slide for processing data in chunks.