Best Practice Method for Import Data

STREAM(MyFile) before and FLUSH() after all MyFile Adds will speed it up big time. In this Updating TPS post it went from a couple hours to 15 minutes.

The Stream file will be locked to other users, and threads. Using LOGOUT/COMMIT will not lock the file until the COMMIT but is a bit more complicated and should have error handling.

I would have the Form call a Process Template that Reads the CSV and adds to the TPS
that way you have a Progress window with a Cancel button. The Form would have this code:

Access:MyFile.UseFile()             !Be sure file is open 
MyRecordsBefore = RECORDS(MyFile) 
FREE(DebugQ) 
MyFileLoadFromCSVProcess(CsvFileName, CsvRecords, DebugQ)
DISPLAY  !Show process result DebugQ on form window
Message('MyFile Added records: ' & RECORDS(MyFile) - MyRecordsBefore & |
        '||Read CSV Records: ' & CsvRecords & |
        '||DebugQ Fails: ' & RECORDS(DebugQ),'MyFileLoadFromCSV run')

Create a Process template that reads the CSV and ADDs to MyFile. Prototype like this to pass in file name and get back Count and Debug queue. This assumes the DebugQ is simply 1 string.

MyFileLoadFromCSVProcess(STRING CsvFileName, *LONG OutCsvRecords, *QUEUE DebugQ)

After the files are opened STREAM(MyFile). IMO better to put that code into the embed for After First Record read.

In the activity for each CSV Record add your MyFile record. In your DebugQ I added some info to know what record failed.

  CsvLineNo += 1
  IF Access:MyFile.PrimeRecord()
        DebuqQ = 'Line '& CsvLineNo &' Prime Error - ' & CSVFILE.Data01
        Add(DebugQ)
   ELSE
      MyFile.Data01 = CSVFILE.Data01
      MyFile.Data02 = CSVFILE.Data02
      IF Access:MyFile.Insert() <> Level:Benign
         DebuqQ = 'Line '& CsvLineNo &' Insert Error - ' & CSVFILE.Data01
         Add(DebugQ)
      END
   END

Before files are Closed:

?Progress:UserString = 'Flushing MyFile Adds ' 
DISPLAY  !There may be pause, especially with COMMIT, so tell user
FLUSH(MyFile)
OutCsvRecords=CsvLineNo

To make the process smooth its good to adjust the records read per cycle from the default of 25.
For a BASIC file the Process template will be done in BYTEs not Records so 25 bytes is probably 1 CSV record per Timer event which will be very slow. To fix it guess your average record length, e.g. 100 bytes. Multiply that by 25 and fill that in for records per cycle so 2500.

1 Like