We have a couple VERY large data tables that are imported each night via the CLI tool, however these tables and corresponding text files contain over 5 million records.
As we understand it, each night, the CLI tool gets the most current data, and we need to overwrite all existing data using the new files. The issue is that these files are taking a very long time to parse through and write to the database.
Is there any other option other than truncating and writing all of the lines? (Sometimes, out of the 5 million records, only a few hundred have actually been added.)
For reference, I am referring to the "quiz_question_answer_dim and _fact" files.