Hi,
I have set Additional Options to TransactionSize=100 and Batch Size to 50 in the Job’s advanced section. My understanding is that setting a Transaction Size will commit the rows at whatever number is assigned, so in my case when 100 records have been cached, those are committed to the database.
Unfortunately, the job is not saving any records to the Microsoft SQL destination table for one of the source tables. When I turn on verbose logging, I can see in the logs that 10000+ records have been cached, because I can see the data from those records in the log file.
There are other tables which are successfully being populated. Just this one is not. When it errors out at some 20kish row the job ends and there is no data in my destination table.
How do I make the job save data it is caching to the table before the error is thrown?