EXCEEDED_ID_LIMIT: there are '600000' or more records deleted in the specified time period. Please retry getDeleted() call using a shorter time period.

  • 24 May 2023
  • 1 reply


I am using cdata sync and getting the error “EXCEEDED_ID_LIMIT: there are '600000' or more records deleted in the specified time period. Please retry getDeleted() call using a shorter time period.”

I have seen other posts say to use a Full vs partial copy but I am not sure that is something that Cdata sync does.   Is there a way go get beyond this error in cdata or salesforce?


Best answer by Elizabeth G 7 June 2023, 15:59

View original

1 reply

Userlevel 4

Thanks for reaching out. Typically, a technical support ticket would be the best way to diagnose and resolve this issue, and I noticed you already opened one with our team. Please find their answer copied below, but for future reference, please use the support ticket, as the agent will be able to assist with this further.

The error message "EXCEEDED_ID_LIMIT: there are '600000' or more records deleted in the specified time period" error is thrown due to a Salesforce API limitation according to which: 'If a getDeleted() call returns more than 600,000 records, the exception EXCEEDED_ID_LIMIT is returned.' (as stated in the API documentation here: SOAP API Call Limits | Salesforce Developer Limits and Allocations Quick Reference | Salesforce Developers ). 

You can find more details about this limitation in the following documentation: getDeleted() | SOAP API Developer Guide | Salesforce Developers

Salesforce advises doing frequent replications on that page to avoid having to copy any data chunks that would require more than 600k deleted records , which is a best practice that makes sense if you're going to be manipulating large chunks of data like this over a short period of time.

1- There is a solution using the "Truncate Table Data" or "Drop Table" option as you stated for a full copy, as doing so will guarantee that you create a full replication of the table in the destination. The goal is to not utilize the getdeleted api call to retrieve deleted data from Salesforce, but rather to use the queryAll() | SOAP API Developer Guide | Salesforce Developers api call to retrieve all of the data as queryall doesn't appear to have any restrictions. To learn more about both this properties, kindly check the following documentation:

Another solution is going to be to ensure that Sync is never querying a time interval containing more than 600k deleted records. There are a few ways you can try to achieve this, but note that some have drawbacks: 

2- Use a small replicate interval
Setting your replicate interval to something small (for example 1 hour) it should ensure that the job is never "looking" at more than 1 hour's worth of data at a time. This should solve the problem as long as you did not delete >600k rows on one specific table in any given hour. You can set the Replicate Interval and Replicate Interval Unit in the advanced job settings either at the job level or the task level. You can read more about these properties hereCData Sync - Advanced Job Options

3- Skip deletes entirely 
You can set Deletion Behavior=Skip Delete in the advanced job settings to entirely bypass the query that we are sending to fetch those >600k deleted rows. This would mean that deleted records would remain in your destination table though without any indication they were deleted.