Solved

Scripted flow cleanup

  • 13 September 2023
  • 3 replies
  • 73 views

Userlevel 3
Badge

I was wondering if anyone had a good solution for purging files from all flow connectors.  Maybe a custom script where I could pass in WorkspaceId and a ConnectorId that would delete files from the Send/Receive directories.

My initial thought, is it possible to delete files across all connectors within any given flow?

I have a few CSV connectors that are used to download 5K+ rows, split them into their own file and pass the data into a SQL stored procedure.

Because of the split, that single CSV is now a new XML file for each line.  As it travels down the flow I end up with 5K+ XML files in my Send/Receive directories.  Usually I get an Arc notification that the size of X directory has exceeded the recommended size..

Right now, I use the interface to click Delete All from both Input/Output tabs to keep folders clean.

 

icon

Best answer by James B 14 September 2023, 16:26

View original

3 replies

Userlevel 6
Badge

There are some simple solutions here - Arc has a cleanup routine that it runs nightly at midnight to clean up files in the Logs, Send and Sent folders that are older than a specified interval in the Settings(cogwheel)->Advanced tab: 

 

 

The button there will run the cleanup manually, and there is an operation that you can call from ArcScript to trigger the nightly job (along with the other tasks there, like the certificate check):

 

<arc:call op="appDailyTask" />

 

With that said, though - the intention here is to clean up old logs, copies of the Sent items, and unsent files that have been stuck for the interval. The expectation is that in a healthy flow:​​​​

 

  • files aren’t accumulating in the Send folders, they are processed
  • files aren’t accumulating in the Receive folders - they are either passed on to the next connector in the flow - or if you have a connector that is the terminal point in a flow (e.g., if it doesn’t flow into anything), you have a backend procees that is picking those up. 

 

 

For these files in the Send folder - are you expecting those to fail and remain, or is the volume of files larger than you are processing them? If you have a connector where the Send files are queuing up, you may want to adjust the size of the working set for each automation cycle by setting the Automation->Max Files to a higher value (say 100 to 1000, after the split) to tell the application to work on that queue longer. 

 

If you have files that are stuck in the Send folder because they are failing to process, the cleanup routine will eventually pick them up, but you probably want to consider designing your flow so files that you know would fail are instead filtered out, or directed down a different path. 

 

If you’re deleting the files from the Receive folder of a connector, it’s likely that the best way to address this is to add an additional connector to the end of the flow to deal with them. 

 

If you want to drop off processed files on the file system, using the File connector as a terminal point in the flow will take those files outside of Arc (you’d still be expected to manage the path that you point the files to outside of Arc). 

 

If you are outputting files that you don’t want to keep - say for example, you terminate your flow with a REST call that outputs a response that you don’t need to process, consider adding a Script connector after the terminal point in the flow and don’t configure a Script. An empty Script connector will “process” the file without output, essentially deleting it. 

 

Userlevel 3
Badge

James,

Thanks for your response.  I am already doing cleanup via the advanced settings on old files.

I’m not having any issues processing the large quantities of files, I just want to be able to purge the send/receive directories across all connectors of the flow (perhaps as a last step in the flow?).

 

I think you gave me my solution > File Connector.  I can specify the logical path to the files and have the connector delete after processing.

 

Thanks!

Userlevel 1
Badge

I noticed that the cleanup options are not perfect:

  • For some workspaces I have very large amount of files being processed daily, they need to be cleaned up more frequently
  • The cleanup does not take into account all the files

 

And for this a powershell script runs every night that deletes all remaining files.

Reply