Skip to main content

Hi,

 

I have set up an inbound integration to listen to a folder and then import the data from a .csv file.

Sometimes it’s fine, then other times i’m seeing the data get duplicated.

In FSM going to Integration » Integration Monitor » Monitor Instances and I can see the instances duplicate.

Example:

 

Looking at the logs its almost as if the file is still hanging around it gets moved to the processed location, but perhaps not quick enough, so the listener is picking it up again.

 

Timestamp: 2024-02-16T16:51:25.5157945+00:00
Message: Error during complete operation: Could not find file '\\ServerLocation\aw_purchase_supplier_receipt\Pilkingtons\Pilks_2_Posted_240213_193554_04.csv'.
Machine: ServerLocation
Process Name: w3wp
Thread Name: FSMServerIntegrationMonitor~FolderWatcherPool
Message Source: AW_PURCHASE_SUPPLIER_RECEIPT_PILKS

 

This example showed as an exception however the data got processed twice, as per the screenshot above, and by the fact I have double the amount of records in the database I’m expecting.

 

Any else experienced this, and if so is there a fix or alternative process to try?

 

Cheers Ady

Hi @AdrianEgley ,

Were you able to resolve this problem? If not, is this only happening for a specific inbound integration or is this common for all the other inbounds? Or did you look for any BR or configuration which could duplicate the inbound process?


Hi @SAMLK,

It is still happening and it’s something I have noticed in the past with false negatives at server log level i.e. referencing a file that no longer exists as it’s already been processed.

 

However, to get around this instead of the folder listener always being on, I have configure to check every ‘x’ minutes and that appears to stop any duplication of files attempting to be processed. I don’t have any BR’s in place for integrations like this, it is a straight-forward import of a dataset.

 

Ady


Hi @SAMLK,

It is still happening and it’s something I have noticed in the past with false negatives at server log level i.e. referencing a file that no longer exists as it’s already been processed.

 

However, to get around this instead of the folder listener always being on, I have configure to check every ‘x’ minutes and that appears to stop any duplication of files attempting to be processed. I don’t have any BR’s in place for integrations like this, it is a straight-forward import of a dataset.

 

Ady

Hi @AdrianEgley 

Are you able to replicate the same problem in another environment (Test or Dev) ? Then I’d suggest you to open a support incident for this.


@SAMLK - yeah already had a ticket open.

Was hoping someone else out there were experiencing issues.

 

However I have put it on a schedule to check the folder and it appears to have stopped the files being duplicated.


Reply