Skip to main content

Afternoon,

Has anyone created a data migration which will import a file from a specified file location on a daily basis? If so, could you share how you did it? 

 

Scope:

This migration job will pickup a file from a specified location, import the file and then archive the file upon completion. This job will run on a daily basis at 11pm

I can create the data migration job to import the file using the usual manual methods however, haven’t figured out how to automate the pickup from a location.

We’re currently using IFS Apps 10 however, looking to go to cloud in the near future.

API to be triggered; PROJECT_TRANSACTION_API

 

Thanks!

Ryan

Hi @RyanK 

please have a look into the IFS Technical Documentation of IFS Applications 10:

 

https://docs.ifs.com/techdocs/

 

The necessary information can be found here:

 

Direct link:

https://docs.ifs.com/techdocs/foundation1/040_administration/260_data_management/050_data_migration/002_migration_types/010_file_migration/020_create_table_from_file/

...and within that page search for “Create table from file - OnServer” please.

 

I also recommend to use 2 migration jobs for your purpose:

  • First migration job (CREATE_TABLE_FROM_FILE) which will load your file
  • Second migration job (MIGRATE_SOURCE_DATA) which will fetch the data from the first migration job. Set the rule CONNJOB to active and add the name of your first migration job into the field rule value. Afterwards you can schedule the second migration job.

Prerequisite: File Location OnServer in your first migration job and a fixed file name must be used.

 

/Jens


@RyanK What I think also can work is prepare two migration jobs as @Jens mentions. Next in the second mig job you define in the method that mig job 1 must run before the mig job starts. Mig job 2 needs to be scheduled.

To make sure that you process each file, I would delete (option in the rules) the file on the server once read. Bear in mind that when the file could not be processed correctly, you don't have a backup. So in stead of delete, a rename (giving it a date and time extension or something along that line, see documentation) will prepare the backup. Last option is nice, but you will need a monitor function to now and then remove the files that are no longer needed (for example older than 2 months). This could be server service task.


Reply