Skip to main content

Our Finance team wants to see invoice data in our Tableau reports.  It seems like a simple request but we’re unclear on the best (or any relatively simple and stable) way to make invoice data from IFS Cloud automatically available to Tableau.  A replication delay of a few hours would be fine.

Before switching to IFS we used the ETL service FiveTran to copy tables from our onsite MRP’s database to GCP BigQuery and reported from there.  It worked well for us.  We’re learning there’s not really a way to expose our IFS Cloud tables to FiveTran in the same way, but we’re still hoping to be able to replicate select tables to BigQuery without a lot of intermediate infrastructure to maintain. 

We’re aware of IFS Connect, but don’t see how it can be used to reliably keep an external data set in sync with an IFS table.

I see a featured called Access Views. Is it relatively simple to keep a SQL Server table in sync w/ an Access View?

Any advice would be appreciated.  Thank you!

Hi,

 

Access Views are definitely a candidate.

They are usually used to extract data from IFS into a BI system. You might need a license for the IFS BI component depending on what exactly you'll do - please check with your sales contact. 

 

Access Views are read only, but it looks like that's what you want to do. They are generated on top of Information Sources, usually as an online view unless the amount of data is extensive. Reading is usually done with the IFSINFO user.

If you are on managed cloud at 23R2 or older, you'll need a VPN to access the database. In 24R1 there is a hosted BI option as well, but I'm not sure if that is a fit for your if you're just looking to integrate with FiveTran and/or BigQuery. Please find your version if you need more thoughts on this.


Does it help if a .csv file is created and exported on a schedule to a server location such that the 3rd party system can pick it up for loading? If that is a feasible solution ‘File Out’ Migration jobs can be created and scheduled. You may want to think about incremental load by using the ‘where’ clause on the migration job to help load only new data.


Hi,

 

Access Views are definitely a candidate.

They are usually used to extract data from IFS into a BI system. You might need a license for the IFS BI component depending on what exactly you'll do - please check with your sales contact. 

 

Access Views are read only, but it looks like that's what you want to do. They are generated on top of Information Sources, usually as an online view unless the amount of data is extensive. Reading is usually done with the IFSINFO user.

If you are on managed cloud at 23R2 or older, you'll need a VPN to access the database. In 24R1 there is a hosted BI option as well, but I'm not sure if that is a fit for your if you're just looking to integrate with FiveTran and/or BigQuery. Please find your version if you need more thoughts on this.

Thank you for this, it sounds promising!  We do already have an Access View defined for the data we require.  We’re not yet on 24R1, so we’ll look into setting up a VPN.   

The documentation (Index - Technical Documentation For IFS Cloud) seems to indicate AVs are something only SQL Server can use.  So I expect we can’t connect directly w/ Tableau or FiveTran, we’d need to run SQL Server somewhere, have it connect to IFS Cloud via VPN, and replicate the data available from the Access View.  Our BI would then access the data from that SQL Servier.  Do you know of any documentation/guides for accessing/replicating the Access View once we have a VPN connection established?


Does it help if a .csv file is created and exported on a schedule to a server location such that the 3rd party system can pick it up for loading? If that is a feasible solution ‘File Out’ Migration jobs can be created and scheduled. You may want to think about incremental load by using the ‘where’ clause on the migration job to help load only new data.

This could be a solution for us, thank you for the suggestion!  

To test this I’m trying to follow the documentation here: https://docs.ifs.com/techdocs/22r2/030_administration/050_data_management/050_data_migration/030_migration_types/010_file_migration/030_create_output_file/#write_data_to_a_server_file

 

However it doesn’t seem to let me specify a file name.  When I type in TEST_FILE_NAME.csv I see a warning “Could Not Find Unique...” and the field remains blank

 

And when I try to execute the job it says “No details found”:

 

Please let me know if you have any guidance for resolving these.  Thank you!


Hi,

 

Access Views are definitely a candidate.

They are usually used to extract data from IFS into a BI system. You might need a license for the IFS BI component depending on what exactly you'll do - please check with your sales contact. 

 

Access Views are read only, but it looks like that's what you want to do. They are generated on top of Information Sources, usually as an online view unless the amount of data is extensive. Reading is usually done with the IFSINFO user.

If you are on managed cloud at 23R2 or older, you'll need a VPN to access the database. In 24R1 there is a hosted BI option as well, but I'm not sure if that is a fit for your if you're just looking to integrate with FiveTran and/or BigQuery. Please find your version if you need more thoughts on this.

Thank you for this, it sounds promising!  We do already have an Access View defined for the data we require.  We’re not yet on 24R1, so we’ll look into setting up a VPN.   

The documentation (Index - Technical Documentation For IFS Cloud) seems to indicate AVs are something only SQL Server can use.  So I expect we can’t connect directly w/ Tableau or FiveTran, we’d need to run SQL Server somewhere, have it connect to IFS Cloud via VPN, and replicate the data available from the Access View.  Our BI would then access the data from that SQL Servier.  Do you know of any documentation/guides for accessing/replicating the Access View once we have a VPN connection established?


Hi,

We have faced the same challange. We have all our analytics in Snowflake and wanted to move IFS data there and where promptly guided to use the standard setup of self hosted BI in our azure tenant with the added complexity of sql server instances. For our needs this was slow and costly.

We have since realized that there exists a way more flexible way - direct oracle access through the VPN tunnel to IFS Cloud. Our setup is now a simple VM running in Azure, with an integration runtime for oracle, and then utilizing DataFactory to push relevant data to Snowflake when needed. Highly recommend this setup if you have no need for the functionality the self hosted bi solution brings.

Kind regards,
/Magnus

​​​​​​​


Hi @bcardellini 

Here is another way to do it that we use everyday at CVE through our ETL Software TALEND :

* We query the IFS Cloud 22R1 platform using IFS Cloud standard APIs

* We store the result of those queries in our own dedicated MS Server database
* Then, our Power BI dashboards are connected to that dedicated MS Server database as their datasource


Hi @bcardellini 

did you checked out MS SQL Server Linked Server Settings?
You have to install an ORACLE client on a “BI-Server with a MS SQL database (your DWH)”
Look for ASFU (ORACLE / IFS) licence. You will find:
You are allowed to use third party tools like ODBC to read(!!!) your own ORACLE data.
This could be the predefined IFS IALs(Views), standard IFS Views (your see in your SystemInfo) or you can fetch “raw” table data and use the ETL way of filling extr_IFS_xxx_TAB in your MS SQL DWH database. Why it is a good idea to use “raw” data, we can discuss in detail.

Transform into DIM_ and FACT_ tables. Create your own cubes (multidim and tabular) for all the frontends. Use TALEND or other tools to enhance/automate the ETL process.

This solution will run on onpremise installation but cloud installations as well.
There are some points needed to create a VPN tunnel to your cloud ORACLE DB and certainly you can use an AZURE environement as well.
 

HTH

Michael


Reply