Question

Supply chain order analysis table

  • 7 March 2024
  • 2 replies
  • 51 views

Badge +2

Hi everyone,

this has already been somehow asked in a couple of other questions, but not exactly with a final solution.

Our business need is to have a complete and updated view of the supply chain for customer orders.

This could be provided by the supply_order_analysis_tab, but we know it is a temporary table, with data triggered by manual actions by users on customer order lines. This means that not all order lines are included, and the included chains might not be updated.

I have created SQLs using demad_ref_code and order_ref, and it works, but only at some extent.

Does anyone has the same need? Is there any solution, out-of-the-box or customized, a workaround, or any other known way?

We tried to trigger the tree generation procedure for all order lines and create a custom table but it’s obviously too much in terms of resources required and fails. Could an incremental solution work? What should it consider?

We are now on App9 and just started the project to move to IFS Cloud. From the first touches we had with Cloud, it seems that the behavior is the same: could anyone already on Cloud confirm?

Thanks for any feedback!

 

David

 


2 replies

Userlevel 7
Badge +28

It does behave the same in Cloud and for the same reasons mentioned in other threads and as you’ve seen the behavior.  It is a dynamic table that is generated per user based on their origination point into the tree.  To do so as a static table across multiple companies and sites for hundreds of thousands of orders and possibly millions of order lines (it would have to contain all order history as well even if the order is closed) would require a massive amount of server resources to run and maintain.

Userlevel 4
Badge +7

Hi David,

I had similiar problems with temporary tables but was able to solve that the “old” ETL way.
Fetch/Extract data from ORACLE into MS SQL Server.
Transform it e.g. into a FACT_CustomerOrderLine table.
Transform DIM_ tables.
Create (once) a (Sales) cube.
Load data from the star-schema in the DWH into the cube.
Do this every night.

Performance?
Should be very well as long as you avoid fetching the APIs.
It is not always possible but to give an example:

Instead of using the MCCOM_SYSTEM_EVENT_API.Get_Description just
use the “underlying” table and left outer join it.
API “results” will mostly have 4000 char width :-(

The following ScreenShot displays the amount of time we need to create an FACT_CustomerOrderLine table in the DWH.

So, around 5000 records per second.
And the BI Server is with 64 GB RAM not this oversized!

 

So I agree with Shawn when you stay within the ORACLE DB.
Using copied data in the MS SQL Server is not alway this simple but you can do what you want anytime you want with your data.

Ahhh, just another example.

 

Why do we have a IPIS for every day?
You could (if you want) see every timestamp from the go live up to this night/morning.
But also we take the deltas (plus and minus of every part in every location) and have the
“streams” of the part flowing in and out of all the locations.

HTH

Best wishes

Michael
 

 

 

 


 

Reply