I just went through a successful upgrade from IEE to 23r2 deployed On Prem
I have the provided IFS_INVENTORY, IFS_SALES, IFS_CRM, and IFS_PRODUCTION_PERFORMANCE models deployed for Data Mart and scheduled.
I scheduled my Data Mart Refresh Category, Data Load, and Model Processing in a db task chain, these execute successfully, and I see my logs in both data and process runs.
For some reason I don't see any data past the day I did the initial data load in SQL Server, the data is fresh in the MVT and BI tables.
Can someone assist with this, if so, please let me know if more information is needed from me?
Hi
You mentioned that this is a upgrade from IEE to IFS Cloud. In IEE we had cubes and in IFS Cloud we have Tabular models. The frameworks are different and it is not a direct upgrade from Apps10/IEE. Have you done all the resource creation and environment setup related tasks correctly?
If the above was completed successfully;
What is the status of the load runs? Can you see new load runs created under your data load definition from the scheduled jobs? Are all the data sources showing as successful in the latest load?
If you manually trigger a data load run from the definition, will that transfer new data to the SQL server side?
Best regards,
Subash
I currently have 4 years of data successfully transferred using the same Data Mart Categories, data load definitions, and process definitions.
Fresh status in all Data Mart Categories
- stale on fast moving facts within categories, but timestamp matches last job
Success status of all runs of data load definition
- log has no errors
Success status of all runs of process definitions
- log has no errors
Running manually produces the same results as above and did not transfer new data to SQL Server
Hi
Login to the Oracle DB as IFSINFO and query the relevant access views (_BI views) for the facts and dimensions you have issues and check whether you can see the data. The data in the BI views are the ones getting loaded to the SQL server.
Then login to the DB as IFSAPP and query the _DM and _OL views also to check what data is visible there. If the _DM view is missing something compared to the _OL view then there is an issue with the DataMart refresh.
Also better to check the SQL tables as well because sometime in SQL server views the data may get filtered from different conditions.
This is why I suspect the issue is at the step of transferring to SQL server.
I dug into it more today. It turns out that I can see up-to-date values in SQL Server. For example the `dbo.FACT_INVENT_TRANS_HIST_TM` table is fresh.
But I do not see those updated values through the Analysis Server
I think it’s promising that I see them in SQL, but still not sure why the model isn’t processed?
Hi
If the Oracle _BI views and SQL server tables, views are updated with latest data we know that the data mart refresh and data load to the SQL server is working fine as expected.
Is the model process definition created with the process mode set to “Full”? Try a “Full” refresh on the model directly using SSMS.
Are you using IFS delivered sample PBI report or a report of your own to connect? Could it be due to some page level or report level filters the latest data is not visible?
Hey
I actually do see an update to the model doing a “Full” refresh in SSMS! I am using Tabular Editor 3 to validate the data at the moment, so no visual to consider right now, but will note for later.
My Process Definitions in IFS are using “Default”, is that my only issue?
Update
while this worked in my test environment, my prod environment is giving an error when attempting to process “Full” in SSMS.
The error:
Failed to save modifications to the server. Error returned: 'COM error: Microsoft.Data.Mashup; The given credential is missing a required property. Data source kind: SQL. Authentication kind: UsernamePassword. Property name: Password.
I’ve tried to compare things between test and prod. I changed the value for the dbo login to Administrator like it is in test and restarted the db, but I am still getting the error. Consultants had their own user account in there for prod. The consultants also set this up and would not share all details so that I could manage this myself…so I’m a bit at a loss.
Where can I check the values it is passing for credentials?
Hi
Yes. I think the data not getting updated is related to the refresh mode “Default”. It will be always good to use “Full” mode.
The username, passwords are set to the data source properties during the tabular model deployment.
Are you sure that the model you are trying to process is a tabular model deployed from IFS Cloud application?
Can you try to deploy the tabular model again and try the processing?
The data source credentials are set from the SSIS package during the model deployment and the values are taken from the SSIS catalog configuration.
Goto - Integration Services Catalogs > SSISDB > Iyour folder name]
You can check the SQL DB name, username from this configuration. password is a hidden field.
Best regards,
Subash
Thanks
I have my prod environment working properly now. I was correct in changing the user credentials for SQL and SSAS, I matched the values that are in the YAML file. The final thing I did was run the environment setup again. I believe the SSAS user was the only one with the incorrect credentials.
The interesting thing is these values haven’t changed and should have been present when the initial environment setup and model deployment of the models was done prior to go-live by consultants...not certain what went wrong there, but its spilled milk now.
As the environment setup was already done, at go-live I followed the documentation (IFS Analysis Models - Environment Parameter - Technical Documentation For IFS Cloud) and manually updated the status of the environment setup using the Reset Status command. Had I done this again at go-live I would have received errors and been able to fix the credentials before spending days transferring data to SQL server.
Thank you for your help, I will mark an earlier post as the solution for this question
Depending on how long a “Full” process definition takes, I may need to create a different process definition for table instead of database and handle the FACTs and DIMs differently. In your experience, do you always do “Full” and what has been your downtime with this method?
Hi
Glad you were able to solve the issue.
I only have experience with test data sets for “Full” refresh. With those the full refresh only takes few minutes maximum or less than minute in most cases. So not in a position to comment about time taken for a actual customer production data set.
BR,
Subash
Reply
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.