Topics started by durette
We’re upgrading from IFS Applications 9 to IFS Cloud. Along the way, the scripts drop the Oracle accounts for our end users. The application is missing a lot of reporting capabilities, and without the SQL Query Tool, we’d like to give read-only access to the database so certain users can get their work done. How do you recommend we go about granting read-only database access? From an ASFU licensing perspective, can I use two different usernames for the same named user, to keep the FND user and Oracle user segregated?
This feels like a n00b question... I'm writing a report in SQL. What is the best JOIN clause from the inventory transaction history (INVENTORY_TRANSACTION_HIST) to the customer order invoices (CUSTOMER_ORDER_INV_JOIN)? I want to see the inventory transaction history so I can see all the cost elements on an invoiced order line. I need a solution capable of handling multiple deliveries on the same line. The closest I've gotten so far is to use the proximity of the invoice and delivery dates, but that doesn’t feel completely right. I don't think I need to handle credits (yet).
Today with Apps 9, we have multiple DEV environments hosted on a single application server and on a single database server. We keep them separated with different database service names and different port numbers. (Production has its own set of servers, as does our final UAT environment.) When we move to a Remote deployment of IFS Cloud, is this still going to be feasible? Will we be able to host multiple nonproduction instances on a single server? ...and would we still want to?
We are in the process of migrating our Exchange 2013 on-prem email environment to Exchange Online. We currently use IFS 9 with multiple IFS Sales and Marketing 8 GSS instances to synchronise between SaM and our sales teams' appointments, tasks, and contacts. These have been and continue to work between our IFS environment and our on-prem Exchange mailboxes. Our Exchange admin used one of our IFS testing environments, checked GSS is working as expected on-prem, then migrated the master account (and its mailbox) and a test user with its mailbox into AzureAD and Exchange online. He followed the instructions to successfully configure the Impersonation role and apply this to the master account. Both the master and test accounts have Exchange licenses applied to them and basic authentication is enabled. He logged in to Outlook online successfully as the master account to confirm it is working and accessible with the correct password.While running ConnectionTester from the GSS server console
The 21R2 documentation now makes a reference to a “File Storage Service” repository type. How does this work? How do I set this up? How do end users use it?https://docs.ifs.com/ifsclouddocs/21r2/CreateAndMaintainDocument/ActivityEnterRepositoryLocation.htm @Mathias Dahl(?)
I want to create a custom LU that I can link to multiple kinds of objects (customers, suppliers, orders, invoices… whatever). It has a column CF$_PARENT_OBJKEY that I’m using to hold the OBJKEY for the linked object.When I create a custom tab in IEE, OBJKEY isn’t in the dropdown for the Parent Key, but it doesn’t complain when I enter it in manually. (I confirmed a nonsensical field name isn’t allowed, proving that OBJKEY is passing some kind of validation.)(Ironically, OBJKEY is one of the allowed fields on the Child Key.) However, even with this column set up as the link, the custom tab doesn’t seem to be filtering. We’d like to pilot business processes around this now in IEE, but we’ll definitely want it for our upgrade to Cloud that’s already in progress. Can I set up an OBJKEY custom tab link in IEE? Can I set up an OBJKEY custom tab link in Aurena?
We use LDAP for user authentication against Active Directory. We've found that when a user's password contains a space or a period, IFS won't let them log in, even when their password is correct.1. What is the full list of characters that aren't allowed?2. Is there a way to configure this to work?
We’re looking to upgrade from IFS Applications 9 to IFS Cloud. We’re currently using IFS Sales and Marketing, but with this upgrade, we’ll move to Embedded CRM.Our products are carefully selected for each customer and are often highly engineered. We’re a global company offering different product lines manufactured in each facility, but our product lines are similar enough for an untrained person to potentially select the wrong one. When a business opportunity comes in, some engineering work is necessary before we know which site’s products would best suit the customer’s needs. We don’t want to do this engineering work until we have some reasonable justification for the cost of this work, but we do want to enter each business opportunity into IFS as soon as we can. Our best work-around on the table so far is to create a company called “CRM” with a single site called “CRM” where we can dump all our opportunities. This ensures the Business Opportunity doesn’t convey any more information t
We user Active Directory with a Windows domain that is older than 2016, so OpenID Connect isn’t available to us natively yet. If we don’t upgrade our domain before we upgrade to IFS Cloud, is it still possible to use AD to authenticate our users? Is there a third party tool that we can use to relay requests to expose our LDAP endpoint as the OpenID Connect protocol? I think my question is essentially asking the same thing as this one, but some extra details in the context of IFS could be very useful to me here, please.Active Directory as OpenID provider? - Stack OverflowKeycloak is mentioned here as an option; has anyone here used it?
If I have a Custom Entity, and if I create a Generated Projection from that, how do I expose that projection through the Integration cluster (int) to enable basic authentication? In the Projection Configuration, I'm only seeing the "Users" and "ExternalB2B" categories enabled with no option to change these from inside Aurena.
It's often useful to know which table contains the data you're seeing in the system. This is helpful both for knowing where to get data for a report and for knowing which LU's API to call to make a change.IFS Enterprise Explorer has the System Info tab for this, but this doesn't always show these details, particularly under the Solution Manager.What we can do instead is make a change inside the application--in the relevant screen and field that we want to learn about--then look inside the database to see what recently got changed.Step 1:BEGIN dbms_stats.flush_database_monitoring_info;END;/ Step2:Make your change in the application. Step 3:-- (Again)BEGIN dbms_stats.flush_database_monitoring_info;END;/ Step 4:-- As app owner, e.g. IFSAPPSELECT * FROM user_tab_modifications WHERE timestamp = (SELECT MAX(timestamp) FROM user_tab_modifications); Between steps 1 and 3, you'll want to work quickly to ensure only your changes get captured, either in a nonproduction environment or during
How do Aurena image controls work? In the Aurena page designer, I see an option to set a control as type “image”. How does this work? Since BLOBs aren’t supported as custom fields, I tried filling in this field with some SVG text to see if my web browser would like that.<?xml version="1.0" encoding="utf-8"?><svg xmlns="http://www.w3.org/2000/svg" width="100" height="100" viewBox="0 0 100 100"><circle cx="50" cy="50" r="40" stroke="black" stroke-width="3" fill="red" /></svg> Aurena puts in a placeholder image no matter what I seem to set this field to… How is this supposed to work? Is this improved in IFS Cloud?
Certain IT users need access to log in to a nonproduction environment as any arbitrary user.We use LDAP authentication against Active Directory for all our environments. By resetting the user's database password, though, we can allow the system to fail LDAP authentication, then pass database authentication.Under certain circumstances I have yet to fully understand, sometimes logging in as a user this way causes their AD account to get locked. Even after we log out of IFS as that user, the system still attempts AD authentication repeatedly.When this happens, the only solution I have yet discovered is to restart the middleware.Is this preventable?When it happens, is there a less severe solution than restarting the middleware?
If I take our order backlog as of tonight at midnight, subtract from that our backlog as of this morning at midnight, then add today's revenue, I will get today's net bookings.Σ Bookings = Δ Backlog + RevenueThis calculation method accurately handles both order line cancellations and intentional shipping differences. Does anyone else here do this?What are you using to get a daily snapshot of your backlog?How are you handling different time zones?Is your solution easy to maintain/change?Is it durable?Is it simple and trustworthy?
With Enterprise Application Search dropped from Aurena, I came up with a way to crudely replicate this feature using just straight relational Oracle, with no add-ins.This scans every leftmost column of every index, using both primary and secondary indexes. It only returns results where the table has a _PK index associated with it. I excluded ROWKEY to speed it up a bit, but it’s still somewhat slow.As well as scanning common indexes, it also scans any UPPER() function-based indexes you might have. It’s otherwise case-sensitive.The system load on a scan this big is fairly risky, so it only does an exact match. Users typically don’t understand why HelloWorld% is so much faster than %HelloWorld, and I didn’t want to spend time parsing the expression for safety.I recommend using this for specific searches. Don’t use an internal company ID, for example. You might implement this with a SQL Quick Report like this:SELECT * FROM TABLE(c_enterprise_search_api.get_search_results('&SEARCH_STR
We have an External Files background job that calls External_File_Utility_API.Execute_Batch_Process2. It runs every half hour or so to look for invoices. When it doesn’t find the file, it throws a background job error. ORA-20110: ExternalFileUtility.File MY_DIRECTORY\MY_FILE.CSV not exist We’re getting more serious about managing our background job errors, and this one is generating quite a bit of noise. I’m trying to improve the signal-to-noise ratio with the ultimate goal of making every error actionable. Is there a way to configure it to fail silently or gracefully when the file isn’t there, rather than throw this ugly error?
Constantly changing business requirements put pressure on IT teams to change code frequently. In today’s globally integrated environments, however, the idea of a quiet maintenance downtime period is becoming obsolete. Thus, it’s necessary to be able to make changes on a hot running system. Everything I've read about avoiding ORA-04068 says to not use package globals. However, IFS requires module_ and lu_name_, so how can we get around this? The source code for DICTIONARY_SYS reads the values of these globals right from the source code. The globals have to be on lines 2 through 6, but nothing says they can't be commented out! Here is a demonstration of this trick at work. Package "one" uses globals, and package "two" uses fake globals inside a comment. Both get picked up in the sweep for the dictionary cache, but only package "one" gets invalidated on a second run. With package globals:CREATE OR REPLACE PACKAGE c_test_one_api AUTHID DEFINER ISmodule_ CONSTANT VARCHAR2(25) := 'FNDBAS';l
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.