Hi,We have implemented various custom projections and have a customer requirement to store the request and response payloads into a custom log table. Some payloads can be quite large so we have set these table fields to be of CLOB type.Is there an easy way to convert the request and response payloads (which are of pl/sql record type) into a JSON CLOB format? Currently the only way we know of is to create the JSON from scratch using the values retrieved from the payloads. Is there some sort of utility, package or some other method that can be used to easily do the conversion?Thanks!
Hi all,We have developed many custom projections and client has a requirement where they will be sending a unique id in the request header upon execution of the call for us to store for logging purposes.Is there a way to retrieve or manipulate the request headers from the inbound calls to our custom projections? Thanks.
Hi all, We are developing multiple custom projections in IFS Cloud and have a requirement to add a custom field and value to the response headers. Is it at all possible to manipulate the response headers being sent back when a custom projection is executed? Thanks, Manuel
Hi,So I noticed this error when trying to send a dataset load manually from IFS Cloud to PSO: I would have to delete certain Location Region records beforehand to be able to manually send a load successfully. Is there a way to manually send a load successfully without having to delete the Location Region records? Thanks in advance!
Hi,I’m currently developing a custom projection and this is how I currently have defined a structure for the Response.structure AppointmentDetailsResponse { attribute Latitude Number; attribute Longitude Number; attribute DatasetId Text; attribute Technician Structure(TechnicianResponse); attribute AdditionalRegion List<Structure(AdditionalRegion)>;}There are a few weird behaviors that I’m noticing:When I add an extra attribute to this AppointmentDetailsResponse structure (e.g. attribute SomeField) like this: structure AppointmentDetailsResponse { attribute Latitude Number; attribute Longitude Number; attribute DatasetId Text; attribute SomeField Text; attribute Technician Structure(TechnicianResponse); attribute AdditionalRegion List<Structure(AdditionalRegion)>;} and if hard code a value to it in the the logic response_.some_field := 'TEST';RETURN response_; I’m getting this error when I test out the projection: { "error": { "code": "DAT
Hi all,Is there a way to create or manipulate a Scheduling Type (SLA) for a Work Task? I know that the default Scheduling Type comes from what is configured in the Scheduling Type, Scheduling Activity Type screens and set in the Scheduling Dataset. We’re looking for a way to somehow either override the existing Work Task Scheduling Type or create a new one and be able to set a Start and End date for it -- all this without having to use Contracts. Thanks!
Hi, I have created and released a Work Task from IFS Cloud and it displays allocated in PSO. However, upon trying to manually commit it in PSO, I get the following error message and the Activity remains Allocated: I also see these broadcast warnings in the Events: Does anybody know what could be causing this issue? Thanks!
Hi,We are looking into different options for migrating data from external systems into IFS Cloud. It is my understanding that the recommended approach is to use the Data Migration Manager component to import data into IFS Cloud via flat file.We are expecting to import a huge number of records into IFS and we are looking for an alternative to flat file import method due to performance impact. Are there any other options for migration? (e.g. external database connection) If not, are there plans on improving this feature in the near future?Any insights will help. Thanks!
Hi,I’m looking into the Data Migration Manager component and am looking for a way to migrate data from an external database into IFS Cloud. I took a look at the online documentation and it mentions that in the process of defining a Migration Project, there is a section where you can connect to an environment using a Database Link, which will be created by Cloud Operations in the application.Does anybody have a guide on how exactly we would create or define this Database Link?If I wanted to connect to an external database for data migration, is using the Database Migration Manager the correct process? If so, how does this process work? If not, what is the correct process?Any insights would be very much appreciated. Thanks!
Hi all,I’ve been trying to test some of the calls related to Work Tasks using the WorkTasksHandling projection and the JtTaskSet entity but it’s always returning empty results even though there are existing Work Tasks in the system: Existing Work Tasks in the systemExecuting GET call on JtTaskSet entity. Response returns count = 0 and empty value array. Here’s an example of trying to retrieve a single record from the JtTaskSet entity using an existing Work Task ID: Executing GET on existing Work Task. Response returns Resource not found. However, there are other projections that do return results. Here’s an example using the PersonHandling projection and the PersonInfoSet entity:GET call on PersonInfoSet returns records Here’s the setup for the Client User that I’m using to test these calls:Permission sets associated to WorkTasksHandling projectionDirect grant permission sets associated to the IFS_POSTMAN user. Let me know if I’m missing anything and if anybody has any insights
Already have an account? Login
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.