Skip to main content

IFS Digitalization CollABorative: Tech Talk Session with Andrew Lichey, VP, Platform Product Management at IFS

Date of Meeting: 18 October 2023 10:00 AM US Eastern Time 

 

Andrew Lichey Background:

  • VP Platform Product Management
    What does platform mean? It means that I'm really focusing on the technologies that we use to build our applications, the frameworks that we have, and the introduction of new technologies into that and making it available to our teams to add integrally within the different business areas of our software. So includes everything from cloud platforms that we run, companies and things like that, but then also the introduction of new technologies and the normalization of those technologies like AI analytics, things like that.

 

Andrew Lichey Presentation:

Slide: IFS CollABorative Digitalization

  • So, the purpose of this call was to cover what's new since the beginning of 2023. What we've done in those releases, what new features that we want you to be aware of, things that you can take advantage of now just to give a little bit of context.

Slide: Legal Disclaimer

  • This presentation contains only general information on the basis of the current situation and does not constitute a warranty or binding obligation. IFS assumes no responsibility for any of the outlines set forth herein or with respect to the future development of the IFS business.
  • Consequently, this statement of direction should not be used, relied upon or treated as a specific warranty or undertaking of such nature by IFS.

Slide: Agenda

  • So I broke this session down into essentially 4 parts. First, we'll look at some of the new things that were introduced in 23R1. I think everyone on the call is familiar, but I'll just cover it again. We do releases twice a year. We do a late spring and late fall release. 23R1 represented our first release. 23R2 is representing our second release, which is coming out shortly. We used to have a much longer development cycle. We went to a twice a year cycle so we can get innovation to you, our customers faster and to help reduce the friction of taking upgrades. So, after I covered 23R1 and 23R2, we'll give a little glimpse of what we're looking at in the future. Again, 2024-25 time frame.

Slide: Optimizing People, Assets, and Services (23R1)

  • There was a lot in 23R1 and I have a lot on this slide and don't worry, I'm not going to go through it bullet point by bullet point, but I just want to put this up there so that you can see all the different things that maybe contextual to your business. 23R1 did go out I think it went out in May this year, so a lot of new functionality.

Slide: Connect Global Operations (23R1)

  • But what I'm going to be focusing on specifically is some of the platform investments that we made, with the goal of helping global organizations or local organizations improve how quickly they can upgrade, how quickly they can install new versions. But also, how they can use IFS cloud to get more of the behaviour that they're looking for out of here.  
  • So, there's five sections here under platform. We have the deploy IFS cloud faster, improve workflow, workflow controls, simplify integration APIs, save time with managing IT access and then share system up time and efficiency. I'm really going to be focusing on the first three here today, deep diving into those areas.

Slide: Key Features

  • So, if we start with the first, we introduced a little while ago a new tool called the IFS Data Migration Manager. If you're not familiar with Data Migration Manager purpose is to help you move large volumes of data from other systems or from other data stores that you have into IFS Cloud. We put a lot of work and investment in this, both to help new customers who are going through a data migration from another system, but also for existing customers. One of the interesting things that we're sitting with a lot of customers is that the data is spread out across their data state. So not IFS Cloud being their ERP, isn't necessarily the owner of all of that data, they may have other ERP systems in place. They may be capturing data and IoT data and historian, things like that, that they want to then move into IFS cloud at some point, but move in in a batch way rather than in real time integrations. And that's one of the things the Data Migration Manager supports. The Data Migration Manager is ultimately replacing existing tools that we had, those are legacy tools like the Data Migration Tool, and we had some regional solutions as well. Those, while fit for purpose, we're very dated and as such really had trouble keeping up with the expectations that you would have on data migrations going forward. So, we made that investment in building a new Data Migration Manager.
  • I’m just going to touch on some of these key features that we have. So, we talk about metadata validation. This is the ability to, prior to doing a migration, to validate your data on an ongoing basis so that you don't need to wait for moving all the data at one time like scheduling over a weekend and doing all your validations. You can do your validations as you go, as you prep the data for big moves. We talk about duplication of data, harmonizing data from multiple data sources, both bi-directionally.
  • Basic data validation, this is typically used when you want to validate incoming data against existing data that you have in the system. So, say you're loading data from a historian or from another ERP system, or best of breed system you have, and you want to make sure that it has the right connections to the data in IFS cloud. You can do that validation. You can do that enrichment as part of the process.
  • We talked about transformation is another one. Obviously, any kind of data migration involves changing the format of data from whatever format is currently in, into what IFS cloud would expect, and that is built into the Data Migration Manager.
  • Then we have pre deployment and post deployment processes. So, pre deployment processes are things that you do to the data before you move it into IFS Cloud and that could be in enrichment, that could be validations that could be anything, any kind of action you want to do against that data before moving it. And then we have post deployment processes. So now you've put this data in IFS Cloud. What do you want to follow that transaction up with? Maybe it's calling certain APIs, maybe it's syncing this data with other systems, things like that. So this gives you the ability to validate and enrich and transform that data both before processing it and after processing it with the system. This is bidirectional, so you can use the Data Migration Manager not only to import large volumes of data, but you can also use it to extract large volumes of data to other systems you have to a data lake and those kind of things.

Slide: Key Features

  • We have a lot of tables in IFS Cloud. 6000 plus tables and to make it easy for people to run these projects we have templates. So, these templates come with predefined schemas for all the tables that we have in the system, as well as some predefined mapping. So that you don't have to start from scratch every time you're looking to low data from another system into IFS cloud or vice versa. You can start with a template greatly reduces the amount of time and effort to put this migration together.
  • Project Analyzer, this is looking at the changes that you made in your different environments and determining what the state of those is and making recommendations for how to take the next step.
  • Smart mapping looks at your existing data, and looks at the way that that data is defined in another system, and then helps you do the data mapping within IFS Cloud. So, this uses some AI to make inferences from things like metadata that exists in those systems, or within the table names and column names themselves to make some recommendations on that.
  • So like I said, we put a lot of we put a lot of effort into the Data Migration Manager. This does replace existing tools that we have. It is for not only new customer onboarding, but also increasingly common, we're seeing the need just to migrate data between environments on an ongoing basis. This could be as a result of acquisitions that you're doing, where you acquire new companies and you want to move that data from their existing systems into your master IFS Cloud system. This could be exporting data into a data lake or a data factory for AI or machine learning, or did analytics or things like that, but it works bidirectionally.

Slide: Workflows

  • Another thing that we've been putting a lot of work into is our business process automation capabilities within IFS Cloud. Commonly something we refer to as workflows. So, if you haven't used them before, the purpose of workflows is to allow you to automate activities within the application. One potential use case is, if you have a user that's going to the same 6 steps as part of a process over and over again, and they need to follow those six steps in order to make sure that all the data is loaded, all the data is validated, and the processes are done correctly. That's a perfect example of something that we can automate down to a single step. Other examples would be incoming integrations that need to perform additional transactions or another good one is IoT. So, if we have an IoT reading that comes in and it's an event or it's a fault and we know that that fault it's going to require a task to get scheduled, a technician to be sent out, parts to be ordered, all that stuff we can automate all that processing within workflow. The goals behind workflow are #1, to free up users to do higher value tasks and higher value activities within the system. So, reduce the amount of paperwork essentially that your users have to do in order to do their daily jobs. So, bring them up to add more value. To increase the consistency of these transactions. So, the same things are happening time and time again, which is critically important, not only from ensuring that they're done correctly, but also then later on when you're doing process analytics, when you're looking at applying machine learning, things like that, it's important that you have that consistency in the way that your data is recorded, and the transactions being processed. And then tied to that is the data quality, making sure that all the data that's coming through here accurately reflects the times of these transactions occurred, accurately reflects all the interactions with the system. So, by automating all that, you remove the possibility of human error. Great example on this. I was talking to a customer yesterday. They are a manufacturing customer who wants to automate the start and stop of shop orders within IFS Cloud, based upon IoT readings that they're getting from their MES system. So, the equipment on their factory floor is recording amperage used by the machine. So, the more energy and machine uses indicates that a shop order has started. They can then use that to send that data in IFS Cloud and build a workflow to update the status of their shop orders and work orders to make sure that that data is always being recorded in a timely fashion, in the same way, and is accurate. One of the challenges they have today is that it's a very manual process and users commonly forget it, and as a result they're updating their shop orders either too late or too early or whatnot, and it makes it really hard for them to calculate their OEE and things like that. So again, primary benefits on workflow are freeing up users to do higher value tasks, increasing the consistency of transactions in the application and improving the quality of the data that you have. So, we've been investing a lot in this, and one of the things that we did in 23R1 was add the ability to launch workflows in different ways. So, previously prior to 23R1, the only way you would launch a workflow would be in response to a transaction happening in the system. So, you put an event on a data element in the system. You could say for example, anytime shop order is updated, any time a task is updated, call this workflow or you could do it from an API call. All of our workflows have rest APIs automatically generated for them, so you could call them that way.
  • But in 23R1, we added the ability to call workflows based on interactions in the Web client. So, when your users are in there, they're clicking a button, they're selecting a menu item, or they're changing the value of a field, you can launch a workflow based upon that. Now, why would you want to? Some of the common use cases are additional validations. You have custom validations that are critical to your business but aren't made of within IFS Cloud that you want to make sure enforced every time, so you can build the workflow to do that validation and then associate it with a button click or a menu item.
  • You can do data enrichment. Like if you need to do some additional updates to the data, before a process is kicked off, you can do that from there. And then you can do entirely custom processes. So, if there's something that you want to add, an ability to do some sort of process in the Web client that's not currently supported, you could build that process using a workflow, and then call that from a button click, menu item click, that you add to the screen in the web UI.

Slide: Entity Service APIs

  • Next thing I had here that I wanted to bring up was the entity service APIs. This is really, really important and it's a big part of our API strategy going forward. Before 23R1, the vast majority of our API's either standard APIs or premium APIs, were originally designed to support Web client interactions. So, the things that your users are doing in the Web client, we're all being driven off of the APIs that we have. As such, they're very procedural in nature. There, I click a button to post a purchase order and that button has an API behind it which goes and delivers all the logic for posting a purchase order. What you couldn't do though, was update the data directly. So, instead you would call a process, and that process would update whatever data is contextual to that process, but you couldn't update a single work order. You couldn't update a single task. You couldn't update those kinds of things directly. That's what the entity service APIs are now allowing you to do. The entity Service API is by definition are CRUD, so they allow you to, on an individual interview level, create new Records, update existing records, or delete records. Now, all of the same security is enforced, all the same data relationships and business policy and business rules are enforced. This just gives you much more granular control over which data is updated and when it gets updated. Now one important point here is it still is composable. So even though you are now updating individual table rows, you can compose multiple updates together to create more complex transactions.
  • The entity service API, I look at as the twin of another investment that we had previously made called the query designer, where the NT service APIs give you that really tight control over updating data. The query designer does the same, but for querying data out of the system. So, if you're not familiar with it, the query designer allows you to build APIs for complex queries, so you go in, you define a complex query in the system, and it generates an API that you can call over and over again from that, to get that data. It's extremely flexible, so again you can query any data. You can apply any filters and things like that, and used together with the entity service APIs, you can create really complex processes that are atomic and that you can use in things like workflows, you can use in automation as a response to an inference made by ML, or in your integrations you're both data processing matched data management. All those different areas.

Slide: 23R2

  • So, that was some of the some of the key platform things I wanted to bring up on 23R1, which again we released 23R1 in May of this year. Now we have 23R2 coming up shortly, and I’m in a kind of an awkward position here. Honestly, from a timing perspective, because we haven't done our all of our announcements and shared everything that we're going to have in 23R2, but I didn't want to not share anything here.

Slide: Business Process Automation (Workflow) 23R2

  • So, I have a couple of key features within the platform area that I wanted to talk about because they tie together with those previous ones that we talked about in 23R1. So, I talked about workflows and how we've been investing heavily in workflows and it's a big part of where we're going. One of the things I didn't mention was, why workflows specifically over other ways of automating transactions, and really our goal with workflow is to enable you, the customer, to be able to do these types of things yourselves. So, we've intentionally chosen to build workflows with a what you see is what you get type visual editor, so that as your business changes and you want to make changes to the system, you have the power to do that without coming back to IFS to update custom APIs that were created or custom event actions that were created. So, we're continuing that investment in 23R2, and we'll continue it going forward.
  • Some of the things that we've added in 23R2. Number one, some I just talked about the entity service API, so we added the entity service API in 23R1, we’re embedding the ability to call them and consume them within workflow in 23R2. I really can't overstate how important this is for workflow. I've always said, we can put any investment we want and workflow, but the real power in workflow comes from the APIs that we have available to it. Because the APIs are what we use to get the data, to make the decisions, and to perform the actions.
  • So, in 23R2, we've added the entity service APIs which opens up a million new opportunities for us and for you in creating automations. Previously, the way I look at it is, prior to the entity service APIs when you were building a workflow, you were choosing an API to call and you were choosing from a list of pre-built APIs that all did define things. If what you needed wasn't one of those predefined APIs, you had to build a custom API. If one if one of those APIs did like 80% of what you wanted but not 100% of what you wanted, you had to build custom APIs. The entity service APIs, by giving you the more granular control, lets you deliver all that logic that you want from that workflow in an Evergreen fashion, so you're not going to need to go back when you go to upgrade. You're not going to go back and update custom APIs or PL/SQL existing online, SQL event actions, things like that. You’re just using the entity service APIs which are Evergreen by nature. So, when you upgrade the workflows you built in 23R1 are going to work in 23R2. When you upgrade from 23R2 to 24R1, those same workflows are going to keep working.
  • We also added the ability to track execution logs. We've had workflows for a while and one of the bits of feedback that we've gotten is, anytime you do automation, people are nervous about ensuring that the automation is doing exactly what they intended to do. So that's where the execution logs come in and allows you to go in and see OK, when did the workflow run? Under what context? What data was used in the workflow? What decisions were made, and what were the outcomes of those decisions? So, it gives you a really detailed view so that you can be confident that the workflow is working the way that you intended it to.
  • Another thing we're doing in 23R2 is adding a lot of advanced examples for workflows. So, we've always had examples in workflows, but our primary target before 23R2 was putting examples in there for new users, people that hadn't used workflows before and giving them basic guidance to how to start using them. Now, we've had workflows in the software for a couple of releases. We're getting feedback from people that they're using workflows today, but they might not be using all the advanced features that are available to them. That's really what this is targeting. We're trying to show off and what workflows can do and give you some practical examples of real business cases being done through these examples that you can then use when building your own workflows.

Slide: Future

  • Want to talk a little bit about the future. Now again, this is looking at beyond 23R2. This is looking at 2024-2025 time frame. Again, this is what we're currently intending to do. This is not a guarantee that this is what we're going to deliver, but I did want to give you some insight into where we're going. A lot of the same information is going to show up. We're working right now on updating our statement of direction as well as our roadmap. So, in a way, this is a little bit of a preview of both of those, but you're going to see this reinforced. You're going to see this in more detail in those areas very quickly. I think the road map we have planned to come out in the next couple weeks. Statement of direction in December is what we're targeting. The difference between those two is the road map is looking at one to three releases out and giving a very detailed view of what investments we're making. The statement of direction is something that's aimed more at C-Suite type users to just understand at a macro level where are we going with the software, so that they can be comfortable that it's aligned with their business objectives over the next two to three years.

Slide: Future (2024-25)

  • So again, there's a bunch here that we're doing within the platform area.
  • I'm not going to go into all of them, but I did highlight some of the ones that I wanted to talk about here. The first one is IFS.ai, so I know last session, you had Bob De Caux on here. Bob is our head of AI. He talked a lot about what we're doing, so I'm not going to rehash all that, other than I'm going to reinforced that we are looking at introducing AI in any place in the system where it adds value for you as the customer. So, we're looking at embedded AI within the processes that we currently have. We're not looking to be an AI vendor. We're not looking to be an AI platform where you come, and you can consume our AI services directly. What we're looking at doing is having it embedded within the process in the application, so that you get the benefits from AI without having to do the work of defining the algorithms and mapping them into the processes that we have. So, this is an area that we're very serious about. We're looking at across the whole application and I know Bob gave you a lot more information last time, but I just wanted to reinforce that.
  • Segregation of duties. This is something we've been asked about for a long time. Segregation of duties, some people summarize it as role level security. It's actually more than that, but it's the ability to control who has visibility and who has access to specific sets of data within the system at a more granular level than something like a company level or a site level. So segregation of duties allows you to say this data belongs to this organization and hence only users from that organization are going to be able to see that data, going to be able to update that data. And those two things are separate. You may allow some people to see it, but not update it. Things like that. So that is one of the investments that we have that's currently ongoing.
  • Another one is document automation. Document automation is a number of different things. It includes the generation of static documents by the application, it includes operational reporting, it includes things like dynamic documents like tax statements, things like that commonly used for regulatory compliance and other things. It's really any time we're generating a document for whatever reason. Now, currently this is an area that in all honesty, we haven't invested as much in as we would have liked to over the last couple releases. So, now we're looking at a catch-up process here. The different ways that we generate documents in the application are different across, depending on the use case. So, we're looking at normalizing that, having a single way to generate documents, and allow you to generate and customize and configure those documents going forward. That is a big part of this. We recognize that this is not going to be a situation where we introduced the new document automation and immediately shut off the old one, because many of you have many, many years worth of reports and documents that you've built up over time. So, this is going to be one of the things that's going to be an incremental and evolutionary experience to get you to the new version of document automation.
  • Data archiving is a really important one, especially as more and more customers are coming into our Cloud and we're managing their environments. It's important that we make sure that those environments are performant and scalable and one of the ways of doing that is by having a standard data archiving process, like document automation. I mean we already have a data archiving solution in IFS cloud, but it's something that we really haven't invested in, and as such it's not where it needs to be, so this is something that we're looking at going forward here as well.
  • Analytics as a service. So, this is a really exciting one I think, and one that I think many of you will get excited about because currently today when you look at your BI and analytics platforms, you're managing those yourselves. So, say you're using power BI or using whatever stack that you're using. You’re deploying that analytic stack, you're managing that analytic stack, and that requires a lot of skills, that requires users to be dedicated to that process. It obviously requires hardware and expenses. Our goal here is to offer that entire analytics stack as a service provided by IFS. We're starting with focusing on Cloud based delivery of that, but this is something that we're going to be doing in multiple iterations across multiple releases, starting in the Cloud, then looking at how we deploy this as a service to our on premise customers or air gap customers. So in addition to managing the BI stack itself, it includes all new analysis models and things like that. So I think that's really exciting and potentially a game changer for you as you go forward with BI.
  • Another thing I'll mention here is we are very much a Microsoft shop when it comes to analytics, a lot of our analytics today is all based on Power BI as part of this. That's still the direction that we're going, but we're also incorporating some of the new stuff in Microsoft Fabric. If you're not familiar with Fabric it’s Microsoft next generation platform for analytics that includes Power BI, so it doesn't replace Power BI, but includes Power BI. It includes a number of other things that we've been investing in over the years, such as Microsoft Purview, which is a data catalog and tool for data governance and data management. So, all those things are kind of coming together, at the same time we're adding value by putting all the IFS bits on the front end of that and making sure that it's tightly integrated.
  • Here's another great one. It's AI workflow automation. I've talked to a lot of customers, workflows very near and dear to my heart. I've been involved in workflow related work for many years, even prior to working on IFS Cloud, and one of the consistent things that I've always heard, ten years ago, I could have had this discussion with you, and I would said same thing, is that people love the idea of workflow. It's going to save them a lot of time, but there isn't an initial investment that they need to make.

Slide: Example (Bashar Khaldun)

  • So, this is an example that visualizes that. I have a person here Bashar Khaldun, Bashar is an operation manager at a distribution centre in Abu Dhabi.
  • Tell me if this answer familiar, Bashar’s got 10 employees, but he's got workload for 14-16-18 employees. So way more work to do than he's got employees to do it. He's got no additional headcount in the plan and worse yet, the business is coming to him and said, hey, we're selling these new contracts. The amount of work that you're going to have is going to go up by 50%, and not in a year, but maybe in a month. So, it's not something we can address with head count, things like that, because we're going to need to do it soon. So, he needs to figure out a way to increase the throughput. Wants to use IFS business process automation. Recognize that it's going to save his workforce a lot of time. Maybe, like I said today, 10 employees that work for 16 by using business process automation. You could take that workload from 16 users down to 8 users by automating a lot of the activities. The challenge is that takes an investment, right? It takes an investment to build the process automation, to understand what processes could be automated, to actually build the workflows, to test the workflows and get them rolled out. So, in this case this year, because of this the impending demand on him, wants to use business process automation, but he can't make that investment right now because he'll miss his SLA's for the next month or two months or three months If he takes resources off to work on this. This is where the AI workflow automation comes in.

Slide: Engineering Value Proposition – Targeting audiences, Value & differentiation

  • So here what we're saying is the AI is going to go through, and it's going to find opportunities to automate processes in the application for you. And it's going to make recommendations on which workflows it would recommend that you use, and then give you the ability to say, ‘Yes’ or ‘No’, generate this workflow for me, put it in place. Again, the whole point of workflows is to free up users to do more, either to focus more on strategic tasks or to get more work done. This is another step in that case, we we're freeing up the users from having to define what workflows need to be done. This is something that really applies to every customer that we have, and something that we're excited to look at. And this goes back into something I talked about before, about the collection of data and having good data quality, having consistent data, because that's exactly how something like this will work. It's going to look at the process analytics. It's going to look at the capture of data about how your users are working in the system, and from that, if it finds the same users or the same users with similar roles doing the same tasks over and over again, that's when it's going to identify this is an opportunity to do an automation and then build that automation for you. I'm excited about this one. I'm excited about everything Workflow. That's mainly the things I wanted to go through.

Slide: Future (2024-25)

  • There's a number of other things that we have on here. One I'll mention here is our integration accelerators. If you're not familiar, we are partnered with Boomi. Boomi is an integration platform as a service vendor. A number of years ago I looked at how IFS was delivering integrations globally, and found out back in like 2018, that there really was no consistent approach to how we were doing it and everything was ad hoc and as a result, almost every integration project was coming in over budget, over time, things like that. So, we selected BOOMI as an integration partner to help us with that. Their cloud based platform allows you to build the integrations visually through like a workflow type designer. So, it really cuts down on the amount of time and effort to build the integrations, to update them as you go forward. As part of that effort, we've been building a series of what we're calling integration accelerators. These are integration starting points between us and other systems.
  • So, in 23R1 we introduced a sales force integration accelerator, which was specifically targeting service industries. So, if you want to integrate and give a 360 degree view of your service and sales view of the customer to each other, that's what that integration accelerator was designed to do. Make sure that the sales people using Salesforce would see the same information that your service people would see and vice versa about the customer. So what equipment they have? Who the contacts are, jobs that they have available jobs, historical jobs, things like that, to help both groups do more.
  • Now we're working on an SAP integration accelerator and this is specifically focused around enterprise asset management where we see a lot of deployments of both IFS cloud and SAP. They're using IFS cloud to deliver the EAM portion of this, the enterprise asset management and using SAP to do the finance and other pieces like that. So this integration accelerator is focused on making that easy. Now there are a number of number of advantages to these integration accelerators. There's the business processes are mapped, there's the data that's mapped, but there's also just the simple fact that we're eliminating the complexity of the touch points. So as an integrator, you don't need to know how to authenticate your messages. You don't need to know how to do error handling, all those things that's built into the accelerators. And we are looking at other accelerators, this is something we want to continue to develop and something I would really value your feedback on, places you would see where that would be really beneficial if we had integrations between IFS Cloud and other systems.
  • I also have on here a lot of investments that we're making around the experience part of our application. A lot of these are based on using AI, so using AI to improve how you search the application, using AI to offer a conversational experience in some of the processes that we have, things like that.

 

Questions / Answers / Feedback / Responses:

  • Q: We are not on the Cloud yet, but we are starting off the project to come to the Cloud and just making out how can we how can we get to know all these details behind all the nice information here. I often think that the consultant that that we meet, they they're not catching up with all the news. So that's the question for my account manager I will meet tomorrow, but one specific question concerning segregation of duties. This is something that we have been wished for many, many years. So, does that mean that you can have different permission sets for users in different companies? So, in one company you are you have a full user but in at the same time another company in the system you just have a viewer access right to port master data. Is that within that scope of segregation of duty, so different access, but depending on the company you have access to.
    A: Yeah, I was just taking a note because I like to reinforce this with customer feedback when we talk about these requirements internally, but yes it does.
    Q: Good, because I've been wishing that for many, many years. So, when was that planned? 24 something?
    A: Yeah, 24 something
    Q: And so if we if we go live autumn next year, we will have it available then?
    A: Likely
    Q: And also with the segregation, will that also cover some functions in IFS today that requires access to the other side, even if we don't want them to be there, like replicating of products, parts, to use the assortment and replicates part you need to have access to the other side and we don't want that, they can make mistakes and today they have full access them to the outside as well. Is that also in that context of segregation of duty? Or maybe it's still too detailed.
  • A: I would need to understand more detail to be really confident in my answer, but I would say yes based on the fact that we can differentiate between types of access, so read only access versus update access.
    F: I've seen that with replicating parts, you're sitting and operation side and you replicate the part to the sales side that they should use and buy from the operation side. But I don't want the operation guys to have access to the sales side, that they should just replicate the part. Today, it's that it's not possible. You need to have full access to the other side. Another is the RMA, Return Material Authorisation, we are using direct deliveries and then the customer he can enter an RMA in the sales side but to be able to handle that from the operation, you need to have access to both sides which is not only for the system to work but I would like that to be seamless. The process should allow you to do it.
  • R: I know it's not exactly what you're asking for, but I talked about the data migration manager and moving data around and one of the investments that we're looking at in that area as well as being able to move data between companies and potentially between sites as well. And now that's going to replicate that data, which isn't exactly what you're looking for, but I just want to mention.
    F: Yes. So, we have to do sometimes a report that we that we just show for users because we cannot let users in some countries have access to certain data.
    R: The customer and supplier they are global, but are still some parameters which are not global. That is what I wish to have, which is a backdoor option. Now there are some that are supposed to be global, but since you're global company like we are, same customer is used in many continents and with many sites, they destroyed the data for each other. So, make it complete, company unique, customer and supplier data. I'm happy to support you or to explain what I mean in detail, but there has been lots of improvement from our old IFS 2003 to IFS 10 or 11, but that's still a ways to go.
  • F: We acquire quite many companies. They're not so big, they're small and to have a company is quite expensive in a country, both in the US and in China and everywhere. So instead of creating new legal entity, we add on a new site to an existing legal entity and have the brand for that site so they have their own site data, but the shared service, the finance is a common finance function, but there are some limitations. Simple things like logo. The logo is on company level. I wish to have a logo possibility on site level so we have a path for you, add a logo to the site and then I can have it there.
    Now we have to have different layouts and then the reporting engine. Then we need to maintain all the layouts for every change we do, so that is a wish from my side in the same context.
  • Q: So, not being familiar with IFS Cloud to the level that I would want to be at this point. I mean I would have more questions and most of which were probably very stupid, right? But about the workflow automation, we see a lot about the capabilities, we hear a lot about the capabilities. What I haven't personally experienced or seen yet is where exactly it sits. Does it sit on the server level? Does it sit at the client level? Does it sit at a scheduled process like a cronjob kind of level? To what level can I implement these workflow automations? Because I see what you're saying. I get excited and you can start folding in the API messages into a workflow. You're practically eliminating the need for a company such as make.com or power automate, or you’re mimicking some of that capability at the very least. That could be super beneficial to the end user to where I have a power user in the system they identify, I do this 10 times a day. Let me just automate that for myself, but there's a significant risk with that as well, right? If they do something wrong or they set it up to run every 10 seconds, you could really cripple the system. But I'm also thinking from a business rule point of view where I have an IFS FSM today. All of my business rules and schedule processes calling for both business rules and just automated XML scripts. How do I envision where this sit? And maybe the answers all but, between business rules, scheduled processes and user facing, automation workflows.
    A: So probably not surprising, given my background with FSM, there's a lot of similarities between what we do in business rules in FSM and what we do in workflows within IFS Cloud. We really didn't have anything like business rules in IFS Cloud before workflows and given how much of a differentiator business rules was in FSM and how it really drove a lot of the sales success we had in there, that something that we definitely wanted to have within Cloud. You mentioned a couple other systems out there like power automate and others, there are a lot of competitors in this space. But we've looked at this as something that was really a commoditized feature in the application. This is something I believe that is something people expect to have within an enterprise system. People expect to have workflow like functionality. They're not looking to buy this separately, and even if they do buy it separately, the big advantage of our workflow over something like power automate is that it's tightly integrated in. It allows you when you're selecting which APIs that you want to use in the automation it loads in all the parameters. Things like that. So, from a user experience perspective, it's much better.
    You talked about some of the concerns about running it and having aberrant behaviour coming out of a workflow that's in part what I was touching on before about the new logging and the new tracing that we've added into IFS Cloud so that you can as part of your testing, see exactly when it's being run, see exactly what it's doing to make sure that it's behaving in the way that you expect. Now at the same time, we all know that you can test it for days and days on end, and some users going to use it in a way you never expected. So, the workflows that we have in the system are all versioned and you can activate a new version, you can revert to a previous version, you can activate or deactivate the workflows entirely. So, when behaviours happen that are unexpected, you have a lot of control over what the next step is. Is it just reverting back to the previous one? Is it shutting it down entirely? Things like that.
    Now as far as where are the workflows run? The actual execution of the workflows is all happening on the server itself. Like I said, workflows can be initiated based on interactions in the client, but the actual execution of the workflow is running on the server itself and then it's using the data from the transaction or the interaction that started it as well as then calling the API to do get any additional information that's needed or to perform any actions that you want to have happen as a result of that workflow.
    Q: OK, will I be able to use the workflow to help for example the importability of data to our system? So, data migration, I get that that, I mean it’s super exciting, but so in my business, being a B to B service provider, we face a lot of bulk work. Hey, a customer has just asked us to update all their sites. Hey, we just got a new customer on board and we need to load in their new sites. There's a whole lot of bulk work being done right now and we are funnelling that bulk work through my IT support team or my BI team. Because I don't trust users to import, I just don't right? Importing capabilities and FSM specifically, you quite frankly have the ability to cancel 1.3 million tasks by accident. Just because you didn't filter your import Excel document clear enough. Would I be able to use these workflows, add some kind of automation, some checks that say, maybe validate some data on the import prior to executing it and at a level of security to allow my users to do something that I today would not trust them to do?
    A: You could do that. One of the major use cases of workflows is to add additional validations that you want to have on transactions and those could be transactions that a user is performing in the system, user types of value on a screen tabs out and you want to perform a custom validation. You could do a workflow for it right there. You could also do it for importing mass amounts of data. Is it the right tool for exactly what you're describing? I would suggest first we look at doing those additional validations in the data migration manager itself. There's a lot of capabilities already there. Now, at the end of the day, those two things work together as well. So, if for some reason you have a validation that's so complex that it can't be delivered through the data migration manager, we can build the workflow that's called from the data migration manager to do that validation.
    R: For example, site addresses like I could hit something up against an API to validate that it is what I want to put into the system, like a tracking number needs to be valid. Something like that.
    A: Yes
    R: Ok, so again, I conceptually understand it, at least at a very high level and I can't wait to get my hands on it. So, I'm excited.
    A: Yeah, like I said, business rules were probably one of the three biggest differentiators for FSM and why FSM was successful as it was. The one downside with business rules was that it does have a high bar for learning how to do them, and is not visual. So, we want all the power that business rules have, but also make it visual and make it easier for people to learn and start using.
    R: Right. Yeah, I'll back you up on that one big time. So we implemented NetSuite for finance and I asked for 2 extra validations. I said I don't want users to be able to change this field to something if that value isn't something else, and I want to make sure that this field always has 7 digit characters in it. They came back with, ‘well, that's client scripting. A user scripting takes 36 hours to develop.’ I blew my mind. It takes Me 2 seconds and FSM.

 

If you are an IFS Customer and would like to watch the recording, please email jessica.foon@ifs.com

A copy of the slides can be found in the attachments section below.

 

Next Meeting: 8 November 2023 10:00 AM US Eastern Time
IFS Digitalization CollABorative - Think Tank Session TBD

If you are an IFS Customer and you do not have the next meeting invitation to this CollABorative and would like to join, please click here to fill out the form

Be the first to reply!

Reply