Skip to main content
Question

BPA Workflow - result set exceeds row limit for the Query Collection

  • October 29, 2024
  • 6 replies
  • 255 views

Marcel.Ausan
Ultimate Hero (Partner)
Forum|alt.badge.img+22

I have a Workflow where I want to calculate Inventory Accuracy → in order to do this, I need to read all the lines in the CountResultsHandling → Reference_CountingReportLine for the current year.

When I inspect the WF, I get the below error:

The result set size exceeds either the row limit or the character limit for the Query Collection.

What’s the max rows allowed in the Query Collection? Any advice how I could workaround this limitation?

Of course, I could further filter the Colelction for a smaller time period, but my customer wants the WF to take into account the data for the full current year.

 

@kamnlk any suggestion will be appreciated.

 

6 replies

Forum|alt.badge.img
  • Do Gooder (Employee)
  • 2 replies
  • October 30, 2024

Hi,

You will encounter the specified error message if the read query exceeds the Row or Character limit.
Here we configured query collection row limit as 10000 and query collection char limit as 2097152 (2GB).

Workaround to reduce row limit
1) Add Parameters
2) Define Filters

 

 

Workaround to reduce column limit
1) Use $select

You can learn more about $select by referring to the following resource.

 

IFS API - Technical Documentation For IFS Cloud

 

Thank you.

Dinelka


Marcel.Ausan
Ultimate Hero (Partner)
Forum|alt.badge.img+22
  • Author
  • Ultimate Hero (Partner)
  • 1268 replies
  • October 30, 2024

Thanks @dinelka. I just checked and there are 4908 rows fetched in the Collection. So probably I’ll just use $select to specify just the columns I need for my calculation. Thanks a lot!

 


Marcel.Ausan
Ultimate Hero (Partner)
Forum|alt.badge.img+22
  • Author
  • Ultimate Hero (Partner)
  • 1268 replies
  • October 30, 2024

@dinelka it seems I hit another limitation. I can’t use Parameters and Filter at the same time.

I would need the filter so I don’t go above 10000 rows and I also need the $select Parameter to not go above the char limit as 2097152 (2GB).

Anyways, to me it seems strange that less than 5000 rows will have more than 2 GB and exceed the char limit. Isn’t it?

 


Forum|alt.badge.img+9
  • Hero (Employee)
  • 132 replies
  • October 30, 2024

This is a validation from workflow side. 

Option 1:

If possible, you can try with an option of breaking the “Get count Results” API task into multiple tasks using filters and then join the collection into a single collection inside the workflow using a script task

 

var ArrayList = Java.type('java.util.ArrayList');
var list = new ArrayList();
list .addAll(Reference_CountingReportLine_Set2);
list .addAll(Reference_CountingReportLine_Set2);
execution.setVariable("YearCountingReportLine", list )

as a downside you wont be able to use $select here due to the above validation


 

option 2:

Since you are interested with year of the data, you could try to create a custom attribute to the CountingReportLine entity where you extract the year from the LastCountDate attribute. 

Then that attribute is available inside workflow as a new attribute. you can use that to use filtering of data and also use $select  

Add Custom Attribute page configuration

 

reference: 020 add edit readonly attribute - Technical Documentation For IFS Cloud

 


dsj
Ultimate Hero (Partner)
Forum|alt.badge.img+22
  • Ultimate Hero (Partner)
  • 880 replies
  • June 19, 2025

Hi ​@kamnlk 

Usage of $select saved my day. Thanks a lot for that!

To further limit the result set, I tried to use $top in the parameters but looks like it fetches all the records despite that. Do you know if it’s not supported or we need to use any other syntax?

Here’s my API call

 

Regards,

Damith


Forum|alt.badge.img+9
  • Hero (Employee)
  • 132 replies
  • June 20, 2025

Hi ​@dsj 

Glad to hear it helped you 😀

Workflow currently only supports $select and $skip at the moment. You may have to read in the data and filter it using a script task. Then possibly to cleanup other data from execution 

ref: IFS API - Technical Documentation For IFS Cloud


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings