I’m wondering if anyone has a technical explanation of the factors that can affect the output speed when using an output channel from an overview screen? We (me or users) can see a wide variation in the output speed based on time of day, system load, user connection speed, user terminal load, etc. All of which typically make sense and are mostly correlative. However, very often I notice that the same output one time will run at hundreds or even dozens of rows per second, while at other times seem to progress at thousands of rows per second. Sometimes the output is steady and continues until done with no pauses, other times, the output seems to hold for pauses of minutes to dozens of minutes.
I’m aware that creating Quick Reports to reduce the relative number of columns has a significant effect on reducing speed, but for all users, I should be able to give them some expectation of the threshold or limitation with number of rows that are too many.
I’ve been doing tests with anywhere from 25K to over 70K rows of data from Customer Order, Customer Order Lines, or even Transaction History just to measure the output time. During peak system load, a 25K row output can be more than half an hour, on the weekend, somewhat shorter. But the effect I can’t explain is running the same output the first time - assuming it is significantly slow, like half an hour or more, can be run a second time without re-running the search and the output will occur in a few minutes.
What determines the initial output speed?
Why a factor of 10-50x faster on a subsequent output of the same data?