Epicor suffers from a very annoying limitation with ALL of its business objects that prevents it from updating more than one record per service call. This results in a terrible waste of bandwidth and terrible SOAP or REST service performance in real use cases. For example let’s say I need to adjust the discount percentage based on some calculation for every detail row in a 100 rows quote. I have to call GetByID once, a BO method to update the discount percentage on each row, and then .Update() on each row. For every one of those BO calls except the first one, I have to transmit a complete dataset serialized to XML, and receive a response containing the update copy of the complete dataset. Let’s say the serialized dataset weighs 10KB. 100 rows times 2 calls times 2 datasets per call times 10KB = 4MB of network transfers for a simple update of discount percentage. This gets worst though, as this isn’t a single 4MB transfer, but 200x calls that each have an associated network latency. Now combine that with having say 40 clients operating at once, and you can easily see how performance will suffer and network congestion will occur.
The easy solution would be to have the .Update() method take in a dataset with all modified rows once, and update all records at once. It appears Epicor tried to accomplish this with the .UpdateExt() methods, but this is unusable, as it doesn’t trigger BPMs and uses a different dataset which none of the other BO methods accept. I just don’t understand why we cannot have a real transactional update method for every single BO. GetByID once, modify all required records, Update once. If it fails just roll back the transaction. This would instantly alleviate countless performance issues with Epicor, and I see no difficulty at all in doing this with Entity Framework, which it already uses…