Is there a way for the Epicor API (using BAQs) to accept a bunch of records instead of just one at a time? Outgoing (from Epicor) it works that way, multiple records can be retrieved, but incoming (to Epicor) doesn’t seem to work, it only processes one record.
We are wanting to pass data back and forth between Epicor and Salesforce, via Mulesoft, which by default, wants to batch up the requests (a good thing in general). We send the data from Epicor, then for the records that were sent, we get back the success or failure for the records. However, we have been having to get those status updates back one at a time in order for Epicor’s BAQs to update them via the v1 Rest API. We have it functioning, but it could be much better.
Typically, it won’t be a HUGE amount of records, but we have run into a few cases where 50+ records would be processed. Batches of 200 were intended, though we’d rarely hit 200 within 15 minutes. It has happened though, where over 600 records were involved in total, but it took far longer than we’d expected.
That’s not the main concern, I was just wanting to confirm that we have to do it one patch at a time and we can’t send an array of records. The consultants we have for the Salesforce side seem to think reliability would be better if it were to work where we could update many records at once.
You could do this in a number of ways, just not with the v1 API BAQ PATCH method.
It’s been a bit, but I believe you can add in a custom BPM on the execution to consume a parameter from a BAQ where you’d pass a delimited list of your records in the BAQ parameter, then execute the update action on each record after parsing it.
Another alternative is to use a Function instead of the BAQ service
Here you can send a JSON blob with all your updates to the function and do the updates all in one call. Not sure what kind of response SalesForce accepts, but you could pass back some indicator of success. I’m pretty sure there’s any example here on the list. Oh wait, here’s one from @Aaron_Moreng:
I would do this in a function. Or if you can use v2 api, use a UBAQ Custom Action.
But IF you are limited to the v1 API, and ODATA, you have a couple of options.
PATCH: You can serialize a JSON Payload and put it in a character field of sufficient size,
and do a custom update. I believe x(1000) is really nvarchar(max), so I’d make a calculated field,
and serialize my data into that field.
In the pre-processing of the BPM, deserialize it from that field, and populate the dataset for the base method, or roll your own update.
You cannot use a parameter on patch.
GET: You can do the same thing, but use a parameter, but then you have to deal with that parameter on regular gets. (Custom Logic to ignore etc)