Custom Code is a LINQ Query to update a field on the table…
// Description: Calculates total net weight of an shipment
var CalcWeight = (from shipDtl in Db.ShipDtl
join part in Db.Part on shipDtl.PartNum equals part.PartNum
where shipDtl.PackNum == PackNum
select shipDtl.SellingInventoryShipQty * part.NetWeight
).ToList().Sum();
using (var txScope = IceContext.CreateDefaultTransactionScope())
{
foreach (var ShipDtl in (from ShipDtlRow in Db.ShipDtl
where ShipDtlRow.Company == Session.CompanyID
&& ShipDtlRow.PackNum == PackNum
select ShipDtlRow))
{
TotalWeight = Convert.ToDecimal(CalcWeight);
ShipDtl.ShippedWeight_c = Convert.ToDecimal(CalcWeight);
}
Db.Validate();
txScope.Complete();
}
Which is working correctly but I keep getting these errors.
This method will fire when any part of the shipment changes, and it might be only the ShipHead record that’s changed.
The tt tables contain only changed records, so in that case there won’t be any ttShipDtl records.
Presumably in that case you don’t need anything to happen, so the easiest way to solve it is an early conditional with “there is at least one changed row” or “there is at least one added row” etc, for the ShipDtl table.
You’re right, they’re only in pre-processing. My habit is to do tests there and enable the post-processing directive if the tests pass.
If you want to skip the extra pre-processing directive, then a simple “number of rows in the designed query” would do the same thing, and the query only needs to contain the ttShipDtl table with some field.
Is it the first thing in the workflow, before you set the value of your variable? If you’re testing that there are rows in the ttShipDtl table then it shouldn’t be able to complain afterwards that there aren’t any …
Stepping back and thinking holistically here, rather than the immediate problem …
I’d be doing the actual work post processing, because otherwise it’s messy trying to sort out what all the current values of everything in the object are, even though it’s more straightforward setting new values within the object at the pre-processing stage. By “post”, the values are all set ready to read.
I’d be checking that something has changed in the ShippedWeight_c column of at least one ShipDtl record, otherwise you’re wasting the processing. That’s easier done pre-processing. If there is a change, you enable post-processing, and the first condition checks whether it’s been enabled.
Then I’d get the values from Db.ShipDtl as you have done, and that should all work.
I’d also refresh dsHolder as a last stage, in case the user gets “row has been modified” errors.
How would there be a change in the ShippedWeight_c in Pre-Processing if the data hasn’t been created yet using the post process? or would I move the custom code to the pre-processing? I may sound blonde but I want to get my head around this…
You’d check for changes in the fields you’re using to calculate it - sorry to be obscure! So looking at your code, that’s going to be SellingInventoryShipQty. If that hasn’t changed for any record, there’s no work to do and the whole lot can be skipped.
I’m assuming this is firing because of users working on shipments in the client? If so, a BPM which alters the record they’re working on means the client’s version isn’t the latest any more, so if they change something else and then save, they’ll get an error saying so.
The fix is something like this after your code has updated the records:
For future reference, it’s cleaner using this same object to do your updates, too, rather than Db.Validate, and then this follows on naturally. But your existing code will work fine, I think, so that’s just a side note.