Set a global BPM variable via async custom code

Hello!

If I create a BPM variable, set it in async code block 1, then want to access that same BPM variable (with the new value set in async block 1) in the same bpm on async code block 2, is that possible?

Yes, the Context BPM values remain between BPMs. Even between Method and Data BPMs. Sometimes this is used to pass data from one BPM to another… just be VERY Careful to document what you do… if you use Checkbox01 for one thing, and another programmer uses it for a different purpose, you can get yourself in trouble (personal experience).
Another “trick” is to simply pass data from one BPM to another inside the dataset itself. You can create a User Defined field in the dataset, and then pass data through that variable.

Hi Tim, thanks for your suggestions! I’ll have to look into creating my own ud field, but I think I mis stated something. Let me lay this out:

I have 1 BPM with 2 ASYNC code widgets, and 1 sync code widget. I need another async routine to run after the first async code widget has ran (and populated my global variable)

1: BPM Variable of type DataSet:

Overview of flow:

The rows get added to the ds just fine in async code block 1, but when i attempt to access the same ds in async code block 2 (the end one) I get no rows.

Some thoughts:

  • Is async code block 2 executing in parallel with 1? I tried testing this but everything happens so fast I’m not sure, or is it waiting for the first block to finish?
  • Is it just the nature of async bpms that a bpm ds cannot be accessed with multiple async code blocks?

The reason:
I’m creating shipments using the bo’s in code block 1, but the changes do not get “committed” to the db until after the code block finishes, is there a way to force this in the code block? I need the changes in the DB before code block 2 can run

Should I just serialize the ds into a string and pass that in callcontext to another directive and do the work there? I worry about the max size available of an nvarchar.

Thanks!

I’m a little curious why you’re executing these dependent tasks as async vs. sync. In my opinion, it’s introducing a ton of opportunity for ambiguity for when the actual task completes before it moves on to the next (dependent) task

Hey Aaron,

This is to address a possible scenario where if the app server crashes/reboots and an async task is running, the process restarts fresh and makes new pack numbers. All previous pack numbers in the routine pre server reboot/crash do not exist. The issue is I make calls to an API in the loop in async block 1 and send the pack numbers and other data as the “order” to the warehouse, but this is happening before the packs are actually “committed” to the db. If the server were to crash/reboot during this process, we would send duplicate orders to the warehouse.

This pattern might lend itself to writing some data to a UD table to hold it and then be able to retry in the event of an outage. Hard to say without knowing all the details!

1 Like

I was leaning that way but really didnt wanna use up a whole UD table for this lol

Yeah I hear you. You could set up a SQL table and your own API to write to it it that’s easier :wink:
I don’t know, but I bet there is a way to create UD tables? Perhaps?

I ended up creating a UD field on ShipHead that flags that a new shipment added is ready for the api call (this is set to true in my first async block since it is creating customer shipments), then a standard DD that looks for this flag (is true) and sends to our api async because at that point, the shipment records (pack numbers) have been fully committed to the database. :slight_smile:

1 Like