The problem that is difficult to parameterize refers to the fact that in the presses, the MtlPart is a few coils, which due to the limitations of the machinery, when a production starts with a coil, the ENTIRE coil must be finished, without the possibility of exchanging coils.
The MtlPart has a UOM=KG. We can have information on the amount of KG of weight of each coil.
The expectation is that the Jobs can be scheduled in such a way that they consume the total KG of the coil. We are not clear which way to go.
An alternative that we evaluated is to adapt the Adjust Global Scheduling Order menu to calculate the consumption of each type of coil, in each Job, and then change the Scheduling Order before launching Global Scheduling.
We use the lot sizing parameters on the parts that consume these materials to have Epicor write jobs for these parts in quantities that consume exactly āoneā of said material.
This is something we thought about as well, but the problem is that manufacturing such a large quantity of a single part to consume the coil is neither cost-effective nor efficient.
Is it that you can run as many jobs as you like off of one coil, but the problem is when the last job you are going to run does not consume the exact remaining quantity of the coil you are using?
Yes.
There are different types of coils (with different characteristics). The parties use certain coils. But the problem is that when a coil is placed in the machine, the coil must be completely consumed, it is not possible to use it partially. Therefore, it is necessary to group all the jobs of parts that consume the same coil, and very possibly this does not coincide with the scheduling and these jobs do not consume the coil completely, and therefore it is necessary to look for jobs in the future to include or increase the quantity to be manufactured of some of the parts.
There is also ābatchingā where the scheduler will groups jobs based on input material.
I might be missing the real issue also, but Iāll say that we use coils of wire in our process too - but weāve taken the time to make the configurator calculate the exact time and material requirement āper unit/pieceā of output. That means that for a given order/job qty, the configurator rules will set the material and operation times to be exactly what our standard is. And the scheduler will do its thing. If you work backwards and want to input a coil/roll size/weight and determine the exact run time and unit output that will occur - that should be easy enough (just the reverse math)
Interestingly enough we do have a very similar restriction in a different area of our manufacturing operation.
We print stripes of laminate. These parts have various widths, but a group of jobs will get printed concurrently on a roll of substrate that has a fixed width every time.
The pool of available jobs/parts change on a daily basis and are largely unpredictable. As a result of this, we have a mostly custom solution that presents people with a list of jobs they need to batch together. They use certain criteria to determine which jobs to put together, to consume a roll of substrate as efficiently as possible, and assign an internal ābatch idā to these jobs that we use for the duration of the process to know these jobs are meant to be run concurrently.
Interestingly similar. This process is what I think can be/could be behind an updateable dashboard that refactors the scheduling order to batch. Might be an interesting project for us to work on
I have tried to get this to fit into Epicor neatly but so far I havenāt figured out how.
In our situation, we have other factors beyond efficient material consumption that determine what can be grouped together, which complicates things further. We have the relevant factors set up as attributes on the parts which the users can then use to filter by to make the process a bit easier. I still think more enhancements can be made, like displaying a running tally of the total width of the selected jobs, to compare with the width of the available substrate, etc.
Ultimately, though, I would just like these jobs to be grouped and scheduled automatically.
Yep - us too. Could be 4 factors in combination that would make the best batch group. And we do the same - present the line supervisor with a list of open jobs in the next 30 days so they can batch as needed but being careful not to overload inventory or create a WIP storage problem in our limited space.
Anyway, letās keep in touch if we think of anything interesting and share what we find!
As this is moving toward batching - I thought I would give you a link of an approach I put out here a while back.
Take a look and we could discuss moreā¦