Okay, it seems like this is a problem specific to the live environment.
I copied Live over to Pilot and ran mfg lead time calculation again.
Results:
Pilot - mfg lead time calculation took 5 hours exact.
Live - mfg lead time calculation started at 6:43PM yesterday, went into an error state (apparently) at 3:43AM, but is still executing as of 9:08AM
I did an analysis to see if there are any parts that are taking too long to calculate. It just stands out that the number of parts calculated per minute are more in Pilot compared to Live. Even though they’re running the exact same process.
Pilot on average calculating around 47 parts per minute.
Live on average calculating around 14 parts per minute, but I can see that the distribution is quite sporadic.
The only thing that stands out is the error message in windows server event viewer logs for the ICE Task Agent exactly at 3:43 am when the process goes into error in the system monitor.
Error message is:
- <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
- <System>
<Provider Name="EpicorTaskAgent3.2.600.0" />
<EventID Qualifiers="0">215</EventID>
<Level>2</Level>
<Task>0</Task>
<Keywords>0x80000000000000</Keywords>
<TimeCreated SystemTime="2025-04-10T17:43:46.584545200Z" />
<EventRecordID>7170168</EventRecordID>
<Channel>Epicor ICE Task Agent Service</Channel>
<Computer>PLPE10SVR.preformed.com.au</Computer>
<Security />
</System>
- <EventData>
<Data>"ERP102600_LIVE": A communication error occurred trying to run task ID 892426 for agent "print" on the application server (User: "ParasS", Task Description: "Manufacturing Lead Time Calculation"). If this continues to happen investigate if you need to increase the receive and send timeouts in your web.config. Error details: System.ServiceModel.CommunicationException: The socket connection was aborted. This could be caused by an error processing your message or a receive timeout being exceeded by the remote host, or an underlying network resource issue. Local socket timeout was '23:59:59.9990000'. ---> System.IO.IOException: The read operation failed, see inner exception. ---> System.ServiceModel.CommunicationException: The socket connection was aborted. This could be caused by an error processing your message or a receive timeout being exceeded by the remote host, or an underlying network resource issue. Local socket timeout was '23:59:59.9990000'. ---> System.Net.Sockets.SocketException: An existing connection was forcibly closed by the remote host at System.Net.Sockets.Socket.Receive(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags) at System.ServiceModel.Channels.SocketConnection.ReadCore(Byte[] buffer, Int32 offset, Int32 size, TimeSpan timeout, Boolean closing) --- End of inner exception stack trace --- at System.ServiceModel.Channels.SocketConnection.ReadCore(Byte[] buffer, Int32 offset, Int32 size, TimeSpan timeout, Boolean closing) at System.ServiceModel.Channels.SocketConnection.Read(Byte[] buffer, Int32 offset, Int32 size, TimeSpan timeout) at System.ServiceModel.Channels.DelegatingConnection.Read(Byte[] buffer, Int32 offset, Int32 size, TimeSpan timeout) at System.ServiceModel.Channels.ConnectionStream.Read(Byte[] buffer, Int32 offset, Int32 count) at System.Net.FixedSizeReader.ReadPacket(Byte[] buffer, Int32 offset, Int32 count) at System.Net.Security.NegotiateStream.StartFrameHeader(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest) at System.Net.Security.NegotiateStream.ProcessRead(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest) --- End of inner exception stack trace --- at System.Net.Security.NegotiateStream.ProcessRead(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest) at System.Net.Security.NegotiateStream.Read(Byte[] buffer, Int32 offset, Int32 count) at System.ServiceModel.Channels.StreamConnection.Read(Byte[] buffer, Int32 offset, Int32 size, TimeSpan timeout) --- End of inner exception stack trace --- Server stack trace: at System.ServiceModel.Channels.StreamConnection.Read(Byte[] buffer, Int32 offset, Int32 size, TimeSpan timeout) at System.ServiceModel.Channels.SessionConnectionReader.Receive(TimeSpan timeout) at System.ServiceModel.Channels.SynchronizedMessageSource.Receive(TimeSpan timeout) at System.ServiceModel.Channels.TransportDuplexSessionChannel.Receive(TimeSpan timeout) at System.ServiceModel.Channels.TransportDuplexSessionChannel.TryReceive(TimeSpan timeout, Message& message) at System.ServiceModel.Channels.SecurityChannelFactory`1.SecurityDuplexChannel.TryReceive(TimeSpan timeout, Message& message) at System.ServiceModel.Dispatcher.DuplexChannelBinder.Request(Message message, TimeSpan timeout) at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout) at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation) at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message) Exception rethrown at [0]: at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg) at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type) at Ice.Contracts.RunTaskSvcContract.RunTask(Int64 ipTaskNum) at Ice.Proxy.Lib.RunTaskImpl.RunTask(Int64 ipTaskNum) in C:\_Releases\ICE\RL10.2.600.0FW\Source\Shared\Contracts\Lib\RunTask\RunTaskImpl.cs:line 155 at Ice.TaskAgentCore.ServiceCaller.<>c__DisplayClass34_0.<RunTask_RunTask>b__0(RunTaskImpl impl) at Ice.TaskAgentCore.ImplCaller.RunTaskImplCaller`1.<>c__DisplayClass4_0.<Call>b__0(TImpl impl) at Ice.TaskAgentCore.ImplCaller.RunTaskImplCaller`1.Call[TResult](Func`2 doWork, ExceptionBehavior communicationExceptionBehavior, ExceptionBehavior timeoutExceptionBehavior) at Ice.TaskAgentCore.ImplCaller.RunTaskImplCaller`1.Call(Action`1 doWork, ExceptionBehavior