I have just recently been tasked with going through code on BPMs to try and figure out why certain ones are taking so long. I had originally run diagnostics on some server logs and found 2 that are taking anywhere from 20-80 seconds to run. The code was originally written for us by a company that we no longer have a relationship with so reaching out to them to try and optimize the code is not on the table.
The code is doing multiple joins from a tt table and I believe this is what is causing the slow down. I am not sure if at this point it would be easier to completely redo the code or try to salvage what we have and try and rewrite it. Only problem is I havenât been working with Epicor that long and am not very proficient in writing C# yet.
I will paste the code below to see if there are any ideas as to which way I should be going with this.
//string messageText = string.Empty;
foreach(var tt in (from ttrow in ttShipTo select ttrow))
{
double TotalBillToDate = 0, TotalOrderToDate = 0;
foreach(var st in (from strow in Db.ShipTo where strow.Company == tt.Company && strow.CustNum == tt.CustNum && strow.ShipToNum == tt.ShipToNum select strow))
{
int CurrentOrder = 0;
foreach(var i in (from irow in Db.InvcDtl
join hrow in Db.InvcHead on new {irow.InvoiceNum, irow.Company} equals new {hrow.InvoiceNum, hrow.Company}
where
irow.Company == tt.Company &&
irow.CustNum == tt.CustNum &&
irow.ShipToNum == tt.ShipToNum
orderby irow.InvoiceNum
select new {irow.Company, irow.InvoiceNum, hrow.DocInvoiceAmt, hrow.CreditMemo } ))
{
if(CurrentOrder == i.InvoiceNum)continue;
TotalBillToDate += Convert.ToDouble(i.DocInvoiceAmt);
CurrentOrder = i.InvoiceNum;
}
foreach(var i in (
from irow in Db.OrderHed
join drow in Db.OrderDtl on new {irow.Company, irow.OrderNum} equals new {drow.Company, drow.OrderNum}
join rrow in Db.OrderRel on new {drow.Company, drow.OrderNum, drow.OrderLine, tt.ShipToNum} equals new {rrow.Company, rrow.OrderNum, rrow.OrderLine, rrow.ShipToNum}
where irow.Company == tt.Company && irow.ShipToCustNum == tt.CustNum
orderby irow.OrderNum
select new { rrow.ShipToNum, rrow.OrderNum, DocTotalCharges = irow.DocTotalCharges + irow.DocTotalTax, irow.DocTotalMisc } ))
{
if(CurrentOrder == i.OrderNum)continue;
TotalOrderToDate += Convert.ToDouble(i.DocTotalCharges + i.DocTotalMisc);
CurrentOrder = i.OrderNum;
}
st["TotalBilledToDate_c"] = Convert.ToDecimal(TotalBillToDate);
st["TotalOrderedToDate_c"] = Convert.ToDecimal(TotalOrderToDate - TotalBillToDate);
tt["TotalBilledToDate_c"] = Convert.ToDecimal(TotalBillToDate);
tt["TotalOrderedToDate_c"] = Convert.ToDecimal(TotalOrderToDate - TotalBillToDate);
}
}
At a glance, the first thing that screams at me is the continue statements.
They are a code smell. If you are doing an if and continue that means you just pulled back all that data from the db into the app server to throw it away when you donât need it. Iâd probably look at adding the condition to the where clause
from irow in Db.InvcDtl
join hrow in Db.InvcHead on new { irow.InvoiceNum, irow.Company } equals new { hrow.InvoiceNum, hrow.Company }
where
irow.Company == tt.Company &&
irow.CustNum == tt.CustNum &&
irow.ShipToNum == tt.ShipToNum
//Something like this â && irow.InvoiceNum != CurrentOrder
orderby irow.InvoiceNum
select new { irow.Company, irow.InvoiceNum, hrow.DocInvoiceAmt, hrow.CreditMemo }))
{
//Then this can be deleted â if (CurrentOrder == i.InvoiceNum) continue;
The next would be pulling extra columns not needed. Company and CreditMemo are not needed in the first query. ShipToNum is not needed on the second. Nitpicking but I like to trim my queries against SQL as much as possible.
The other oddity i see is you are looping over a collection of ttShipTo rows. How many duplicates do you expect versus unique data in those inner queries? Whenever I look at looping, I always worry I am retrieving the same data for every loop. I donât think that is applicable to the data you have here but always factors into my thinking. It may be more appropriate to retrieve all data lookup in one shot then iterate those in one shot.
I have not even considered the query itself or joins and how they align with indexes - thatâs often an issue
@knash that same issue does apply to custom code. You should NEVER EVER NEVER join to a ttTable. Werenât you at our talk? Did ya fall asleep? LoL
In this case though the joins are not to the ttTable they are using a primitive within the ttRecord to as a whereClause and that is ok.
Here are the slides @asmar there was a LOT more covered in conversation Q&A part unfortunately I donât have that⌠Someone recorded it but they havenât sent it to me yet. E10Help Panel.pdf (1.3 MB)
âSo it appears the assumption I made was correct, joining to the class object causes the âjoinâ to occur in memory post SQL query so it would yield a much slower result⌠Good stuff!â - Jose
You need to always be aware that joining to an in memory structure is problematic. Sorry I did not review your question to dig it out.
I run into this in internal reviews as well. I still have not looked at your complete scenario but one of the things folks do a lot is push the ttrows (or some subset of columns) into the db as a âtemp tableâ and join against it. Very quick then even with millions of db records in partTran, etc type tables. The push of the data into the temp table and cleanup of records afterwards is many times faster than the âjoin to ttâ
â Bart
I went ahead and made some of the changes that you outlined and the code was still going quite slow. I think I am going to have to dive into this code and really break it down to figure out why it was created this way. In my opinion it is the code itself and we may have to think about using other methods to get the information as there has to be an easier/more efficient way that might not even require custom code.
The code was originally dictated by our credit department to our consultant at the time. It is meant to go through all the shiptos for the current customer and grab their total ordered to date , and give us the total outstanding balance for that current selected shipto. We are going to be talking with the credit department to hopefully clean up the naming convention as it does not accurately reflect what it is supposed to representing.
I donât know if the code is populating all the custom fields in the Erp.ShipTo_UD table for every shipto for all customers at once somehow. We have as the same data fields used in dashboards that populate the information in seconds yet when pulling up a customer in customer maintenance it hangs sometimes up to 3 minutes in extreme cases.
It has definitely been fun trying to go into code written by other people trying to figure out everything theyâve created/touched. We have had a total of 3 different consulting companies doing things for us and my team has been brought into the fold with Epicor and we are now tasked with cleaning it all up as much as possible. I am going to have some further discussions with my team and see if there is something that we can come up with.
Also thanks for the pdf of the E10 Help Panel Jose! I attended it this year at insights and wanted to get those slides to have as a reference in addition to the notes I took.
Agree with @hkeric.wci on that prototyping approach. I would be curious how much data is in each query you are looking at. There is always the time someone has some outlier number of records that was not expected and throws this all off. Doing some breakdowns on how many records are in ttShipTo, then the first query, then the second, etc would be interesting to see.
You can also turn on the SQL and DB Perf traces in appserver.config to get some numbers on how long it takes to execute different items.
@IanCastellanos If you just download LINQPad (I would recommend the 99$ PAID) then you open those files in the above attachment, you should be ready to go with minimal configuration. You are welcome no worries, let us know if we can help.
Got LINQpad4 all set up and have run into what I think is a big problem. Seems that there is no ttShipto in our databases Which makes me wonder what the code is exactly doing unless I am missing something. Which could be the case since Epicor is still new to me.