What can we rely on to not change in future versions when developing custom software?

Our primary Epicor developer has been writing most all solutions outside of Epicor in node.js exposing a web interface for users, and interacts with Epicor data via direct SQL access. REST APIs are used for calculated fields only.

We are now talking about upgrading and one consideration was moving to the cloud, and we were told we wouldn’t have direct SQL access which would be a major issue if we had to rewrite all of his code for the last year or two to use another method for accessing data from Epicor. I’m hope the company decides to take the direction of standardizing on REST APIs(no direct DB access) if we were to have a guarantee the REST APIs would not change in the future. I assume this would abstract us away from the database implementation(eg E9 used Progress DB and E10 Microsoft SQL) so if it were to change in the future it would have no impact on our code…

You can build BAQ’s and functions and then expose them via REST. Those really shouldn’t change. You created them, after all.

2 Likes

You will never have a guarantee that things won’t change. If you’re going to customize, you have to be willing and able to adjust and fix as things change. The only way you’ll be able to not have any changes is to stop upgrading.

REST calls are abstracted enough, that things shouldn’t change very often, but it’s never going to be 100% static all the time. That just isn’t possible to make improvements to the product, and add features while never changing anything.

2 Likes

I would recommend using Functions as you have access to the DB even in Cloud. You are not able to do stored procedures, views, and other stuff; but you can read and write to the DB through a Function.

Yes, even my widget based BPM’s break on upgrade. Nothing is guaranteed, but some things are more likely than others.

Thanks I know for a lot of REST APIs they will include a version in the endpoint(eg /api/v1/x/y/z) so if they need to make changes that break the API they will release a new version and keep the old(eg /api/v1.1/x/y/z) I didn’t know if Epicor had any policies like this.

Sounds like using the REST APIs vs direct DB is likely less risk, but we have no guarantee or policies from Epicor in regards to backward compatibility.

Thanks!!!

I am definitely Team API. Amazon Web Services was only able to grow and adapt as they did because of the apparent Jeff Bezos API Mandate.

The way people view this problem has a lot to do with when they learned to program. In my first paid programming gig, I worked with a payroll system on an IBM mainframe. Y’all will love this. The way you ran payroll was you mounted two tapes. The source tape was the state of the payroll from the previous run. You ran the program. This would make read the tape, make maintenance changes, and calculate payroll. The updated records were written to the new tape. This new tape became the input for the next run.

:exploding_head:

Problem with the payroll? Reuse the source tape and run it again. :person_shrugging:

Eventually, the data on tapes moved into databases. Most of us here learned to read and write databases and have a tough time imagining there is any other way to do things. The challenges with databases are keeping our code in sync with schema and security. In the pre-Internet era, the security risks were smaller, so there was not a lot of attention paid to it. As for Schema synchronization, we came up with tools, like Entity Framework, that created code from schemas or created databases from code.

But as clients got further away from the database servers, remote sites and Internet, things got more complicated. Since people are interacting with the database directly, all the business logic has to be in the client. Any changes in the business logic required updating all the clients around the enterprise. This can be a PITA with organizations of any size. There was also a performance penalty because one had to move data out to the client to do the work and then back in to update the database. On a LAN, this wasn’t too bad but with remote sites, it can be painful - mostly to users because devs are usually closer to the database. :wink:

And then there’s security. Sure, people can enable TLS on SQL connections but let’s face it. Nobody does. That means certificate maintenance, proper coding at the client, etc. Also, the only authorization is with usernames and passwords or Windows Auth (if you’re lucky). But that all requires work too - making sure the schema is set up to authorize users to columns, tables, views, and stored procedures. Again, most people just leave it pretty wide open. I don’t even want to think how many ODBC connections are out there to Kinetic databases that are wide open to anyone with Excel or MS Access. :worried:

Programmers graduating today are used to dealing with APIs. The only way they would access databases directly is when they are writing an API. And even then, they try to abstract away the details of the particular database in case they want to move from one to another like SQL Server to Postgress or MySQL. Microsoft does not grant direct access to the Exchange Data Store or the hard drives behind Azure Blob Storage. Why do we grant direct access? :thinking: With the growing trend of Domain Driven Design and Vertical Slice Architecture, the database is no longer the center of the world - which is completely foreign idea to the current large crop of N-Tier coders today.

Most of the developers here on this list use APIs as if they were direct database calls. They call the REST endpoint for Part, Sales Order, etc. directly. This means the business logic is probably still in the client. A handful of coders here primarily use just a few endpoints, most notably Epicor Functions and the BAQSvc. Functions contain all the business logic. So now, if there is a breaking change in any of the Kinetic services, there is only one place to fix it. One doesn’t have to roll out new clients. And even if one does, API versioning will allow older clients with reduced functionality to continue to operate until a new client is rolled out.

To take this a step further, one would write a layer of APIs that interact with the Epicor Functions and BAQs. This provides a few more benefits. Now API-Keys can be hidden from the client. Clients need to authenticate to the API then the API can retrieve the API-Key appropriate for that client. This can provide more fine-grained authorization too including conditional access. One can also protect the server with rate limiting to prevent Dential of Service Attacks. And finally, one can also add some caching to improve the performance of your API because you’re not hitting the database as often and remote users can feel closure to the data when the cache is closer.

So, for all the reasons you’re feeling pain now - I would recommend some level of API over direct access. That’s just technical debt that has to be paid eventually.

4 Likes

Really appreciate the well thought out response!

The problem is I’m not a decision maker, and I’m trying to get ammo to help sway the decision makers to what I believe is a wiser technical approach. Decision makers care mostly about $, so bringing them a technical argument generally doesn’t work I’ve found, I have to figure out how to make it about $, but not sure how to quantify it…

It all boils down to you can pay now, or you can pay later.

It’s probably cheaper in the long run to pay now.

1 Like

NEVER… NEVER should anyone be writing directly through SQL. That is going around the business logic, and you can corrupt the database if you do things incorrectly. Using the APIs is the only way to make sure that you have applied the correct business logic.
CASE IN POINT:

  1. you can update the quantity, or date on the Order Release… but if you do it in SQL, it will not get properly updated in the other tables that also need to be updated (OrderDtl, OrderHed, PartDTL comes to mind).
  2. modifying the jobhead table will not update all the details (JobAssm, JobMtl, JobOpr).
    EVERYTHING should be done with method calls via REST.

Note that in the cloud there is NO DIRECT SQL ACCESS… PERIOD. all data access must be done via API calls (both reading and writing). Anyone trying to take a shortcut will be very disappointed if you ever try to move to the cloud later.

5 Likes