I am definitely Team API. Amazon Web Services was only able to grow and adapt as they did because of the apparent Jeff Bezos API Mandate.
The way people view this problem has a lot to do with when they learned to program. In my first paid programming gig, I worked with a payroll system on an IBM mainframe. Y’all will love this. The way you ran payroll was you mounted two tapes. The source tape was the state of the payroll from the previous run. You ran the program. This would make read the tape, make maintenance changes, and calculate payroll. The updated records were written to the new tape. This new tape became the input for the next run.
Problem with the payroll? Reuse the source tape and run it again.
Eventually, the data on tapes moved into databases. Most of us here learned to read and write databases and have a tough time imagining there is any other way to do things. The challenges with databases are keeping our code in sync with schema and security. In the pre-Internet era, the security risks were smaller, so there was not a lot of attention paid to it. As for Schema synchronization, we came up with tools, like Entity Framework, that created code from schemas or created databases from code.
But as clients got further away from the database servers, remote sites and Internet, things got more complicated. Since people are interacting with the database directly, all the business logic has to be in the client. Any changes in the business logic required updating all the clients around the enterprise. This can be a PITA with organizations of any size. There was also a performance penalty because one had to move data out to the client to do the work and then back in to update the database. On a LAN, this wasn’t too bad but with remote sites, it can be painful - mostly to users because devs are usually closer to the database.
And then there’s security. Sure, people can enable TLS on SQL connections but let’s face it. Nobody does. That means certificate maintenance, proper coding at the client, etc. Also, the only authorization is with usernames and passwords or Windows Auth (if you’re lucky). But that all requires work too - making sure the schema is set up to authorize users to columns, tables, views, and stored procedures. Again, most people just leave it pretty wide open. I don’t even want to think how many ODBC connections are out there to Kinetic databases that are wide open to anyone with Excel or MS Access.
Programmers graduating today are used to dealing with APIs. The only way they would access databases directly is when they are writing an API. And even then, they try to abstract away the details of the particular database in case they want to move from one to another like SQL Server to Postgress or MySQL. Microsoft does not grant direct access to the Exchange Data Store or the hard drives behind Azure Blob Storage. Why do we grant direct access? With the growing trend of Domain Driven Design and Vertical Slice Architecture, the database is no longer the center of the world - which is completely foreign idea to the current large crop of N-Tier coders today.
Most of the developers here on this list use APIs as if they were direct database calls. They call the REST endpoint for Part, Sales Order, etc. directly. This means the business logic is probably still in the client. A handful of coders here primarily use just a few endpoints, most notably Epicor Functions and the BAQSvc. Functions contain all the business logic. So now, if there is a breaking change in any of the Kinetic services, there is only one place to fix it. One doesn’t have to roll out new clients. And even if one does, API versioning will allow older clients with reduced functionality to continue to operate until a new client is rolled out.
To take this a step further, one would write a layer of APIs that interact with the Epicor Functions and BAQs. This provides a few more benefits. Now API-Keys can be hidden from the client. Clients need to authenticate to the API then the API can retrieve the API-Key appropriate for that client. This can provide more fine-grained authorization too including conditional access. One can also protect the server with rate limiting to prevent Dential of Service Attacks. And finally, one can also add some caching to improve the performance of your API because you’re not hitting the database as often and remote users can feel closure to the data when the cache is closer.
So, for all the reasons you’re feeling pain now - I would recommend some level of API over direct access. That’s just technical debt that has to be paid eventually.