Working with a company who wants to use Power BI cloud with a replicated copy of their Kinetic public cloud database. Are others doing this?
In searching past posts, I wasn’t clear if this could be done. For various reasons, the data analysis team has chosen Power BI instead of using the standard Kinetic tools.
Do they really need the whole database? Are they trying to do realtime reporting? If not, export the immutable stuff (PartTran, all financial postings, etc.) and sync the required master files (Parts, Customers, Suppliers, etc.) using SysRowIDs and SysRevIDs. Might be a good time to look at Data Fabric if they are invested in PowerBI.
Their analytics team does not want to use BAQs or other standard Kinetic tools. They want to interrogate the database directly, and because of potential query loads we don’t want them hitting any of the regular instances.
They want the whole database. This isn’t for regular operational or financial reporting. They will be analyzing trends, patterns, etc. Big data type of things.
Thanks @josecgomez - they were told they had to provide static IP addresses the cloud team would whitelist but the users of the analytics could be anywhere so that wasn’t feasible.
When you connect a SQL Db to Power BI you do so normally via a DataFlow which sucks the data out of the Db and puts it into the Power Platform the only thing you’d have to white list is giving the power platform access to that Db which (I believe Epicor already hosts these Db’s in Azure anyway is literally a check box on their side)
If they are trying to do live Queries against Db in Power Bi I’d suggest they find new consultants or analysts because that is super inefficient and slow in most cases particularly for a large Epicor Db.
The Power BI part is no problem, been there doing that. REST is easy. BAQ is awkward and annoying but filters out most stack overflow copypasta.
Something to consider… Power BI’s connection wizards make it really easy to drag entire tables out of the server and cherry pick what you actually wanted client side. PBI training basically never teaches writing SQL, which is fine, but if you need to reference the trailing week’s LaborDtl someone needs to write SQL. A SQL skills gap can be a thorny thing to discover from analysts even though it’s not at all uncommon.
There can be other good reasons, but the cost and effort of a database mirror or MS fabric deserve those details.
We are using PowerBI va REST API, however we have not queried tables directly, we filtered the data we want to analyse in BAQs and then visualise in PowerBI.
Remember PowerBI is a visualisation tool not a data analytics tool so any data cleansing is best done in epicor BEFORE sending over, if your data analysts have any data analytics training they will know any kind of analysis in PowerBI is slow - could they utilise Visual Studio to generate the tables and do the Extract process to a local SQL database in the format they want and then visualise from there?
I only know this because i am currently doing a Data analytics apprenticeship with PowerBI.
hopefully this will help you…connecting the rest API to powerBI though is a very simple process UNTIL you get over 50000 rows of data and then you run into issues, hence following ETL (Extract, Transform & Load)
I have found using REST was an issue for a few reasons.
A. slow.
B. Last time i checked, Power BI had issues caching the ODATA feeds. using a sql database was faster on the transfer or data but also when refreshing the reports.