Thanks Jose, I was learning about that replication to Azure earlier this year and it sounded cool and you are using it so that’s even cooler.
I’ll have to read into it more.
Do you do anything else related to data warehousing as seen in the flow below? Or is it strictly access to BAQs and the users have to build the queries every time? This could lead to a much longer thread than necessary…
A data warehouse is basically a place where you do long term storage of historical data typically rolled up via something called OLAP Cubes, these cubes (or new tables) is where basically you roll up and store say total sales or total cost (different measures) grouped by date, city, territory, sales rep etc
So on top of OrderHed, OrderDtl etc you’ll have OrderTotalsByRep where the values are pre-calculated and stored.
It is an easier way to do data analytics because the rollups are PreComputed instead of having to do those sums up front.
Power BI, Grow etc will usually do this so that they can return data faster, the “downside” is that those computations take time and are time delayed so the Data Warehouse usually re-computes its values a few times a day but it isn’t real time.
We also use this for our Power BI Reporting where we have reports driven from rolled up sales figures so we can provide quick KPIs
Epicor also has this facility built in via what they called executive queries. These BAQs generate “Cubes” that can be scheduled via System Agent to RollUp stuff and store it, then you can write dashboards etc off those cubes
Thanks for reaffirming what I understood so far and what I could remember from school.
Do you make use of OLAP cubes?
I think an added benefit of having these cubes is a consistent approach to reporting data whereas if someone is making a new query every time you have people using new measures or new fields potentially.
What I have experienced at both places I have worked is the constant back and forth between whose report is right or questions why we have conflicting data. I know some of this is just organizational where we need to plan it out and all be on the same page about how things are computed, but beyond that I am wondering how to create something that users can access without having to understand queries. Maybe I am dreaming. But I feel like that’s where OLAP cubes come into place.
Thanks for your thoughts and experience with this. It helps to know what others are doing and where they have found success. I think what we always will run up against is whether the user needs up to date data or whether they can use something that is 3 hours or 12 hours old. This generally depends on the business question.
I’ll add that even though a data warehouse would solve the queries issue, trying to train people on how to make analytical reports using one of the many tools out there is the other hurdle, but I think data warehousing would be the first step.
I too like Power BI, but I think it’s hard for some users to grasp. Excel has a lot of the data modeling tools that BI has, but not as nice. It takes a little more work. And visualizing stuff is not as easy.
Anyways, looking to start doing something like what you are doing with reporting at your place. I need to replicate and then create some central “sources of truth” for departments because right now we have to write all these queries to answer questions.
We went down that road because finance wants real-time data, not data that lags behind an hour or a day.
I just was grilled by the SQL DBA Community on using CDC for it. I wasn’t even trying to use Epicor’s CDC but SQL’s built-in CDC.
Epicor does CDC on the Entity Framework level, not SQL.
I guess there are better tools to auto-sync tables and databases I learned. Even built in On-Prem to Azure SQL Sync tools from Microsoft, and even more are coming in the latest SQL Server.
I dont know if that is just an old way of thinking, because most of these DBAs started decades ago, or if it is indeed a best-practice to not use CDC for it. Anyone know what Brent Ozar thinks?
This ^^^ They spoke about it in my azure fundamentals course by Microsoft. Since cloud has taken off, I take it many people needed these tools so they developed them…
Has anyone got a Power BI data model that they’ve made based off of Epicor data? I know people will have different modelling preferences/requirements, but having a base model with common calculations as they tie to the standard epicor reports would be amazing.
These folks have, it is what we use, they basically sell you the Epicor Data Model with a bunch of ready built reports it is pretty awesome and they will work with you to help make changes if needed. (At the data model level)
When I was first coming into this field I thought, “well I’ll just do this myself,” and quickly learned that each step in that diagram flow I posted above requires a pretty extensive knowledge of the subject matter. That, and the continued maintenance of it requires a bit of time.
Thanks for that, I’m relatively new to Epicor after having come from another group (using M1).
Do you know if within EDA they have their calculations/measures fully open so you can see how their model was built? (eg measures, relationships, filtering logic etc). A static/pre-built model where you can’t break it open to see how the calculations work would be frustrating.
It does, I started building my own and not trying to brag but I understand the Epicor Data model pretty damn well. It still took me forever, these guys spent I believe over 2 years getting the data model built.
We decided to go that route instead of having me continuing to build the data model it was just too much work to get it done as quickly as we wanted to.
James, I do know that you can see the SQL behind what they are doing in EDA and also edit and adapt it to whatever your company needs.
Since you are new to Epicor you should note that Epicor recently aquired GROW which is another BI tool and so I don’t really know where they intend to go with EDA now.