Why this blog series exists:
When you buy an out of the box data warehouse & cube solution like Jet Enterprise from Jet Reports, www.jetreports.com or ZapBI, it usually meets the 80/20 rule (the stock solution satisfies 80% of your reporting requirements, and you'll need further customizations to meet your remaining analytics requirements).
Client requests from customers like you inspire this blog series.
Set a Default Value for Budget Amount
Problem: Some measures require filtering by a default value to show reasonable results.
Consider the example of the company that creates and revises a new budget each quarter. While it's perfectly reasonable to show the [Budget Amount] for March 2017 from the 'Initial 2017 Budget' or 'Q1 2017 Amended', it would be misleading to show the sum of both budgets for March 2017. Similarly, if your organization uses a forecasting tool to calculate expected item consumption or sales, you may have multiple forecasts numbers for the same item and month.
Solution: To minimize the risk of 'double counting', we'll modify [Budget Amount] to always filter by a default [Budget Name].
In this solution, we hard-coded a budget into the Default Member as part of the dimension definition. For improved maintenance, we could add an [IsDefaultBudget] attribute to the [Budget] dimension in our cube and data warehouse. Then reference the new column when defining the Default Member. Ideally our source ERP system can identify the current budget, or we can implement business processes and SQL code to auto-identify the correct budget.
Note: Because we can have multiple concurrent budgets, but shouldn't sum [Budget Amount] across different budgets, the measure is a semi-additive fact -- in Business Intelligence nomenclature semi-additive facts cannot be summed across all dimensions. Other common examples include balance, profit percent, average unit price, company consolidation, as well as most cost-allocation schemes. When customizing our data warehouses and cubes, we must be careful where we implement the calculation to avoid reporting errors.
Non-additive facts cannot be summed (ex. [First Invoice Date]). Fully-additive facts are 'normal' facts which can be summed across all dimensions.
Invert the sign of Income Statement Accounts
Problem: Sometimes the dimension used affects how we want to calculate a measure.
In many ERP systems the [Amount] or [Quantity] column have counter-intuitive signs. Consider G/L Entries which have negative signs for revenue and positive signs for expenses or inventory transactions which show a negative sign for item sales and shipments and positive signs for item purchases. While the data makes sense to accountants and analysts, business users frequently appreciate seeing signs that intuitively make sense.
In cube-based reports, we'd like to invert the sign on [Amount] for all Income Statement accounts but not Balance Sheet accounts. Note, it's arguably 'wrong' to invert the sign in the data warehouse because that would make our [Amount] column semi-additive, and potentially cause problems for our auditors!
Solution: In the cube, use a SCOPE statements to assign a different calculation to [Amount] based on the [GL Account] dimension.
Note: in SSAS cube nomenclature we typically deal with two out of three measure types: Standard and Calculated (avoid Derived measures when possible).
A Standard measure is typically a SUM or COUNT of a column that exists on the fact table in the data warehouse: ex. sum(Amount) or sum(Quantity).
A Calculated measure is an expression that typically uses standard measures in conjunction with MDX expressions and functions. Typical examples include division (profit margin percent or avg. daily sales) as well as all business functions (Year To Date, Balance, or Previous Period).
A Derived measure is a calculation based on fields from the same fact table -- ex. Profit Margin ([Sales Amount] - [Cost Amount]) When possible, avoid derived measures and use standard measures instead-- just add the calculation as a new column in the data warehouse. Using standard instead of derived measures has two major advantages:
That's All Folks
Remember, out of the box solutions by definition are designed to work for ALL customers. Further customization is expected, even required! Onyx Reporting is available for development services, training and mentoring as you implement minor (if not major) customizations to your Jet Enterprise implementation.
For clients not using Jet Enterprise, please contact us to find out how data warehouse automation can cut your business intelligence implementation time from weeks and months into days.
If you're anything like me, you've got multiple copies of NAV stored on your SQL server from various clients and if you forget to write down which version of NAV they're on, that can be a pain.
Fortunately, table structure doesn't change THAAAAAT much ;) but just in case:
[NAV Database Version] = CASE
WHEN [databaseversionno] = 40
THEN 'Navision 4 SP2'
What exactly are "multiple environments?" It's an infrastructure that allows your business to separate BI development endeavors from the live environment the organization uses for reporting and analytics.
"Why would I need it?" Development is not always a quick 5 or even 15-minute fix. Larger projects can take days, weeks, even months to complete, so you need a sandbox for the developers to 'play' in while the rest of the organization continues forward with 100% uptime on their live reporting environment. Some organizations may even choose to separate the Dev sandbox from QA (Quality Assurance) efforts, so a third Test environment may be needed!
But "do I need multiple environments?" As with any business question, 'it depends'. If your development is limited to just adding a field here or a new measure there during the initial implementation of your business intelligence project, you may be able to wiggle by without separated environments.
It may make sense to separate development from the live environment if:
It may make sense to implement a QA environment if your organization has
Ready to get started today?
This presentation is preparation for a training series Onyx Reporting will be conducting in 2017! Join our mailing list to keep up with special offers, training opportunities and our blog series.
At two conferences in the same month, I overheard CTO's express, that enterprise resource planning (ERP) systems like Dynamics NAV, GP or Sage could only capture what happened inside a company, and to uncover the 'why', organizations would have to consider data collected outside their ERP (Google Analytics, CRM, webstore, Mailchimp, Twitter etc.)
Ex. Data analysts can build sales or marketing spend report based on data stored in NAV with aplomb. They can even ascertain product and customer trends, but without external data, it'd be difficult to ascertain why organizations are seeing the behavior or more importantly, how they can influence behavior.
This blog post will review
The Data Discovery Hub facilitates data integration by separating ETL (extract, transform, load) into three databases: ODS, DSA, MDW
In an effort to escape 'Excel-anarchy' many organizations transition away from error-prone spreadsheet-driven reporting toward governed enterprise data warehouses which are expected to provide a 'single-version-of-the-truth' for all reporting and analytics endeavours. To that end, during the implementation process, bespoke business rules and constraints are applied to the data warehouse to enforce the consistency and validity of reported data.
Although data warehouse applications (like Jet Report's Jet Data Manager) simplifies the process of applying business rules to data sets, the majority of implementations Onyx Reporting encounters do not report & monitor outlier records that fail data validation constraints. Translation: you may be truncating data and not know it. Double Translation: You may be making decisions based on incomplete data! That's not to say that supporting tools within the product ecosystem don't exist! They just frequently go under utilized.
Every Data Warehouse is built on (un)declared Business Rules
The expectation that business keys (like Customer No or G/L Account No) will uniquely identify one member of a dimension is virtually universal. Though a semi-obvious requirement, particularly when organizations are integrating data from disparate sources (multiple companies, a legacy system, web-based or unstructured data sources), conflicts between the expected versus actual data reality can arise. Additionally, there can be a disconnect between expected values (as laid out by standard operating procedures) and the actual values recorded in the ERP systems (ex. every sales event should be attributed to a Salesperson or Geographic Region).
To close the loop, and prevent your organization from making strategic decisions based on incomplete information, system implementors must add measures and controls for monitoring and correcting records that fail data validation.
Join our mailing list to be informed when we publish new articles.
Author Jae Wilson, lead data strategist at Onyx Reporting, partners with co-author Joel Conarton, director at executive and management consultancy Catalystis, to provide strategic solutions for data-driven organizations.
Are you ready to partner with us?