ONYX REPORTING LTD.
  • Welcome
  • Services
  • Blog
  • YouTube Channel
  • Contact
  • Domo IDEA Exchange
    • Schedule
    • Call for Presenters
  • Welcome
  • Services
  • Blog
  • YouTube Channel
  • Contact
  • Domo IDEA Exchange
    • Schedule
    • Call for Presenters
Search

VIDEO:  10 tips to Polish your Dashboards & Simplify Charts

28/1/2021

0 Comments

 
I had the opportunity to appear as a guest speaker on Janice Taylor's webinar series on Dashboard Design and Best practices, https://www.reportsyourway.com/.  During this 15-minute presentation, I reviewed 10 easy checkpoints dashboard designers can implement to produce concise and effective Dashboards.

Read More
0 Comments

5+ Methods for Getting Data in and out of Domo

16/4/2020

0 Comments

 
Picture
An IT/IS stakeholder raised to me their concern that Domo, like many BI vendors, was a one-way street: a tool where you can easily push data in and not get data out.  The underlying narrative: both business and IT stakeholders see vendor-lock as a risk to minimize.  I assured the gentleman, in addition to extensive dashboard creation and distribution capabilities, Domo positions itself as a data distribution hub by providing several data extraction methods suited for a broad mix of users with different usage requirements. 


TLDR for Data Governance teams: My most successful enterprise clients position Domo at the end of a data lake or data warehouse pipeline then use built-in tools to secure, monitor and distribute data to different stakeholders.  Domo's distribution methods get data into the hands of data quality and governance teams, data scientists, and business analysts in the platforms of their preference, which of course ranges from Tableau and Qlik to Jupyter notebooks, Office 365 products, or any other SQL or data storage platform.  IT/IS accomplish this without increasing administrative overhead by applying security measures in Domo which trickles through all the data distribution options outlined below.

TLDR for Developers: Domo facilitates systems integration, business process automation, app development as well as data ingestion and distribution via SDKs and CLIs which leverage an openly accessible API framework, including Java, JavaScript, Python and R libraries.  In addition to storing data in Domo's cloud, it also supports a federated query model to leave data in source location.  Check out developer.domo.com.  I could go on for days.

Read More
0 Comments

Data Warehouse and Cubes Tips and Tricks - March 2017

19/3/2017

0 Comments

 
Picture
Solutions:  
  • Set a default budget for Budget Amount
  • Invert the sign of Income Statement Accounts.

Why this blog series exists:
When you buy an out of the box data warehouse & cube solution like Jet Enterprise from Jet Reports, www.jetreports.com or ZapBI, it usually meets the 80/20 rule (the stock solution satisfies 80% of your reporting requirements, and you'll need further customizations to meet your remaining analytics requirements).  
​Client requests from customers like you inspire this blog series.

Set a Default Value for Budget Amount

Problem:  Some measures require filtering by a default value to show reasonable results.  

Consider the example of the company that creates and revises a new budget each quarter.  While it's perfectly reasonable to show the [Budget Amount] for March 2017 from the 'Initial 2017 Budget' or 'Q1 2017 Amended', it would be misleading to show the sum of both budgets for March 2017.  Similarly, if your organization uses a forecasting tool to calculate expected item consumption or sales, you may have multiple forecasts numbers for the same item and month. 

Solution:  To minimize the risk of 'double counting', we'll modify [Budget Amount] to always filter by a default [Budget Name].
In this solution, we hard-coded a budget into the Default Member as part of the dimension definition.  For improved maintenance, we could add an [IsDefaultBudget] attribute to the [Budget] dimension in our cube and data warehouse.  Then reference the new column when defining the Default Member.  Ideally our source ERP system can identify the current budget, or we can implement business processes and SQL code to auto-identify the correct budget.

Note:  Because we can have multiple concurrent budgets, but shouldn't sum [Budget Amount] across different budgets, the measure is a semi-additive fact -- in Business Intelligence nomenclature semi-additive facts cannot be summed across all dimensions.  Other common examples include balance, profit percent, average unit price, company consolidation, as well as most cost-allocation schemes.  When customizing our data warehouses and cubes, we must be careful where we implement the calculation to avoid reporting errors.

Non-additive facts cannot be summed (ex. [First Invoice Date]).  Fully-additive facts are 'normal' facts which can be summed across all dimensions.

Invert the sign of Income Statement Accounts

Problem:  Sometimes the dimension used affects how we want to calculate a measure.

In many ERP systems the [Amount] or [Quantity] column have counter-intuitive signs.  Consider G/L Entries which have negative signs for revenue and positive signs for expenses or inventory transactions which show a negative sign for item sales and shipments and positive signs for item purchases.  While the data makes sense to accountants and analysts, business users frequently appreciate seeing signs that intuitively make sense.

In cube-based reports, we'd like to invert the sign on [Amount] for all Income Statement accounts but not Balance Sheet accounts.  Note, it's arguably 'wrong' to invert the sign in the data warehouse because that would make our [Amount] column semi-additive, and potentially cause problems for our auditors!

Solution:  In the cube, use a SCOPE statements to assign a different calculation to [Amount] based on the [GL Account] dimension.
 Note:  in SSAS cube nomenclature we typically deal with two out of three measure types:  Standard and Calculated (avoid Derived measures when possible).

A Standard measure is typically a SUM or COUNT of a column that exists on the fact table in the data warehouse:  ex. sum(Amount) or sum(Quantity).

A Calculated measure is an expression that typically uses standard measures in conjunction with MDX expressions and functions.  Typical examples include division (profit margin percent or avg. daily sales) as well as all business functions (Year To Date, Balance, or  Previous Period). 

A Derived measure is a calculation based on fields from the same fact table -- ex. Profit Margin ([Sales Amount] - [Cost Amount])   When possible, avoid derived measures and use standard measures instead-- just add the calculation as a new column in the data warehouse.  Using standard instead of derived measures has two major advantages:  
  1. Users reporting directly against the data warehouse can use the predefined column.
  2. The cubes will process faster during their overnight load.

That's All Folks

Remember, out of the box solutions by definition are designed to work for ALL customers.  Further customization is expected, even required!  Onyx Reporting is available for development services, training and mentoring as you implement minor (if not major) customizations to your Jet Enterprise implementation.

For clients not using Jet Enterprise, please contact us to find out how data warehouse automation can cut your business intelligence implementation time from weeks and months into days.
0 Comments

12 Questions for Dashboard Design (and 5 Mistakes to Avoid)

14/12/2016

0 Comments

 
Last week we explored 6 dashboard design tips to improve the aesthetics of existing dashboards.  In this post, we share design tips from our analysts about dashboard and OLAP cube requirements gathering as well as costly mistakes to avoid.
Picture

12 Tips to Remember

Tailor your dashboard to the audience AND the process they're trying to optimize.
  • How important is this measure to a user compared to other measures?
  • What other measures would provide comparative context?
  • Is there a sequence of measures a user would follow to drill into a question?
  • How might the answer to the previous questions differ by users at different levels of the organizational hierarchy?
Express the data in a way relevant to the audience.
  • What's the scope of your dashboard? (Company, department, individuals, a supplier)
  • What level of summarization or detail is appropriate?
  • What unit of measure is approrpiate for each measure?
  • What complementary information should I include?
  • Again, how might the answers differ as scope or user changes?
Match the tool to the user. 
  • Managers and executives may prefer a dashboard that provides a non-interactive executive or strategic summary, while business or data analysts may prefer to construct their own analyses against raw data straight from the data warehouse, while mid-level managers may require a live operational report.
  • Did you choose the right chart?
  • Should this be a PowerBI dashboard, a SQL query, an SSRS report, a Jet Professional Report, an Excel PivotTable?
  • How often does the data need to be refreshed?  Do operational decisions depend on activity or status changes in the last 15 minutes?  Last 3 hours?  Last 24 hours?

5 Tips to Avoid

Kicking off a project with an overly complex problem quickly  leads to project stall or paralysis.
  • Complexity can arise when a variety of inputs are required to calculate a measure (eg. 'true profitability or cost allocation). A bottom-up approach of first identifying simple measures which aggregate into increasingly complex summarizations can encourage quick wins and maintain momentum.
  • If your team insists on a top-down approach, divide and conquer. Decompose complex strategic objectives goals, or measures into composite pieces.
  In your data warehouse, cubes and dashboards, avoid using metrics or abbreviations who's meanings are not immediately obvious.
  • The use of generic names like "Global Dimension 2", abbreviations or entity codes may affect adoption rates and user acceptance because they only resonate with a handful of seasoned analysts and fall flat with a broader audience.
  • In your visualizaitons, stave-off confusion or frustration by opting for verbose titles, captions, and axis labels while avoiding legends.
KISS (Keep it simple).  Avoid clutter, non-succinct graphics, or unintelligible widgets. 
  • 3D effects, sparklines, temperature gauge charts, mapping charts, and pie charts are common visualization elements that add spice and sizzle to a dashboard; however, when poorly executed, these elements take up significant space without communicating a lot of information.
  • Ask yourself if you could express the same information with less ink, color or space.
Avoid failing to match metrics with goals.
  • "How does this dashboard aid a business process?" It's not enough to showcase the activities of a department.  Keep strategic goals or operational activities in mind. Does the dashboard provide enough context for viewers to reach an actionable conclusion?
Don't wait to get started.
  • Many are tempted to wait until development has finished before they begin laying out their dashboards, when in truth, report requirements should drive or at least inform development efforts.

Go Forth and do Great Things

Still looking for inspiration?
  • Here are some of my favorite Jet Reports from the Report Player
  • Can't create the chart you want with a PivotTable?  Use Excel's Cube Functions 
  • Need dimensions or measures added to your cubes?  Our services team has domain expertise on data warehouse and OLAP cube design for Dynamics NAV and GP.
  • Make sure to sign up for our weekly newsletter for special promotions and articles delivered straight to your inbox.  
0 Comments

6 Design Tips for Better Dashboards

6/12/2016

0 Comments

 
Picture
In his article "Choosing the Right Chart for your Data," Brian Petersen (VP of Professional Services at Jet Reports) writes: "Data is the foundation of effective business.  ... Being able to quickly read and analyze your data enables you ... to understand how a particular set or group of facts contributes to your overall success and steer your decisions proactively."
He then goes on to describe several common charts including:
  • Area Charts
  • Bar Charts
  • Scatter Charts and
  • Stacked Column Charts
Read the full article here:

For more seasoned analysts, the challenge is less about finding the right chart so much as laying out a dashboard that effectively communicates a broad scale of both summarized as well as detailed information. As we delve into optimizing dashboards, we move away from technical or domain expertise and transition toward questions of User Experience and User Interface.

For these projects, I'll leverage knowledge gleaned from Stephen Few's guide to dashboarding -- "Information Dashboard Design" which was heavily influenced by Edward Tufte's seminal work "The Visual Display of Quantitative Information".

Charts Jump off the Page with these 6 Tips

1) "Brevity is the soul of wit" -- Do not exceed 1 page.
Any CxO will describe the perfect dashboard as an interactive report where they can see all the important information on one page. Translation: Edit. Edit. Then edit again. Examine how much excessive detail or decoration you can pare away without supplying inadequate context.

​Your final product shouldn't require scrolling, changing tabs, or (ideally) a legend.

2) How good is good? -- Provide enrichment and context through comparison.
  • Same measure at a point in the past (Last Year or Last Month)
  • Target (Budget or Forecast)
  • Relationship to future target or prior prediction (% of annual budget or forecasted budget)
  • Benchmark for norm (company average or industry standard)
  • Separate but related measure
Picture
3) Consider providing non-quantitative data
If you frame dashboards around improving a process or keeping 'two fingers on the pulse of the company'; in addition to measuring activity, it may make sense to provide non-numeric data.
  • Top & bottom performers
  • Prioritized issues to investigate
  • Upcoming due dates.

4) Emphasize the important things.
By understand how the eye travels across the page, designers can highlight, prioritize and de-epmphasize.  This is particularly important when planning the placement of auxiliary elements including filters, slicers, legends and labels.
Picture
5) Maximize the "Data-Ink Ratio"
  • If 'data ink' is any line or pixel that communicates quantitative data (a line, point or bar in a chart) and 'total ink' includes all the lines used to create said chart (axes, tick marks, borders), then the data-ink ratio measures the distribution of ink used for communication versus formatting and decoration.
  • Strive to enhance data ink while reducing and deemphasizing non- data ink.
    1. Can you remove unnecessary tick marks or grid lines?
    2. Do variations in color provide additional meaning, or is the message equally clear with just one or two muted colors?
    3. Do you really need that 3-D effect or color gradients?
Picture
6) Organize information to support interpretation and application
  • Organize groups according to departments, entities or use
  • Co-locate and subtly delineate related data
  • Support meaningful comparison while minimizing meaningless comparisons

My favorite feature of Few's book was his analysis of sample dashboards, wherein he described, not only the flaws in various dashboards but also modeled various alternative ways of presenting the data.  For dashboard developers this analysis would prove invaluable for sharpening our critical eye, but also provide inspiration for what dashboards can (or shouldn't) look like!
Picture

Bridge the Gap between Concept and Execution

For those of you using Pivot Tables or PowerBI to access data from a Jet Enterprise cube, it can be difficult pursue the optimum dashboard layout or chart because you're constrained by the limits of the pivot table or data model.
In a previous post: Better Dashboarding with Cube Functions, Onyx Reporting presents a tutorial for converting Pivot Tables into infinitely more manipulable Cube functions.

One Book to Rule them All

Picture
Content and images from this blog post were taken from Stephen Few's Information Dashboard Design (buy it on Amazon).

Check out our blog at www.onyxreporting.com/blog or sign up for our weekly newsletter: "The Data Dump".

If you need support knocking out a batch of reports or want to customize your Jet Reports cubes to include some new comparative measures, our services team is amoung the best in the business.
0 Comments

Choosing the Right Forecasting Method

6/6/2016

0 Comments

 
With modern reporting tools allowing more and more companies to answer 'what did we do yesterday?' the logical progression is to then predict, 'what are we going to do tomorrow?'  Forecasting solutions vary in complexity with options including elaborate third party software, complex spreadsheets developed by internal analysts, and occasionally a shake of the good-ol' crystal eight ball. I recently had a conversation with a consultant who confided with me that their client swears by XYZ solution. "How does it work?" I asked. Does it look at a bell curves and confidence intervals? 12 month moving average? 3 month moving average? Weighted average? Linear regression? Seasonal adjustment? "It looks at all of that" I was assured. "How does the system know which forecast method to use?" ... silence.
To be honest, almost any method could be a completely (in)appropriate way of forecasting the future! If you sell ice cream or agricultural pesticides, there's likely to be seasonality in your data. If you're predicting the stock market, moving averages and linear regression are common prediction methods. If you're tracking click rates after launching a new marketing campaign ... well that's another level of complicated, but to get the ball rolling, linear or exponential regression models may be a solid place to start.

Calculating Error in a Forecast

Picture
In a nutshell, cost functions evaluate a forecast method's goodness of fit by measuring the amount of error where:
Absolute Error = Absolute Value (Expected - Actual).
In the sample data set I whipped up, we have several months of Actuals and two forecast methods, 3 Month Moving Average and 5 Month Moving Average (3MA and 5MA respectively). An n-month moving average forecasts the current month by taking the average of n preceding months.  
In a future blog post, I'll share how I built these forecast models into my SSAS Sales cube using MDX.
For the month of December, 3MA forecast 625 while 5MA predicted 596. Given that actuals were 846, 3MA was clearly a better forecast because it had the least absolute error.
  • Dec 3MA Absolute Error: ABS(625 - 846) = 221
  • Dec 5MA Absolute Error: ABS(596 - 846) = 250
To avoid the possibility that December was a lucky fluke for 3MA, it would be reasonable even expected to compare absolute error between 3MA and 5MA for other periods.
  • Nov 3MA Absolute Error: ABS(637 - 729) = 92
  • Nov 5MA Absolute Error: ABS(542 - 729) = 187
Because 3MA's forecast is closer than the 5MA forecast two periods in a row, we might say '3MA's average absolute error is less than 5MA.' We've just unwittingly defined the cost function Mean Absolute Deviation (MAD) and how we use it to select 'the best' forecast.
Google tells me that cost functions "provide a criterion by which forecast methods can be compared," but I've chosen to reinterpret the definition as, "cost functions help us decide which forecasting method to use."
DISCLAIMER
To come clear, I am not a statistician. While developing a forecasting data model for a client, I was able to consult a trusted mathematician who guided me through the basics of using cost functions to evaluate and select the 'best' forecasting model. I'm writing this article with the assumption that you either have forecast models that you're currently using, or are interested in adopting budgeting tools for use.
Calculating the Cost Functions MAD, MAPE, and MSEDon't go running for the hills just yet! Cost functions and abbreviations can sound intimidating, but this 5th grade level math will enable us to quantify and compare the error between two forecasts. The three most common cost functions are MAD (Mean Absolute Deviation), MAPE (Mean Absolute Percent Error) and MSE (Mean Squared Error).

Calculating MAD (Mean Absolute Deviation)
MAD is the sum of absolute error (also called deviations) divided by the number of periods measured.
In December and November the absolute errors for 3MA were 221 and 92 as opposed to 5MA's 250 and 187. We concluded that 3MA on average had less error.
  • 3MA MAD: [221 + 92] / 2 = 157
  • 5MA MAD: [250 + 187] / 2 = 219
In application, although MAD measures how much error there was, it's insufficient to the user seeking to understand the severity of the error. A MAD of 157 against a data set measured in the millions would be negligible compared to that same MAD with a data set ranging between 500 and 1000. This desire for providing MAD with context speaks to the next cost function Mean Absolute Percent Error (MAPE).
​Calculating MAPE (Mean Average Percent Error)
MAPE differs from MAD by calculating the average absolute percent error instead of simply taking the average absolute error.
Absolute Percent Error: ABS(Expected - Actual) / Actual
  • 3MA MAPE: [221/846 + 92 / 729] / 2 = 13%
  • 5MA MAPE: [250 / 846 + 187 / 729] / 2 = 28%
Although MAPE and MAD weigh error equally, the statement summarizing MAPE -- "we forecast November and December's sales with 13% accuracy" paints a clearer picture than MAD's "our forecasts had on average had 157 error." For this reason, MAPE is the cost function of choice in most business analytics; however, because MAPE and MAD treat deviation equally, they are unsuitable for forecasts that seek to minimize error per period (as opposed to overall error)
Picture

Calculating MSE (Mean Squared Error)
Mean Squared Error takes the Average of Error Squared.
  • GF Squared Error Period 11: (90 - 70)^2 = 387
  • GF Squared Error Period 12: (115 - 90) ^2 = 640
NOTE: With MSE the unit of measurement is square units as opposed to MAD which is measured in units.
As aforementioned, MAPE is the preferred cost function because it minimizes overall error, but consider the challenge of equipping production lines across factories in different countries. Once resources have been allocated, it may be difficult to reapportion or procure additional machinery; therefore, in this instance our primary concern is not minimizing error across the entire data set so much as minimizing error at each forecast point. In other words, the ideal forecast will allow many smaller errors while avoiding a few significant errors.
In this new data set we have two forecast methods, 3MA and Growth Factor (wherein using a crystal ball we 'managed' to predict expected as actuals * 1.28). Both forecast methods have the same total deviation and MAD but 3MA, with a max percent error of 70%, has larger swings in absolute deviation while Growth Factor consistently has a 28% error for each forecast.
In terms of our factory example, using MSE we would have selected the forecast that equally over allocated equipment at a +28% error across all factories instead of the 3MA model which would have wide swings in equipment over and under allocation.
​
Which Cost Method Should I Choose?
Someone said "there's lies, damned lies and statistics", and it's currently uncertain whether to credit Mark Twain or a British dignitary. Go figure. As with all things business related when faced with the question, "Which cost function is best?" the answer is, "It depends." Although MAPE and MAD weigh error evenly MAPE provides context for the severity of error where MAD does not, thus making it the de facto choice in business analysis.  However, because MAPE represents percent error and 5th grade math tells us we can't divide by zero, you may have to use MAD or MSE in cases where your data set's actuals could include 0's--consider monthly inventory demand for example. Lastly, in instances where we're more concerned with reducing error per forecast, it may be more important to use the MSE cost factor.
One last thought to keep in mind before embarking on the world of forecast evaluation: not all errors were made equal! With MAD, MSE, and MAPE, we're assuming positive error and negative error are equally weighted, but consider a retail environment where running out of inventory may be more costly than carrying inventory. While warehouse fees or product obsolescence from overstocking inventory may add up, overall it may be less costly than a lost sale or disappointed customer.

Visualization:  Closing the Loop

Picture
Take a look at this ball of spaghetti! In a world where our reporting and analytics tool can graph historical demand and quickly generate multiple forecasting models, this mess is just a few clicks away, and though loaded with an overabundance of forecasting data, the chart is too busy and doesn't actually tell us which forecast method to use. Which option has the lowest MAPE?
Picture
The above chart is halfway there insofar as it could be filtered to show us todays' period and identify the forecast with the lowest MAPE (or MSE as appropriate). I can then simplify my chart to the bare essentials: actuals, expected, a trend line, and the prediction for the next period.
Picture

That's All Folks

With the exception of the simplified introductory datasets at the top of the blog, all the charts were generated based on MDX added to the stock cubes provided with Jet Enterprise. If you're interested in adding forecasting or forecast evaluation to your cube-based reporting environment give me a holler at Jae@OnyxReporting.com
0 Comments

My Favorite Things

18/4/2016

0 Comments

 
My Favorite Things--Jae WilsonDocs.com
0 Comments

Jet Enterprise & Top 10 Items

14/4/2016

0 Comments

 
Jet Enterprise & Top 10 Items--Jae WilsonDocs.com
0 Comments

100 free Jet Reports via the Report Player

8/1/2016

0 Comments

 
Between now and my previous post --'why Power BI won't replace a Data Warehouse'--which was built on ideas shared in 'why people should stop confusing BI with Visualization', I've done even more thinking on the topic of why clients have unsuccessful or incomplete Business Intelligence implementations.  The good news is, it's seldom a case of bad software fit, lack of resources or expertise, so much as uncertainty around what the desired outcome should be.  It doesn't matter if your solution is Targit, Crystal, SAP, ZapBI or Jet Enterprise, if the client doesn't start with a clear vision of what they want, no matter what you deliver, it seems, they don't want that!  
Ultimately, the root challenge is not a question of functionality or capability so much as whether the user can articulate what they want or need.  I just had a conversation with a potential client who lamented, "No one on my team except me really knows what they want."  Honestly, that degree of transparency is better than the dreaded "we just want a simple dashboard," but there's still work to be done!  
It's time to sit down with your users and map out some mock ups of what you want your final reports to look like.  Get away from emails and lengthy paragraph long descriptions; instead, use simplified spreadsheets with sample headers, fake data, mock-up graphs, or a list of the KPI's you want to see in your solution.  Even an seemingly obvious question like, "How do you want to consume this data, (spreadsheet, pivot table, web page, or mobile device) can have far reaching implications on what Front End software you use to develop your reports!  I've delivered more than a handful of dashboard workshops where my opening question"what do you want your dashboard to look like?" was met with dead silence. #FacePalm. 
If this still seems too daunting, maybe it's time to do a little window shopping--did you know that Jet Reports has over 100+ generic reports available for you to use?  I know, "these generic reports don't quite meet your company's unique needs."  But that's not our expectation!  If you can express what does and doesn't work about a generic report, then we can uncover not only what you want and need, but now you have ideas about what it could look like on paper!  A good Jet consultant can customize any of the Report Player reports, or even build a more robust one from scratch with all the features you want!
Unfortunately consultants are sometimes slow to promote the Report Player.  "Our end users don't like it because XYZ."  Trust me, I've heard every possible reason, from both clients and consultants, but I'll shorten my counter-argument to the following: the Report Player inspires users with glimpses of what's possible.  At the end of the day, consultants can only deliver what users ask for, and clients can only ask for what they know about or have seen before.  I like to think of the Report Player as a mini IKEA catalogue: the pictures are there to whet your appetite. Don't expect completeness; instead, let the report layouts be a source of inspiration.  I love it when a client shows me a report and says: "I love it, but can it do ..."  Now we can focus the conversation to delivering exactly what you want because you know what you're looking for.

Happy Reporting!
The Report Player--Jae WilsonDocs.com
0 Comments

Stop buying Visualization Software

20/10/2015

0 Comments

 
​At the risk of being the unpopular consultant, I'm going to make a recommendation that no one wants to hear.  Please.  Stop spending money on visualization software.

If you Google "Top 10 BI Software" or go to a trade show, you'll see all sorts of visualization tools that promise to sing, dance, and make your dreams come true.  It'll make dashboards, KPI's, and pretty charts--maybe even warn you if you drop below Safety Stock.  Wait, put the credit card down, and please stop buying visualization software.

While it is fun believing that a shiny new software can solve your data problems, you probably already own a visualization tool that will meet 90% of your reporting and analytics needs!  Don't believe me?  I'll bet if you asked an analyst to give you numbers about <insert your business metric here>, after a few hours, they'd come back to you with an Excel Spreadsheet or Pivot Chart or two.  Turn around and ask the Finance Department for an Income Statement report broken out by Department and Region with measures for Current Year, Previous Year and %Growth.  Again, I bet you get another spreadsheet.  And why not?  No matter how unsexy it seems, Excel is the de facto tool for business analysis, and it will probably meet 90% of your reporting needs.
​
The real question though, is how much sweating did the analyst have to go through to get you those spreadsheets?  If you look at the formulas and see a slew of SUMIF()s and VLOOKUPS() how nervous are you about presenting to the bank, auditors or in meeting rooms?  Only way it could be worse is if you saw a bunch of static numbers!  Truthfully, this is where BI comes to play.  Although our eyes may start glazing over when you start talking about SSIS, ETL, star schemas, and multidimensional models, the fact of the matter is, any visualization tool will perform better when pointed at a well-designed Data Warehouse or Cube.  In fact, if you ask the vendor giving you your Visualization Software demo, I’ll be they’ll tell you that data is coming out of a sanitized data warehouse, cube or data model that emulates a star schema.   If they’re using none of these tools they’re assuming that your ERP contains clean data.  <Insert skeptical snerk here>.

So what exactly is Data Warehousing all about? 
​Well, the Data Warehouse is actually the end of a tumultuous story of data access and extraction from your various source databases, Point of Sales systems, Google Analytics, Bloomberg CSV's, customs spreadsheets, or web-based Census Bureau data).  Once extracted, data undergoes rigorous application of business logic and queries to conform, validate, and clean all the disparate data, then finally a transformation process forges the cleaned data into a Data Model that consists navigable dimensions and fact measurements.  Voila.  Your Data Warehouse.

Sounds dry and possibly unsexy?  Consider the alternative.  Earlier this year I worked with a client who wanted a ranked dashboard that would show the amount and volume of deals that salespersons were bringing through the door.  The first draft of the report took less than an hour to build, but it was immediately sent back for revision.  "That lead conversion number is almost triple our average," I was told.  Further investigation revealed that Status could waffle between Lead and Conversion multiple times based on a combination of factors including: data entry problems, whether the data was collected via web forms or hand written forms, or if specific intermediary statuses were made or missed.  My report needed Olympic-quality gymnastics to extract a reasonable measurement at the cost of report performance.

Suffice to say, what should have been an easy report ended up taking more time than expected because there wasn't one source of clean data, and I loathe to think what would happen if the client wanted the report rolled up by region, or presented in a different layout.  Every client this story of having paid a consultant an ungodly amount of money for a report that we’re only marginally comfortable with and cannot be modified to present the data a slightly different way.  So what’s the solution?  A new visualization tools?  No!  Invest in a clean data warehouse that handles all the data gymnastics for you!  Don't get lured in by a vendor's siren's song of sexy graphics, KPI's and dashboards if you don't have a solid data warehouse already.  Instead, I would recommend looking around for a Data Warehouse automation tool. 

Many of my client have enjoyed success working with the Jet Enterprise data warehouse automation tool, http://jetreports.com/products/jet-enterprise.php.  And I’d be thrilled to work with you to develop data warehouses either from scratch or customize one from the library of pre-built projects.  Once that’s done, THEN we can talk about sexy visualization tools.
0 Comments
    Profile Picture Jae Wilson
    View my profile on LinkedIn

    Stay Informed.

    * indicates required

    RSS Feed

    Categories

    All
    Automation
    Basic Training Series
    Business Intelligence
    Connect
    Dashboard
    Data Pipeline
    Data Science
    Domo
    Excel Tricks
    Executive Training & Leadership
    Extract
    Jet Enterprise
    Jet Essentials
    New Release
    NP Function
    Onyx Reporting
    Planning
    Power Pivot
    Python
    Report Writing
    Statistics And Analytics
    TimeXtender
    Visualization

London, UK
jae@OnyxReporting.com
+44 747.426.1224
Jet Reports Certified Trainer Logo
  • Welcome
  • Services
  • Blog
  • YouTube Channel
  • Contact
  • Domo IDEA Exchange
    • Schedule
    • Call for Presenters