Whilst I agree with the feeling, I think a lot of BI devs have a tendency to throw the baby out with the bathwater.
Decision making is a complex process. The graphs and data fed to the executive via BI are just inputs to the deep net of his brain, which has been trained on the "data" absorbed over decades of experience. The objectives themselves are not simple to model - management is a delicate balancing act between competing stakeholders. The job of the BI professional is not to just produce what he is told, but to figure out what problem the person making the request is trying to solve, and then solve it in the simplest way possible. Occasionally this requires teaching them some things.
Taking an example: imagine you have an engine vibrating normally. You want to set up an alarm that rings if the engine vibrates abnormally - specifically, adding a new frequency to the existing signal (maybe it indicates a screw is coming off or something). You can feed the signal as is to your algorithm, or you can put it through a FFT in which case the "signal" is just a bunch of peaks at each frequency, and your algorithm is literally just a switch (if peak at frequency f reaches amplitude A, trigger alarm). The switch is orders of magnitude simpler, cognitively, than the algorithm that is fed the raw signal; it's also likely to be more accurate. Feature engineering is almost the most important part of statistical learning.
The executive is like the alarm - pre-processing the signal is your job. They are used to simple tools, usually univariate and linear, at a stretch, some can deal with simple polynomials. The better you pre-process the signal, the easier it becomes for the executive to make a correct decision by associating the new data to whatever decades of experience he trained his brain on.
A concrete example: let's say your CMO has asked you to give him vouchers and new customers for the last 6 months. He's clearly trying to establish the relationship between his voucher campaigns and new customers. You can give him the vouchers and the new customers, daily/weekly/whatever, and put it on a nice Tableau graph and give it to him and forget about it... or you can confirm that the problem he is looking to solve is indeed the relationship between his campaigns and new customers.
At which point you ask why new customers? And you find that he has a theory that gaining new customers is the best way to increase revenue, and the objective of the company is to increase revenue (due to incoming fundraising round whose valuation is based on revenue and revenue growth), but it is short term cash flow constrained (hence looking at vouchers instead of, say, marketing spend).
Since he probably doesn't know that a model can have more than one variable, you explain that to him and brainstorm what other variables might impact revenue growth. Assuming you're trying to predict new customers per income statement dollar and new customers per cash flow dollar, you might find that online marketing spend and season are two significant variables, and that there is a significant interaction term between vouchers and marketing spend of certain types. It's now your job to explain that "formula" - standard error of coefficients included - to the executive, and brainstorm what output is required to make him be able to quickly check how this input has changed over time (which might just be.. an alarm). You'll also have to explain the measure of fit you are using (R-squared almost always wins by virtue of being very intuitive).
And of course, this is what I identify as the gap that none of these BI products can ever hope to fill. You need someone technical with full grasp over all the data sources of the company - AND the implicit and explicit data model of the business - who also happens to be continuously involved with management discussions and at least moderately aware of the business. Most companies have a set up whereby the BI team is some sort of self-service restaurant where the executive swoops in, gets his request processed, and swoops back out. Many prefer hiring young, inexperienced BI staff because BI is seen as a cost centre, and because the way they "scale" requests is by adding headcount. One offshoot of this is that the executive starts wanting a Tableau, something will all the data neatly prepared that can be drag and dropped into the 1-dimensional models that he uses to pre-process his company data.
The upshoot is that it's not that executives are stupid, but more along the lines of GIGO. Without the tools required to make sense of the signals they receive, executives cannot make the right decisions even if they have the right experience and thinking. I suspect a large part of why more experienced executives are smarter is that they learn to spot trends over decades of experience based on very simple signals; for example, an experienced hedge fund manager will sniff out a fraudulent company much faster than someone who has just started, just by looking at the financial reports.
Decision making is a complex process. The graphs and data fed to the executive via BI are just inputs to the deep net of his brain, which has been trained on the "data" absorbed over decades of experience. The objectives themselves are not simple to model - management is a delicate balancing act between competing stakeholders. The job of the BI professional is not to just produce what he is told, but to figure out what problem the person making the request is trying to solve, and then solve it in the simplest way possible. Occasionally this requires teaching them some things.
Taking an example: imagine you have an engine vibrating normally. You want to set up an alarm that rings if the engine vibrates abnormally - specifically, adding a new frequency to the existing signal (maybe it indicates a screw is coming off or something). You can feed the signal as is to your algorithm, or you can put it through a FFT in which case the "signal" is just a bunch of peaks at each frequency, and your algorithm is literally just a switch (if peak at frequency f reaches amplitude A, trigger alarm). The switch is orders of magnitude simpler, cognitively, than the algorithm that is fed the raw signal; it's also likely to be more accurate. Feature engineering is almost the most important part of statistical learning.
The executive is like the alarm - pre-processing the signal is your job. They are used to simple tools, usually univariate and linear, at a stretch, some can deal with simple polynomials. The better you pre-process the signal, the easier it becomes for the executive to make a correct decision by associating the new data to whatever decades of experience he trained his brain on.
A concrete example: let's say your CMO has asked you to give him vouchers and new customers for the last 6 months. He's clearly trying to establish the relationship between his voucher campaigns and new customers. You can give him the vouchers and the new customers, daily/weekly/whatever, and put it on a nice Tableau graph and give it to him and forget about it... or you can confirm that the problem he is looking to solve is indeed the relationship between his campaigns and new customers.
At which point you ask why new customers? And you find that he has a theory that gaining new customers is the best way to increase revenue, and the objective of the company is to increase revenue (due to incoming fundraising round whose valuation is based on revenue and revenue growth), but it is short term cash flow constrained (hence looking at vouchers instead of, say, marketing spend).
Since he probably doesn't know that a model can have more than one variable, you explain that to him and brainstorm what other variables might impact revenue growth. Assuming you're trying to predict new customers per income statement dollar and new customers per cash flow dollar, you might find that online marketing spend and season are two significant variables, and that there is a significant interaction term between vouchers and marketing spend of certain types. It's now your job to explain that "formula" - standard error of coefficients included - to the executive, and brainstorm what output is required to make him be able to quickly check how this input has changed over time (which might just be.. an alarm). You'll also have to explain the measure of fit you are using (R-squared almost always wins by virtue of being very intuitive).
And of course, this is what I identify as the gap that none of these BI products can ever hope to fill. You need someone technical with full grasp over all the data sources of the company - AND the implicit and explicit data model of the business - who also happens to be continuously involved with management discussions and at least moderately aware of the business. Most companies have a set up whereby the BI team is some sort of self-service restaurant where the executive swoops in, gets his request processed, and swoops back out. Many prefer hiring young, inexperienced BI staff because BI is seen as a cost centre, and because the way they "scale" requests is by adding headcount. One offshoot of this is that the executive starts wanting a Tableau, something will all the data neatly prepared that can be drag and dropped into the 1-dimensional models that he uses to pre-process his company data.
The upshoot is that it's not that executives are stupid, but more along the lines of GIGO. Without the tools required to make sense of the signals they receive, executives cannot make the right decisions even if they have the right experience and thinking. I suspect a large part of why more experienced executives are smarter is that they learn to spot trends over decades of experience based on very simple signals; for example, an experienced hedge fund manager will sniff out a fraudulent company much faster than someone who has just started, just by looking at the financial reports.