I”m someone who loves mathematics: it’s perfect since there is no such thing as maybe, approximation is only decided, and it is universal. Being able to play with numbers has always drawn me to cost calculation. You set up a model, you try to be as accurate as possible and you figure out what accounting is not clearly telling. And yet I am often frustrated by it.

Maybe my favourite cost calculation system is the ABC-method, or Activity Based Cost Method. You try to figure out which are the cost drivers for the different generators, this to be able to best match consumption by the different business units or products. Then you make a matrix in order to try and figure out which are the cost drivers that are common between different cost generators so you can get fewer cost drivers to work with. After that, it’s just a matter of knowing how much is used by each centre.

Here is what bothers me: why do we work with this matrix? I understand that when we are to calculate on a piece of paper at school or for an exam, it is faster and easier, but we are in a world where we talk about data mining, about “big data”, where we have almost limitless storage capacity and huge calculation capacity and software that are more and more sophisticated and can be customised at wish. So why do we still approximate?

I would assume that we would have a more accurate way of evaluating costs that would bypass the classic reservations that are issued on the traditional methods where we always point out that there is a margin of error, with the example of business units being profitable with one method, and in deficit with another.

The concern that used to be true, where we wanted to be able to have a quick approximation of what to bill customers based on a simple cost analysis is outdated. It is not much of a stretch to add a module in an ERP program to have a quick estimate being given based on the data. When I talked about it earlier, I heard the argument that you would need good and quality data to be able to get this result, but isn’t that already the case in the first place with the classic methods?

Processes are getting more and more complicated, with value chains that combine a lot of different activities, resources and sometimes even industries, making a simplistic calculation with just a handful of indicators a dangerous gamble. It might be time to revisit the way cost controlling is being taught: of course, we need to know the basics, the traditional methods can still apply for small entities, but we might need to consider learning how to use the data that we collect and end up not knowing what to do with.

I hope I can get a discussion starting here, one that might change the point of view of some who might not necessarily be aware that there are ways to do it differently. Let’s not forget, disruptive innovation is the word of the day.