Analysing Big Data

September 25, 2012

Sometimes it can be important to be able to analyse huge amounts of data, something which most business intelligence systems fails to do in a proper way.
The danish company TARGIT has recently teamed up with EXASOL which is one of the leading software company in the world, who offers high-performance AND a high speed databases which can be used in combination with the Targit BI Suite in order to quickly analyse large amounts of data and show these data in a proper way in a modern business intelligence tool. Big data is something that more and more companies are starting to see potentials in as their database grow larger and larger, making it much harder to get data from their datawarehouse fast enough for things such as realtime decision support systems and similar tools.

So instead of spending several days analysing big data you can now with the Targit bi suite and an optimized exasol datawarehouse pull the data realtime in order to get the most out of your business intelligence software.

Normally this requires huge amounts of server power, often paralle connected servers or super computers in order to process big data real time, but with some of the new algorithms out there, it is not possible with normal hardware too. One fo the ways they do it, is by just focusing on the important data and skip the rest. Of course that gives some limitations but in 95% of the cases it will be enough to solve your needs, which is quite acceptable for most people working with big data and business intelligence systems like the Microsoft BI Package or the Targit BI Suite. It’s not enough to just read a blog called great tips you need to know, if you want to use an advanced tool like BI.
There are still room for plenty of improvements in this niche but as the systems get more effective and the hardware gets better we are going see the improvements and within a few years big data won’t be that much of a problem anymore. Of course unless the amount of information gathered inceases faster than the hardware does, then we will still be dealing with the big data problem. But as it looks right now, we can expect the hardware to be able to keep up with the massive amount of information and should be able to solve the big data problem within a the next 5-7 years. Remember that in addition to better and faster hardware the software algorithms also improves making it run more effetive on the current hardware.


Analysing big data with brænde and briketter