Companies looking to jump on the burgeoning Internet of Things (IoT) bandwagon may have to re-evaluate how they architect their solutions. Traditional best practices for gathering and analysing data, where information is stored and processed centrally, are no longer relevant, according to a recent GlobalData survey.

Successful IoT practitioners do not wait for data to coalesce within a central data warehouse before analysing and taking action on that data. The vast majority (74%) of analytical operations are instead taking place close by, or better yet right on, the instrumented devices themselves.

Within vertical markets such as retail, where a sale can be won and lost in a matter of moments, there is no other way to make the necessary rapidfire decisions such as which offer to display to a specific customer as he or she enters a store. Such decisions cannot wait for such transient events to be uploaded to the company’s cloud.

In response, cloud providers such as Microsoft are beginning to revamp their own platforms to push critical IoT analytics functions such as predictive artificial intelligence algorithms downstream to devices. Even server manufacturers HPE and Dell EMC are targeting edge processing with devices built specifically to run analytics close to instrumented devices.

This preference for IoT immediacy is also reflected in the dominant usage of in-memory databases like SAP HANA and distributed file systems such as Apache Hadoop. These two modern data stores together more than doubled the use of traditional relational databases (only 15%) among those surveyed.

On the other end of the spectrum, only six percent of successful IoT practitioners relied upon a data warehouse for storage and analysis. This isn’t to say that data warehouses have no role to play in IoT. They are an important component of any successful analytics endeavor. But in the highly distributed world of IoT, immediate decisions can’t wait for traditional processes based on centralised data. IoT data thrives at the edge.