It’s the end of the year, and we all know what that means – experts are weighing in on their tech predictions for 2018 and the years to come. This year has seen data become the lifeblood of public sector organizations. In a recent interview at NetApp Insight Berlin 2017, Mark Bregman, CTO NetApp, shares his top predictions for 2018, and he says as data continues to drive processes, it will become more “self-aware.” Read on and watch the video below to find out what his other predictions are.

Data Becomes Self-Aware

According to Bregman, as data becomes more “self-aware,” it will carry with it much more of its metadata, creating a new level of significance as it drives processes. Today, in the communications world we have moved to packet-switched data where the data carries information about what to do with it, so we don’t need to know everything about the network it is traveling on. In the future, data will determine what processes get executed on it. Bregman explains an example of this with an autonomous vehicle that gets in an accident. Multiple stakeholders want access to the data for their own purposes. When data is self-aware, it can be tagged so it controls who is allowed to see certain parts of it, and when, eliminating the need for a separate program or security process.

Virtual Machines Becoming Ride-Share Machines

Bergman compares the current state of public cloud to a car rental service and the future being more like a ride-sharing program such as Uber. Currently, users pick what instances of the cloud they want to pay for while they need it and not for the work that’s getting done. Virtual machines provisioned on webscale infrastructure are like the rideshare services of computing where the user simply specifies the task that needs to be done.

Data Will Grow Faster than the Ability to Transport It

It’s no secret that we’re transferring more data before between data centers and cloud. With the emergence of Internet of Things (IoT), there will be more data generated and “stored at the edge” that will not be able to be shipped back to the core. Bergman believes this will not be an issue because we can do the necessary processing at the edge. This will lead to the need for data lifecycle management at the edge where historically it was just siloed.

Evolution of Big Data to Huge Data

Big data will become huge data, driven by the increased need for artificial intelligence and machine learning. Very large amounts of data will need to be analyzed in near real-time to meet new needs. As such, a feedback loop will be created that will drive new architectures like in-memory analytics. With in-memory analytics, technologies will come along that will allow us to build persistent storage and persistent memory.

Importance of Blockchain

Blockchain is alive and well today, but in the future, it will be even more significant providing a trustworthy and truly distributed and immutable ledger for the data. Because there is no central point of control, such as a centralized server, it is impossible to change or delete information contained on a blockchain, and transactions are irreversible. Blockchain will fundamentally change the way the data centers of the future are managed.