By 2020, Gartner predicts that data will be used to reinvent, digitalize or eliminate 80% of business processes and products from a decade earlier.
Organizations are at a tipping point in the quest for more meaningful data. Over the years, we’ve seen platforms and tools and systems emerge that tackle a piece of the data problem, but it’s created silos of information that fail to string together the entire story in a way that empowers more strategic decision making.
From contextual analytics to accessible data analysis, we’ve compiled a list of trends affecting the world of data and how companies access and strategically use it. Here are the five trends on their way out in 2017 – stay tuned for next week when we share those that are “in.”
Out: Relying on experts for data analysis.
A natural language search interface changes the game for everyday business users who have had to rely on data experts and analysts for their time and technical skills. Instead of waiting for hours, days or even weeks, a natural language search interface allows all users to have conversations with their data – quickly – so they can gain business insights, all without writing code or knowing SQL.
Out: Ineffective data cleansing practices.
While enterprises invest heavily in data cleansing, it’s currently still largely inefficient. Too much cleansing can dilute data to the point where the process works against you, and traditional cleansing leaves only a portion of the data usable. Cognitive engines allow you to use much more of the original data as possible, reducing the need to invest heavily in inefficient data cleansing. This allows the enterprise to invest more wisely in partners for the more complex data cleansing.
Out: Siloed data.
The clunky, complicated technology of yesterday meant that users could only access a portion of data that someone within their organization had decided was relevant for them. To enable the best decision making, enterprise search tools must effectively expand a user’s view and provide cross-business insights. New tools do a better job of breaking down information silos across the various systems in use at an organization to offer an unobstructed view of the data as a whole.
Out: Public or private clouds.
More enterprises are moving to the cloud, and more specifically, the hybrid cloud, when it comes to their data solutions. For many, the public cloud is seen as too unreliable to successfully process and transfer big data, while private clouds limit scaling and availability. For big data, the hybrid cloud offers the best of both worlds.
Out: In-memory systems.
With the cloud comes a more fully distributed approach when it comes to computing and data processing. Because today’s companies are moving towards search and data analysis at the source, in-memory systems are becoming a constraint.
Stay tuned for our next post on what’s “in” for 2017.