Go directly to the content of the page Go to main navigation Go to research

VIPs, researchers, and business and opinion leaders share their take on news items or key issues about digital transformation or energy transition.

One of the main challenges of the next 5 years for businesses and for any organisation will be to secure and maximise the value of their business data. Between the public cloud and the private cloud, a hybrid data fabric solution is gaining prominence.

Stored data has become more complex, enriched as it now is with multiple types of information and formats (text, images, sound, video, photos, etc.). Indeed, it’s these very complexities that have led to changes in data storage.

Data that was stored internally on file servers in the 1980s and 1990s is today held in the form of data warehouses in data centres and in private, public and hybrid clouds. This dispersal is weakening businesses and forcing them to structure their data using data fabric architecture in order to optimise costs and enhance practices.

The main challenge today is to manage the increase in the amount of data and its use. To achieve this, businesses need to become data driven. In other words, they need to draw conclusions from a now widely shared view: that enterprise data serves not just to manage but above all to transform your business.

“Using the public cloud as a tool to access massive amounts of computing power and the private cloud to protect sensitive data.”

That being the case, how can you leverage your own data without leaving the path open for other businesses to do it in your place and taking the risk that they leverage and monetise your know-how faster than you? The tech giants provide just one example of how high the risk is of being stripped of your data and business.

Taking back control

In order to take back ownership of your data, it’s vital to understand how it flows within your business processes. That means knowing where it comes from, how to clean it up and how to enhance and exploit it before feeding it back into your business applications. This review of the whole data management process is referred to as the data pipeline.

To get an idea of how the approach works, take the example of a company that has decided to move into distance selling. If its mobile e-commerce application is to be effective, all of its stock, customer purchase and promotion data first needs to be consolidated. This is a straightforward task today thanks to data warehouses.

The second step involves enhancing the data through a segmentation process. Categories and sub-categories are established, enabling the client to sort information more easily (clothing, cosmetics, household appliances, etc.). These measures also make it possible to manage customer history more accurately.

This naturally leads onto the third stage, which consists of optimising commercial recommendations for customers based on artificial intelligence.

Say we’ve identified from the data that twice in the last three years a customer has bought ski equipment for his son during the “Foire aux vins”* period. The company then has every interest in recommending promotions for boys’ ski equipment in the right size.

Being properly aligned with customer buying patterns can generate a 2 to 4% increase in the average shopping cart.

This whole process and the data pipeline approach also have the advantage of helping to continuously improve services. When delivery delays or returns of faulty items start multiplying, a fine-grained algorithm-based analysis of the data can help determine if the reason is that parcels are being delivered outside rather than inside the building or if the issue is related to a specific product or supplier.

Cloud: the best of both worlds

However, in order to keep control over data and fully enhance its value, it must be stored in the right place. Some businesses opt for an internal solution using a private cloud and dedicated data centres. This model is sure to deliver data security, but it is static and often lags behind in terms of technology, limiting scope for development.

Others choose up-to-date platforms with providers that host data in the public cloud. The risk in terms of confidentiality and the high data ingress and egress fees are far from being insignificant (in fact they are growing exponentially).

But there is a third way: a hybrid solution. The principle involves using the public cloud as a mobility tool (IoT/Edge) that offers massive amounts of computing power and scalability capabilities. Combined with a private cloud, operated internally or by a provider, the solution helps protect sensitive enterprise data relating to business applications. I refer to this as hybrid data fabric (HDF) architecture. It optimises storage, ensures your data is secure and forms the basis of your data pipeline.

HDF frees up your energy to better understand and innovate with your data.


* These wine promotion fairs are an annual event, generally held in France after the summer holidays.



By Yves Pellemans, Axians France CTO   

Find out more