Go directly to the content of the page Go to main navigation Go to research

In an increasingly open, fragmented and virtualised environment, data control is becoming a critical issue for the various players in the property value chain. So it’s all about moving from a context of big data to a system of controlled management of smart data.

Buildings need data to be smart. Indeed, the buildings of tomorrow will be able to generate, move, store, and import huge amounts of data. But what should be done with this wealth of information? How should it be managed? Where should it be kept? Where should it be sent? In short, where should a building’s smart data be housed? In connected objects or equipment? In servers? In the cloud?

The answers to all of these questions depend of course on the configuration of the site (its size, whether it is a single site or spread over multiple locations), its level of technical sophistication (deployment of IoT applications, density and scope of networks), its function (head office or branch, nature of operations) and the sets of players involved (subcontractor eco-system).

However, says Julien Delbecchi, information systems manager at VINCI Facilities (VINCI Energies), “Beyond the variety of situations, a data warehousing strategy is underpinned by three invariable factors: security, performance, and cost control. It is these three objectives that should guide choices, not logistical ease.”

But it’s not that simple. We are seeing a proliferation of on and off-premise data centres and, above all, exponential growth in the number of cloud applications. For the smarter a building is, the more links it accumulates between central databases, data centres, and external SaaS (Software as a Service) applications on the one hand, and the company’s business functions (accounting, payroll, shared calendars, CRM, etc.) on the other.

Shadow IT…

Worse still, these multiple connections often result in a loss of control. This is what is meant by “shadow IT”, a real blind spot in information systems, which could account for more than 30 % of IT expenditure.

While it is estimated that a business generally relies on two or three major public cloud platforms (like Amazon Web Services, Microsoft Azure, or Google Cloud Platform), a study carried out in 2017 by the “Club des experts de la sécurité de l’information et du numérique” (club of information and digital security experts – Cesin) shows that the average number of SaaS applications used by companies stands at around 1,700.

“The aim is that client data remains with the client, on the premises.”

To ensure optimum building performance and to provide technical and other services that are tailored to needs, facility management providers must not only be able to connect their own BMS (building management system) and CMMS (computerised maintenance management system) to the applications of their operator and/or occupant clients, but must also have access to the most useful data. In other words, data that holds real information and operational value for various maintenance job scenarios.

In this nebulous multi-cloud environment, it’s about (re)taking control of data, about moving from a context of big data to a system of controlled management of smart data.

“The aim is that client data remains, to the extent possible, with the client, on the premises. The idea being to organise and structure data before moving what needs to be moved to external databases, based on predefined criteria of criticality, usefulness, and urgency,” explains Delbecchi.

… and edge computing

The answer may well lie in edge computing, otherwise known as a distributed open network of micro data centres. How does it work? Data is sent directly to a small local device which stores, processes, and analyses the information before pushing it to the cloud or a remote data centre. “Together with the startup SpinalCom, we are building architecture that makes it possible to connect our clients’ local centres and our own databases using generic devices. The goal is not to impose a new standard but to create a common language so that our solutions can be deployed easily, quickly, and cost-effectively from one building to another,” adds Delbecchi.