Forecasts estimate that in 2050, more than two thirds of humanity will live in cities. This high urban concentration leads to new uses and growing expectations in the management of mobility, security, waste management, communication, city’s operations, sustainability, and energy consumption. To address these concerns and improve the living environment, the concept of smart city has emerged.

Cities are not smarter than they used to be, the difference lies in the way data is collected and especially in the way it is used. Big Data and technology are key factors in the concept of smart cities for conducting business and making key decisions.

The various objects (traffic lights, public transport, warning sensors, etc.) are increasingly linked together by networks to improve their efficiency. This is the Internet of Things (IoT), a set of technologies known in everyday life through connected objects.

It’s all about the data:

What enables a smart city to optimize resources, improve the quality of life of its inhabitants and create sustainable economic development is the massive amount data available. All this data needs to be processed and is part of the components of an analytical ecosystem capable of integrating, analyzing, and sharing data in real time. Data is the backbone of a smart city.

Smart cities require intelligent data management that shares information, enables application developers to create new products and uses analytics to improve the systems that form the foundation of a smart city.

Smart cities must integrate data from all available sources. The goal of the Smart City is to break down the data silos to bring all the data together for rich, robust analytic insights that enable more accurate reporting. City authorities can then act based on these relevant analyses, so that they have the necessary knowledge to make decisions.

Where does the data come from?

  • City’s database:
    • Geographic Information System (GIS) such as ESRI,
    • Building HVAC data
    • Business supervisors (lighting, parking, waste, finance, etc.)
    • Existing video surveillance systems
    • City police
  • IoT sensors
  • Third party apps:
    • Energy suppliers (Engie, energy syndicate)
    • Weather forecast
    • Water authorities
    • Waze
    • Telecom operators
    • Service providers for the city (bus transport, metro, etc.)

How is data collected and transformed into useful information?

The first stage is to collect data which requires the ability to consider all the communication protocols used by the sensors but also those used to connect third-party systems.

An important phase in processing the data is to put it into context because out of context, the data does not mean much. The data must therefore be linked to other types of sources from the same ecosystem, providing more detail and analysis. Once crossed, this data is enriched and becomes information, which once processed by the AI will produce knowledge, to be able to take the right decision at the right moment. Thus, based on three measures such as pedestrian flow, weather, and car park occupancy, it is possible to predict the occupancy rate of the shop.

Once collected, standardized, enriched with meaning and stored, the data will play its role as “fuel” for the final applications.

Information exchange is becoming crucial today. There are so many different sources that it would be a waste and financially challenging not to pool all these data productions. And one of the main sources comes from the different existing applications. It is therefore important to create links between them in order to share this data, which is often already contextualized and enriched by the business expertise developed at the very heart of their algorithms.

Software connectors called APIs (Application Programming Interfaces) exist to enable access between these different databases. APIs natively offer data standardization and almost infinite storage since each application hosts its data. For example, to analyze the air quality in Aix-en-Provence, Axians chose to connect to Atmosud’s data (air quality observatory of the South Provence-Alpes-Côte d’Azur region) rather than to their sensors. The information collected has already been processed in a way that only an air quality expert would know.

What about Big Data?

Even if the notion of data lake is improved by a wide variety of applications databases, it is still necessary to have powerful systems to process all this information collected over time.

Big Data is the solution designed to allow this kind of processing in real time. It aims to offer an alternative to traditional database and analysis solutions (Business Intelligence platform in SQL server…).

This concept brings together a family of tools that respond to a triple challenge known as the 3V rule. These include a considerable Volume of data to be processed, a great Variety of information and a certain level of Velocity to be achieved, in other words the frequency of creation, collection and sharing of this data.

  • Huge amount of data to process: The astronomical amount of data generated by companies and individuals is constantly increasing. Only Big Data is capable of handling such a large amount of data and information.
  • Speed of data creation, collection and sharing: Speed is the rate at which data flows. That is, how often it is generated, captured, and shared.
    With new technologies, data is being generated ever faster and in much shorter periods of time. Companies are forced to collect and share it in real time, but the cycle of generating new data changes very quickly, making information quickly obsolete.
  • Variety of information: The types of data and its sources are becoming increasingly diverse, removing the neat, easy-to-consume structures of traditional data. These new data types include a wide variety of content: geolocation, connections, measurements, processes, flows, social networks, text, web, images, videos, emails, books, tweets, audio recordings…
    Because of this diversity, which removes structure, the integration of data into spreadsheets or database applications is increasingly complex or impossible.

Cities’ data management:

The management of data flows sometimes escapes political power to the benefit of digital operators. The possession of this mass of data therefore calls into question the historical legitimacy of the local authority over its territory, blurs the limits of the private and the public, and disrupts the rules of the city’s economy and governance.

To avoid abuses, in France, Article 17 of Law No. 2016/1321 of 7 October 2016 for a Digital Republic provides a framework for data from public service delegations. From now on, when a public body – such as a local authority – chooses to delegate the management of a public service (water, waste, transport, etc.) to a company, it can require the company to provide it with details of the data collected in this context.

On the other hand, the European regulation on the protection of personal data adopted on 27 April 2017 provides for the compulsory appointment of a personal data protection officer (DPO) within public bodies, responsible for ensuring the implementation of personal data protection provisions.

How does this apply to the field?

An Italian example:

Data are now at the center of the projects. So, when you don’t have control of the data, an IoT project can run into some difficulties for the integrator in charge of it.

Axians has seen how difficult it is not to control the flow of data, to be totally dependent on the sensor provider and the data collection platform, without being able to intervene to provide quick and efficient support to the end customer. An Italian Axians team decided to start building their IoT platform from the open source ‘Thingsboard’ system. The main objectives were to be able to manage the sensors remotely and to have a real time visualization of the data. This is the basis to deploy a Smart City system.

They evaluated and fully customized it to build the Axians Xsona platform, which perfectly meets the needs of IoT solutions, such as the following use cases:

  • Swimming pool managing solution: the customer needed a single portal to monitor the number of people, the energy meter, the pH and the water meter consumption.
  • Facilities management using BIM (Building Information Model) software: The customer wanted to monitor access to the restrooms and canteen to improve cleaning. He also wanted the sensor to be wireless and battery powered. In this case, Axians deployed a LoraWAN network and Xsona platform, used as middleware to inject data into the BIM.
  • Automation to open solenoid valves for irrigation: the customer wanted to monitor soil temperature and moisture. In this case, Axians mainly delivered the onsite platform (at the customer’s request) that collects the data from the LoraWAN soil moisture sensors and sends it to the Schneider BMS software.

Without this type of platform before, it would have been impossible to offer an end-to-end solution to these customers. The creation of the platform has opened many cases and requests from our customers, which confirms the great potential.

Systems have long been deployed based on the needs of city agencies. The result is a large number of different enterprise applications that are independent of each other, and cities are now faced with many issues such as:

  • Data sharing
  • Many interfaces to manage
  • The cost of all maintenance

The main challenge is to find a centralized solution where only one actor is the owner of the whole system and therefore is able to be fully responsive and agile.

For several years, improvement of the data analyses tools has enabled the local authorities to upgrade the services to a higher level. Data processing tools are very powerful today and it will continue to progress. Business intelligence contributes a lot to the companies’ capacity to decide and anticipate the future. The broad debate around data is wide, many notions are to be considered to understand the wide spectrum of implication of the latter such as cybersecurity, artificial intelligence, data analytics and so on.

Humans may become overloaded with the number of decisions they have to make. As SmartCity develops, the number of results to analyze and therefore the number of choices to make is exponential. The objective is to automate part of these decisions via automated platforms, which is hyper automation.