6G-PATH
UC-CITIES-1: Connected and Sensing city
This use case relies on a smart city platform consisting of different types of vehicles (some of them being autonomous), people, and sensors around the city, such as environmental and mobility sensors (e.g., video cameras, radars, lidars), all being connected and sending real-time information. This platform will serve as a base for the subset of scenarios.
In the scenario of high-distribution video on the move, we consider the scenario where people are travelling around while watching high-quality videos. The video coding and video distribution services are proactively orchestrated in the geographically distributed MEC to the new points of access of the people before they arrive, so no video disruption is foreseen. The data on vehicles, people, mobility, and networks will give information on the prediction of both mobility and network quality, providing a prediction of the needs of the services in terms of quality and location to travel with their users.
In the scenario of traffic avoidance, we consider the scenario of a city digital twin facilitating simulations of the traffic, helping make decisions based on real-time information and simulated events presenting data in 2D and 3D formats.
In the scenario of enhanced city services, AI is used to provide better services to citizens, such as detecting stolen cars, badly parked vehicles, cars parked in the carriageway, and control of shared areas, to improve the quality of life in the urban spaces. This is done through a Smart City deployed with MEC, sensors, a private 5G network and connected vehicles (with V2X). This scenario will also include specific services to attend emergency scenarios (eg. ambulances).