Traffic congestion anomaly detection and prediction using deep learning
- Publication Type:
- Working Paper
- Citation:
- arXiv, 2020
- Issue Date:
- 2020-06-23
Recently Added
Filename | Description | Size | |||
---|---|---|---|---|---|
2006.13215v1.pdf | Accepted version | 8.26 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is new to OPUS and is not currently available.
Congestion prediction represents a major priority for traffic management
centres around the world to ensure timely incident response handling. The
increasing amounts of generated traffic data have been used to train machine
learning predictors for traffic, however, this is a challenging task due to
inter-dependencies of traffic flow both in time and space. Recently, deep
learning techniques have shown significant prediction improvements over
traditional models, however, open questions remain around their applicability,
accuracy and parameter tuning. This paper brings two contributions in terms of:
1) applying an outlier detection an anomaly adjustment method based on incoming
and historical data streams, and 2) proposing an advanced deep learning
framework for simultaneously predicting the traffic flow, speed and occupancy
on a large number of monitoring stations along a highly circulated motorway in
Sydney, Australia, including exit and entry loop count stations, and over
varying training and prediction time horizons. The spatial and temporal
features extracted from the 36.34 million data points are used in various deep
learning architectures that exploit their spatial structure (convolutional
neuronal networks), their temporal dynamics (recurrent neuronal networks), or
both through a hybrid spatio-temporal modelling (CNN-LSTM). We show that our
deep learning models consistently outperform traditional methods, and we
conduct a comparative analysis of the optimal time horizon of historical data
required to predict traffic flow at different time points in the future.
Lastly, we prove that the anomaly adjustment method brings significant
improvements to using deep learning in both time and space.
Please use this identifier to cite or link to this item: