Next Article in Journal
Developing a Model to Express Spatial Relationships on Omnidirectional Images for Indoor Space Representation to Provide Location-Based Services
Previous Article in Journal
Validation of Recent DSM/DEM/DBMs in Test Areas in Greece Using Spirit Leveling, GNSS, Gravity and Echo Sounding Measurements
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Graph Neural Network for Traffic Forecasting: The Research Progress

1
School of Information and Communication Engineering, Beijing University of Posts and Telecommunications, Beijing 100876, China
2
School of Computer Science and Engineering & China-Singapore International Joint Research Institute, Nanyang Technological University, Singapore 639798, Singapore
3
Yanqi Lake Beijing Institute of Mathematical Sciences and Applications, Beijing 101408, China
4
China Academy of Industrial Internet, Beijing 100102, China
*
Author to whom correspondence should be addressed.
ISPRS Int. J. Geo-Inf. 2023, 12(3), 100; https://doi.org/10.3390/ijgi12030100
Submission received: 18 December 2022 / Revised: 2 February 2023 / Accepted: 26 February 2023 / Published: 27 February 2023

Abstract

:
Traffic forecasting has been regarded as the basis for many intelligent transportation system (ITS) applications, including but not limited to trip planning, road traffic control, and vehicle routing. Various forecasting methods have been proposed in the literature, including statistical models, shallow machine learning models, and deep learning models. Recently, graph neural networks (GNNs) have emerged as state-of-the-art traffic forecasting solutions because they are well suited for traffic systems with graph structures. This survey aims to introduce the research progress on graph neural networks for traffic forecasting and the research trends observed from the most recent studies. Furthermore, this survey summarizes the latest open-source datasets and code resources for sharing with the research community. Finally, research challenges and opportunities are proposed to inspire follow-up research.

1. Introduction

Traffic forecasting is the foundation of modern transportation infrastructures and intelligent transportation systems (ITSs). It has a wide range of applications in trip planning, road traffic control, and vehicle routing [1,2,3,4,5,6]. Traffic forecasting has drawn a great amount of attention from both academia and industry in recent decades [7,8,9,10,11,12,13,14]. However, the traffic forecasting problem has not been fully resolved due to the complex spatiotemporal dependencies of traffic activities. Furthermore, developments in the Internet of things (IoT), the Internet of vehicles (IoV) and artificial intelligence (AI) techniques [15] have helped to measure and model more diverse traffic-related characteristics, allowing the design of autonomous and efficient data-driven traffic forecasting methods [16,17,18]. To gain a comprehensive understanding of the opportunities and challenges in traffic forecasting, we summarize here the recent research progress in this vibrant field to facilitate future research.
Depending on the data format used, traffic forecasting problems can be classified into different types, including time series data, grid data, and graph data. Among them, the earliest and most common problem formulation is time series forecasting, where historical data points are used as model input to predict future conditions [19,20,21]. Furthermore, time series forecasting problems can be divided into univariate problems and multivariate problems. For univariate time series problems, only one traffic variable is considered, such as traffic flow or traffic speed. For multivariate time series problems, multiple traffic variables are considered simultaneously. In addition to univariate and multivariate settings, time series forecasting can also be formulated as single-step forecasting and multiple-step forecasting. In single-step forecasting problems, only one data point needs to be predicted in the next step. In multiple-step forecasting problems, there is more than one value to predict.
Some typical time series forecasting models include simple linear regression, autoregressive integrated moving average (ARIMA), and seasonal autoregressive integrated moving average (SARIMA). SARIMA outperforms ARIMA because it captures seasonal patterns. In the transportation domain, both daily and weekly patterns are observed and useful for forecasting. SARIMA was further improved using the Kalman filter in [19], and the improved model outperformed other time series models. Empirical mode decomposition (EMD) is often used together with time series models, where the time series is first decomposed into different components and each component is then modeled with a time series model. This combination has been shown to be effective for traffic forecasting [21].
Although time series data are the most commonly used data format in traffic-related studies, they are insufficient because they do not consider the spatial dependence of traffic activities. To overcome this problem, two data formats, grid data and graph data, are further used. For traffic forecasting with grid data, at each time step, the traffic data are aggregated by some regularly divided regions in the studied urban area. Each regularly divided region can be regarded as a grid. By aggregating the corresponding traffic variables in each grid, we obtain an intensity map that can be displayed in an image format, as shown in Figure 1. In single-step traffic forecasting problems with the grid-data format, the historical grid data in a predefined lookback window are formulated as image frames and used as the input feature. The frame in the next time step is used as the prediction target.
For traffic forecasting problems in graph format, traffic data are aggregated by specific locations or stations, which are regarded as nodes in a traffic graph. Node features are collected traffic variables such as traffic flow or speed. Edges can model road topological connections or spatial distances between different nodes. In single-step traffic forecasting problems with the graph format, the historical graph data in a predefined lookback window are used as the input feature. The graph in the next time step is used as the prediction target, as shown in Figure 2.
Existing traffic forecasting methods can also be divided into different categories according to the models used, including statistical models, shallow machine learning models, and deep learning models. Each has its own scope of applicable scenarios and can be adapted to different situations [26]. Statistical models are mainly linear models such as ARIMA and SARIMA. These models are advantageous due to their low computational cost and good interpretability. However, their predictive performance is inferior to that of machine learning and deep learning models, which are better at capturing nonlinear relationships. Shallow machine learning models are represented by tree-based models such as decision trees and random forests [27]. They were the first choice for early research until recently with the adoption of more intelligent and accurate deep learning models represented by modern neural networks such as convolutional neural networks (CNNs) [28] and recurrent neural networks (RNNs) [29].
These deep learning models have been proven effective for a variety of forecasting problems in the finance, energy and communications sectors [30,31,32,33,34,35,36,37,38,39,40]. Among various deep learning models, graph neural networks (GNNs) have become state-of-the-art solutions to various forecasting problems. In the financial field, a comprehensive survey of deep learning models for stock market forecasting showed that emerging GNN models received the most attention [30]. In the economic field, deep learning models have been proven effective for retail forecasting [37] and market demand forecasting [39], which are the basis of supply chain management. In the energy field, it has also been confirmed that deep learning models are becoming the main solution [31,33,34]. It is worth mentioning that external factors such as temperature and weather information have a great impact on the forecasting performance [32,38]. This observation is insightful for traffic forecasting problems, as transportation systems are also highly influenced by weather information, e.g., road traffic decreases during bad weather. For communication networks, various deep learning models have been proven to be more effective than statistical and machine learning models, such as the InceptionTime model adopted in [35] based on the time series data format and the convolutional LSTM model adopted in [36] based on the grid-data format. It is also observed that GNNs are gaining popularity in cellular traffic prediction [40]. GNNs utilize graph structures, which are common in transportation infrastructure, such as road networks and subway systems. GNNs can effectively capture interactions between nearby traffic sensors or stations, thereby improving prediction performance.
The research topic of this survey focuses on GNN-based solutions, and there are still many recent publications introducing CNN-based or decision-tree-based solutions [28,41,42,43]. As discussed in previous relevant studies [44,45], compared to CNN-based or decision-tree-based solutions, GNN-based solutions have a wide range of applicable scenarios and achieve state-of-the-art performance. GNN-based solutions can be applied when there is a natural graph structure (such as a road network) or when an artificial graph can be constructed (such as a neighborhood graph in grid data). However, GNN-based solutions are inapplicable when the above graphs are not available, e.g., traffic data collected in a single loop detector only. GNNs are mainly used to forecast speeds and volumes in urban networks and freeways. However, prediction in urban networks is far more challenging than that in freeways because of complex spatiotemporal traffic patterns caused by various reasons, such as complex road structures, different vehicle types, and time-varying user demands.
Although there have been some surveys of deep learning for traffic forecasting problems, most of them are not GNN-focused, with only a few exceptions [46,47,48,49,50,51,52,53]. This study serves as an extension of existing GNN-relevant surveys [44,45], summarizes the latest research progress in 2022, and aims to be the latest reference manual for researchers in related fields. In this survey, a total of 118 journal papers and 30 conference papers published in 2022 are reviewed, all of which are selected from prestigious journals and conferences in transportation, computer science, and multidisciplinary fields. Each paper is reviewed in a structured manner and lessons learned are discussed to reveal research trends. Based on the surveyed studies, the latest open datasets and code resources are also collected and organized in lists. Existing research challenges are identified, and corresponding research opportunities are further suggested.
The contributions of this survey are summarized as follows:
  • This survey summarizes the latest studies on the topic of traffic forecasting with graph neural networks.
  • This survey provides the research community with up-to-date lists of open datasets and code resources.
  • This survey identifies existing research challenges and suggests corresponding research opportunities to inspire follow-up research.
The remainder of this paper is organized as follows. Section 2 is a literature review of the latest relevant studies and a discussion of recent research trends. Then, the latest lists of open datasets and code resources for the research community are presented in Section 3. Section 4 discusses research challenges and opportunities when applying graph neural networks for traffic forecasting to inspire follow-up research. The conclusion is drawn in Section 5.

2. Literature Review and Research Trends

The studies covered in this survey were all selected from prestigious journals and conferences in transportation, computer science, and multidisciplinary fields. To share incremental knowledge and avoid repetition with existing similar surveys [44,45], this section only selected those published in 2022 for discussion, with a total of 118 journal papers and 30 conference papers. Source journals and conferences are listed in Table A2 and Table A3, with the number of papers counted. The reviewed studies are summarized in Table 1. For each study, the specific traffic problem, graph type, dataset, model component (especially the GNN structures involved) and a summary of the main content are discussed. More relevant studies are tracked and updated in our GitHub repository (https://github.com/jwwthu/GNN4Traffic, accessed on 2 February 2023).
As discussed in the introduction, Section 1, we categorized traffic forecasting problems from two perspectives, namely, based on the data format or based on the model used. Furthermore, in Table 1, we provided another perspective based on transportation modes, such as road traffic, taxis, bikes, and subways. As shown in Table 1, we found that the road traffic flow and speed prediction problem was still the most popular traffic prediction problem in different traffic-related studies. There are two possible reasons for this trend. The first reason is that for the road traffic forecasting problem, open datasets and baseline models are more accessible with well-processed steps and instructions, which saves the workload of data collection and preprocessing. The second reason is that building graphs for road-network-related problems is more intuitive, making it more natural to use GNNs to solve road traffic flow and speed prediction problems, and thus more common in the scope of our investigation.
As shown in Table 1, there are two types of graphs listed in the graph column, static graphs and dynamic graphs. In the early research stages, static graphs were widely used because of their convenience. However, researchers realized that static graphs were insufficient to capture changes in network topology and traffic patterns. For example, traffic flow measurements and their correlations on road segments change dynamically in space and time, which is beyond the modeling capabilities of static graphs. Then, dynamic graphs were introduced. As the name implies, a dynamic graph is a graph that can evolve as new nodes or edges are added or removed. However, static graphs are still very useful when the traffic infrastructure remains unchanged for the time period considered. Therefore, some researchers use both dynamic and static graphs. The static graph is used to model the static road network, and the dynamic graph is used to consider the impact of dynamic traffic events and weather information.
Most of the collected datasets used in the surveyed studies are open datasets with only a few exceptions. Among those open datasets, some have made great contributions to support relevant studies, which can fairly evaluate and compare different models, e.g., PeMS-BAY and METR-LA. However, it also poses problems when existing datasets are overutilized and overfitted to GNN-based deep learning models and produce unreliable models for other traffic scenarios and datasets. To address this potential risk, new datasets were collected and are listed in Section 3 for further evaluation. Additionally, most of the surveyed studies used two or more datasets, a phenomenon worthy of further study.
For the model component part, the graph convolutional network (GCN) [54] and graph attention network (GAT) [55] are the two dominant networks used. It is difficult to go through all the GNN model details in the surveyed papers listed in Table 1. Interested readers are advised to read the original text of the surveyed papers.
A GCN is a pioneer in transferring the concept of convolution operations from Euclidean image data to non-Euclidean image data and has achieved great success in the past few years. The basic idea of a GCN is to aggregate the features from neighbors and then apply a linear transformation on the aggregated features. GCN layers can be stacked k times to capture k-hop neighbor information. However, a GCN requires the entire graph structure for training, which consumes a considerable amount of computer memory. In that case, GAT, based on the attention mechanism [56], was introduced as an alternative to GCN. The main difference between GAT and GCN is the introduction of importance scores for different neighbors based on the masked self-attention mechanism. Technical details about GCN and GAT are beyond the scope of this survey, which aims to identify research trends, and can be found in relevant surveys [57,58]. Designing more effective GCN or GAT variants is still a major research direction. Fundamental theoretical breakthroughs in the GNN research community will also help in the development of new traffic forecasting methods.
Both GCN and GAT are mainly used to capture spatial dependencies. To capture temporal dependencies, there are some classic models, e.g., temporal convolutional network (TCN), long short-term memory (LSTM), and gated recurrent unit (GRU). More recently, an attention-based model, i.e., the Transformer, has proven effective for capturing long-term dependency in time series [59]. Nevertheless, as indicated in Table 1, Transformer has only been used in a few surveyed studies, and there is still much room for research.
Table 1. Summary of the surveyed studies.
Table 1. Summary of the surveyed studies.
StudyProblemGraphDatasetModel ComponentSummary
[60]Road traffic flow, road traffic speedDynamic graphPeMS03, PeMS04, PeMS07, PeMS08, PeMS-BAY, METR-LAGCN, TCNDual dynamic spatial–temporal graph convolution network (DDSTGCN) is featured with a dual graph structure of traffic flow graph and its dual hypergraph to reveal more complicated latent relations.
[61]Road traffic flowDynamic graph, static graphPeMS04, PeMS08TCN, GCNSpatiotemporal adaptive graph convolutional network (STAGCN) is featured with an adaptive graph generation block to capture both the learnable long-time static road graph and the learnable short-time dynamic graph.
[62]Road traffic speed, regional bike flowDynamic graphPeMS-BAY, METR-LA, BikeNYCGAT, TCNJointGraph is featured with a network reconstructor to reconstruct the traffic graph and the ability to handle a multidataset joint training task.
[63]Metro traffic flowDynamic graph, static graphBJMF15GCN, TCNKnowledge graph representation learning and spatiotemporal graph neural network (KGR-STGNN) is featured which better captures the influence of external factors.
[64]Regional traffic flowStatic graphHaikouTaxi, ChengduTaxiGCN, TCNMultiattribute graph convolutional network (MAGCN) is featured with the consideration for area attributes and a novel matrix whose values are the functional area-based origin–destination pairs.
[65]Ride-hailing demandStatic graphRide-hailing datasets in Beijing and ShanghaiGCNThe proposed multilinear relationship GCN is characterized by multimodal coordinated representation learning and spatial feature extraction from different modalities.
[66]Road traffic flowStatic graphPeMS08, METR-LAGCN, LSTMA multiview Bayesian spatiotemporal graph neural network (MVB-STNet) is featured with a Bayesian neural network layer for handling data uncertainty with sparse and noisy data.
[67]Road traffic flowStatic graphPeMSD4, PeMSD7, PeMSD8GraphSAGE, GRUA transferable federated inductive spatial–temporal graph neural network (T-ISTGNN) is featured with the capability of cross-area traffic state forecasting when preserving the privacy of source areas.
[68]Regional taxi usageStatic graphTaxiNYCGAT, GRUA spatiotemporal heterogeneous graph attention network (STHAN) is featured with a spatiotemporal heterogeneous graph in which multiple spatial relationships and temporal relationships are modeled and metapaths are used to depict compound spatial relationships.
[69]Road traffic flowDynamic graph, static graphMETR-LA, PEMS-BAYGCN, GRUA spatiotemporal prediction framework using high-order graph convolutional network (STHGCN) is featured with a dynamic adaptive spatial graph learning module to learn the high-order dependence.
[70]Road traffic flowDynamic graphPeMS04, PeMS08GCNThe proposed CTVI+ framework uses a temporal self-attention mechanism and a multiview graph neural network for learning temporal and spatial traffic patterns.
[71]Origin–destination demandDynamic graphTaxiNYC, BikeNYC, BikeDCGAT, LSTMA temporal graph autoencoder (TGAE) is featured with a temporal network embedding framework that utilizes node representations in latent space to capture the temporal evolution of traffic networks.
[72]Regional ride-hailing demandDynamic graphUberNYC, TaxiNYCGAT, 1D-CNN, TransformerA deep multiview spatiotemporal virtual graph neural network (DMVST-VGNN) is featured with an integrated structure of GAT, 1D-CNN, and Transformer networks.
[73]Road traffic flowStatic graphPrivate dataGCN, LSTM, GANA graph convolution and generative adversarial neural network [73] is featured with a GAN structure and parallel prediction ability for multiple steps.
[74]Road traffic flow, road traffic speedDynamic graphPeMS-BayGAT, GCNA hierarchical mapping and interactive attention network (HMIAN) is featured with a hierarchical mapping structure for capturing functional zone relevance and long-distance dependence.
[75]Road traffic flowStatic graphPeMSD3, PeMSD4, PeMSD7, PeMSD8GCNThe proposed forecasting framework uses an outlier detection strategy for a real-world IoV environment.
[76]Metro passenger flowStatic graphCDmetro2018GCN, GRUA spatial–temporal multigraph convolutional wavelet network (ST-MGCWN) is featured with a graph wavelet convolution with multigraph fusion.
[77]Road traffic flowStatic graphPeMSD4, PeMSD8GCN, ConvLSTMA multidimensional attention-based spatial–temporal network (MA-STN) is featured with a multidimensional attention mechanism to capture spatial and temporal patterns.
[78]Road traffic speedStatic graphMETR-LA, PeMS-BAYGCN, TCNThe proposed approach features a multimode spatial–temporal convolution of a mixed hop diffuse ordinary differential equation (MHODE).
[79]Road traffic flowStatic graphPeMSD4, PeMSD8GCNThe proposed approach features a gated attention graph convolution model with multiple spatiotemporal channels.
[80]Road traffic speedStatic graphPeMS-BAY, METR-LAGCN, GRUThe proposed approach features a combination of time classification and GCN models.
[81]Road traffic flow, bike demand, taxi demandDynamic graph, static graphPeMSD3, PeMSD4, PeMSD7, PeMSD8, BikeNYC, TaxiNYCGCN, GRUA dual graph gated recurrent neural network (DG 2 RNN) is featured with a bidirectional GRU layer for learning temporal dependency and a spatial attention mechanism for learning spatial dependency.
[82]Road traffic flowStatic graphMinnesota Department of Transportation Traffic DataGCN, GRUThe proposed approach features an attribute feature unit to fuse external factors into a spatiotemporal GCN.
[83]Road traffic speedDynamic graphSeattle-Loop, METR-LAGCN, GRUA self-attention graph convolutional network with spatial, subspatial, and temporal blocks (SAGCN-SST) captures the dynamic spatial dependency with a self-attention mechanism and is robust against traffic congestion and accidents.
[84]Taxi demand, bike demandDynamic graphTaxiNYC, BikeNYCDCNN, TransformerA dynamical spatial–temporal graph neural network (DSTGNN) is featured with an inhomogeneous Poisson process to model the changing demand process and the spatial–temporal embedding network to infer the intensity.
[85]Ride-hailing demandDynamic graphTaxiNYCGCN, GRUA dynamic multigraph convolutional network with generative adversarial network (DMGC-GAN) is featured with a multigraph GCN module to learn from different dynamic OD graphs and a GAN structure to overcome the demand sparsity problem.
[86]Road traffic speedStatic graphPrivate dataGCN, GRUThe proposed approach features a GAN structure for robust data-driven traffic modeling.
[87]Road traffic flow, road traffic speedStatic graphSeattle-Loop, PeMS-BAYGAT, GANThe proposed GAT-GAN framework features a combination of first-order and high-order neighbors.
[88]Road traffic flowDynamic graphPeMSD4, PeMSD8GCN, CNNA graph and attentive multipath convolutional network (GAMCN) is featured with a novel GCN variant with road-network graph embedding and a multipath CNN module.
[89]Road traffic accidentDynamic graphNYC Open Data, PeMS-BayGCNA multiattention dynamic graph convolution network (MADGCN) is featured with multiple attention mechanisms for capturing spatial and temporal influences.
[90]Road traffic flowStatic graphPrivate dataGCN, GATThe proposed approach leverages DRL to integrate and improve GCN and GAT results.
[91]Road traffic flowStatic graphPeMS (with 97 detectors)GCNThe proposed approach features the combination of a GCN and six complex network properties.
[92]Road traffic flowDynamic graphPeMSD7, PeMSD11GCNThe proposed approach features a GCN-based data imputation module and an adaptive approach of leveraging DRL for the dynamic graph’s adjacency-matrix generation.
[93]Road traffic flowDynamic graphPeMSD4, PeMSD8GCNThe proposed CRFAST-GCN features a conditional random field (CRF)-enhanced GCN to capture the semantic similarity globally.
[94]Road traffic speedDynamic graphPeMSD8, METR-LATCN, GCNA universal framework is proposed to transform the existing one-step-ahead models to multistep-ahead models.
[95]Road traffic speedStatic graphMETR-LA, PEMS-BAYGNNThe proposed approach features a novel GNN layer with a location attention mechanism to aggregate traffic flow information from adjacent roads.
[96]Road traffic speedDynamic graphMETR-LA, PEMS-BAYDCNN, TCNSpatiotemporal sequence-to-sequence network (STSSN) is featured with an encoder-decoder structure with the joint modeling ability of spatial and temporal correlations.
[97]Road traffic flowDynamic graphPeMS, private dataGNN, LSTMAn attentive attributed recurrent graph neural network (AARGNN) is featured with the modeling of both static and dynamic factors.
[98]Road traffic flowStatic graphPeMSD4, PeMSD8GCNAn adaptive graph learning algorithm (AdapGL) is proposed to learn the complex dependencies, and the model parameters are optimized with the expectation maximization algorithm.
[99]Bike demand, taxi demandStatic graphBikeNYC, TaxiNYCGATA comodal graph attention network (CMGAT) is featured with a multiple-traffic-graph-based spatial attention mechanism and a multiple-time-period-based temporal attention mechanism.
[100]Road traffic speedDynamic graphMETR-LA, PEMS-BAYGCN, TCNAn adaptive spatiotemporal graph neural network (Ada-STNet) is featured with a dedicated spatiotemporal convolution architecture and a two-stage training strategy.
[101]Road traffic speedStatic graphPeMSD7, METR-LA, Seattle-LoopGCN, TransformerAn attention-based graph convolution network and Transformer (AGCN-T) is featured with the combination of a GCN and temporal Transformer modules.
[102]Road traffic speedDynamic graphPeMSD4, PeMSD8GCN, ConvGRUAn attention encoder–decoder dual graph convolution model with time-series correlation (AED-DGCN-TSC) is featured with the combination of a time series correlation analysis and deep learning modules.
[103]Road traffic flowDynamic graphPeMSD3, PeMSD4, PeMSD7, PeMSD8GCNAn improved dynamic Chebyshev GCN is proposed with a novel Laplacian matrix update method, the attention mechanism, and a novel feature construction method.
[104]Road traffic flowStatic graphPeMSD4, PeMSD8GCN, GLUA causal gated low-pass graph convolution neural network (CGLGCN) is featured with a causal convolution gated linear unit with less computation time and a GCN with a self-designed low-pass filter.
[105]Road traffic flowDynamic graphPeMSD4, PeMSD8GATAn attention-based spatiotemporal graph attention network (ASTGAT) is featured with multiple residual convolution and a high–low feature concatenation.
[106]Road traffic speedDynamic graphMETR-LA, PeMS-BAY, PeMS-SGCNAn attention-based dynamic spatial–temporal graph convolutional network (ADSTGCN) is featured with the combination of a dynamic adjustment module, a gated dilated convolution module, and a spatial convolution module.
[107]Road traffic flowStatic graphPeMS-LA, PeMS-BAYGCNAn attention-based spatiotemporal graph convolutional network considering external factors (ABSTGCN-EF) is featured with the combination of a GCN and attention encoder network modules and the consideration of external factors.
[108]Road traffic flowStatic graphPeMSD4, PeMSD8GCN, LSTMAn augmented multicomponent recurrent graph convolutional network (AM-RGCN) is featured with an LSTM-based temporal correlation learner that incorporates a one-dimensional convolution.
[109]Road traffic speedStatic graphTaxiSZGCN, GRUA bidirectional-graph recurrent convolutional network (Bi-GRCN) is featured with the combination of a GCN and a bidirectional GRU.
[110]Road traffic flowStatic graphPrivate dataGraphSAGE, LSTMThe proposed approach features the combination of GraphSAGE, a global temporal block, and the self-attention mechanism.
[111]Regional traffic flowStatic graphPrivate dataGCN, CNNThe proposed ConvGCN-RF features a preprocessing-encoder–decoder framework and the combination of CNN, GCN, and random forest modules.
[112]Bus demandStatic graphPrivate dataGCN, LSTMThe proposed approach features the combination of a time-dependent geographically weighted regression and graph deep learning and the consideration of dynamic-built-environment influences.
[113]Regional crowd flowStatic graphTaxiNYC, BikeNYCGAT, CNN, LSTMThe proposed approach features a semantic GAT module for learning dynamic inter-region correlations.
[114]Road traffic speedStatic graphA new open data of Seoul, South KoreaGCNA distance, direction, and positional relationship graph convolutional network (DDP-GCN) is featured with the consideration of three spatial dependencies.
[115]Road traffic flowStatic graphPeMSD3, PeMSD7, private dataDGGPThe proposed approach features novel deep graph Gaussian processes (DGGPs), which consist of the aggregation of a Gaussian process, temporal convolutional Gaussian process, and Gaussian process with a linear kernel.
[116]Road traffic flow, road traffic speedDynamic graphPeMSD3, PeMSD4, PeMSD7, PeMSD8, METR-LA, PeMS-BAYGCNA dynamic spatial–temporal adjacent graph convolutional network (DSTAGCN) is featured with the construction of a spatial–temporal graph and the integration of fuzzy systems and neural networks for uncertain relationship representation.
[117]Road traffic flow, road traffic speedDynamic graphPeMS-BAY, TaxiBJ, PeMSD4, PeMSD8GCN, GRUA dynamic spatial–temporal graph convolutional network (DSTGCN) is featured with a dynamic graph generation module with geographical proximity and spatial heterogeneity.
[118]Road traffic flowDynamic graphPeMSD3, PeMSD4, PeMSD7, PeMSD8, PeMS-SANGCNThe proposed approach features a new temporal vector CNN module and a new dynamic correlation graph construction method.
[119]Regional travel demandStatic graphTaxiNYCGCN, GRUThe proposed approach features a geographic similarity graph, functional similarity graph, and road similarity graph.
[120]Road traffic speedDynamic graphPeMS-BAY, METR-LAGCN, LSTMThe proposed EnGS-DGR model features the ensemble learning of GCN, Seq2Seq, and dynamic graph reconfiguration algorithms.
[121]Road traffic speedDynamic graphPeMS-BAY, METR-LAGCN, CNN, GRUThe embedded spatial–temporal network (ESTNet) combines multirange GCN and 3D-CNN modules for modeling spatial–temporal dependencies.
[122]Passenger OD flowStatic graphPrivate dataGCN, TCNThe proposed approach features a novel sharing-stop network to model relationships between bus passengers and various mobility patterns.
[123]Road traffic speedStatic graphPrivate dataGCN, GRUThe proposed approach features the incorporation of a wavelet transform and usage of the electronic toll collection (ETC) gantry transaction data.
[124]Road traffic flowDynamic graphPeMSD4, PeMSD8GATA fully dynamic self-attention spatiotemporal graph network (Fdsa-STG) is featured with a spatial GAT, a temporal GAT, and fusion layers to extract recent, daily, and weekly periodicity patterns.
[125]Regional traffic flowDynamic graphTaxiNYC, TaxiBJGAT, GCN, LSTMA federated deep learning based on the spatial–temporal long and short-term network (FedSTN) is featured with a recurrent long-term capture network module, attentive mechanism federated network module, and semantic capture network module to capture both spatial–temporal and semantic features.
[126]Intersection turning traffic flowStatic graphA new open data of Wuhan, ChinaGCN, GRUThe proposed approach features the modeling of turning traffic flow with a GCN and a GRU.
[127]Metro ridershipStatic graphPrivate dataGCN, LSTMAn attention-weighted multiview graph to sequence learning approach (AW-MV-G2S) is featured that learns spatial correlations from geographic distance, functional similarity, and demand pattern views.
[128]Regional traffic flowDynamic graphTaxiNYC, TaxiBJ, BikeNYCGATThe proposed approach features the multiresolution transformer network, GAT, and channel-aware recalibration residual network modules.
[129]Road traffic flowDynamic graphPeMSD3, METRA-LAGATThe proposed GDFormer features a novel graph diffusing attention module to model the dynamically changing traffic flow.
[130]Road traffic speedDynamic graphPeMS-BAY, NavInfo Beijing, NavInfo ShanghaiGATThe proposed approach features a novel data-driven graph construction method.
[131]Road traffic flowDynamic graphMetro Interstate Traffic Volume Data SetGAT, LSTMA graph correlated attention recurrent neural network (GCAR) is featured with a combination of GAT, multilevel attention, and parallel LSTM modules.
[132]Road traffic speedDynamic graphQ-TrafficGATA graph sequence neural network with an attention mechanism (GSeqAtt) is featured with two attention mechanisms to capture temporal correlations and graph structures.
[133]Intersection traffic flowStatic graphQingdao Traffic DataGCNA spatial–temporal graph convolutional network (ST-GCN) is featured with an adjacent-similar algorithm and the ability to model both spatial and temporal dependencies of intersection traffic.
[134]Regional traffic speedStatic graphPrivate dataGCN, ConvLSTMThe proposed HDL4STP model features the combination of GCN, ConvLSTM, and fusion layers.
[135]Road traffic flowDynamic graph, static graphPeMSD4, PeMSD8GCN, LSTMAn improved graph convolution res-recurrent network (IGCRRN) is featured with a combination of an origin graph matrix and a data-generated embedding node matrix for spatial dependency.
[136]Bike flowStatic graphPrivate dataRelation graph networkThe proposed approach features a generalized attention mechanism to extract block features and make cross-city predictions.
[137]Subway demand, ride-hailing demandStatic graphSubwayNYC, TaxiNYCGCNA multirelational spatiotemporal graph neural network (ST-MRGNN) is featured with the multimodal demand prediction ability with multirelational GNNs.
[138]Road traffic flowStatic graphPeMSD3, PeMSD4, PeMSD7, PeMSD8, PeMS-BAYGAT, TCNA multirelational synchronous graph attention network (MS-GAT) considers multiaspect traffic data couplings and learns channel, temporal, and spatial relations with GATs.
[139]Road traffic speedDynamic graphPrivate dataGAT, CNNThe proposed HA-STGN model considers spatial–temporal heterogeneous features and contains a dynamic graph module, a time-sensitive attention mechanism, and an adaptive fusion module.
[140]Road traffic flowStatic flowPeMSD3, PeMSD4, PeMSD7, PeMSD8GCNAn adaptive graph cross-strided convolution network (AGCSCN) is featured with temporal feature extraction with a cross-strided convolution network and spatial feature extraction with an adaptive GCN.
[141]Road traffic flowStatic graphPeMSD4, PeMSD8GCN, LSTMA long-short-term-memory-embedded graph convolution network (LST-GCN) is featured with an LSTM embedding into GCNs.
[142]Road traffic speedDynamic graphDidiChengdu, METR-LAGCN, TCNA spatiotemporal adaptive gated graph convolution network (STAG-GCN) is featured with the combination of a self-attention TCN, mix-hop adaptive gated GCN, and fusion layers.
[143]Road traffic flowStatic graphPEMS03, PEMS04, PEMS07, PEMS08GCN, LSTMA memory-attention-enhanced graph convolution long short-term memory network (MAEGCLSTM) is featured with the combination of a memory attention mechanism and LSTM.
[144]Road traffic speedDynamic graph, static graphPeMS-BAYGCN, TCNA multistage spatiotemporal fusion diffusion graph convolutional network (MFDGCN) is featured with multiple static and dynamic spatiotemporal association graphs.
[145]Road traffic flowStatic graphPeMSD4, PeMSD8GCN, TCNA multihead self-attention spatiotemporal graph convolutional network (MSASGCN) is featured with the combination of a GCN, a TCN, and the multihead self-attention mechanism.
[146]Metro passenger flowStatic graphHZMF2019GCN, GAT, CNNA multitime multigraph neural network (MTMGNN) is featured with the combination of gated CNN, GCN, and GAT modules with multiple graphs.
[147]Road traffic speedStatic graphMETR-LAGCN, TCNA gated temporal graph convolution network (GT-GCN) is featured with a multistep-ahead prediction ability with GCN and gated TCN modules.
[148]Regional ride-hailing demandStatic graphPrivate dataGCN, LSTMMultigraph aggregation spatiotemporal graph convolutional network (MAST-GCN) is featured with a novel graph aggregation method.
[149]Road traffic flowStatic graphPeMSD4, PeMSD7, PeMSD8GCNThe proposed approach features a multiscale traffic prediction ability with a cross-scale GCN and temporal networks.
[150]Metro passenger flowDynamic graphPrivate dataGCN, GRUThe proposed approach proposes multifeature spatial–temporal dynamic multigraph convolutional networks for spatial and temporal connections.
[151]Road traffic speedStatic graphQ-Traffic, private dataGCN, LSTMThe proposed approach features a multifold correlation attention network to model dynamic correlations.
[152]Regional traffic flowDynamic graphTaxiNYC, BikeNYCGCN, GRUA multimode dynamic residual graph convolution network (MDRGCN) is featured with multimode dynamic GCN, GRU, and residual modules for learning cross-mode relationships.
[153]Metro passenger flowStatic graphPrivate dataGCN, GAT, LSTMA temporal graph attention convolutional neural network model (TGACN) is featured with a multigraph generation method and a new spatiotemporal feature fusion method.
[154]Road traffic speedStatic graphMETR-LA, PeMS-BAYGCN, TransformerA multiview spatial–temporal graph neural network (MVST-GNN) is featured with multiview Transformer and GCN modules.
[155]Metro flow, bus flowStatic graphPrivate dataGCNA multitask hypergraph convolutional neural network (MT-HGCN) models the correlation between different tasks with a feature-compressing unit.
[156]Regional traffic flowStatic graphTaxiSZGCN, GRUThe proposed TmS-GCN model features the combination of GCN and GRU modules.
[157]Road traffic flow, road traffic speedStatic graphPrivate dataGCN, LSTMThe proposed method features a Seq2seq GCN-LSTM framework and the usage of connected probe vehicle data.
[158]Bus passenger flowStatic graphPrivate dataGCN, LSTMThe proposed method features a bus network graph construction method and the combination of GCN and LSTM modules.
[159]Road traffic flowStatic graphPeMSD4, PeMSD8GCNThe proposed approach features the combination of graph deep learning and federated learning.
[160]Road traffic flowStatic graphPeMSD4, PeMSD7GCN, GRUA spatial–temporal attention graph convolution network on edge cloud (STAGCN-EC) is featured with the edge training approach and deep learning modules designed for low-computational-power devices.
[161]Road traffic flowStatic graphPEMSD3, PEMSD4, PEMSD7, PEMSD8GCN, TCNThe proposed approach features two semantic adjacency matrices and a dynamic aggregation method.
[162]Road traffic speedStatic graphMETR-LA, PeMS-BAYGCN, GRUA spatial–temporal upsampling graph convolutional network (STUGCN) is featured with a novel upsampling method with virtual nodes to model the global spatial–temporal correlations.
[163]Regional passenger demandStatic graphDidiCD, TaxiNYCGAT, ConvGRUThe proposed approach features the combination of GAT and ConvGRU modules.
[164]Road traffic flowStatic graphPeMSD4, PeMSD8GCN, CNNThe proposed STGMN model features the combination of a 1D CNN with channel attention and interpretable multigraph GCN modules.
[165]Metro passenger flowDynamic graphPrivate dataGCN, TrellisNetThe proposed STP-TrellisNets+ incorporates TrellisNet with graph convolution in multistep traffic prediction for the first time.
[166]Road traffic flowStatic graphPeMSD4, PeMSD8GCN, TCNA spatial–temporal global semantic graph attention convolution network (STSGAN) is featured with the usage of global geographic contextual information for urban flow prediction.
[167]Road traffic flowStatic graphPeMSD4GAT, GLUA spatiotemporal multihead graph attention network (ST-MGAT) is featured with the combination of GAT and GLU structures.
[168]Taxi demandStatic graphPrivate dataMPNNThe proposed approach features multimodal message passing and attention mechanisms.
[169]Road traffic congestionStatic graphPeMSD4GATThe proposed TCP-BAST features bilateral alternation modules with GAT, a multihead masked attention network, and temporal and spatial embedding.
[170]Road traffic flow, road traffic speed, road travel timeStatic graphTaxiBJ, META-LA, PeMS-BAY, PeMSD4, PeMSD8GCN, GAN, GRUThe proposed approach features the combination of multigraph GCN and GAN structures.
[171]Road traffic flowDynamic graphPeMSD4, PeMSD7TCN, GCNThe proposed framework features the combination of dilated TCN, multiview GCN, and masked multihead attention modules.
[172]Road traffic speedDynamic graph, static graphMETR-LA, PeMS-BAYGCN, GRUA time-evolving graph convolutional recurrent network (TEGCRN) is featured with the combination of time-evolving and predefined graphs.
[173]Road traffic speedStatic graphSeattle-Loop, TaxiSZGCN, GRU, GANThe proposed approach features the combination of a GCN and a GAN with output distribution constraints.
[173]Road traffic speedStatic graphTaxiSZ, METR-LA, PeMS-BAYMPNN, GRUThe proposed approach features a combination of bidirectional message passing, GRU, and self-attention mechanisms.
[174]Road traffic speedDynamic graphMETR-LA, PeMS-BAYGAT, TCNThe proposed TransGAT model features an attention-based node-embedding algorithm and a gated TCN module.
[175]Regional ride-hailing demandStatic graphDidiHaikou, TaxiWuhanGCN, LSTMA multiview deep spatiotemporal network (MVDSTN) is featured with the combination of both traffic and semantic views.
[176]Road traffic flow, road traffic speedDynamic graphMETR-LA, PeMS-BAY, PeMSD4, PeMSD7TransformerAn adaptive graph spatial–temporal Transformer network (ASTTN) is featured with adaptive spatial–temporal graph modeling and local multihead self-attention.
[177]Road traffic flow, road traffic speedDynamic graphMETR-LA, PeMS-BAY, PeMSD3, PeMSD4, PeMSD7, PeMSD8GCN, TCNThe proposed approach features the neural architecture search framework for GNN and CNN modules.
[178]Road traffic speedStatic graphMETR-LA, PeMSD4GCN, TCNA spatial–temporal channel-attention-based graph convolutional network (STCAGCN) is featured with stacked dilated convolution for long-sequence modeling.
[179]Road traffic flowDynamic graph, static graphPeMSD3, PeMSD4, PeMSD7, PeMSD8GCNThe proposed approach features a cascading structure to enhance interaction and capture heterogeneity.
[180]Road traffic speedStatic graphMETR-LA, PeMS-BAY, PeMS-M, PeMSD4, PeMSD8GATA spatiotemporal graph attention network (ST-GAT) is featured with an individual spatiotemporal graph for modeling individual dependencies.
[181]Road traffic speedStatic graphMETR-LA, PeMS-BAYGCN, TCNThe proposed approach features a novel residual estimation module.
[182]Bike demandDynamic graphBikeChicago, BikeLAGNNThe approach features a novel graph generator and GNN with flow-based and attention-based aggregators.
[183]Road traffic speedDynamic graphMETR-LA, PeMS-Bay, TaxiSZGCNThe proposed approach features the decomposition of seasonal static and acyclic dynamic components for traffic prediction.
[184]Road traffic flowDynamic graphPeMSD3, PeMSD4, PeMSD7GCN, GRUAn AdaBoost spatiotemporal network (Ada-STNet) is featured with the boosting approach of stacking base models.
[185]Road traffic flow, road traffic speedDynamic graphMETR-LA, PeMS-BAY, PeMSD4, PeMSD8GCN, GRUA decoupled dynamic spatial–temporal graph neural network (D 2 STGNN) is featured with a decoupled spatial–temporal framework and a dynamic graph learning module.
[186]Road traffic flowDynamic graphPeMSD3, PeMSD4, PeMSD7, PeMSD8GCN, GTUA dynamic spatial–temporal-aware graph neural network (DSTAGNN) is featured with a new dynamic spatial–temporal-aware graph and a novel GNN structure.
[187]Road traffic flowStatic graphPeMSD3, PeMSD4, PeMSD7, PeMSD8GNNThe proposed approach features a first-order gradient supervision (FOGS) which uses first-order gradients for training the prediction model.
[188]Road traffic flowDynamic graphPeMSD3, PeMSD4, PeMSD7, PeMSD8GCNA spatiotemporal graph neural controlled differential equation (STG-NCDE) is featured with the incorporation of neural controlled differential equations in traffic forecasting for the first time.
[189]Road traffic flowStatic graphPeMSD4, PeMSD7GNNThis study proposes a communication-efficient federated learning approach for graph-based traffic forecasting.
[190]Road traffic flow, traffic demandStatic graphPeMSD3, PeMSD8, BikeNYC, TaxiNYCGCN, MSDRThe proposed approach is based on a graph-based multistep dependency relation (MSDR) model with the ability to learn from multiple historical time steps.
[191]Road traffic speedStatic graphDidiShenzhen, DidiChengdu, PeMS-BAY, METR-LAGCNThe proposed ST-GFSL framework features the combination of spatiotemporal traffic prediction with few-shot learning and cross-city knowledge transfer.
[192]Road traffic speedDynamic graphMETR-LA, PeMS-BAYGAT, TCNThe proposed approach features the semantic closeness relationship and traffic dynamics.
[193]Road traffic flow, road traffic speedDynamic graphMETR-LA, PeMS-BAY, PeMSD4GNNThe proposed approach enhances the performance of spatiotemporal GNNs with a pretraining model trained with very long term history data.
[194]Road traffic flowDynamic graph, static graphPeMSD4, PeMSD8GCN, GRURegularized graph structure learning (RGSL) is featured with an embedding-based implicit dense similarity matrix, a regularized graph generation method, and a Laplacian matrix mixed-up module to fuse the graphs.
[195]Road traffic flow, road traffic speedDynamic graphPeMSD8, METR-LAGCN, TCNSpatiotemporal latent graph structure learning network (ST-LGSL) is featured with a MLP-kNN-based graph generator and the combination of diffusion graph convolutions and gated TCN modules.
[196]Road traffic flowStatic graphPrivate dataGCN, GRUA spatiotemporal differential equation network (STDEN) is featured with the combination of data-driven and physics-driven approaches and the differential equation network model for modeling the spatiotemporal dynamic process.
[197]Road traffic flowDynamic graphPeMSD3, PeMSD4, PeMSD8GCN, CNN, GRUTime-aware multipersistence spatio-supragraph convolutional network (TAMP-S2GCNets) is featured with the introduction of a time-aware multipersistence Euler-Poincaré surface and a supragraph convolution model for intra- and interdependencies.
[198]Road traffic flowStatic graphPeMSD8GCN, GRU, GLUA two-stage stacked graph convolution network (ED2GCN) is featured by the stacking of a GCN, a GLU, and the attention mechanism.
[199]OD travel demandStatic graphTaxiChicago, TaxiNYCDCNN, TCNA spatial–temporal zero-inflated negative binomial graph neural network (STZINBGNN) is featured with the uncertainty quantification of the sparse travel demand with diffusion and temporal convolution networks.
[200]Road traffic speedStatic graphNAVER-Seoul, METR-LAGCNA pattern-matching memory network (PM-MemNet) is featured with a novel key–value memory structure and a pattern-matching framework.
[201]Regional traffic flowStatic graphNeurIPS Traffic4Cast Challenge DataGNNThe proposed approach features the combination of U-Net with graph learning.
[202]Road traffic speedStatic graphPeMS-BAY, METR-LAGCN, CNNThe proposed approach features a mix-hop GCN and stacked temporal attention mechanism.
[203]Road traffic flowDynamic graphPeMSD3, PeMSD4, PeMSD7, PeMSD8GCNThe proposed approach features a graph construction method for cross-time and cross-space correlations.
[204]Road traffic speedStatic graphMETR-LA, PeMS-BAYGCN, GRUThe proposed approach features a novel local context-aware spatial attention mechanism.
[205]Road traffic speedDynamic graphPeMS-BAY, private dataGCNThe proposed approach features the combination of a GCN and attention mechanism for multidimensional information aggregation.
The problems considered in Table 1 were grouped into different transportation modes, e.g., road traffic, taxis, bikes, and subways. Previous studies have also shown that joint forecasting of multimode data is beneficial [206]. GNN-based solutions are applicable and have already been used for multimode forecasting cases. In [152], a multimode dynamic residual graph convolution network (MDRGCN) model was proposed for regional taxi and bike flow forecasting, in which cross-mode relationships were learned by multimode dynamic GCN, GRU, and residual modules. In [99], a comodal graph attention network (CMGAT) was proposed for bike and taxi demand forecasting, which was based on a multiple-traffic-graph-based spatial attention mechanism and a multiple-time-period-based temporal attention mechanism. In these studies, it was demonstrated that the GNN-based joint forecasting of multimode traffic data was more effective than individual forecasts.
We also noticed that the traffic occupancy prediction problem was not seen in the studies reviewed in this paper. Some possible reasons are discussed below. Traffic occupancy is often modeled as a decision variable rather than using continuous variables such as traffic speed or volume. While GNN-based solutions have been shown to be effective in predicting continuous variables, as described in this survey, decision-tree-based models are still powerful for making binary decisions, e.g., XGBoost and LightGBM [207,208,209]. Another possible reason is that traffic occupancy can be detected more efficiently with computer vision methods based on images or videos, in which case convolutional neural networks and Transformers still dominate [210]. A similar problem is lane-occupancy-rate prediction, which is also rare in the literature due to the high cost of collecting real-world lane-occupancy data, e.g., deploying loop detectors for each lane in large-scale road segments. For example, only simulated traffic data can be used for lane-occupancy measurement and prediction in [211].
For model evaluation and comparison, different evaluation metrics are used, e.g., root mean square error (RMSE), mean absolute error (MAE), and mean absolute percentage error (MAPE). Forecast horizons also differ per study, such as 5, 10, 30, or 60 min, and it was found that the larger the horizon, the harder the forecasting problem, and the greater the error observed with larger horizons. Due to the different evaluation metrics and forecast horizons, it is nearly impossible to fairly compare all surveyed studies and quantify the difficulty of the available datasets. It was also found that for some common baselines, e.g., DCRNN [212], STGCN [213], and Graph WaveNet [214], their reported performance in different studies could vary when the training variables were different.

3. New Dataset and Code Resources

This section provides up-to-date lists of open datasets and code resources for the research community.

3.1. New Datasets

Open datasets are the basis for evaluating and comparing different forecasting models [215]. As discussed in Section 2, several open datasets have been widely used in the surveyed studies, such as METR-LA, PeMS, and NYC Open Data. Despite the availability of these datasets, developing new datasets is still beneficial for the following two reasons. The first reason is the risk of overfitting of deep learning models on existing datasets, especially those that are relatively small compared to datasets in other domains, such as large collections of images and natural language corpora. The second reason is that models trained using datasets collected many years ago may suffer from data drift as traffic facilities change. The data-shift problem means that the traffic patterns in the historical training data could be totally different from those in the newly collected test data, and the performance of trained deep learning models can degrade significantly in unseen cases. Therefore, here, we update the community with new, publicly available traffic datasets in Table 2 to facilitate future research and encourage constant updates of high-quality traffic datasets.

3.2. New Code Resources

Open-code resources facilitate the replication of published results and migration of proposed models to new problems. We summarize here the new publicly available code resources in Table 3 and list the implementation frameworks, including TensorFlow (https://www.tensorflow.org, accessed on 2 February 2023) and PyTorch (https://pytorch.org/, accessed on 2 February 2023). It is observed from Table 3 that PyTorch is more popular than TensorFlow for developing new graph neural network models in traffic forecasting research.
There are also many off-the-shelf libraries available for implementing GNNs using PyTorch or TensorFlow, e.g., PyTorch Geometric (https://pytorch-geometric.readthedocs.io/, accessed on 2 February 2023), Deep Graph Library (https://www.dgl.ai/, accessed on 2 February 2023), TensorFlow Graph Neural Networks (https://blog.tensorflow.org/2021/11/introducing-tensorflow-gnn.html, accessed on 2 February 2023), and Spektral (https://graphneural.network/, accessed on 2 February 2023). These libraries have implemented some well-known GNN variants, such as GCN and GAT, and provide the ability to define new GNN models. However, they are not designed for traffic forecasting problems. It would be more convenient to replicate those existing GNN-based traffic forecasting models with the open-code resources listed in Table 3.

4. Research Challenges and Opportunities

This section discusses research challenges and opportunities when applying GNNs to traffic forecasting problems in order to inspire follow-up research.

4.1. Research Challenges

Several challenges can be observed from the surveyed studies, which can be categorized into data, model, and system perspectives. From a data perspective, challenges include data quality and cold-start issues. From a model perspective, challenges include complex graph structure and model robustness concerns. From a system perspective, the real-world deployment of GNNs in transportation systems is a challenge that cannot be ignored.
The first challenge is the training data quality. When utilizing graph neural networks, some issues related to data quality may arise. On the one hand, high-quality datasets are expensive to build, as the data collection process can be time-consuming and costly. As extreme or urgent traffic events such as traffic jams and accidents are rare, collecting comprehensive datasets is more difficult. On the other hand, data privacy is also non-negligible if we want to create more comprehensive datasets, since most existing traffic datasets are collected from public transportation modes (e.g., taxis and shared bikes) or road sensors, rather than from private vehicles [224].
The second challenge is the cold-start problem [136] when initializing GNNs for traffic prediction. Deep learning models, including GNNs, usually require a large quantity of training data to efficiently train the model and obtain satisfactory predictions. However, data collection in the traffic field is often time-consuming and labor-intensive, for example, by installing loop detectors for traffic flow and speed information collection. The cold-start problem arises when the developed GNN models are to be used in a new area or station, especially for a growing urban network.
The third challenge is the diverse and complicated graph structures that exist in the real-world traffic infrastructure. Most surveyed studies consider only dense graphs, e.g., in downtown areas or on closely connected highways, when traffic activities are active. However, the complete traffic graph of a city may be sparse, with some nodes having no or few connections to other nodes. This real-world condition has received insufficient attention in the surveyed studies. Another limitation of the surveyed studies is that the graphs considered are relatively small, e.g., less than 1000 nodes. For example, the most popular PeMS datasets are a collection of subsets from a large dataset collected from more than 40,000 individual detectors spread over a wider geographic area, since the size of the original dataset exceeds the computing abilities for some research groups.
The fourth challenge is the robustness of GNN models. Deep learning models have long been criticized for their black-box nature with little or no interpretation paired with predicted outcomes. This black-box problem exists for graph neural networks as well, and there are few systematic methods for interpreting GNNs in traffic forecasting settings. Many anomalies or outliers in the data are removed during processing steps or do not appear in the training dataset. When these anomalies are encountered during the testing or deployment phase, the performance of the trained GNN model degrades, leading to large deviations in model predictions. Given such risks, it is important to enhance the robustness and interpretability of GNN models to increase user confidence in the models.
The fifth challenge is the real-world deployment of GNNs in transportation systems. The real-world implementation of the surveyed GNN solutions requires substantial computing, communication, and storage resources. However, most of the surveyed studies only consider empirical evaluations based on offline datasets without testing their models on real-world transportation systems. Several obstacles arise in the real-world deployment of GNNs. To effectively utilize graph-based structures, a centralized deployment mode is required to collect global information and compute predictions in a single server. Although deep learning models, including GNNs, can be trained offline, the online inference process still requires considerable computing and storage resources when the considered traffic graph is very large. When the considered graph becomes larger, the communication overhead also increases. To achieve more efficient and safe transportation systems, complex GNN architectures may not be necessary for traffic-related tasks if their marginal performance improvement fails to cover the increased computational, communication, and storage costs.

4.2. Research Opportunities

Some promising research opportunities are discussed to address the above challenges and inspire future research.
The first research opportunity is the introduction of traffic simulation tools for creating unseen complex situations as training data. Two specific approaches, model-driven and data-driven approaches, can be further investigated. Model-driven approaches are based on macroscopic or microscopic traffic simulators, where macroscopic tools focus on the high-level deterministic relationships of flow, speed, and the density of traffic flows, while microscopic tools focus on individual details. On the other hand, data-driven methods do not rely on traffic domain knowledge but create more data samples from existing methods, e.g., generative adversarial network (GAN)-based studies [85,86,87,170,225]. Regarding the black-box nature of neural networks, the use of physics-informed neural network approaches is gaining popularity in research. These approaches combine both model-driven and data-driven methods and have been successfully applied in the transportation domain [226,227].
The second research opportunity is to introduce new learning schemes to traffic forecasting problems, e.g., transfer learning, meta learning, and federated learning. Transfer learning has been proven effective for transferring cross-city knowledge, which will help address the cold-start problem in new cities [191]. Furthermore, meta learning has been shown to be useful for building new graph structures through efficient structure-aware learning during cross-city knowledge transfer. Privacy-preserving schemes are further proposed to be combined with transfer learning, protecting the sensitive information from the source domain [67]. Federated learning is another effective learning approach for maintaining data privacy while training effective deep learning models [159,189].
The third research opportunity is the combination of knowledge graphs under different road conditions or transportation modes to establish connections among them [63]. More external data can be used when constructing traffic knowledge graphs, e.g., the activity calendar from social media for potential traffic demands. Additionally, the knowledge between different transportation modes, e.g., interchange hubs, would be useful for multimodal prediction [137,152,155].
The fourth research opportunity is a distributed learning approach for training large-scale graph neural networks for traffic forecasting [228,229]. When the application of GNNs for traffic prediction scales to larger graphs, a distributed training of graph neural networks is necessary. In those cases, improvements in training and runtime efficiency is even more beneficial and important. Another similar idea is to leverage cloud computing for model training and edge computing for runtime inference [160,230] to accelerate the distributed training and inference process.
The fifth research opportunity is the Bayesian learning approach for uncertainty quantification. Uncertainty in traffic forecasting may not be as critical as uncertainty in other domains, e.g., wireless communication problems. However, it is still important to account for uncertainty in the transportation domain when noisy or missing data could impair predictive capabilities and lead to unusual forecasts. Bayesian neural networks have been shown to be effective in dealing with data uncertainty caused by noisy or missing data in road traffic flow forecasting [66]. Another similar idea is to incorporate the physical mechanism of traffic flow dynamics as constraints, such as neural controlled differential equations [188] and Poisson processes [84], to avoid unreasonable predicted values [196] and help to improve the model interpretability.
The sixth research opportunity is the combination of graph neural networks and reinforcement learning, which is rarely considered in the surveyed studies, with only one exception [90]. The ensemble of these two models can sometimes produce brilliant sparks. For example, some relevant studies leverage reinforcement learning techniques for a more efficient graph neural network structure search [231]. On the other hand, reinforcement learning itself is useful for making optimal decisions in the traffic domain with properly designed rewards, e.g., traffic light control and autonomous driving. There is still a large research gap in applying reinforcement learning to graph data structures [232,233].
The last but not the least research opportunity is the deployment of GNNs based on cloud computing and B5G/6G communication techniques. Cloud computing can provide the required computing and storage resources. GNN models can be trained, deployed, and updated in the cloud with a scalable infrastructure. The B5G/6G communication technique is designed to have the ability to support massive machine-type communication scenarios and can be used for reliable and massive traffic data collection and transmission.
In summary, the first and second research opportunities are proposed to address the first and second research challenges. The third and fourth research opportunities are proposed to address the third research challenge. The fifth research opportunity is proposed to address the fourth research challenge. The last research opportunity is proposed to address the fifth research challenge.

5. Conclusions

In 2022, the number of studies on the topic of applying graph neural networks for traffic forecasting grew rapidly. In this survey, we summarized the progress made by these studies and listed their targeted problem, graph types, datasets, and neural networks used. We observed that the road traffic flow and speed prediction problem was still the most popular traffic forecasting problem. The GNN family, GCN and GAT, was one of the promising solutions to these problems. To further motivate follow-up research, new collections of datasets and code resources were presented. Research challenges and opportunities were further discussed in this study.

Author Contributions

Conceptualization, Weiwei Jiang, Jiayun Luo, and Miao He; methodology, Weiwei Jiang, Jiayun Luo, and Miao He; software, Weiwei Jiang, Jiayun Luo, and Miao He; validation, Weiwei Jiang, Jiayun Luo, and Miao He; formal analysis, Weiwei Jiang, Jiayun Luo, and Miao He; investigation, Weiwei Jiang, Jiayun Luo, and Miao He; resources, Weiwei Jiang, Jiayun Luo, and Miao He; data curation, Weiwei Jiang, Jiayun Luo, and Miao He; writing—original draft preparation, Weiwei Jiang, Jiayun Luo, and Miao He; writing—review and editing, Weiwei Jiang, Jiayun Luo, and Miao He; visualization, Weiwei Jiang, Jiayun Luo, and Miao He; supervision, Weiwei Jiang and Weixi Gu; project administration, Weiwei Jiang and Weixi Gu; funding acquisition, Weiwei Jiang. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Fundamental Research Funds for the Central Universities.

Data Availability Statement

The data are available at https://github.com/jwwthu/GNN4Traffic, accessed on 2 February 2023.

Acknowledgments

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Abbreviation List

The abbreviations used in this manuscript are listed in Table A1 with their full names.
Table A1. Abbreviations used in this manuscript.
Table A1. Abbreviations used in this manuscript.
AbbreviationFull Name
AARGNN [97]Attentive attributed recurrent graph neural network
ABSTGCN-EF [107]Attention-based spatiotemporal graph convolutional network considering external factors
ADSTGCN [106]Attention-based dynamic spatial–temporal graph convolutional network
AED-DGCN-TSC [102]Attention encoder–decoder dual-graph convolutional network with time series correlation
AGCN-T [101]Attention-based graph convolution network and transformer
AGCSCN [140]Adaptive graph cross-strided convolution network
AM-RGCN [108]Augmented multicomponent recurrent graph convolutional network
ARIMAAutoregressive integrated moving average
ASTGAT [105]Attention-based spatiotemporal graph attention network
ASTTN [176]Adaptive graph spatial–temporal transformer network
AW-MV-G2S [127]Attention-weighted multiview graph-to-sequence learning
Ada-STNet [184]AdaBoost spatiotemporal network
Ada-STNet [100]Adaptive spatiotemporal graph neural network
AdapGL [98]Adaptive graph learning
Bi-GRCN [109]Bidirectional-graph recurrent convolutional network
CGLGCN [104]Causal gated low-pass graph convolution neural network
CMGAT [99]Comodal graph attention network
CNNConvolutional neural network
CRFConditional random field
ConvGRUConvolutional GRU
ConvLSTMConvolutional LSTM
D 2 STGNN [185]Decoupled dynamic spatial–temporal graph neural network
DCNN [234]Diffusion convolutional neural network
DDP-GCN [114]Distance, direction, and positional relationship graph convolutional network
DDSTGCN [60]Dual dynamic spatial–temporal graph convolution network
DG 2 RNN [81]Dual-graph gated recurrent neural network
DGGP [115]Deep graph Gaussian process
DMGC-GAN [85]Dynamic multigraph convolutional network with generative adversarial network
DMVST-VGNN [72]Deep multiview spatiotemporal virtual graph neural network
DRLDeep reinforcement learning
DSTAGCN [116]Dynamic spatial–temporal adjacent graph convolutional network
DSTAGNN [186]Dynamic spatial–temporal aware graph neural network
DSTGCN [117]Dynamic spatial–temporal graph convolutional network
DSTGNN [84]Dynamical spatial–temporal graph neural network
ED2GCN [198]Two-stage stacked graph convolution network
EMDEmpirical mode decomposition
ESTNet [121]Embedded spatial-temporal network
ETCElectronic toll collection
FOGS [187]First-order gradient supervision
Fdsa-STG [124]Fully dynamic self-attention spatiotemporal graph network
FedSTN [125]Federated-deep-learning-based on the spatial–temporal long and short-term network
GAMCN [88]Graph and attentive multipath convolutional network
GANGenerative adversarial network
GATGraph attention network
GCAR [131]Graph correlated attention recurrent neural network
GCNGraph Convolutional Network
GCN-GAN [73]Graph convolution and generative adversarial neural network
GDFormer [129]Graph diffusing Transformer
GLUGated linear unit
GRUGated recurrent unit
GSeqAtt [132]Graph sequence neural network with an attention mechanism
GT-GCN [147]Gated temporal graph convolution network
GTUGated tanh unit
HMIAN [74]Hierarchical mapping and interactive attention network
IGCRRN [135]Improved graph convolution res-recurrent network
ITSIntelligent transportation systems
IoTInternet of things
IoVInternet of vehicles
KGR-STGNN [63]Knowledge graph representation learning and spatiotemporal graph neural network
kNNK-Nearest Neighbor
LST-GCN [141]Long-short-term-memory-embedded graph convolution network
LSTMLong short-term memory
MA-STN [77]Multidimensional attention-based spatial-temporal network
MADGCN [89]Multiattention dynamic graph convolution network
MAEGCLSTM [143]Memory attention enhanced graph convolution long short-term memory network
MAGCN [64]Multiattribute graph convolutional network
MAST-GCN [148]Multigraph aggregation spatiotemporal graph convolutional network
MDRGCN [152]Multimode dynamic residual graph convolution network
MFDGCN [144]Multistage spatiotemporal fusion diffusion graph convolutional network
MG-GAN [225]Multiple-graph-based generative adversarial network
MHODE [78]Mixed hop diffuse ordinary differential equation
MLPMultilayer perceptron
MPNNMessage-passing neural network
MS-GAT [138]Multirelational synchronous graph attention network
MSASGCN [145]Multihead self-attention spatiotemporal graph convolutional network
MSDR [190]Multistep dependency relation network
MT-HGCN [155]Multitask hypergraph convolutional neural network
MTMGNN [146]Multitime multigraph neural network
MVB-STNet [66]Multiview Bayesian spatiotemporal graph neural network
MVDSTN [175]Multiview deep spatiotemporal network
MVST-GNN [154]Multiview spatial–temporal graph neural network
PM-MemNet [200]Pattern-matching memory network
RGSL [194]Regularized graph structure learning
SAGCN-SST [83]Self-attention graph convolutional network with spatial, subspatial, and temporal blocks
SARIMASeasonal autoregressive integrated moving average
ST-GAT [180]Spatiotemporal graph attention network
ST-GCN [133]Spatial–temporal graph convolutional network
ST-LGSL [195]Spatiotemporal latent graph structure learning
ST-MGAT [167]Spatiotemporal multi-head graph attention network
ST-MRGNN [137]Multirelational spatiotemporal graph neural network
STAG-GCN [142]Spatiotemporal adaptive gated graph convolution network
STAGCN-EC [160]Spatial–temporal attention graph convolution network on edge cloud
STAGCN [61]Spatiotemporal adaptive graph convolutional network
STCAGCN [178]Spatial–temporal channel-attention-based graph convolutional network
STDEN [196]Spatiotemporal differential equation network
STG-NCDE [188]Spatiotemporal graph neural controlled differential equation
STHAN [68]Spatiotemporal heterogeneous graph attention network
STHGCN [69]Spatiotemporal prediction framework using high-order graph convolutional network
STSGAN [166]Spatial–temporal global semantic graph attention convolution network
STSSN [96]Spatiotemporal sequence-to-sequence network
STUGCN [161]Spatial–temporal upsampling graph convolutional network
STZINBGNN [199]Spatial–temporal zero-inflated negative binomial graph neural network
Seq2SeqSequence to sequence
T-ISTGNN [67]Transferable federated inductive spatial-temporal graph neural network
TAMP-S2GCNets [197]Time-aware multipersistence spatio-supragraph convolutional network
TCNTemporal convolutional network
TEGCRN [172]Time-evolving graph convolutional recurrent network
TGACN [64]Temporal graph attention convolutional neural network
TGAE [71]Temporal graph autoencoder

Appendix B. The Source Journal list

The source of the journals for the surveyed studies are listed in Table A2 with the number of papers counted.
Table A2. Source journals for the surveyed papers.
Table A2. Source journals for the surveyed papers.
Journal NameNumber of Surveyed Papers
IEEE Transactions on Intelligent Transportation Systems23
Information Sciences7
Applied Intelligence6
Journal of Advanced Transportation6
Electronics5
Physica A: Statistical Mechanics and its Applications5
ACM Transactions on Intelligent Systems and Technology4
Applied Sciences4
Knowledge-Based Systems4
Transportation Research Part C: Emerging Technologies4
Expert Systems with Applications3
ACM Transactions on Knowledge Discovery from Data2
GeoInformatica2
IEEE Internet of Things Journal2
IEEE Transactions on Knowledge and Data Engineering2
IET Intelligent Transport Systems2
ISPRS International Journal of Geo-Information2
Neural Computing and Applications2
Wireless Communications and Mobile Computing2
World Wide Web1
Applied Soft Computing1
Big Data1
Computer Communications1
Computers, Environment and Urban Systems1
Connection Science1
Digital Communications and Networks1
Digital Signal Processing1
Engineering Applications of Artificial Intelligence1
Environment, Development and Sustainability1
Future Generation Computer Systems1
IEEE Access1
IEEE Sensors Journal1
IEEE Transactions on Big Data1
IEEE Transactions on Neural Networks and Learning Systems1
IEEE Transactions on Vehicular Technology1
International Journal of Intelligent Systems1
International Journal of Machine Learning and Cybernetics1
Journal of King Saud University-Computer and Information Sciences1
Journal of Rail Transport Planning & Management1
Mathematics1
Neural Processing Letters1
Neurocomputing1
Pattern Recognition Letters1
Remote Sensing1
Sustainability1
Sustainable Computing: Informatics and Systems1
The Computer Journal1
Transportation Research Record1
Transportmetrica B: Transport Dynamics1
Transportmetrica B: transport dynamics1

Appendix C. The Source Conference List

The source conferences for the surveyed papers are listed in Table A3 with the number of papers counted.
Table A3. The source conferences for the surveyed papers.
Table A3. The source conferences for the surveyed papers.
Conference NameNumber of Surveyed Papers
International Joint Conference on Neural Networks (IJCNN)6
ACM International Conference on Information and Knowledge Management (CIKM)4
ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD)4
International Joint Conference on Artificial Intelligence (IJCAI)2
AAAI Conference on Artificial Intelligence (AAAI)2
International Conference on Learning Representations (ICLR)2
IEEE Symposium on Computers and Communications (ISCC)1
International Conference on Artificial Neural Networks (ICANN)1
IEEE International Conference on Data Engineering (ICDE)1
Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD)1
IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)1
International Conference on Very Large Databases (VLDB)1
International Conference on Machine Learning (ICML)1
IEEE Wireless Communications and Networking Conference (WCNC)1
International Conference on Database Systems for Advanced Applications (DASFAA)1
IEEE International Conference on Computer Supported Cooperative Work in Design (CSCWD)1

References

  1. Zlatanova, S.; Yan, J.; Wang, Y.; Diakité, A.; Isikdag, U.; Sithole, G.; Barton, J. Spaces in spatial science and urban applications—State of the art review. ISPRS Int. J. Geo-Inf. 2020, 9, 58. [Google Scholar] [CrossRef] [Green Version]
  2. Rehman, A.; Haseeb, K.; Saba, T.; Lloret, J.; Ahmed, Z. Towards resilient and secure cooperative behavior of intelligent transportation system using sensor technologies. IEEE Sens. J. 2022, 22, 7352–7360. [Google Scholar] [CrossRef]
  3. Liu, C.; Xiao, Z.; Wang, D.; Wang, L.; Jiang, H.; Chen, H.; Yu, J. Exploiting Spatiotemporal Correlations of Arrive-Stay-Leave Behaviors for Private Car Flow Prediction. IEEE Trans. Netw. Sci. Eng. 2022, 9, 834–847. [Google Scholar] [CrossRef]
  4. Wang, L.; Wang, S.; Yuan, Z.; Peng, L. Analyzing potential tourist behavior using PCA and modified affinity propagation clustering based on Baidu index: Taking Beijing city as an example. Data Sci. Manag. 2021, 2, 12–19. [Google Scholar] [CrossRef]
  5. Ahangar, M.N.; Ahmed, Q.Z.; Khan, F.A.; Hafeez, M. A survey of autonomous vehicles: Enabling communication technologies and challenges. Sensors 2021, 21, 706. [Google Scholar] [CrossRef] [PubMed]
  6. Xiao, Z.; Xiao, D.; Havyarimana, V.; Jiang, H.; Liu, D.; Wang, D.; Zeng, F. Toward accurate vehicle state estimation under non-Gaussian noises. IEEE Internet Things J. 2019, 6, 10652–10664. [Google Scholar] [CrossRef]
  7. Vlahogianni, E.I.; Karlaftis, M.G.; Golias, J.C. Short-term traffic forecasting: Where we are and where we’re going. Transp. Res. Part C Emerg. Technol. 2014, 43, 3–19. [Google Scholar] [CrossRef]
  8. Vlahogianni, E.I.; Golias, J.C.; Karlaftis, M.G. Short-term traffic forecasting: Overview of objectives and methods. Transp. Rev. 2004, 24, 533–557. [Google Scholar] [CrossRef]
  9. Wang, D.; Wang, C.; Xiao, J.; Xiao, Z.; Chen, W.; Havyarimana, V. Bayesian optimization of support vector machine for regression prediction of short-term traffic flow. Intell. Data Anal. 2019, 23, 481–497. [Google Scholar] [CrossRef]
  10. Xiao, J.; Xiao, Z.; Wang, D.; Bai, J.; Havyarimana, V.; Zeng, F. Short-term traffic volume prediction by ensemble learning in concept drifting environments. Knowl.-Based Syst. 2019, 164, 213–225. [Google Scholar] [CrossRef]
  11. Ermagun, A.; Levinson, D. Spatiotemporal traffic forecasting: Review and proposed directions. Transp. Rev. 2018, 38, 786–814. [Google Scholar] [CrossRef]
  12. Yu, R.; Li, Y.; Shahabi, C.; Demiryurek, U.; Liu, Y. Deep learning: A generic approach for extreme condition traffic forecasting. In Proceedings of the 2017 SIAM International Conference on Data Mining, SIAM, Houston, TX, USA, 27–29 April 2017; pp. 777–785. [Google Scholar]
  13. Long, W.; Xiao, Z.; Wang, D.; Jiang, H.; Chen, J.; Li, Y.; Alazab, M. Unified Spatial-Temporal Neighbor Attention Network for Dynamic Traffic Prediction. IEEE Trans. Veh. Technol. 2023, 72, 1515–1529. [Google Scholar] [CrossRef]
  14. Liu, Z.; Zhang, R.; Wang, C.; Xiao, Z.; Jiang, H. Spatial-temporal conv-sequence learning with accident encoding for traffic flow prediction. IEEE Trans. Netw. Sci. Eng. 2022, 9, 1765–1775. [Google Scholar] [CrossRef]
  15. He, M.; Gu, W.; Kong, Y.; Zhang, L.; Spanos, C.J.; Mosalam, K.M. Causalbg: Causal recurrent neural network for the blood glucose inference with IoT platform. IEEE Internet Things J. 2019, 7, 598–610. [Google Scholar] [CrossRef]
  16. Lana, I.; Del Ser, J.; Velez, M.; Vlahogianni, E.I. Road traffic forecasting: Recent advances and new challenges. IEEE Intell. Transp. Syst. Mag. 2018, 10, 93–109. [Google Scholar] [CrossRef]
  17. Xiao, Z.; Fang, H.; Jiang, H.; Bai, J.; Havyarimana, V.; Chen, H.; Jiao, L. Understanding private car aggregation effect via spatio-temporal analysis of trajectory data. IEEE Trans. Cybern. 2021. Early Access. [Google Scholar] [CrossRef] [PubMed]
  18. Xiao, Z.; Fang, H.; Jiang, H.; Bai, J.; Havyarimana, V.; Chen, H. Understanding urban area attractiveness based on private car trajectory data using a deep learning approach. IEEE Trans. Intell. Transp. Syst. 2022, 23, 12343–12352. [Google Scholar] [CrossRef]
  19. Ghosh, B.; Basu, B.; O’Mahony, M. Multivariate short-term traffic flow forecasting using time-series analysis. IEEE Trans. Intell. Transp. Syst. 2009, 10, 246–254. [Google Scholar] [CrossRef]
  20. Lippi, M.; Bertini, M.; Frasconi, P. Short-term traffic flow forecasting: An experimental comparison of time-series analysis and supervised learning. IEEE Trans. Intell. Transp. Syst. 2013, 14, 871–882. [Google Scholar] [CrossRef]
  21. Wang, H.; Liu, L.; Qian, Z.; Wei, H.; Dong, S. Empirical mode decomposition–autoregressive integrated moving average: Hybrid short-term traffic speed prediction model. Transp. Res. Rec. 2014, 2460, 66–76. [Google Scholar] [CrossRef]
  22. Zhang, J.; Zheng, Y.; Qi, D. Deep spatio-temporal residual networks for citywide crowd flows prediction. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017; Volume 31, pp. 1655–1661. [Google Scholar]
  23. Jiang, W. TaxiBJ21: An open crowd flow dataset based on Beijing taxi GPS trajectories. Internet Technol. Lett. 2022, 5, e297. [Google Scholar] [CrossRef]
  24. Guo, S.; Lin, Y.; Feng, N.; Song, C.; Wan, H. Attention based spatial-temporal graph convolutional networks for traffic flow forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HA, USA, 27 January 2019; Volume 33, pp. 922–929. [Google Scholar]
  25. Sun, B.; Zhao, D.; Shi, X.; He, Y. Modeling global spatial–temporal graph attention network for traffic prediction. IEEE Access 2021, 9, 8581–8594. [Google Scholar] [CrossRef]
  26. Ang, K.L.M.; Seng, J.K.P.; Ngharamike, E.; Ijemaru, G.K. Emerging Technologies for Smart Cities’ Transportation: Geo-Information, Data Analytics and Machine Learning Approaches. ISPRS Int. J. Geo-Inf. 2022, 11, 85. [Google Scholar] [CrossRef]
  27. Jiang, W. Vehicle Destination Prediction with Spatial Clustering and Machine Learning. Internet Technol. Lett. 2022, e403. [Google Scholar] [CrossRef]
  28. Jiang, W.; Zhang, L. Geospatial data to images: A deep-learning framework for traffic forecasting. Tsinghua Sci. Technol. 2018, 24, 52–64. [Google Scholar] [CrossRef]
  29. Khan, Z.; Khan, S.M.; Dey, K.; Chowdhury, M. Development and evaluation of recurrent neural network-based models for hourly traffic volume and annual average daily traffic prediction. Transp. Res. Rec. 2019, 2673, 489–503. [Google Scholar] [CrossRef]
  30. Jiang, W. Applications of deep learning in stock market prediction: Recent progress. Expert Syst. Appl. 2021, 184, 115537. [Google Scholar] [CrossRef]
  31. Santhosh, M.; Venkaiah, C.; Vinod Kumar, D. Current advances and approaches in wind speed and wind power forecasting for improved renewable energy integration: A review. Eng. Rep. 2020, 2, e12178. [Google Scholar] [CrossRef]
  32. Rajbhandari, Y.; Marahatta, A.; Ghimire, B.; Shrestha, A.; Gachhadar, A.; Thapa, A.; Chapagain, K.; Korba, P. Impact study of temperature on the time series electricity demand of urban nepal for short-term load forecasting. Appl. Syst. Innov. 2021, 4, 43. [Google Scholar] [CrossRef]
  33. Shankarnarayan, V.K.; Ramakrishna, H. Comparative study of three stochastic future weather forecast approaches: A case study. Data Sci. Manag. 2021, 3, 3–12. [Google Scholar] [CrossRef]
  34. Zhao, E.; Sun, S.; Wang, S. New developments in wind energy forecasting with artificial intelligence and big data: A scientometric insight. Data Sci. Manag. 2022, 5, 84–95. [Google Scholar] [CrossRef]
  35. Jiang, W. Internet traffic prediction with deep neural networks. Internet Technol. Lett. 2022, 5, e314. [Google Scholar] [CrossRef]
  36. Jiang, W. Internet traffic matrix prediction with convolutional LSTM neural network. Internet Technol. Lett. 2022, 5, e322. [Google Scholar] [CrossRef]
  37. Sousa, M.; Tomé, A.M.; Moreira, J. Long-term forecasting of hourly retail customer flow on intermittent time series with multiple seasonality. Data Sci. Manag. 2022, 5, 137–148. [Google Scholar] [CrossRef]
  38. Jiang, W. Deep learning based short-term load forecasting incorporating calendar and weather information. Internet Technol. Lett. 2022, 5, e383. [Google Scholar] [CrossRef]
  39. Zhuang, X.; Yu, Y.; Chen, A. A combined forecasting method for intermittent demand using the automotive aftermarket data. Data Sci. Manag. 2022, 5, 43–56. [Google Scholar] [CrossRef]
  40. Jiang, W. Cellular traffic prediction with machine learning: A survey. Expert Syst. Appl. 2022, 201, 117163. [Google Scholar] [CrossRef]
  41. Zhan, X.; Zhang, S.; Szeto, W.Y.; Chen, X. Multi-step-ahead traffic speed forecasting using multi-output gradient boosting regression tree. J. Intell. Transp. Syst. 2020, 24, 125–141. [Google Scholar] [CrossRef]
  42. Feng, B.; Xu, J.; Zhang, Y.; Lin, Y. Multi-step traffic speed prediction based on ensemble learning on an urban road network. Appl. Sci. 2021, 11, 4423. [Google Scholar] [CrossRef]
  43. Li, X.; Xu, Y.; Zhang, X.; Shi, W.; Yue, Y.; Li, Q. Improving short-term bike sharing demand forecast through an irregular convolutional neural network. Transp. Res. Part C Emerg. Technol. 2023, 147, 103984. [Google Scholar] [CrossRef]
  44. Ye, J.; Zhao, J.; Ye, K.; Xu, C. How to build a graph-based deep learning architecture in traffic domain: A survey. IEEE Trans. Intell. Transp. Syst. 2022, 23, 3904–3924. [Google Scholar] [CrossRef]
  45. Jiang, W.; Luo, J. Graph neural network for traffic forecasting: A survey. Expert Syst. Appl. 2022, 207, 117921. [Google Scholar] [CrossRef]
  46. Tedjopurnomo, D.A.; Bao, Z.; Zheng, B.; Choudhury, F.; Qin, A.K. A survey on modern deep neural network for traffic prediction: Trends, methods and challenges. IEEE Trans. Knowl. Data Eng. 2020, 34, 1544–1561. [Google Scholar] [CrossRef]
  47. Boukerche, A.; Wang, J. Machine Learning-based traffic prediction models for Intelligent Transportation Systems. Comput. Netw. 2020, 181, 107530. [Google Scholar] [CrossRef]
  48. Boukerche, A.; Tao, Y.; Sun, P. Artificial intelligence-based vehicular traffic flow prediction methods for supporting intelligent transportation systems. Comput. Netw. 2020, 182, 107484. [Google Scholar] [CrossRef]
  49. Manibardo, E.L.; Laña, I.; Del Ser, J. Deep learning for road traffic forecasting: Does it make a difference? IEEE Trans. Intell. Transp. Syst. 2021, 23, 6164–6188. [Google Scholar] [CrossRef]
  50. Lee, K.; Eo, M.; Jung, E.; Yoon, Y.; Rhee, W. Short-term traffic prediction with deep neural networks: A survey. IEEE Access 2021, 9, 54739–54756. [Google Scholar] [CrossRef]
  51. Yin, X.; Wu, G.; Wei, J.; Shen, Y.; Qi, H.; Yin, B. Deep learning on traffic prediction: Methods, analysis and future directions. IEEE Trans. Intell. Transp. Syst. 2021, 23, 4927–4943. [Google Scholar] [CrossRef]
  52. Jiang, W.; Luo, J. Big data for traffic estimation and prediction: A survey of data and tools. Appl. Syst. Innov. 2022, 5, 23. [Google Scholar] [CrossRef]
  53. Jiang, W. Bike sharing usage prediction with deep learning: A survey. Neural Comput. Appl. 2022, 34, 15369–15385. [Google Scholar] [CrossRef] [PubMed]
  54. Kipf, T.N.; Welling, M. Semi-supervised classification with graph convolutional networks. In Proceedings of the International Conference on Learning Representations (ICLR ’17), Toulon, France, 24–26 April 2017. [Google Scholar]
  55. Veličković, P.; Cucurull, G.; Casanova, A.; Romero, A.; Liò, P.; Bengio, Y. Graph Attention Networks. In Proceedings of the International Conference on Learning Representations, Vancouver, BC, Canada, 30 April–3 May 2018. [Google Scholar]
  56. Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, L.; Polosukhin, I. Attention is all you need. In Advances in Neural Information Processing Systems; MIT Press: Long Beach, CA, USA, 2017; Volume 30. [Google Scholar]
  57. Wu, Z.; Pan, S.; Chen, F.; Long, G.; Zhang, C.; Philip, S.Y. A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 2020, 32, 4–24. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  58. Zhang, Z.; Cui, P.; Zhu, W. Deep learning on graphs: A survey. IEEE Trans. Knowl. Data Eng. 2020, 34, 249–270. [Google Scholar] [CrossRef] [Green Version]
  59. Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Virtual, 8–9 February 2021; Volume 35, pp. 11106–11115. [Google Scholar]
  60. Sun, Y.; Jiang, X.; Hu, Y.; Duan, F.; Guo, K.; Wang, B.; Gao, J.; Yin, B. Dual Dynamic Spatial-Temporal Graph Convolution Network for Traffic Prediction. IEEE Trans. Intell. Transp. Syst. 2022, 23, 23680–23693. [Google Scholar] [CrossRef]
  61. Ma, Q.; Sun, W.; Gao, J.; Ma, P.; Shi, M. Spatio-temporal adaptive graph convolutional networks for traffic flow forecasting. IET Intell. Transp. Syst. 2022. Early View. [Google Scholar] [CrossRef]
  62. Kong, X.; Wei, X.; Zhang, J.; Xing, W.; Lu, W. JointGraph: Joint pre-training framework for traffic forecasting with spatial-temporal gating diffusion graph attention network. Appl. Intell. 2022, 1–18. [Google Scholar] [CrossRef]
  63. Wang, S.; Lv, Y.; Peng, Y.; Piao, X.; Zhang, Y. Metro Traffic Flow Prediction via Knowledge Graph and Spatiotemporal Graph Neural Network. J. Adv. Transp. 2022. [Google Scholar] [CrossRef]
  64. Wang, Y.; Zhao, A.; Li, J.; Lv, Z.; Dong, C.; Li, H. Multi-attribute Graph Convolution Network for Regional Traffic Flow Prediction. Neural Process. Lett. 2022, 1–27. [Google Scholar] [CrossRef]
  65. Zhang, L.; Geng, X.; Qin, Z.; Wang, H.; Wang, X.; Zhang, Y.; Liang, J.; Wu, G.; Song, X.; Wang, Y. Multi-modal graph interaction for multi-graph convolution network in urban spatiotemporal forecasting. Sustainability 2022, 14, 12397. [Google Scholar] [CrossRef]
  66. Xia, J.; Wang, S.; Wang, X.; Xia, M.; Xie, K.; Cao, J. Multi-view Bayesian spatio-temporal graph neural networks for reliable traffic flow prediction. Int. J. Mach. Learn. Cybern. 2022, 1–14. [Google Scholar] [CrossRef]
  67. Qi, Y.; Wu, J.; Bashir, A.K.; Lin, X.; Yang, W.; Alshehri, M.D. Privacy-Preserving Cross-Area Traffic Forecasting in ITS: A Transferable Spatial-Temporal Graph Neural Network Approach. IEEE Trans. Intell. Transp. Syst. 2022. [Google Scholar] [CrossRef]
  68. Ling, S.; Yu, Z.; Cao, S.; Zhang, H.; Hu, S. STHAN: Transportation Demand Forecasting with Compound Spatio-Temporal Relationships. ACM Trans. Knowl. Discov. Data (TKDD) 2022, 17, 1–23. [Google Scholar] [CrossRef]
  69. Wang, J.; Wang, W.; Yu, W.; Liu, X.; Jia, K.; Li, X.; Zhong, M.; Sun, Y.; Xu, Y. STHGCN: A spatiotemporal prediction framework based on higher-order graph convolution networks. Knowl.-Based Syst. 2022, 258, 109985. [Google Scholar] [CrossRef]
  70. Dai, S.; Wang, J.; Huang, C.; Yu, Y.; Dong, J. Dynamic Multi-View Graph Neural Networks for Citywide Traffic Inference. ACM Trans. Knowl. Discov. Data (TKDD) 2022, 17, 1–22. [Google Scholar] [CrossRef]
  71. Wang, Q.; Jiang, H.; Qiu, M.; Liu, Y.; Ye, D. TGAE: Temporal Graph Autoencoder for Travel Forecasting. IEEE Trans. Intell. Transp. Syst. 2022. [Google Scholar] [CrossRef]
  72. Jin, G.; Xi, Z.; Sha, H.; Feng, Y.; Huang, J. Deep multi-view graph-based network for citywide ride-hailing demand prediction. Neurocomputing 2022, 510, 79–94. [Google Scholar] [CrossRef]
  73. Zheng, H.; Li, X.; Li, Y.; Yan, Z.; Li, T. GCN-GAN: Integrating Graph Convolutional Network and Generative Adversarial Network for Traffic Flow Prediction. IEEE Access 2022, 10, 94051–94062. [Google Scholar] [CrossRef]
  74. Sun, J.; Peng, M.; Jiang, H.; Hong, Q.; Sun, Y. HMIAN: A Hierarchical Mapping and Interactive Attention Data Fusion Network for Traffic Forecasting. IEEE Internet Things J. 2022, 9, 25685–25697. [Google Scholar] [CrossRef]
  75. Djenouri, Y.; Belhadi, A.; Srivastava, G.; Lin, J.C.W. Hybrid graph convolution neural network and branch-and-bound optimization for traffic flow forecasting. Future Gener. Comput. Syst. 2023, 139, 100–108. [Google Scholar] [CrossRef]
  76. Xiu, C.; Sun, Y.; Peng, Q. Modelling traffic as multi-graph signals: Using domain knowledge to enhance the network-level passenger flow prediction in metro systems. J. Rail Transp. Plan. Manag. 2022, 24, 100342. [Google Scholar] [CrossRef]
  77. Xu, G.; Hu, X. Multi-Dimensional Attention Based Spatial-Temporal Networks for Traffic Forecasting. Wirel. Commun. Mob. Comput. 2022, 2022, 1358535. [Google Scholar] [CrossRef]
  78. Huang, X.; Lan, Y.; Ye, Y.; Wang, J.; Jiang, Y. Traffic Flow Prediction Based on Multi-Mode Spatial-Temporal Convolution of Mixed Hop Diffuse ODE. Electronics 2022, 11, 3012. [Google Scholar] [CrossRef]
  79. Ge, Y.; Zhai, J.F.; Su, P.C. Traffic Flow Prediction Based on Multi-Spatiotemporal Attention Gated Graph Convolution Network. J. Adv. Transp. 2022, 2022, 2723101. [Google Scholar] [CrossRef]
  80. Pan, X.; Hou, F.; Li, S. Traffic Speed Prediction Based on Time Classification in Combination With Spatial Graph Convolutional Network. IEEE Trans. Intell. Transp. Syst. 2022. [Google Scholar] [CrossRef]
  81. Zhao, J.; Chen, C.; Liao, C.; Huang, H.; Ma, J.; Pu, H.; Luo, J.; Zhu, T.; Wang, S. 2F-TP: Learning Flexible Spatiotemporal Dependency for Flexible Traffic Prediction. IEEE Trans. Intell. Transp. Syst. 2022. [Google Scholar] [CrossRef]
  82. Qi, X.; Mei, G.; Tu, J.; Xi, N.; Piccialli, F. A Deep Learning Approach for Long-Term Traffic Flow Prediction with Multifactor Fusion Using Spatiotemporal Graph Convolutional Network. IEEE Trans. Intell. Transp. Syst. 2022. [Google Scholar] [CrossRef]
  83. Zheng, G.; Chai, W.K.; Katos, V. A dynamic spatial–temporal deep learning framework for traffic speed prediction on large-scale road networks. Expert Syst. Appl. 2022, 195, 116585. [Google Scholar] [CrossRef]
  84. Huang, F.; Yi, P.; Wang, J.; Li, M.; Peng, J.; Xiong, X. A dynamical spatial-temporal graph neural network for traffic demand prediction. Inf. Sci. 2022, 594, 286–304. [Google Scholar] [CrossRef]
  85. Huang, Z.; Zhang, W.; Wang, D.; Yin, Y. A GAN framework-based dynamic multi-graph convolutional network for origin–destination-based ride-hailing demand prediction. Inf. Sci. 2022, 601, 129–146. [Google Scholar] [CrossRef]
  86. Jin, J.; Rong, D.; Zhang, T.; Ji, Q.; Guo, H.; Lv, Y.; Ma, X.; Wang, F.Y. A GAN-Based Short-Term Link Traffic Prediction Approach for Urban Road Networks Under a Parallel Learning Framework. IEEE Trans. Intell. Transp. Syst. 2022, 23, 16185–16196. [Google Scholar] [CrossRef]
  87. Xu, D.; Lin, Z.; Zhou, L.; Li, H.; Niu, B. A GATs-GAN framework for road traffic states forecasting. Transp. B Transp. Dyn. 2022, 10, 718–730. [Google Scholar] [CrossRef]
  88. Qi, J.; Zhao, Z.; Tanin, E.; Cui, T.; Nassir, N.; Sarvi, M. A Graph and Attentive Multi-Path Convolutional Network for Traffic Prediction. IEEE Trans. Knowl. Data Eng. 2022. [Google Scholar] [CrossRef]
  89. Wu, M.; Jia, H.; Luo, D.; Luo, H.; Zhao, F.; Li, G. A multi-attention dynamic graph convolution network with cost-sensitive learning approach to road-level and minute-level traffic accident prediction. IET Intell. Transp. Syst. 2022. [Google Scholar] [CrossRef]
  90. Shang, P.; Liu, X.; Yu, C.; Yan, G.; Xiang, Q.; Mi, X. A new ensemble deep graph reinforcement learning network for spatio-temporal traffic volume forecasting in a freeway network. Digit. Signal Process. 2022, 123, 103419. [Google Scholar] [CrossRef]
  91. Hu, Z.; Shao, F.; Sun, R. A New Perspective on Traffic Flow Prediction: A Graph Spatial-Temporal Network with Complex Network Information. Electronics 2022, 11, 2432. [Google Scholar] [CrossRef]
  92. Chen, Y.; Chen, X.M. A novel reinforced dynamic graph convolutional network model with data imputation for network-wide traffic flow prediction. Transp. Res. Part C Emerg. Technol. 2022, 143, 103820. [Google Scholar] [CrossRef]
  93. Diao, C.; Zhang, D.; Liang, W.; Li, K.C.; Hong, Y.; Gaudiot, J.L. A Novel Spatial-Temporal Multi-Scale Alignment Graph Neural Network Security Model for Vehicles Prediction. IEEE Trans. Intell. Transp. Syst. 2022, 24, 904–914. [Google Scholar] [CrossRef]
  94. Liu, F.; Wang, J.; Tian, J.; Zhuang, D.; Miranda-Moreno, L.; Sun, L. A Universal Framework of Spatiotemporal Bias Block for Long-Term Traffic Forecasting. IEEE Trans. Intell. Transp. Syst. 2022, 23, 19064–19075. [Google Scholar] [CrossRef]
  95. Li, Y.; Zhao, W.; Fan, H. A Spatio-Temporal Graph Neural Network Approach for Traffic Flow Prediction. Mathematics 2022, 10, 1754. [Google Scholar] [CrossRef]
  96. Cao, S.; Wu, L.; Wu, J.; Wu, D.; Li, Q. A spatio-temporal sequence-to-sequence network for traffic flow prediction. Inf. Sci. 2022, 610, 185–203. [Google Scholar] [CrossRef]
  97. Chen, L.; Shao, W.; Lv, M.; Chen, W.; Zhang, Y.; Yang, C. AARGNN: An Attentive Attributed Recurrent Graph Neural Network for Traffic Flow Prediction Considering Multiple Dynamic Factors. IEEE Trans. Intell. Transp. Syst. 2022, 23, 17201–17211. [Google Scholar] [CrossRef]
  98. Zhang, W.; Zhu, F.; Lv, Y.; Tan, C.; Liu, W.; Zhang, X.; Wang, F.Y. AdapGL: An adaptive graph learning algorithm for traffic prediction based on spatiotemporal neural networks. Transp. Res. Part C Emerg. Technol. 2022, 139, 103659. [Google Scholar] [CrossRef]
  99. Xu, H.; Zou, T.; Liu, M.; Qiao, Y.; Wang, J.; Li, X. Adaptive Spatiotemporal Dependence Learning for Multi-Mode Transportation Demand Prediction. IEEE Trans. Intell. Transp. Syst. 2022, 23, 18632–18642. [Google Scholar] [CrossRef]
  100. Ta, X.; Liu, Z.; Hu, X.; Yu, L.; Sun, L.; Du, B. Adaptive Spatio-temporal Graph Neural Network for traffic forecasting. Knowl.-Based Syst. 2022, 242, 108199. [Google Scholar] [CrossRef]
  101. Feng, J.; Yu, L.; Ma, R. AGCN-T: A Traffic Flow Prediction Model for Spatial-Temporal Network Dynamics. J. Adv. Transp. 2022, 2022, 1217588. [Google Scholar] [CrossRef]
  102. Zhao, S.; Li, X. An Attention Encoder-Decoder Dual Graph Convolutional Network with Time Series Correlation for Multi-Step Traffic Flow Prediction. J. Adv. Transp. 2022, 2022, 7682274. [Google Scholar] [CrossRef]
  103. Liao, L.; Hu, Z.; Zheng, Y.; Bi, S.; Zou, F.; Qiu, H.; Zhang, M. An improved dynamic Chebyshev graph convolution network for traffic flow prediction with spatial-temporal attention. Appl. Intell. 2022, 52, 16104–16116. [Google Scholar] [CrossRef]
  104. Xu, X.; Mao, H.; Zhao, Y.; Lü, X. An Urban Traffic Flow Fusion Network Based on a Causal Spatiotemporal Graph Convolution Network. Appl. Sci. 2022, 12, 7010. [Google Scholar] [CrossRef]
  105. Wang, Y.; Jing, C.; Xu, S.; Guo, T. Attention based spatiotemporal graph attention networks for traffic flow forecasting. Inf. Sci. 2022, 607, 869–883. [Google Scholar] [CrossRef]
  106. Zhao, J.; Liu, Z.; Sun, Q.; Li, Q.; Jia, X.; Zhang, R. Attention-based dynamic spatial-temporal graph convolutional networks for traffic speed forecasting. Expert Syst. Appl. 2022, 204, 117511. [Google Scholar] [CrossRef]
  107. Ye, J.; Xue, S.; Jiang, A. Attention-based spatio-temporal graph convolutional network considering external factors for multi-step traffic flow prediction. Digit. Commun. Netw. 2022, 8, 343–350. [Google Scholar] [CrossRef]
  108. Zhang, C.; Zhou, H.Y.; Qiu, Q.; Jian, Z.; Zhu, D.; Cheng, C.; He, L.; Liu, G.; Wen, X.; Hu, R. Augmented Multi-Component Recurrent Graph Convolutional Network for Traffic Flow Forecasting. ISPRS Int. J. Geo-Inf. 2022, 11, 88. [Google Scholar] [CrossRef]
  109. Jiang, W.; Xiao, Y.; Liu, Y.; Liu, Q.; Li, Z. Bi-GRCN: A Spatio-Temporal Traffic Flow Prediction Model Based on Graph Neural Network. J. Adv. Transp. 2022, 2022, 5221362. [Google Scholar] [CrossRef]
  110. Hu, C.; Ning, B.; Gu, Q.; Qu, J.; Jeon, S.; Du, B. Big data analytics-based traffic flow forecasting using inductive spatial-temporal network. Environ. Dev. Sustain. 2022, 1–17. [Google Scholar] [CrossRef]
  111. Yin, G.; Huang, Z.; Bao, Y.; Wang, H.; Li, L.; Ma, X.; Zhang, Y. ConvGCN-RF: A hybrid learning model for commuting flow prediction considering geographical semantics and neighborhood effects. GeoInformatica 2022. [Google Scholar] [CrossRef]
  112. Zhao, T.; Huang, Z.; Tu, W.; He, B.; Cao, R.; Cao, J.; Li, M. Coupling graph deep learning and spatial-temporal influence of built environment for short-term bus travel demand prediction. Comput. Environ. Urban Syst. 2022, 94, 101776. [Google Scholar] [CrossRef]
  113. Li, F.; Feng, J.; Yan, H.; Jin, D.; Li, Y. Crowd Flow Prediction for irregular Regions with Semantic Graph Attention Network. ACM Trans. Intell. Syst. Technol. (TIST) 2022, 13, 1–14. [Google Scholar] [CrossRef]
  114. Lee, K.; Rhee, W. DDP-GCN: Multi-graph convolutional network for spatiotemporal traffic forecasting. Transp. Res. Part C Emerg. Technol. 2022, 134, 103466. [Google Scholar] [CrossRef]
  115. Jiang, Y.; Fan, J.; Liu, Y.; Zhang, X. Deep Graph Gaussian Processes for Short-Term Traffic Flow Forecasting From Spatiotemporal Data. IEEE Trans. Intell. Transp. Syst. 2022, 23, 20177–20186. [Google Scholar] [CrossRef]
  116. Zheng, Q.; Zhang, Y. DSTAGCN: Dynamic Spatial-Temporal Adjacent Graph Convolutional Network for Traffic Forecasting. IEEE Trans. Big Data 2022, 9, 241–253. [Google Scholar] [CrossRef]
  117. Hu, J.; Lin, X.; Wang, C. DSTGCN: Dynamic Spatial-Temporal Graph Convolutional Network for Traffic Prediction. IEEE Sens. J. 2022, 22, 13116–13124. [Google Scholar] [CrossRef]
  118. Zhang, W.; Zhu, K.; Zhang, S.; Chen, Q.; Xu, J. Dynamic graph convolutional networks based on spatiotemporal data embedding for traffic flow forecasting. Knowl.-Based Syst. 2022, 250, 109028. [Google Scholar] [CrossRef]
  119. Liu, Z.; Bian, J.; Zhang, D.; Chen, Y.; Shen, G.; Kong, X. Dynamic Multi-View Coupled Graph Convolution Network for Urban Travel Demand Forecasting. Electronics 2022, 11, 2620. [Google Scholar] [CrossRef]
  120. Han, S.Y.; Zhao, Q.; Sun, Q.W.; Zhou, J.; Chen, Y.H. EnGS-DGR: Traffic Flow Forecasting with Indefinite Forecasting Interval by Ensemble GCN, Seq2Seq, and Dynamic Graph Reconfiguration. Appl. Sci. 2022, 12, 2890. [Google Scholar] [CrossRef]
  121. Luo, G.; Zhang, H.; Yuan, Q.; Li, J.; Wang, F.Y. ESTNet: Embedded Spatial-Temporal Network for Modeling Traffic Flow Dynamics. IEEE Trans. Intell. Transp. Syst. 2022, 23, 19201–19212. [Google Scholar] [CrossRef]
  122. Kong, X.; Wang, K.; Hou, M.; Xia, F.; Karmakar, G.; Li, J. Exploring Human Mobility for Multi-Pattern Passenger Prediction: A Graph Learning Framework. IEEE Trans. Intell. Transp. Syst. 2022, 23, 16148–16160. [Google Scholar] [CrossRef]
  123. Zou, F.; Ren, Q.; Tian, J.; Guo, F.; Huang, S.; Liao, L.; Wu, J. Expressway Speed Prediction Based on Electronic Toll Collection Data. Electronics 2022, 11, 1613. [Google Scholar] [CrossRef]
  124. Duan, Y.; Chen, N.; Shen, S.; Zhang, P.; Qu, Y.; Yu, S. Fdsa-STG: Fully Dynamic Self-Attention Spatio-Temporal Graph Networks for Intelligent Traffic Flow Prediction. IEEE Trans. Veh. Technol. 2022, 71, 9250–9260. [Google Scholar] [CrossRef]
  125. Yuan, X.; Chen, J.; Yang, J.; Zhang, N.; Yang, T.; Han, T.; Taherkordi, A. FedSTN: Graph Representation Driven Federated Learning for Edge Computing Enabled Urban Traffic Flow Prediction. IEEE Trans. Intell. Transp. Syst. 2022. [Google Scholar] [CrossRef]
  126. Jia, T.; Cai, C. Forecasting citywide short-term turning traffic flow at intersections using an attention-based spatiotemporal deep learning model. Transp. B Transp. Dyn. 2022, 1–23. [Google Scholar] [CrossRef]
  127. Bao, J.; Kang, J.; Yang, Z.; Chen, X. Forecasting network-wide multi-step metro ridership with an attention-weighted multi-view graph to sequence learning approach. Expert Syst. Appl. 2022, 210, 118475. [Google Scholar] [CrossRef]
  128. Zhang, X.; Xu, Y.; Shao, Y. Forecasting traffic flow with spatial–temporal convolutional graph attention networks. Neural Comput. Appl. 2022, 34, 15457–15479. [Google Scholar] [CrossRef]
  129. Su, J.; Jin, Z.; Ren, J.; Yang, J.; Liu, Y. GDFormer: A Graph Diffusing Attention based approach for Traffic Flow Prediction. Pattern Recognit. Lett. 2022, 156, 126–132. [Google Scholar] [CrossRef]
  130. James, J. Graph Construction for Traffic Prediction: A Data-Driven Approach. IEEE Trans. Intell. Transp. Syst. 2022, 23, 15015–15027. [Google Scholar]
  131. Geng, X.; He, X.; Xu, L.; Yu, J. Graph correlated attention recurrent neural network for multivariate time series forecasting. Inf. Sci. 2022, 606, 126–142. [Google Scholar] [CrossRef]
  132. Lu, Z.; Lv, W.; Xie, Z.; Du, B.; Xiong, G.; Sun, L.; Wang, H. Graph Sequence Neural Network with an Attention Mechanism for Traffic Speed Prediction. ACM Trans. Intell. Syst. Technol. (TIST) 2022, 13, 1–24. [Google Scholar] [CrossRef]
  133. Wang, H.; Zhang, R.; Cheng, X.; Yang, L. Hierarchical traffic flow prediction based on spatial-temporal graph convolutional network. IEEE Trans. Intell. Transp. Syst. 2022, 23, 16137–16147. [Google Scholar] [CrossRef]
  134. Dai, F.; Cao, P.; Huang, P.; Mo, Q.; Huang, B. Hybrid deep learning approach for traffic speed prediction. Big Data 2022. [Google Scholar] [CrossRef]
  135. Zhang, Q.; Yin, C.; Chen, Y.; Su, F. IGCRRN: Improved Graph Convolution Res-Recurrent Network for spatio-temporal dependence capturing and traffic flow prediction. Eng. Appl. Artif. Intell. 2022, 114, 105179. [Google Scholar] [CrossRef]
  136. Jiang, M.; Li, C.; Li, K.; Yang, Z.; Liu, H. Inter-Block Flow Prediction with Relation Graph Network for Cold-start on Bike-Sharing System. IEEE Internet Things J. 2022, 9, 13390–13404. [Google Scholar] [CrossRef]
  137. Liang, Y.; Huang, G.; Zhao, Z. Joint demand prediction for multimodal systems: A multi-task multi-relational spatiotemporal graph neural network approach. Transp. Res. Part C Emerg. Technol. 2022, 140, 103731. [Google Scholar] [CrossRef]
  138. Huang, J.; Luo, K.; Cao, L.; Wen, Y.; Zhong, S. Learning Multiaspect Traffic Couplings by Multirelational Graph Attention Networks for Traffic Prediction. IEEE Trans. Intell. Transp. Syst. 2022, 23, 20681–20695. [Google Scholar] [CrossRef]
  139. Xu, M.; Li, X.; Wang, F.; Shang, J.S.; Chong, T.; Cheng, W.; Xu, J. Learning to effectively model spatial-temporal heterogeneity for traffic flow forecasting. World Wide Web 2022, 1–17. [Google Scholar] [CrossRef]
  140. Li, Z.; Zhang, Y.; Guo, D.; Zhou, X.; Wang, X.; Zhu, L. Long-term traffic forecasting based on adaptive graph cross strided convolution network. Appl. Intell. 2022, 53, 1–15. [Google Scholar] [CrossRef]
  141. Han, X.; Gong, S. LST-GCN: Long Short-Term Memory Embedded Graph Convolution Network for Traffic Flow Forecasting. Electronics 2022, 11, 2230. [Google Scholar] [CrossRef]
  142. Lu, B.; Gan, X.; Jin, H.; Fu, L.; Wang, X.; Zhang, H. Make More Connections: Urban Traffic Flow Forecasting with Spatiotemporal Adaptive Gated Graph Convolution Network. ACM Trans. Intell. Syst. Technol. (TIST) 2022, 13, 1–25. [Google Scholar] [CrossRef]
  143. Qin, Y.; Zhao, F.; Fang, Y.; Luo, H.; Wang, C. Memory attention enhanced graph convolution long short-term memory network for traffic forecasting. Int. J. Intell. Syst. 2022, 37, 6555–6576. [Google Scholar] [CrossRef]
  144. Cui, Z.; Zhang, J.; Noh, G.; Park, H.J. MFDGCN: Multi-Stage Spatio-Temporal Fusion Diffusion Graph Convolutional Network for Traffic Prediction. Appl. Sci. 2022, 12, 2688. [Google Scholar] [CrossRef]
  145. Cao, Y.; Liu, D.; Yin, Q.; Xue, F.; Tang, H. MSASGCN: Multi-Head Self-Attention Spatiotemporal Graph Convolutional Network for Traffic Flow Forecasting. J. Adv. Transp. 2022, 2022, 2811961. [Google Scholar] [CrossRef]
  146. Yin, D.; Jiang, R.; Deng, J.; Li, Y.; Xie, Y.; Wang, Z.; Zhou, Y.; Song, X.; Shang, J.S. MTMGNN: Multi-time multi-graph neural network for metro passenger flow prediction. GeoInformatica 2022, 1–29. [Google Scholar] [CrossRef]
  147. Feng, H.; Jiang, X. Multi-step ahead traffic speed prediction based on gated temporal graph convolution network. Phys. A Stat. Mech. Its Appl. 2022, 606, 128075. [Google Scholar] [CrossRef]
  148. Li, C.; Zhang, H.; Wang, Z.; Wu, Y.; Yang, F. Multigraph Aggregation Spatiotemporal Graph Convolution Network for Ride-Hailing Pick-Up Region Prediction. Wirel. Commun. Mob. Comput. 2022, 2022, 9815133. [Google Scholar] [CrossRef]
  149. Wang, S.; Zhang, M.; Miao, H.; Peng, Z.; Yu, P.S. Multivariate Correlation-aware Spatio-temporal Graph Convolutional Networks for Multi-scale Traffic Prediction. ACM Trans. Intell. Syst. Technol. (TIST) 2022, 13, 1–22. [Google Scholar] [CrossRef]
  150. Zhao, C.; Li, X.; Shao, Z.; Yang, H.; Wang, F. Multi-featured spatial-temporal and dynamic multi-graph convolutional network for metro passenger flow prediction. Connect. Sci. 2022, 34, 1252–1272. [Google Scholar] [CrossRef]
  151. Sun, Y.; Jiang, G.; Lam, S.K.; He, P.; Ning, F. Multi-fold Correlation Attention Network for Predicting Traffic Speeds with Heterogeneous Frequency. Appl. Soft Comput. 2022, 124, 108977. [Google Scholar]
  152. Huang, X.; Ye, Y.; Ding, W.; Yang, X.; Xiong, L. Multi-mode dynamic residual graph convolution network for traffic flow prediction. Inf. Sci. 2022, 609, 548–564. [Google Scholar] [CrossRef]
  153. Wang, Y.; Qin, Y.; Guo, J.; Cao, Z.; Jia, L. Multi-point short-term prediction of station passenger flow based on temporal multi-graph convolutional network. Phys. A Stat. Mech. Its Appl. 2022, 604, 127959. [Google Scholar] [CrossRef]
  154. Li, H.; Jin, D.; Li, X.; Huang, H.; Yun, J.; Huang, L. Multi-View Spatial–Temporal Graph Neural Network for Traffic Prediction. Comput. J. 2022. [Google Scholar] [CrossRef]
  155. Wang, J.; Zhang, Y.; Wang, L.; Hu, Y.; Piao, X.; Yin, B. Multitask Hypergraph Convolutional Networks: A Heterogeneous Traffic Prediction Framework. IEEE Trans. Intell. Transp. Syst. 2022, 23, 18557–18567. [Google Scholar] [CrossRef]
  156. Yang, H.; Zhang, X.; Li, Z.; Cui, J. Region-Level Traffic Prediction Based on Temporal Multi-Spatial Dependence Graph Convolutional Network from GPS Data. Remote Sens. 2022, 14, 303. [Google Scholar] [CrossRef]
  157. Abdelraouf, A.; Abdel-Aty, M.; Mahmoud, N. Sequence-to-Sequence Recurrent Graph Convolutional Networks for Traffic Estimation and Prediction Using Connected Probe Vehicle Data. IEEE Trans. Intell. Transp. Syst. 2022. [Google Scholar] [CrossRef]
  158. Baghbani, A.; Bouguila, N.; Patterson, Z. Short-Term Passenger Flow Prediction Using a Bus Network Graph Convolutional Long Short-Term Memory Neural Network Model. Transp. Res. Rec. 2022, 03611981221112673. [Google Scholar] [CrossRef]
  159. Xia, M.; Jin, D.; Chen, J. Short-Term Traffic Flow Prediction Based on Graph Convolutional Networks and Federated Learning. IEEE Trans. Intell. Transp. Syst. 2022. [Google Scholar] [CrossRef]
  160. Lai, Q.; Tian, J.; Wang, W.; Hu, X. Spatial-Temporal Attention Graph Convolution Network on Edge Cloud for Traffic Flow Prediction. IEEE Trans. Intell. Transp. Syst. 2022. [Google Scholar] [CrossRef]
  161. Zhang, R.; Xie, F.; Sun, R.; Huang, L.; Liu, X.; Shi, J. Spatial-temporal dynamic semantic graph neural network. Neural Comput. Appl. 2022, 34, 16655–16668. [Google Scholar] [CrossRef]
  162. Zhang, S.; Liu, Y.; Xiao, Y.; He, R. Spatial-temporal upsampling graph convolutional network for daily long-term traffic speed prediction. J. King Saud Univ.-Comput. Inf. Sci. 2022, 34, 8996–9010. [Google Scholar] [CrossRef]
  163. Dong, C.; Zhang, K.; Wei, X.; Wang, Y.; Yang, Y. Spatiotemporal Graph Attention Network modeling for multi-step passenger demand prediction at multi-zone level. Phys. A Stat. Mech. Its Appl. 2022, 603, 127789. [Google Scholar] [CrossRef]
  164. Ni, Q.; Zhang, M. STGMN: A gated multi-graph convolutional network framework for traffic flow prediction. Appl. Intell. 2022, 52, 15026–15039. [Google Scholar] [CrossRef]
  165. Ou, J.; Sun, J.; Zhu, Y.; Jin, H.; Liu, Y.; Zhang, F.; Huang, J.; Wang, X. STP-TrellisNets+: Spatial-Temporal Parallel TrellisNets for Multi-Step Metro Station Passenger Flow Prediction. IEEE Trans. Knowl. Data Eng. 2022. [Google Scholar] [CrossRef]
  166. Zhou, J.; Qin, X.; Yu, K.; Jia, Z.; Du, Y. STSGAN: Spatial-Temporal Global Semantic Graph Attention Convolution Networks for Urban Flow Prediction. ISPRS Int. J. Geo-Inf. 2022, 11, 381. [Google Scholar] [CrossRef]
  167. Wang, B.; Wang, J. ST-MGAT: Spatio-temporal multi-head graph attention network for Traffic prediction. Phys. A Stat. Mech. Its Appl. 2022, 603, 127762. [Google Scholar] [CrossRef]
  168. Liao, W.; Zeng, B.; Liu, J.; Wei, P.; Cheng, X. Taxi demand forecasting based on the temporal multimodal information fusion graph neural network. Appl. Intell. 2022, 52, 12077–12090. [Google Scholar] [CrossRef]
  169. Zhang, W.; Yan, S.; Li, J. TCP-BAST: A novel approach to traffic congestion prediction with bilateral alternation on spatiality and temporality. Inf. Sci. 2022, 608, 718–733. [Google Scholar] [CrossRef]
  170. Khaled, A.; Elsir, A.M.T.; Shen, Y. TFGAN: Traffic forecasting using generative adversarial network with multi-graph convolutional network. Knowl.-Based Syst. 2022, 249, 108990. [Google Scholar] [CrossRef]
  171. Chen, L.; Shi, P.; Li, G.; Qi, T. Traffic flow prediction using multi-view graph convolution and masked attention mechanism. Comput. Commun. 2022, 194, 446–457. [Google Scholar] [CrossRef]
  172. Mai, W.; Chen, J.; Chen, X. Time-Evolving Graph Convolutional Recurrent Network for Traffic Prediction. Appl. Sci. 2022, 12, 2842. [Google Scholar] [CrossRef]
  173. Wang, J.; Wang, W.; Liu, X.; Yu, W.; Li, X.; Sun, P. Traffic prediction based on auto spatiotemporal multi-graph adversarial neural network. Phys. A Stat. Mech. Its Appl. 2022, 590, 126736. [Google Scholar] [CrossRef]
  174. Wang, T.; Ni, S.; Qin, T.; Cao, D. TransGAT: A dynamic graph attention residual networks for traffic flow forecasting. Sustain. Comput. Inform. Syst. 2022, 36, 100779. [Google Scholar] [CrossRef]
  175. Wu, Y.; Zhang, H.; Li, C.; Tao, S.; Yang, F. Urban ride-hailing demand prediction with multi-view information fusion deep learning framework. Appl. Intell. 2022, 1–19. [Google Scholar] [CrossRef]
  176. Feng, A.; Tassiulas, L. Adaptive Graph Spatial-Temporal Transformer Network for Traffic Forecasting. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, Atlanta, GA, USA, 17–22 October 2022; pp. 3933–3937. [Google Scholar]
  177. Li, F.; Yan, H.; Jin, G.; Liu, Y.; Li, Y.; Jin, D. Automated Spatio-Temporal Synchronous Modeling with Multiple Graphs for Traffic Prediction. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, Atlanta, GA, USA, 17–21 October 2022; pp. 1084–1093. [Google Scholar]
  178. Wang, Y.; Ren, Q. Dynamic Graph Convolutional Network for Long Short-term Traffic Flow Prediction. In Proceedings of the 2022 IEEE Symposium on Computers and Communications (ISCC), Rhodes, Greece, 30 June 2022–3 July 2022; IEEE: New York, NY, USA, 2022; pp. 1–6. [Google Scholar]
  179. Liu, Z.; Fu, K.; Liu, X. Multi-view Cascading Spatial-Temporal Graph Neural Network for Traffic Flow Forecasting. In Proceedings of the International Conference on Artificial Neural Networks; Springer: New York, NY, USA, 2022; pp. 605–616. [Google Scholar]
  180. Song, J.; Son, J.; Seo, D.h.; Han, K.; Kim, N.; Kim, S.W. ST-GAT: A Spatio-Temporal Graph Attention Network for Accurate Traffic Speed Prediction. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, Atlanta, GA, USA, 17–21 October 2022; pp. 4500–4504. [Google Scholar]
  181. Kim, D.; Cho, Y.; Kim, D.; Park, C.; Choo, J. Residual Correction in Real-Time Traffic Forecasting. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, Atlanta, GA, USA, 17–21 October 2022; pp. 962–971. [Google Scholar]
  182. Li, G.; Wang, X.; Njoo, G.S.; Zhong, S.; Chan, S.H.G.; Hung, C.C.; Peng, W.C. A Data-Driven Spatial-Temporal Graph Neural Network for Docked Bike Prediction. In Proceedings of the 2022 IEEE 38th International Conference on Data Engineering (ICDE), Kuala Lumpur, Malaysia, 9–12 May 2022; IEEE: New York, NY, USA, 2022; pp. 713–726. [Google Scholar]
  183. Shen, Y.; Li, L.; Xie, Q.; Li, X.; Xu, G. A Two-Tower Spatial-Temporal Graph Neural Network for Traffic Speed Prediction. In Proceedings of the Pacific-Asia Conference on Knowledge Discovery and Data Mining; Springer: New York, NY, USA, 2022; pp. 406–418. [Google Scholar]
  184. Sun, J.; Li, J.; Wu, C.; Tang, Z.; Wu, C. Ada-STNet: A Dynamic AdaBoost Spatio-Temporal Network for Traffic Flow Prediction. In Proceedings of the ICASSP 2022—2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Singapore, 23–27 May 2022; IEEE: New York, NY, USA, 2022; pp. 5478–5482. [Google Scholar]
  185. Shao, Z.; Zhang, Z.; Wei, W.; Wang, F.; Xu, Y.; Cao, X.; Jensen, C.S. Decoupled Dynamic Spatial-Temporal Graph Neural Network for Traffic Forecasting. Proc. VLDB Endow. 2022, 15, 2733–2746. [Google Scholar] [CrossRef]
  186. Lan, S.; Ma, Y.; Huang, W.; Wang, W.; Yang, H.; Li, P. DSTAGNN: Dynamic spatial-temporal aware graph neural network for traffic flow forecasting. In Proceedings of the International Conference on Machine Learning, ICML, Baltimore, MD, USA, 17–23 July 2022; pp. 11906–11917. [Google Scholar]
  187. Rao, X.; Wang, H.; Zhang, L.; Li, J.; Shang, S.; Han, P. FOGS: First-order gradient supervision with learning-based graph for traffic flow forecasting. In Proceedings of the International Joint Conference on Artificial Intelligence, IJCAI, Vienna, Austria, 23–29 July 2022; pp. 3926–3932. [Google Scholar]
  188. Choi, J.; Choi, H.; Hwang, J.; Park, N. Graph neural controlled differential equations for traffic forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Virtual, 22 February–1 March 2022; Volume 36, pp. 6367–6374. [Google Scholar]
  189. Zhang, C.; Zhang, S.; Yu, S.; James, J. Graph-Based Traffic Forecasting via Communication-Efficient Federated Learning. In Proceedings of the 2022 IEEE Wireless Communications and Networking Conference (WCNC), Austin, TX, USA, 10–13 April 2022; IEEE: New York, NY, USA, 2022; pp. 2041–2046. [Google Scholar]
  190. Liu, D.; Wang, J.; Shang, S.; Han, P. MSDR: Multi-step dependency relation networks for spatial temporal forecasting. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Washington, DC, USA, 14–18 August 2022; pp. 1042–1050. [Google Scholar]
  191. Lu, B.; Gan, X.; Zhang, W.; Yao, H.; Fu, L.; Wang, X. Spatio-Temporal Graph Few-Shot Learning with Cross-City Knowledge Transfer. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Washington, DC, USA, 14–18 August 2022; pp. 1162–1172. [Google Scholar]
  192. Li, P.; Fang, J.; Chao, P.; Zhao, P.; Liu, A.; Zhao, L. JS-STDGN: A Spatial-Temporal Dynamic Graph Network Using JS-Graph for Traffic Prediction. In Proceedings of the International Conference on Database Systems for Advanced Applications; Springer: New York, NY, USA, 2022; pp. 191–206. [Google Scholar]
  193. Shao, Z.; Zhang, Z.; Wang, F.; Xu, Y. Pre-training Enhanced Spatial-temporal Graph Neural Network for Multivariate Time Series Forecasting. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Washington, DC, USA, 14–18 August 2022; pp. 1567–1577. [Google Scholar]
  194. Yu, H.; Li, T.; Yu, W.; Li, J.; Huang, Y.; Wang, L.; Liu, A. Regularized Graph Structure Learning with Semantic Knowledge for Multi-variates Time-Series Forecasting. In Proceedings of the International Joint Conference on Artificial Intelligence, IJCAI, IJCAI, Vienna, Austria, 23–29 July 2022; pp. 2362–2368. [Google Scholar]
  195. Tang, J.; Qian, T.; Liu, S.; Du, S.; Hu, J.; Li, T. Spatio-Temporal Latent Graph Structure Learning for Traffic Forecasting. In Proceedings of the 2022 International Joint Conference on Neural Networks (IJCNN), Padua, Italy, 18–23 July 2022; IEEE: New York, NY, USA, 2022. [Google Scholar]
  196. Ji, J.; Wang, J.; Jiang, Z.; Jiang, J.; Zhang, H. STDEN: Towards Physics-guided Neural Networks for Traffic Flow Prediction. In Proceedings of the AAAI Conference on Artificial Intelligence, Virtual, 22 February–1 March 2022; Volume 36, pp. 4048–4056. [Google Scholar]
  197. Chen, Y.; Segovia-Dominguez, I.; Coskunuzer, B.; Gel, Y. TAMP-S2GCNets: Coupling time-aware multipersistence knowledge representation with spatio-supra graph convolutional networks for time-series forecasting. In Proceedings of the International Conference on Learning Representations, Virtual, 25–29 April 2022. [Google Scholar]
  198. Xue, Y.; Fan, X.; Huang, Y.; Zhang, X.; Wang, R. Traffic Forecasting Model Based on Two-stage Stacked Graph Convolution Network. In Proceedings of the 2022 IEEE 25th International Conference on Computer Supported Cooperative Work in Design (CSCWD), Hangzhou, China, 4–6 May 2022; IEEE: New York, NY, USA, 2022; pp. 1089–1094. [Google Scholar]
  199. Zhuang, D.; Wang, S.; Koutsopoulos, H.; Zhao, J. Uncertainty Quantification of Sparse Travel Demand Prediction with Spatial-Temporal Graph Neural Networks. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Washington, DC, USA, 14–18 August 2022; pp. 4639–4647. [Google Scholar]
  200. Lee, H.; Jin, S.; Chu, H.; Lim, H.; Ko, S. Learning to Remember Patterns: Pattern Matching Memory Networks for Traffic Forecasting. In Proceedings of the International Conference on Learning Representations, Virtual, 25–29 April 2022. [Google Scholar]
  201. Hermes, L.; Hammer, B.; Melnik, A.; Velioglu, R.; Vieth, M.; Schilling, M. A Graph-based U-Net Model for Predicting Traffic in unseen Cities. In Proceedings of the 2022 International Joint Conference on Neural Networks (IJCNN), Padua, Italy, 18–23 July 2022; IEEE: New York, NY, USA, 2022. [Google Scholar]
  202. Feng, Y.; Han, F.; Zhao, S. A Graph Convolutional Stacked Temporal Attention Neural Network for Traffic Flow Forecasting. In Proceedings of the 2022 International Joint Conference on Neural Networks (IJCNN), Padua, Italy, 18–23 July 2022; IEEE: New York, NY, USA, 2022; pp. 1–7. [Google Scholar]
  203. Li, S.; Ge, L.; Lin, Y.; Zeng, B. Adaptive Spatial-Temporal Fusion Graph Convolutional Networks for Traffic Flow Forecasting. In Proceedings of the 2022 International Joint Conference on Neural Networks (IJCNN), Padua, Italy, 18–23 July 2022; IEEE: New York, NY, USA, 2022; pp. 1–8. [Google Scholar]
  204. Cao, S.; Wu, L.; Zhang, R.; Li, J.; Wu, D. Capturing Local and Global Spatial-Temporal Correlations of Spatial-Temporal Graph Data for Traffic Flow Prediction. In Proceedings of the 2022 International Joint Conference on Neural Networks (IJCNN), Padua, Italy, 18–23 July 2022; IEEE: New York, NY, USA, 2022; pp. 1–8. [Google Scholar]
  205. Hu, J.; Lin, X.; Wang, C. MGCN: Dynamic Spatio-Temporal Multi-Graph Convolutional Neural Network. In Proceedings of the 2022 International Joint Conference on Neural Networks (IJCNN), Padua, Italy, 18–23 July 2022; IEEE: New York, NY, USA, 2022; pp. 1–9. [Google Scholar]
  206. Ke, J.; Feng, S.; Zhu, Z.; Yang, H.; Ye, J. Joint predictions of multi-modal ride-hailing demands: A deep multi-task multi-graph learning-based approach. Transp. Res. Part C Emerg. Technol. 2021, 127, 103063. [Google Scholar] [CrossRef]
  207. Dong, X.; Lei, T.; Jin, S.; Hou, Z. Short-term traffic flow prediction based on XGBoost. In Proceedings of the 2018 IEEE 7th Data Driven Control and Learning Systems Conference (DDCLS), Enshi, China, 25–27 May 2018; IEEE: New York, NY, USA, 2018; pp. 854–859. [Google Scholar]
  208. Chen, Z.; Fan, W. A freeway travel time prediction method based on an XGBoost model. Sustainability 2021, 13, 8577. [Google Scholar] [CrossRef]
  209. Gutmann, S.; Maget, C.; Spangler, M.; Bogenberger, K. Truck parking occupancy prediction: Xgboost-LSTM model fusion. Front. Future Transp. 2021, 2, 693708. [Google Scholar] [CrossRef]
  210. Huang, X.; Tian, X.; Gu, J.; Sun, Q.; Zhao, H. VectorFlow: Combining Images and Vectors for Traffic Occupancy and Flow Prediction. arXiv 2022, arXiv:2208.04530. [Google Scholar]
  211. Yang, S.; Ma, W.; Pi, X.; Qian, S. A deep learning approach to real-time parking occupancy prediction in transportation networks incorporating multiple spatio-temporal data sources. Transp. Res. Part C Emerg. Technol. 2019, 107, 248–265. [Google Scholar] [CrossRef]
  212. Li, Y.; Yu, R.; Shahabi, C.; Liu, Y. Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting. In Proceedings of the International Conference on Learning Representations (ICLR ’18), Vancouver, BC, Canada, 30 April–3 May 2018. [Google Scholar]
  213. Yu, B.; Yin, H.; Zhu, Z. Spatio-Temporal Graph Convolutional Networks: A Deep Learning Framework for Traffic Forecasting. In Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, IJCAI-18, Stockholm, Sweden, 13–19 July 2018; pp. 3634–3640. [Google Scholar] [CrossRef] [Green Version]
  214. Wu, Z.; Pan, S.; Long, G.; Jiang, J.; Zhang, C. Graph WaveNet for Deep Spatial-Temporal Graph Modeling. In Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, IJCAI-19, Macao, China, 10–16 August 2019; pp. 1907–1913. [Google Scholar] [CrossRef] [Green Version]
  215. Xu, Z.; Tang, N.; Xu, C.; Cheng, X. Data science: Connotation, methods, technologies, and development. Data Sci. Manag. 2021, 1, 32–37. [Google Scholar] [CrossRef]
  216. Gao, Y.; Zhou, C.; Rong, J.; Wang, Y.; Liu, S. Short-Term Traffic Speed Forecasting Using a Deep Learning Method Based on Multitemporal Traffic Flow Volume. IEEE Access 2022, 10, 82384–82395. [Google Scholar] [CrossRef]
  217. Axenie, C.; Bortoli, S. Road traffic prediction dataset. Zenodo 2020. [Google Scholar] [CrossRef]
  218. Hou, Y.; Chen, J.; Wen, S. The effect of the dataset on evaluating urban traffic prediction. Alex. Eng. J. 2021, 60, 597–613. [Google Scholar] [CrossRef]
  219. Braz, F.J.; Ferreira, J.; Gonçalves, F.; Weege, K.; Almeida, J.; Baldo, F.; Gonçalves, P. Road traffic forecast based on meteorological information through deep learning methods. Sensors 2022, 22, 4485. [Google Scholar] [CrossRef]
  220. Ma, H.; Zhou, M.; Ouyang, X.; Yin, D.; Jiang, R.; Song, X. Forecasting Regional Multimodal Transportation Demand with Graph Neural Networks: An Open Dataset. In Proceedings of the 2022 IEEE 25th International Conference on Intelligent Transportation Systems (ITSC), Macau, China, 8–12 October 2022; IEEE: New York, NY, USA, 2022; pp. 3263–3268. [Google Scholar]
  221. Prado-Rujas, I.I.; Serrano, E.; García-Dopico, A.; Córdoba, M.L.; Pérez, M.S. Combining heterogeneous data sources for spatio-temporal mobility demand forecasting. Inf. Fusion 2023, 91, 1–12. [Google Scholar] [CrossRef]
  222. Jiang, R.; Cai, Z.; Wang, Z.; Yang, C.; Fan, Z.; Chen, Q.; Tsubouchi, K.; Song, X.; Shibasaki, R. DeepCrowd: A deep model for large-scale citywide crowd density and flow prediction. IEEE Trans. Knowl. Data Eng. 2022. [Google Scholar] [CrossRef]
  223. Wu, Z.; Zheng, D.; Pan, S.; Gan, Q.; Long, G.; Karypis, G. Traversenet: Unifying space and time in message passing for traffic forecasting. IEEE Trans. Neural Netw. Learn. Syst. 2022. [Google Scholar] [CrossRef] [PubMed]
  224. Xiao, Z.; Xiao, H.; Jiang, H.; Chen, W.; Chen, H.; Regan, A.C. Exploring human mobility patterns and travel behavior: A focus on private cars. IEEE Intell. Transp. Syst. Mag. 2021, 14, 129–146. [Google Scholar] [CrossRef]
  225. Liu, C.; Xiao, Z.; Wang, D.; Cheng, M.; Chen, H.; Cai, J. Foreseeing private car transfer between urban regions with multiple graph-based generative adversarial networks. World Wide Web 2022, 25, 2515–2534. [Google Scholar] [CrossRef]
  226. Usama, M.; Ma, R.; Hart, J.; Wojcik, M. Physics-Informed Neural Networks (PINNs)-Based Traffic State Estimation: An Application to Traffic Network. Algorithms 2022, 15, 447. [Google Scholar] [CrossRef]
  227. Huang, J.; Agarwal, S. Physics informed deep learning for traffic state estimation. In Proceedings of the 2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC), Rhodes, Greece, 20–23 September 2020; IEEE: New York, NY, USA, 2020; pp. 1–6. [Google Scholar]
  228. Shao, Y.; Li, H.; Gu, X.; Yin, H.; Li, Y.; Miao, X.; Zhang, W.; Cui, B.; Chen, L. Distributed Graph Neural Network Training: A Survey. arXiv 2022, arXiv:2211.00216. [Google Scholar]
  229. Jiang, W.; He, M.; Gu, W. Internet Traffic Prediction with Distributed Multi-Agent Learning. Appl. Syst. Innov. 2022, 5, 121. [Google Scholar] [CrossRef]
  230. He, Q.; Dong, Z.; Chen, F.; Deng, S.; Liang, W.; Yang, Y. Pyramid: Enabling hierarchical neural networks with edge computing. In Proceedings of the ACM Web Conference 2022, Lyon, France, 25–29 April 2022; pp. 1860–1870. [Google Scholar]
  231. Wang, C.; Zhang, K.; Wang, H.; Chen, B. Auto-STGCN: Autonomous spatial-temporal graph convolutional network search based on reinforcement learning and existing research results. arXiv 2020, arXiv:2010.07474. [Google Scholar]
  232. Munikoti, S.; Agarwal, D.; Das, L.; Halappanavar, M.; Natarajan, B. Challenges and opportunities in deep reinforcement learning with graph neural networks: A comprehensive review of algorithms and applications. arXiv 2022, arXiv:2206.07922. [Google Scholar]
  233. Mingshuo, N.; Dongming, C.; Dongqi, W. Reinforcement Learning on Graph: A Survey. arXiv 2022, arXiv:2204.06127. [Google Scholar]
  234. Atwood, J.; Towsley, D. Diffusion-convolutional neural networks. In Advances in Neural Information Processing Systems; NIPS: Barcelona, Spain, 2016; Volume 29. [Google Scholar]
Figure 1. The grid-format traffic forecasting problem [22,23].
Figure 1. The grid-format traffic forecasting problem [22,23].
Ijgi 12 00100 g001
Figure 2. The graph-format traffic forecasting problem [24,25].
Figure 2. The graph-format traffic forecasting problem [24,25].
Ijgi 12 00100 g002
Table 2. The list of new open traffic datasets.
Table 2. The list of new open traffic datasets.
StudyTraffic AttributesSpatial RangeTemporal RangeDownload Link (Accessed on 2 February 2023)
[114]Aggregated taxi speedSeoul, South Korea1–30 April 2018https://github.com/SNU-DRL/ddpgcn-dataset
[126]Aggregated taxi flowWuhan, China1–28 July 2015http://ggssc.whu.edu.cn/ggsscAssets/download/AttentionModel/code_and_data.zip
HZMF2019 [146]Aggregated metro passenger flowHangzhou, China1–25 January 2019https://github.com/lixus7/MTMGNN
TaxiBJ21 [23]Aggregated taxi flowBeijing, ChinaNovember 2012, November 2014, and November 2015https://github.com/jwwthu/DL4Traffic/tree/main/TaxiBJ21
[216]Aggregated traffic flowBeijing, China1 June–15 July 2009https://github.com/gao0628/Dataset
[217]Aggregated traffic flowSix intersections in an urban area56 dayshttps://zenodo.org/record/3653880#.Y20cPHZBzT6
XiAn Road Traffic [218]Aggregated traffic flow, weather dataXi’an, China1 August–30 September 2019https://github.com/FIGHTINGithub/Xi-an-Road-Traffic-Data
[219]Aggregated traffic flowAveiro, Portugal2019, 2020, and 2021https://figshare.com/s/d324f5be912e7f7a0d21
[220]Aggregated taxi and bike tripsNew York City, USA2019, 2020https://github.com/Evens1sen/Deep-NYC-Taxi-Bike
[221]Aggregated taxi and bike tripsChicago, USA2013 to 2020https://github.com/iipr/mobility-demand
[222]Citywide crowd flowTokyo and Osaka1 April–9 July 2017https://github.com/deepkashiwa20/DeepCrowd
Table 3. The list of new open-code resources.
Table 3. The list of new open-code resources.
StudyFrameworkLink (Accessed on 2 February 2023)
DDSTGCN [60]PyTorchhttps://github.com/j1o2h3n/DDSTGCN
STAGCN [61]PyTorchhttps://github.com/QiweiMa-LL/STAGCN
CTVI+ [70]PyTorchhttps://github.com/dsj96/TKDD
TGAE [71]PyTorchhttps://github.com/wangqiang-codes/TGAE
GAMCN [88]TensorFlowhttps://github.com/alvinzhaowei/GAMCN
MADGCN [89]TensorFlow, PyTorchhttps://github.com/wumingyao/MADGCN
AdapGL [98]PyTorchhttps://github.com/goaheand/AdapGL-pytorch
Ada-STNet [100]PyTorchhttps://github.com/LiuZH-19/Ada-STNet
AM-RGCN [108]PyTorchhttps://github.com/ILoveStudying/AM-RGCN
DDP-GCN [114]TensorFlowhttps://github.com/SNU-DRL/DDP-GCN
GDFormer [129]PyTorchhttps://github.com/dublinsky/GDFormer
ST-GCN [133]TensorFlowhttps://github.com/Wautumn/ST-GCN
MTMGNN [146]PyTorchhttps://github.com/lixus7/MTMGNN2
TmS-GCN [156]PyTorchhttps://github.com/Joker-L0912/Tms-GCN-Py
STUGCN [161]PyTorchhttps://github.com/zsongsong/stugcn
D 2 STGNN [185]PyTorchhttps://github.com/zezhishao/D2STGNN
DSTAGNN [186]PyTorchhttps://github.com/SYLan2019/DSTAGNN
FOGS [187]PyTorchhttps://github.com/kevin-xuan/FOGS
STG-NCDE [188]PyTorchhttps://github.com/jeongwhanchoi/STG-NCDE
ST-GFSL [191]PyTorchhttps://github.com/RobinLu1209/ST-GFSL
STEP [193]PyTorchhttps://github.com/zezhishao/STEP
RGSL [194]PyTorchhttps://github.com/alipay/RGSL
STDEN [196]PyTorchhttps://github.com/Echo-Ji/STDEN
TAMP-S2GCNets [197]PyTorchhttps://github.com/tamps2gcnets/TAMP_S2GCNets
PM-MemNet [200]PyTorchhttps://github.com/HyunWookL/PM-MemNet
[201]TensorFlowhttps://github.com/LucaHermes/graph-UNet-traffic-prediction
TraverseNet [223]PyTorchhttps://github.com/nnzhan/TraverseNet
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Jiang, W.; Luo, J.; He, M.; Gu, W. Graph Neural Network for Traffic Forecasting: The Research Progress. ISPRS Int. J. Geo-Inf. 2023, 12, 100. https://doi.org/10.3390/ijgi12030100

AMA Style

Jiang W, Luo J, He M, Gu W. Graph Neural Network for Traffic Forecasting: The Research Progress. ISPRS International Journal of Geo-Information. 2023; 12(3):100. https://doi.org/10.3390/ijgi12030100

Chicago/Turabian Style

Jiang, Weiwei, Jiayun Luo, Miao He, and Weixi Gu. 2023. "Graph Neural Network for Traffic Forecasting: The Research Progress" ISPRS International Journal of Geo-Information 12, no. 3: 100. https://doi.org/10.3390/ijgi12030100

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop