Loading [MathJax]/extensions/TeX/ietmacros.js
TransAnomaly: Video Anomaly Detection Using Video Vision Transformer | IEEE Journals & Magazine | IEEE Xplore

TransAnomaly: Video Anomaly Detection Using Video Vision Transformer


This study is to detect anomalies in videos with a combination of the ViViT and the U-Net. The model can encode rich temporal information and global context in videos wit...

Abstract:

Video anomaly detection is challenging because abnormal events are unbounded, rare, equivocal, irregular in real scenes. In recent years, transformers have demonstrated p...Show More

Abstract:

Video anomaly detection is challenging because abnormal events are unbounded, rare, equivocal, irregular in real scenes. In recent years, transformers have demonstrated powerful modelling abilities for sequence data. Thus, we attempt to apply transformers to video anomaly detection. In this paper, we propose a prediction-based video anomaly detection approach named TransAnomaly. Our model combines the U-Net and the Video Vision Transformer (ViViT) to capture richer temporal information and more global contexts. To make full use of the ViViT for the prediction, we modified the ViViT to make it capable of video prediction. Experiments on benchmark datasets show that the addition of the transformer module improves the anomaly detection performance. In addition, we calculate regularity scores with sliding windows and evaluate the impact of different window sizes and strides. With proper settings, our model outperforms other state-of-the-art prediction-based video anomaly detection approaches. Furthermore, our model can perform anomaly localization by tracking the location of patches with lower regularity scores.
This study is to detect anomalies in videos with a combination of the ViViT and the U-Net. The model can encode rich temporal information and global context in videos wit...
Published in: IEEE Access ( Volume: 9)
Page(s): 123977 - 123986
Date of Publication: 30 August 2021
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.