Vol. 3 No. 9 (2024)
Articles

Multi-Scale Temporal Deep Learning with Transformers for Microservice Backend Anomaly Detection

Published 2024-12-30

How to Cite

Shao, C. (2024). Multi-Scale Temporal Deep Learning with Transformers for Microservice Backend Anomaly Detection. Journal of Computer Technology and Software, 3(9). Retrieved from https://ashpress.org/index.php/jcts/article/view/270

Abstract

This paper addresses the practical needs of backend anomaly detection in microservice architectures by proposing a multi-scale Transformer detection framework based on distributed tracing links. This framework is designed to stably identify anomalous behavior in scenarios with complex call paths, mixed feature scales, and significant noise fluctuations. The method uses link-level temporal observations as core input, parsing the original trace into an alignable time window sequence and constructing a unified feature representation from dimensions such as latency statistics, throughput intensity, error representation, and topology. At the model level, a multi-scale sequence construction and fusion mechanism preserves both short-term local perturbations and long-term background dependencies. A Transformer encoder is then used to model cross-time and cross-feature associations to obtain a more discriminative latent representation. At the scoring level, the encoded representation is mapped to a window-level anomaly score, supporting the ranking of anomaly risks and alarm output. Comparative experimental results show that this framework achieves superior overall performance across multiple evaluation metrics, demonstrating stronger anomaly detection capabilities and better stability. It is suitable for link monitoring and backend operational risk identification tasks in cloud-native microservice systems.