Machine Learning Method for Multi-Scale Anomaly Detection in Cloud Environments Based on Transformer Architecture
Published 2024-07-30
How to Cite

This work is licensed under a Creative Commons Attribution 4.0 International License.
Abstract
This paper addresses the complexity of anomaly detection in cloud service environments and proposes a detection method based on a multi-scale Transformer. The method models features across temporal granularities and fuses contextual information to capture both short-term fluctuations and long-term trends, avoiding feature loss and insufficient discrimination under a single time scale. The model introduces multi-head attention and gating structures to achieve complementary modeling of global and local features, thereby enhancing the recognition of diverse anomaly patterns in complex cloud environments. A systematic analysis of parameter sensitivity and environmental sensitivity is conducted, revealing performance differences under varying learning rates, numbers of attention heads, and load conditions, which verifies the robustness and adaptability of the method across diverse scenarios. Experiments are carried out on publicly available datasets, evaluating key metrics including Precision, Recall, F1-Score, and Detection Latency. The results show that the proposed method outperforms existing approaches in both accuracy and response speed, effectively improving the reliability and real-time performance of cloud service monitoring. Overall, the multi-scale Transformer anomaly detection method demonstrates strong detection capability and practical value in cloud computing scenarios, providing a feasible solution for large-scale time-series modeling and anomaly identification.