Tpram-kelly.7z Review

: It uses a Transformer-based attention mechanism to build a performance prediction model for microservice nodes on a system's "critical path".

The paper addresses the difficulty of optimizing resource allocation in cloud-native environments where microservices have complex dependencies. TpRam-Kelly.7z

: Experimental results using the DeathStarBench benchmark showed that TPRAM can save at least 40.58% of CPU and 15.84% of memory resources while maintaining end-to-end Quality of Service (QoS). Accessing the Paper : It uses a Transformer-based attention mechanism to

: A preprint or abstract of the work is hosted on ResearchGate . Accessing the Paper : A preprint or abstract

The file refers to the research paper titled " Transformer-based performance prediction and proactive resource allocation for cloud-native microservices ," published in Cluster Computing in August 2025.

You can find the full text or official citation through these platforms: