Astral: A Datacenter Infrastructure for Large Language Model Training at Scale August 1, 2025· Qingkai Meng , Hao Zheng , Zhenhui Zhang , ChonLam Lao , Chengyuan Huang , Baojia Li , Ziyuan Zhu , Hao Lu , Weizhen Dang , Zitong Lin , Weifeng Zhang , Lingfeng Liu , Yuanyuan Gong , Chunzhi He , Xiaoyuan Hu , Yinben Xia , Xiang Li , Zekun He , Yachen Wang , Xianneng Zou , Kun Yang , Gianni Antichi , Guihai Chen , Chen Tian · 0 min read PDF Cite DOI Type Conference paper Publication Proceedings of ACM SIGCOMM (CCF A) publications Last updated on August 1, 2025 Source-Conf Authors Qingkai Meng Authors Hao Zheng Authors Zhenhui Zhang Authors ChonLam Lao Authors Chengyuan Huang Authors Baojia Li Authors Ziyuan Zhu Authors Hao Lu Authors Weizhen Dang Authors Zitong Lin Authors Weifeng Zhang Authors Lingfeng Liu Authors Yuanyuan Gong Authors Chunzhi He Authors Xiaoyuan Hu Authors Yinben Xia Authors Xiang Li Authors Zekun He Authors Yachen Wang Authors Xianneng Zou Authors Kun Yang Authors Gianni Antichi Authors Guihai Chen Authors Chen Tian ← UDMP: Unified Delay-Driven Multipath Protocol for AI Clusters January 1, 2026 Troubleshooting Programmable Data Planes via Real-Time Table Information Recording August 1, 2025 →