About Me
I am a final-year DPhil student (the thesis has been submitted recently) in Engineering Science at the University of Oxford advised by Prof. Noa Zilberman. My research interests include networking, in-network computing, and machine learning. A specific interest of mine is utilizing in-network computing and machine learning techniques to solve networking problems.
I used to be a research assistant supervised by Prof. Junming Shao in Data Mining Lab between 2017 and 2020. I obtained first-class honour BEng in EEE from University of Glasgow and BEng in CE from UESTC in 2020.
|
|
Accepted "Planter: Rapid Prototyping of In-Network Machine Learning Inference", 2024
Changgang Zheng, Mingyuan Zang, Xinpeng Hong, Liam Perreault, Riyad Bensoussane, Shay Vargaftik, Yaniv Ben-Itzhak, and Noa Zilberman
ACM SIGCOMM Computer Communication Review (CCR), 2024 [Acceptance rates comparable to ACM SIGCOMM conference, i.e., under 15%] [Best of CCR]
PDF |
arXiv (2022) |
BibTex |
Code |
Doc
Using programmable network devices to aid in-network machine learning has been the focus of significant research. This work presents Planter, an open-source, modular framework for mapping trained machine learning models to programmable devices. Planter supports a wide range of machine learning models, multiple targets and can be easily extended.
|
|
Accepted "IIsy: Hybrid In-Network Classification Using Programmable Switches", 2024
Changgang Zheng, Zhaoqi Xiong, Thanh T Bui, Siim Kaupmees, Riyad Bensoussane, Antoine Bernabeu, Shay Vargaftik, Yaniv Ben-Itzhak, and Noa Zilberman
IEEE/ACM Transactions on Networking, 2024 [CCF A]
PDF |
arXiv (2022) |
BibTex |
Code
This work presents IIsy, which implements machine learning classification models in a hybrid fashion using off-the-shelf network devices. Besides a range of traditional and ensemble machine learning models, IIsy also supports hybrid classification, achieving near-optimal classification results, while significantly reducing latency and load on the servers.
|
|
Accepted "In-Network Machine Learning Using Programmable Network Devices: A Survey" on the 7th of December, 2023.
Changgang Zheng, Xinpeng Hong, Damu Ding, Shay Vargaftik, Yaniv Ben-Itzhak, and Noa Zilberman
IEEE Communications Surveys and Tutorials [Impact Factor=35.6]
Paper |
Link |
BibTex
In-network ML is a promising technology that provides ML inference services with high throughput, low latency and high power efficiency. This paper provides a holistic review of the fundamentals of in-network ML. It introduces its background and solutions, discusses the existing challenges and open issues, and provides insights for further research explorations.
|
|
Accepted "DINC: Toward Distributed In-Network Computing"
Changgang Zheng, Haoyue Tang, Mingyuan Zang, Xinpeng Hong, Aosong Feng, Leandros Tassiulas, and Noa Zilberman
ACM CoNEXT'23 & Proceedings of the ACM on Networking, 2023
[Acceptance Rate: 24/129=18.6%]
Paper |
BibTex |
Code
Research has focused on enabling on-device functionality, with limited consideration to distributed in-network computing. This paper explores the applicability of distributed computing to in-network computing and presents DINC, a framework enabling distributed in-network computing, generating deployment strategies, overcoming resource constraints and providing functionality guarantees across a network.
|
|
Accepted "QCMP: Load Balancing via In-network Reinforcement Learning", 2023
Changgang Zheng, Benjamin Rienecker, and Noa Zilberman
Proceedings of the ACM SIGCOMM Workshop on Future of Internet Routing & Addressing, 2023
Paper |
BibTex |
Code
Traffic load balancing is a long-time networking challenge. This work presents QCMP, a ReinforcementLearning based load balancing solution implemented within the data plane, providing dynamic policy adjustment with quick response to changes in traffic. Our results show that QCMP requires negligible resources, runs at line rate, and adapts quickly to changes in traffic patterns.
|
|
arXiv "Automating In-Network Machine Learning", 2022
Changgang Zheng, Mingyuan Zang, Xinpeng Hong, Riyad Bensoussane, Shay Vargaftik, Yaniv Ben-Itzhak, and Noa Zilberman
arXiv, 2022
Paper |
BibTex |
Code
Using programmable network devices to aid in-network machine learning has been the focus of significant research. This work presents Planter, an open-source, modular framework for mapping trained machine learning models to programmable devices. Planter supports a wide range of machine learning models, multiple targets and can be easily extended.
|
|
arXiv "IIsy: Practical In-Network Classification", 2022
Changgang Zheng, Zhaoqi Xiong, Thanh T Bui, Siim Kaupmees, Riyad Bensoussane, Antoine Bernabeu, Shay Vargaftik, Yaniv Ben-Itzhak, and Noa Zilberman
arXiv, 2022
Paper |
BibTex
This work presents IIsy, which implements machine learning classification models in a hybrid fashion using off-the-shelf network devices. Besides a range of traditional and ensemble machine learning models, IIsy also supports hybrid classification, achieving near-optimal classification results, while significantly reducing latency and load on the servers.
|
|
Published "Planter: Seeding Trees Within Switches", 2021
Changgang Zheng,
Noa Zilberman
Proceedings of the SIGCOMM'21 Poster and Demo Sessions
Paper |
Short Video |
Slides |
BibTex |
Poster |
Code
Data classification within the network significantly benefits reaction time, servers offload and power efficiency. This work designs an algorithm for efficient mapping of ensemble models, such as XGBoost and Random Forest, to programmable switches. The proposed method overlaps trees within match-action tables, achieves high accuracy and low resource overhead.
|
|