Songze Li

Professor, School of Cyber Science and Engineering, Southeast University
Adjunct Assistant Professor, IoT Thrust, Information Hub, The Hong Kong University of Science and Technology (Guangzhou)

Executive Director, Engineering Research Center of Blockchain Application, Supervision and Management (Southeast University), Ministry of Education

Email: songzeli [at] seu [dot] edu [dot] cn, songzeli8824 [at] outlook [dot] com


Dr. Songze Li is a professor at School of Cyber Science and Engineering, Southeast University. He was an assistant professor at the Internet of Things Thrust in HKUST (GZ), and an affiliate assistant professor at CSE Department in HKUST (CWB). Before that, he worked as a researcher at Stanford University. Dr. Li received his Ph.D. degree from University of Southern California in 2018, and his B.Sc. degree from New York University in 2011. Dr. Li’s current research interest is on developing secure, scalable, and responsible distributed computing and learning solutions, mainly focused on machine learning and blockchain. His research have been published at top security and machine learning conferences and journals inlcuding USENIX Security, NeurIPS, ICML, ICLR, TIFS, and TIT. Dr. Li received the Best Paper Award at NeurIPS-20 Workshop on Scalability, Privacy, and Security in Federated Learning. He was Qualcomm Innovation Fellowship finalist in 2017.


Research Interests

  • Federated learning security and privacy
  • Security, privacy, and safety of large language and multi-modal models
  • Secure multi-party computation
  • Blockchain security and scalability

Academic Services

  • Secretary - IEEE Information Theory Society Hongkong Chapter
Guest editor
  • Journal of Surveillance, Security and Safety, special issue on AI Security and Privacy
  • Entropy, special issue on Information Theory for Distributed Systems
Area chair
  • MLSys 2023 Workshop on Resource-Constrained Learning in Wireless Networks (MLSys-RCLWN-23)
Track chair
  • The 8th International Conference on Computer and Communication Systems (ICCCS-23)
TPC member
  • NeurIPS 2024 International Workshop on Federated Foundation Models (FL@FM-NeurIPS 24)
  • IJCAI 2024 International Workshop on Federated Learning in the Age of Foundation Models (FL@FM-IJCAI 24)
  • IEEE International Conference on Distributed Computing Systems (24)
  • ICLR 2024 Workshop on Privacy Regulation and Protection in Machine Learning (PRIVATE ML@ICLR 2024)
  • NeurIPS 2023 Workshop on Federated Learning in the Age of Foundation Models (FL@FM-NeurIPS 2023)
  • KDD 2023 Workshop on Federated Learning for Distributed Data Mining (FL4Data-Mining-23)
  • ICML 2023 Workshop on Federated Learning and Analytics in Practice (FL-ICML-23)
  • SIGIR 2023 1st Workshop on Federated Learning for Information Retrieval (FLIRT@SIGIR 2023)
  • IJCAI International Workshop on Trustworthy Federated Learning (FL-IJCAI-23)
  • NeurIPS 2022 International Workshop on Federated Learning: Recent Advances and New Challenges (FL-NeurIPS-22)
  • International Conference on Wireless Communications and Signal Processing (WCSP-22)
  • IEEE Region 10 Conference (TENCON-22)
  • IEEE Journal on Selected Areas in Communications, Special Issue on Communication-Efficient Distributed Learning over Networks (JSAC-CEDL-22)
  • IJCAI International Workshop on Trustworthy Federated Learning (FL-IJCAI-22)
  • AAAI International Workshop on Trustable, Verifiable and Auditable Federated Learning (FL-AAAI-22)
  • AAAI Conference on Artificial Intelligence (22, 24, 25)
  • ICML International Workshop on Federated Learning for User Privacy and Data Confidentiality (FL-ICML-21)
  • MobiCom Technologies for the Wireless Edge Workshop (EdgeTech-MobiCom-18)
Journal reviewer (selected)
  • IEEE Transactions on Information Theory
  • IEEE Journal on Selected Areas in Information Theory
  • IEEE Transactions on Information Forensics and Security
  • IEEE Transactions on Dependable and Secure Computing
  • IEEE Transactions on Signal Processing
  • IEEE Journal on Selected Areas in Communications
  • IEEE Transactions on Communications
  • IEEE Transactions on Wireless Communications
  • IEEE Transactions on Knowledge and Data Engineering
  • IEEE Communication Letters
Conference reviewer (selected)

NeurIPS, ICML, ICLR, AAAI, ISIT, ICASSP.


Publications

See the complete list at Google Scholar page.

Journal Papers and Preprints

[1].H. Hu, Y. Wu, Y. Shi, S. Li, C. Jiang, and W. Zhang, “Communication-Efficient Coded Computing for Distributed Multi-Task Learning,”, IEEE Transactions on Communications, Apr. 2023.

[2].T. Jahani-Nezhad, M. A. Maddah-Ali, S. Li, and G. Caire, “Swiftagg+: Achieving asymptotically optimal communication loads in secure aggregation for federated learning,” IEEE Journal on Selected Areas in Communications, vol. 41, no. 4, pp. 977–989, Mar. 2023.

[3].J. Zhu, S. Li, and J. Li, “Information-Theoretically Private Matrix Multiplication From MDS-Coded Storage,” IEEE Transactions on Information Forensics and Security, vol. 18, pp. 1680-1695, 2023.

[4].J. Zhu, S. Li, “A Systematic Approach towards Efficient Private Matrix Multiplication,” IEEE Journal on Selected Areas in Information Theory, vol. 3, no. 2, pp. 257-274, June 2022.

[5].J. Zhu, Q. Yan, X. Tang, and S. Li, “Symmetric Private Polynomial Computation From Lagrange Encoding,” IEEE Transactions on Information Theory, vol. 68, no. 4, pp. 2704-2718, Jan. 2022.

[6].A. R. Elkordy, S. Li, M. Maddah-Ali, and A. S. Avestimehr, “Compressed Coded Distributed Computing,” IEEE Transactions on Communications, vol. 69, no. 5, pp. 2773-2783, May 2021.

[7].S. Li, D. Tse, “TaiJi: Longest Chain Availability with BFT Fast Confirmation,” e-print arXiv:2011.11097, Nov. 2020.

[8].S. Li, M. Yu, C. Yang, A. S. Avestimehr, S. Kanna, and P. Viswanath, “PolyShard: Coded Sharding Achieves Linearly Scaling Efficiency and Security Simultaneously,” IEEE Transactions on Information Forensics and Security, vol. 16, pp. 249-261, July 2020.

[9].S. Li, M. Mousavi Kalan, Q. Yu, M. Soltanolkotabi, and A. S. Avestimehr, “Polynomially Coded Regression: Optimal Straggler Mitigation via Data Encoding,” e-print arXiv:1805.09934.

[10].S. Li, M. Maddah-Ali, and A. S. Avestimehr, “Coding for Distributed Fog Computing,” IEEE Communications Magazine, vol. 55, no. 4, pp. 34-40, Apr. 2017.

[11].S. Li, Q. Yu, M. Maddah-Ali, and A. S. Avestimehr, “A Scalable Framework for Wireless Distributed Computing,” IEEE/ACM Transactions on Networking, vol. 25, no. 5, pp. 2643-2654, Oct. 2017.

[12].S. Li, Q. Yu, M. Maddah-Ali, and A. S. Avestimehr, “A Fundamental Tradeoff between Computation and Communication in Distributed Computing,” IEEE Transactions on Information Theory, vol. 64, no. 1, pp. 109-128, Jan. 2018.

[13].S. Li, D. Kao, and A. S. Avestimehr, “Rover-to-Orbiter Communication in Mars: Taking Advantage of the Varying Topology,” IEEE Transactions on Communications, Vol. 64, No. 2, Feb. 2016.

Conference Papers

[1].S. Li, D. Yao, and J. Liu, “FedVS: Straggler-Resilient and Privacy-Preserving Vertical Federated Learning for Split Models,” ICML, July 2023.

[2].Y. Dai and S. Li, “Chameleon: Adapting to Peer Images for Planting Durable Backdoors in Federated Learning,” ICML, July 2023.

[3].J. Tang, J. Zhu, S. Li, and L. Sun, “Secure Embedding Aggregation for Federated Representation Learning,” IEEE ISIT, June 2023.

[4].Z. Huang, S. Li, K. Liang, and Y. Wu, “Secure Gradient Aggregation for Wireless Multi-Server Federated Learning,” IEEE ISIT, June 2023.

[5].H. Hu, S. Li, M. Cheng, and Y. Wu, “Coded Distributed Computing for Hierarchical Multi-Task Learning,” IEEE ITW, Apr. 2023.

[6].J. Shao, Y. Sun, S. Li, and J. Zhang, “DReS-FL: Dropout-Resilient Secure Federated Learning for Non-IID Clients via Secret Data Sharing,” NeurIPS, Nov. 2022.

[7].J. Zhu and S. Li, “Generalized Lagrange Coded Computing: A Flexible Computation-Communication Tradeoff,” IEEE ISIT, June 2022.

[8].T. Jahani-Nezhad, M. A. Maddah-Ali, S. Li, and G. Caire, “SwiftAgg: Communication-efficient and dropout-resistant secure aggregation for federated learning with worst-case security guarantees,” IEEE ISIT, June 2022.

[9].Y. Sun, J. Shao, S. Li, Y. Mao, J. Zhang, “Stochastic coded federated learning with convergence and privacy guarantees,” IEEE ISIT, June 2022.

[10].C. Yang, J. So, C. He, S. Li, Q. Yu, R. Ali, B. Guler, S. Avestimehr, “LightSecAgg: a Lightweight and Versatile Design for Secure Aggregation in Federated Learning,” Conference on Machine Learning and Systems (MLSys), Aug. 2022.

[11].J. Liang, W. Jiang, S. Li, “OmniLytics: A Blockchain-based Secure Data Market for Decentralized Machine Learning,” ICML International Workshop on Federated Learning for User Privacy and Data Confidentiality (FL-ICML), July 2021.

[12].C. He, S. Li, J. So, M. Zhang, X. Zeng, H. Wang, X. Wang, P. Vepakomma, A. Singh, H. Qiu, L. Shen, P. Zhao, Y. Kang, Y. Liu, R. Raskar, Q. Yang, M. Annavaram and A. S. Avestimehr, “FedML: A Research Library and Benchmark for Federated Machine Learning,” NeurIPS SpicyFL workshop, Dec. 2020.

[13].S. Li, M. Yu, C. Yang, A. S. Avestimehr, S. Kanna, and P. Viswanath, “PolyShard: Coded Sharding Achieves Linearly Scaling Efficiency and Security Simultaneously,” IEEE ISIT, June 2020.

[14].M. Yu, S. Sahraei, S. Li, A. S. Avestimehr, S. Kanna, and P. Viswanath, “Coded Merkle Tree: Solving Data Availability Attacks in Blockchains,” Financial Cryptography and Data Security, Feb. 2020.

[15].S. Li, S. Sahraei, M. Yu, A. S. Avestimehr, S. Kanna, and P. Viswanath, “Coded State Machine - Scaling State Machine Execution under Byzantine Faults,” PODC, July 2019.

[16].Q. Yu, S. Li, N. Raviv, M. Mousavi Kalan, M. Soltanolkotabi, and A. S. Avestimehr, “Lagrange Coded Computing: Optimal Design for Resiliency, Security, and Privacy,” International Conference on Artificial Intelligence and Statistics (AISTATS 2019), Apr. 2019.

[17].Q. Yu, N. Raviv, S. Li, M. Mousavi Kalan, M. Soltanolkotabi, and A. S. Avestimehr, “Lagrange Coded Computing: Optimal Design for Resiliency, Security, and Privacy,” NeurIPS MLSys workshop, Dec. 2018.

[18].Y. Li, M. Yu, S. Li, A. S. Avestimehr, NS Kim, and A. Schwing, “Pipe-SGD: A Decentralized Pipelined SGD Framework for Distributed Deep Net Training,” NIPS, Dec. 2018.

[19].M. Yu, Z. Lin, H. V. Narra, S. Li, Y. Li, NS Kim, A. Schwing, M. Annavaram, and A. S. Avestimehr, “GradiVeQ: Vector Quantization for Bandwidth-Efficient Gradient Aggregation in Distributed CNN Training,” NIPS, Dec. 2018.

[20].S. Li, M. Maddah-Ali, and A. S. Avestimehr, “Compressed Coded Distributed Computing,” IEEE ISIT, June 2018.

[21].S. Li, M. Mousavi Kalan, A. S. Avestimehr, and M. Soltanolkotabi, “Near-Optimal Straggler Mitigation for Distributed Gradient Methods,” 7th International Workshop on Parallel and Distributed Computing for Large Scale Machine Learning and Big Data Analytics, May 2018.

[22].S. Li, M. Maddah-Ali, and A. S. Avestimehr, “Architectures for Coded Mobile Edge Computing,” Fog World Congress, Oct. 2017.

[23].S. Li, M. Maddah-Ali, and A. S. Avestimehr, “Communication-Aware Computing for Edge Processing,” IEEE ISIT, June 2017.

[24].S. Li, S. Supittayapornpong, M. Maddah-Ali, and A. S. Avestimehr, “Coded TeraSort,” 6th International Workshop on Parallel and Distributed Computing for Large Scale Machine Learning and Big Data Analytics, May 2017.

[25].Q. Yu, S. Li, M. Maddah-Ali, and A. S. Avestimehr, “How to optimally allocate resources for coded distributed computing?,” IEEE ICC, May 2017.

[26].S. Li, M. Maddah-Ali, and A. S. Avestimehr, “A Unified Coding Framework for Distributed Computing with Straggling Servers,” IEEE NetCod, Dec. 2016.

[27].S. Li, Q. Yu, M. Maddah-Ali, and A. S. Avestimehr, “Edge-Facilitated Wireless Distributed Computing,” IEEE GLOBECOM, Dec. 2016.

[28].S. Li, Q. Yu, M. Maddah-Ali, and A. S. Avestimehr, “Coded Distributed Computing: Fundamental Limits and Practical Challenges,” IEEE Asilomar Conference on Signals, Systems, and Computers, Nov. 2016.

[29].S. Li, Q. Yu, M. Maddah-Ali, and A. S. Avestimehr, “A Scalable Coded Computing Framework for Edge-Facilitated Wireless Distributed Computing,” The First IEEE/ACM Symposium on Edge Computing, Oct. 2016.

[30].S. Li, M. Maddah-Ali, and A. S. Avestimehr, “Coded Distributed Computing: Straggling Servers and Multistage Dataflows,” 54th Annual Allerton Conference, Sept. 2016.

[31].S. Li, M. Maddah-Ali, and A. S. Avestimehr, “Fundamental Tradeoff between Computation and Communication in Distributed Computing,” IEEE ISIT, July 2016.

[32].S. Li, M. Maddah-Ali, and A. S. Avestimehr, “Coded MapReduce,” 53rd Annual Allerton Conference, Sept. 2015.

[33].S. Li, D. Kao, and A. S. Avestimehr, “Rover-to-Orbiter Communication in Mars: Taking Advantage of the Varying Topology,” IEEE ISIT, June 2015.

[34].S. Li, E. Akyol and U. Mitra, “Power allocation for Gaussian multiple access channel with noisy cooperative links,” IEEE ICASSP, May 2014.

[35].S. Li, U. Mitra and A. Pandharipande, “Cooperative spectrum sharing with joint receiver decoding,” IEEE ICASSP, May 2013.

[36].S. Li, U. Mitra, V. Ratnam and A. Pandharipande, “Jointly cooperative decode-and-forward relaying for secondary spectrum access,” CISS, Mar. 2012.