I am an Assistant Researcher in College of Computer Science & Software Engineering at Hohai University. I obtained my Ph.D. degree from Department of Computer Science & Technology in Nanjing University in Dec. 2022, where I was very fortunate to be advised by Prof. Zhi-Hua Zhou. Before that, I received my B.Sc. degree from Department of Statistics in University of Science and Technology of China in Jun. 2017.

My research interest includes ensemble learning and learning theory.

[Resume] & [中文简历]

sym

🔥 News

  • Enrolling Students: Looking for self-motivated M.Sc/Ph.D. students to work on Artificial Intelligence.
    Feel free to send me an email with your resume and a document stating your research motivation.
  • 2024.05:  🎉🎉 Our paper “Confidence-aware Contrastive Learning for Selective Classification” is accepted by the CCF-A international conference ICML 2024.
  • 2023.12:  🎉🎉 My doctoral dissertation “Research on Theoretical Analysis of Deep Forests and Extensions” was awarded the Excellent Doctoral Dissertation of Jiangsu Artificial Intelligence Society.
  • 2022.12:  🎉🎉 Our paper “Depth is More Powerful than Width with Prediction Concatenation in Deep Forests” is accepted by the CCF-A international conference NeurIPS 2022 as an Oral Representation.
  • 2019.12:  🎉🎉 Our paper “A Refined Margin Distribution Analysis for Forest Representation Learning” is accepted by the CCF-A international conference NeurIPS 2019.

👨‍💻 Students

M.Eng Students
2023: [Tian-Shuang Wu 吴填双]; [Ning Chen 陈宁];

Collected Seminars
[ML&DM Seminar]: Seminar on Machine Learning and Data Mining for my students.
[FAI Seminar]: International Seminar on Foundational Artificial Intelligence.

📝 Publications

ICML 2024
sym

  • [ICML 2024] Confidence-aware Contrastive Learning for Selective Classification. [paper] [code] [bib]
    Yu-Chang Wu, Shen-Huan Lyu, Haopu Shang, Xiangyu Wang, and Chao Qian.
    In: Proceedings of the 41st International Conference on Machine Learning, in press, 2024.
  • [IWQoS 2024] Identifying Key Tag Distribution in Large-Scale RFID Systems. [paper] [code] [bib]
    Yanyan Wang, Jia Liu, Shen-Huan Lyu, Zhihao Qu, Bin Tang, and Baoliu Ye.
    In: IEEE/ACM 32nd International Symposium on Quality of Service, in press, 2024.

  • [TKDD 2024] Interpreting Deep Forest through Feature Contribution and MDI Feature Importance. [paper] [code] [bib]
    Yi-Xiao He, Shen-Huan Lyu, and Yuan Jiang.
    ACM Transactions on Knowledge Discovery from Data, in press, 2024.

  • [JOS 2024] Interaction Representations Based Deep Forest Method in Multi-Label Learning. [paper] [code] [bib]
    Shen-Huan Lyu, Yi-He Chen, and Yuan Jiang.
    Journal of Software, 35(4):1934-1944, 2024.

  • [AISTATS 2023] On the Consistency Rate of Decision Tree Learning Algorithms. [paper] [code] [bib]
    Qin-Cheng Zheng, Shen-Huan Lyu, Shao-Qun Zhang, Yuan Jiang, and Zhi-Hua Zhou.
    In: Proceedings of the 26th International Conference on Artificial Intelligence and Statistics, pp. 7824-7848, Valencia, ES, 2023.

NeurIPS 2022
sym

  • [NeurIPS 2022 Oral] Depth is More Powerful than Width with Prediction Concatenation in Deep Forests. [paper] [code] [bib]
    Shen-Huan Lyu, Yi-Xiao He, and Zhi-Hua Zhou.
    In: Advances in Neural Information Processing Systems 35, pp. 29719-29732, New Orleans, Louisiana, US, 2022.
  • [NN 2022] Improving Generalization of Neural Networks by Leveraging Margin Distribution. [paper] [code] [bib]
    Shen-Huan Lyu, Lu Wang, and Zhi-Hua Zhou.
    Neural Networks, 151:48-60, 2022.

  • [CJE 2022] A Region-Based Analysis for the Feature Concatenation in Deep Forests. [paper] [code] [bib]
    Shen-Huan Lyu, Yi-He Chen, and Zhi-Hua Zhou.
    Chinese Journal of Electronics, 31(6):1072-1080, 2022.

  • [ICDM 2021] Improving Deep Forest by Exploiting High-Order Interactions. [paper] [code] [bib]
    Yi-He Chen*, Shen-Huan Lyu*, and Yuan Jiang (* indicates equal contribution).
    In: Proceedings of the 21st IEEE International Conference on Data Mining, pp. 1030-1035, Auckland, NZ, 2021.

NeurIPS 2019
sym

  • [NeurIPS 2019] A Refined Margin Distribution Analysis for Forest Representation Learning. [paper] [code] [bib]
    Shen-Huan Lyu, Liang Yang, and Zhi-Hua Zhou.
    In: Advances in Neural Information Processing Systems 32, pp. 5531-5541, Vancouver, British Columbia, CA, 2019.

🎖 Honors and Awards

  • 2023.12 Excellent Doctoral Dissertation of Jiangsu Artificial Intelligence Society, Jiangsu.
  • 2023.06 The 5th Special Funding from China Postdoctoral Science Foundation, China.
  • 2022.12 Jiangsu Excellent Postdoctoral Program, Jiangsu.
  • 2019.10 Artificial Intelligence Scholarship in Nanjing University, Nanjing.
  • 2019.09 The First Class Academic Scholarship in Nanjing University, Nanjing.
  • 2018.09 The First Class Academic Scholarship in Nanjing University, Nanjing.
  • 2017.09 Presidential Special Scholarship for first year Ph.D. Student in Nanjing University, Nanjing.
  • 2016.09 The Silver Prize Scholarship for Excellent Student in University of Science and Technology of China, Hefei.

✨ Academic Service

Senior Program Committee Member of Conferences:

  • IJCAI: 2021

Program Committee Member of Conferences:

  • ICML: 2021-2024
  • NeurIPS: 2020-2023
  • AAAI: 2020, 2021, 2023
  • IJCAI: 2020, 2022-2024
  • ICLR: 2020-2023

Reviewer for Journals:

  • Artificial Intelligence (AIJ)
  • IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)
  • IEEE Transactions on Knowledge and Data Engineering (TKDE)
  • IEEE Transactions on Neural Networks and Learning Systems (TNNLS)
  • ACM Transactions on Knowledge Discovery from Data (TKDD)
  • Machine Learning (MLJ)
  • Research
  • Chinese Journal of Electronics (CJE)
  • 软件学报 (Journal of Software, JOS)

📖 Educations

  • 2017.09 - 2022.12, Ph.D. in Computer Science, Nanjing University (NJU)
  • 2013.09 - 2017.06, B.Sc. in Statistics, University of Science and Technology of China (USTC)

💬 Invited Talks

  • 2023.11, Deep Forest, Z-Park National Laboratory, Beijing.
  • 2022.12, Depth is More Powerful than Width, New Orleans Convention Center, Online.
  • 2022.01, Margin Distribution Neural Networks, Huawei Noah’s Ark Lab, Online.

💻 Experiences