Date: Mar 26, 2021

Time: 11:00 - 12:00

Location: IB 2028

DSRC Seminar | Ensemble Reduction and Learning for Resource-constrained Computing

Generic tree ensembles (such as Random Forest, RF) rely on a substantial amount of individual models to attain desirable performance. The cost of maintaining a large ensemble could become prohibitive in applications where computing resources are stringent.
In this talk,  I will introduce a hierarchical ensemble reduction and learning framework, which consistently outperforms RF in terms of both accuracy and retained ensemble size. In other words, ensemble reduction is achieved with enhancement in accuracy rather than degradation.  Boolean logic encoding techniques are developed to directly tackle multiclass problems. Moreover, the framework contains a novel conversion paradigm that supports the automatic deployment of >500 trees on a chip. Comparing with RF, the proposed method reduces power consumption and overall area utilization significantly. The hierarchical approach provides rich opportunities to balance between the computation (training and response time), the hardware resource (memory and energy), and accuracy.

Speaker Bio:
Hongfei Wang received the M.S. degree in computer engineering and Ph.D. degree in computer science from Carnegie Mellon University (CMU), Pittsburgh, PA, USA, and Huazhong University of Science and Technology (HUST), Wuhan, China.
He is currently an Associate Professor with the School of Cyber Science and Engineering, HUST. Before joining HUST, he worked with Intel Research Lab (Pittsburgh, PA), and the Advanced Chip Test Laboratory at CMU, USA. He has over a decade of experience in VLSI test and design, from both academia and industry. His current research interests include statistical optimization of test and diagnosis solutions for digital systems, hardware design for machine learning systems, and EDA methods for variability and reliability.