Hello everyone,

Next Monday (2023/11/09) at noon, Jianxin Zhang will be presenting in EECS room 2311. Please fill out the food form before attending, so we can buy enough pizza for everyone.

If you have research to share, please volunteer to present using this link. Currently, there is no one scheduled for 2023/11/20 (next seminar). As a token of gratitude, presenters get to choose a customized meal from a selection of local restaurants, as listed here.

All seminar info is available on the SPEECS website, and a Google calendar link with dates/times/presenters is can be found here. If you have any questions, you can contact Zongyu Li or me directly, or email speecs.seminar-requests@umich.edu. Suggestions are always welcome :)

Speaker: Jianxin Zhang

Topic: Label Embedding via Low-Coherence Matrices

Abstract: Label embedding is a framework for multiclass classification problems where each label is represented by a distinct vector of some fixed dimension, and training involves matching model output to the vector representing the correct label. While label embedding has been successfully applied in extreme classification and zero-shot learning, and offers both computational and statistical advantages, its theoretical foundations remain poorly understood. This work presents an analysis of label embedding in the context of extreme multiclass classification, where the number of classes \(C\) is very large. We present an excess risk bound that reveals a trade-off between computational and statistical efficiency, quantified via the coherence of the embedding matrix. We further show that under the Massart noise condition, the statistical penalty for label embedding vanishes with sufficiently low coherence. Our analysis supports an algorithm that is simple, scalable, and easily parallelizable, and experimental results demonstrate its effectiveness in large-scale applications.

Supplementary link: https://arxiv.org/abs/2305.19470

Mirror: http://websites.umich.edu/~speecsseminar/presentations/20231106/

Thanks,

Matt Raymond