News
2023/9 I finished my internship at Babylon Chain. Thanks a lot to David and Sankha for hosting me. My main responsibilities encompassed user behavior analysis of Bitcoin, an extensive survey on the Proof of Stake (PoS) liquid staking model and their related incentive programs, and a thorough tokenomics investigation of emerging blockchains, including Akash.
2023/1 Our paper Parallel Deep Neural Networks Have Zero Duality Gap is accepted for 2023 as a poster. In short, we make the convex duality gap for deep neural networks become zero by considering the parallel neural network structure. To a parallel neural network with m branches, the output is the linear combinations of the outputs of m standard fully connected neural networks.
2022/09 Our new paper Overparameterized ReLU Neural Networks Learn the Simplest Models: Neural Isometry and Exact Recovery is available on Arxiv and the codes are available on Github. We resolve the discrepancy between the remarkable generalization and model complexity from a convex optimization and sparse recovery perspective. Under certain regularity assumptions on the data, we show that ReLU networks with an arbitrary number of parameters learn only simple models that explain the data.
2022/09 Our paper Beyond the Best: Distribution Functional Estimation in Infinite-Armed Bandits is accepted in NeurIPS 2022! We provide offline and online sample complexity of estimating mean, quantile, trimmed mean and maximum from noisy observations. We develop a unified meta algorithms and prove the general information-theoretical lower bounds for both offline and online sampling. Check out for the paper, slide and poster.
|