Department of Data Science
Ying Wu College of Computing
New Jersey Institute of Technology
akshay [dot] rangamani [at] njit [dot] edu
I am looking to hire one PhD student for the Fall 2025 cycle. Reach out to me if you are interested in the science of deep learning. Keep in mind that I cannot respond to every email, but I will read them all.
I am organizing a Deep Learning Reading Group in Spring 2025. Reach out to me if you would like to join!
NJIT UG/MS Students - fill out this form if you're interested in working with me.
Recent Topics: Identifying sparse and low rank structures in deep networks (like Neural Collapse.)
I am broadly interested in the science of deep learning. I use theoretical and empirical approaches to understand and explain phenomena in deep learning. I have also worked on problems in sparse and low rank signal processing, with applications to computer vision, biomedical signal processing, and online social networks. I also dabble in problems in computational neuroscience.
I am an Assistant Professor in the Data Science department at NJIT. Between 2020 and 2023 I was a postdoc at the Center for Brains Minds and Machines at MIT, working with Prof. Tomaso Poggio. I got my PhD in Electrical and Computer Engineering at Johns Hopkins University, where I was a member of the Digital Signal Processing Laboratory, advised by Prof. Trac D. Tran. Before coming to JHU, I graduated with a B.Tech in Electrical Engineering from IIT Madras
DS677 - Deep Learning ('25 - Spring; '24 - Spring, Fall)
Year | Publication |
---|---|
2025 | Low Rank and Sparse Fourier Structure in Recurrent Networks Trained on Modular Addition ICASSP 2025 |
2025 | On Generalization Bounds for Neural Networks with Low Rank Layers ALT 2025 with Andrea Pinto and Tomaso Poggio |
2023 | Feature Learning in Deep Classifiers through Intermediate Neural Collapse ICML 2023 with Marius Lindegaard, Tomer Galanti and Tomaso Poggio |
2023 | Dynamics in Deep Classifiers trained with the Square Loss: normalization, low rank, neural collapse and generalization bounds RESEARCH with Mengjia Xu, Tomer Galanti, Qianli Liao and Tomaso Poggio |
2023 | For Interpolating Kernel Machines, Minimizing the Norm of the ERM Solution Maximizes Stability [pdf] Analysis and Applications 20th Anniversary Special Issue with Lorenzo Rosasco and Tomaso Poggio |
2022 | Neural Collapse in Deep Homogeneous Classifiers and The Role of Weight Decay IEEE ICASSP 2022 with Andrzej Banburski-Fahey |
2021 | A Scale Invariant Flatness Measure for Deep Network Minima IEEE ICASSP 2021 with Nam H. Nguyen, Abhishek Kumar, Dzung Phan, Sang H. Chin, Trac D. Tran |
2019 | Target Tracking and Classification Using Compressive Sensing Camera for SWIR Videos Signal Image and Video Processing with Chiman Kwan, Bryan Chou, Jonathan Yang, Trac Tran, Jack Zhang, Ralph Etienne-Cummings |
2018 | Reconstruction-free Deep Convolutional Neural Networks for Partially Observed Images IEEE GlobalSIP 2018 with Arun Nair, Luoluo Liu, Sang H. Chin, Muyinatu A. Lediju Bell and Trac D. Tran |
2018 | ChieF : A Change Pattern based Interpretable Failure Analyzer IEEE Big Data 2018 with Dhaval Patel, Lam Nguyen, Shrey Srivastava, and Jayant Kalagnanam |
2018 | Sparse Coding and Autoencoders (arXiv) [pdf] IEEE ISIT 2018 with Anirbit Mukherjee, Amitabh Basu, Trac D. Tran, Sang H. Chin |
2018 | A Greedy Pursuit Algorithm for Separating Signals from Nonlinear Compressive Observations IEEE ICASSP 2018 with Dung Tran, Trac D. Tran, Sang H. Chin |
2016 | Predicting Local Field Potentials with Recurrent Neural Networks IEEE EMBC 2016 with Louis Kim, Jacob Harer, Sang H. Chin, et. al. |
2015 | Targeted Dot Product Representation for Friend Recommendation in Online Social Networks IEEE/ACM ASONAM 2015 with Minh Dao, Nam P. Nguyen, Trac D. Tran, Sang H. Chin |