I am an Assistant Professor in the Department of Electrical and Computer Engineering at University of California, Riverside, and a cooperating faculty at the Department of Computer Science and Engineering.
My research is on distributed and federated learning, secure, private and trustworthy information processing in large-scale networks.
Prospective Students: I am always interested in working with highly motivated students who want to pursue research at the intersection of: information theory, distributed/federated machine learning, private/secure computing in wireless networks. Our research typically involves both theoretical and implementation components, which requires a strong mathematical background. If you are interested in working in our lab, please apply to the PhD programs of the ECE or CSE Departments and send me an e-mail with the title "Prospective Student" along with your CV and research interests. Students who have majored in math with no prior experience in engineering are also encouraged to apply.
September 2022: Our group has received a UCR OASIS Funding Award to support our research.
August 2022: Serving as a track co-chair for ML and Optimization for Wireless Systems at WCNC 2023, submission deadline Sept 12, 2022!
August 2022: Our paper Communication-Efficient Secure Aggregation for Federated Learning has been accepted to Globecom 2022.
July 2022: Serving as a panelist at the North American School of Information Theory.
June 2022: Organizing an AI workshop for high school students at UC Riverside in July 2022, in collaboration with the Redlands Unified School District.
Apr 2022: Seminar at UCLA.
Feb 2022: Received an NSF CAREER Award!
Jan 2022: Our paper Over-the Air Clustered Federated Learning has been accepted to International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2022). Congratulations Hasin!
Dec 2021: New paper on privacy-preserving federated learning! with Irem and Hasin. TLDR: We introduce a sparse communication framework for speeding up secure aggregation. Preliminary version to appear at 2022 ICLR Workshop on Socially Responsible Machine Learning.
Oct 2021: New paper on asynchronous federated learning! Preliminary version to appear at NeurIPS Workshop on Federated Learning: New Challenges on Privacy, Fairness, Robustness, Personalization and Data Ownership.
Sep 2021: Serving as a TPC member for IEEE International Symposium on Information Theory (ISIT 2022), submission deadline Jan 15, 2022!
Aug 2021: Serving as a TPC member for IEEE Wireless Communications and Networking Conference (WCNC 2022), submission deadline Oct 15, 2022!
July 2021: Received the UCR Regents' Faculty Fellowship!
July 2021: New paper on sustainable/green federated learning A Framework for Sustainable Federated Learning at the International Symposium on Modeling and Optimization in Mobile, Ad hoc, and Wireless Networks (WiOpt)!
June 2021: New paper on privacy-preserving federated learning! Securing Secure Aggregation: Mitigating Multi-Round Privacy Leakage in Federated Learning, preliminary version accepted at FL-AAAI Workshop 2022.
May 2021: Research highlight from the ECE Department! Privacy-Aware Large-Scale Machine Learning
April 2021: Our paper on sustainable/green machine learning Energy Harvesting Distributed Machine Learning has been accepted to IEEE International Symposium on Information Theory!
January 2021: Our paper Turbo-Aggregate: Breaking the Quadratic Aggregation Barrier in Secure Federated Learning has been accepted to IEEE Journal on Selected Areas in Information Theory: Privacy and Security of Information Systems
January 2021: Our paper CodedPrivateML: A Fast and Privacy-Preserving Framework for Distributed Machine Learning has been accepted to IEEE Journal on Selected Areas in Information Theory: Privacy and Security of Information Systems
October 2020: Our paper BREA: Byzantine-Resilient Secure Federated Learning has been accepted to IEEE Journal on Selected Areas in Communications: Machine Learning in Communications and Networks!
October 2020: New project on resilient communications and computing in large-scale networks!
September 2020: Our paper A Scalable Approach for Privacy-Preserving Collaborative Machine-Learning has been accepted to Conference on Neural Information Processing Systems (NeurIPS 2020)!
July 2020: New paper on Byzantine-robust privacy-preserving federated learning: BREA: Byzantine-Resilient Secure Federated Learning
May 2020: New paper on communication-efficient secure federated learning: Turbo-Aggregate: Breaking the Quadratic Aggregation Barrier in Secure Federated Learning
May 2020: New paper on distributed graph processing: TACC: Topology-Aware Coded Computing for Distributed Graph Processing
April 2020: New paper on privacy-aware distributed machine learning: CodedPrivateML: A Fast and Privacy-Preserving Framework for Distributed Machine Learning
I can be reached at bguler at ece dot ucr dot edu