I am an Assistant Professor in the Department of Electrical and Computer Engineering at University of California, Riverside, and a cooperating faculty at the Department of Computer Science and Engineering.
My research is on developing scalable, privacy-preserving, and context-aware information processing frameworks for large-scale distributed networks.
I am always interested in working with motivated students who would like to pursue research in one or more of the following areas (potentially, the intersection of), information theory, machine learning, privacy and security, wireless networks, distributed computing, and communications.
My research typically involves both theoretical and implementation components, which requires a strong mathematical background. If you are interested in working with me, please send me an e-mail with the title "Prospective Student" along with your CV and research interests. Students who have majored in math with no prior experience in engineering are also encouraged to apply.
Oct 2021: New paper on asynchronous federated learning! Preliminary version to appear at NeurIPS Workshop on Federated Learning: New Challenges on Privacy, Fairness, Robustness, Personalization and Data Ownership.
Sep 2021: Serving as a TPC member for IEEE International Symposium on Information Theory (ISIT 2022), submission deadline Jan 15, 2022!
Aug 2021: Serving as a TPC member for IEEE Wireless Communications and Networking Conference (WCNC 2022), submission deadline Oct 15, 2022!
July 2021: Received the UCR Regents' Faculty Fellowship!
July 2021: New paper on sustainable/green federated learning A Framework for Sustainable Federated Learning at the International Symposium on Modeling and Optimization in Mobile, Ad hoc, and Wireless Networks (WiOpt)!
June 2021: New paper on privacy-preserving federated learning! Securing Secure Aggregation: Mitigating Multi-Round Privacy Leakage in Federated Learning
May 2021: Research highlight from the ECE Department! Privacy-Aware Large-Scale Machine Learning
April 2021: Our paper on sustainable/green machine learning Energy Harvesting Distributed Machine Learning has been accepted to IEEE International Symposium on Information Theory!
January 2021: Our paper Turbo-Aggregate: Breaking the Quadratic Aggregation Barrier in Secure Federated Learning has been accepted to IEEE Journal on Selected Areas in Information Theory: Privacy and Security of Information Systems
January 2021: Our paper CodedPrivateML: A Fast and Privacy-Preserving Framework for Distributed Machine Learning has been accepted to IEEE Journal on Selected Areas in Information Theory: Privacy and Security of Information Systems
October 2020: Our paper BREA: Byzantine-Resilient Secure Federated Learning has been accepted to IEEE Journal on Selected Areas in Communications: Machine Learning in Communications and Networks!
October 2020: New project on resilient communications and computing in large-scale networks!
September 2020: Our paper A Scalable Approach for Privacy-Preserving Collaborative Machine-Learning has been accepted to Conference on Neural Information Processing Systems (NeurIPS 2020)!
July 2020: New paper on Byzantine-robust privacy-preserving federated learning: BREA: Byzantine-Resilient Secure Federated Learning
May 2020: New paper on communication-efficient secure federated learning: Turbo-Aggregate: Breaking the Quadratic Aggregation Barrier in Secure Federated Learning
May 2020: New paper on distributed graph processing: TACC: Topology-Aware Coded Computing for Distributed Graph Processing
April 2020: New paper on privacy-aware distributed machine learning: CodedPrivateML: A Fast and Privacy-Preserving Framework for Distributed Machine Learning
I can be reached at bguler at ece dot ucr dot edu