Basak Guler



I am an Assistant Professor in the Department of Electrical and Computer Engineering at University of California, Riverside, and a cooperating faculty at the Department of Computer Science and Engineering.

My research is on developing scalable, privacy-preserving, and context-aware information processing frameworks for large-scale distributed networks.

Prospective students:

I am always interested in working with motivated students who would like to pursue research in one or more of the following areas (potentially, the intersection of), information theory, machine learning, privacy and security, wireless networks, distributed computing, and communications.

My research typically involves both theoretical and implementation components, which requires a strong mathematical background. If you are interested in working with me, please send me an e-mail with the title "Prospective Student" along with your CV and research interests. Students who have majored in math with no prior experience in engineering are also encouraged to apply.

News:

July 2021: Received the UCR Regents' Faculty Fellowship!

July 2021: New paper on sustainable/green federated learning A Framework for Sustainable Federated Learning at the International Symposium on Modeling and Optimization in Mobile, Ad hoc, and Wireless Networks (WiOpt)!

June 2021: New paper on privacy-preserving federated learning! Securing Secure Aggregation: Mitigating Multi-Round Privacy Leakage in Federated Learning

May 2021: Research highlight from the ECE Department! Privacy-Aware Large-Scale Machine Learning

April 2021: Our paper on sustainable/green machine learning Energy Harvesting Distributed Machine Learning has been accepted to IEEE International Symposium on Information Theory!

January 2021: Our paper Turbo-Aggregate: Breaking the Quadratic Aggregation Barrier in Secure Federated Learning has been accepted to IEEE Journal on Selected Areas in Information Theory: Privacy and Security of Information Systems

January 2021: Our paper CodedPrivateML: A Fast and Privacy-Preserving Framework for Distributed Machine Learning has been accepted to IEEE Journal on Selected Areas in Information Theory: Privacy and Security of Information Systems

October 2020: Our paper BREA: Byzantine-Resilient Secure Federated Learning has been accepted to IEEE Journal on Selected Areas in Communications: Machine Learning in Communications and Networks!

October 2020: New project on resilient communications and computing in large-scale networks!

September 2020: Our paper A Scalable Approach for Privacy-Preserving Collaborative Machine-Learning has been accepted to Conference on Neural Information Processing Systems (NeurIPS 2020)!

July 2020: New paper on Byzantine-robust privacy-preserving federated learning: BREA: Byzantine-Resilient Secure Federated Learning

May 2020: New paper on communication-efficient secure federated learning: Turbo-Aggregate: Breaking the Quadratic Aggregation Barrier in Secure Federated Learning

May 2020: New paper on distributed graph processing: TACC: Topology-Aware Coded Computing for Distributed Graph Processing

April 2020: New paper on privacy-aware distributed machine learning: CodedPrivateML: A Fast and Privacy-Preserving Framework for Distributed Machine Learning

Contact:

I can be reached at bguler at ece dot ucr dot edu