Skip to main content
Thesis defences

PhD Oral Exam - Babak Molaei, Electrical and Computer Engineering

Rectifying the Implementation Challenges of a Novel Detection Architecture Aiming to Achieve 360° View for FMCW Automotive Radar


Date & time
Tuesday, April 29, 2025
1 p.m. – 4 p.m.
Cost

This event is free

Organization

School of Graduate Studies

Contact

Dolly Grewal

Accessible location

Yes

When studying for a doctoral degree (PhD), candidates submit a thesis that provides a critical review of the current state of knowledge of the thesis subject as well as the student’s own contributions to the subject. The distinguishing criterion of doctoral graduate research is a significant and original contribution to knowledge.

Once accepted, the candidate presents the thesis orally. This oral exam is open to the public.

Abstract

Over the past couple of years, drive assisting and autonomous self-driving technologies have always been among the most highly focused topics in academia and industry. In simple words, these technologies are all about highly accurate tracking and sensing the surrounding area up to a certain distance to create a near to real-time 3D/4D view for moving vehicles or objects to be used for different purposes such as self-navigation, safety assurance, collision avoidance, lane keep/change assist, Overspeed/under speed control and others.

Although this crucial technology seems already in good standing, more in-depth research into the details states otherwise. Due to the current high cost of these mainly optional modules, most people cannot afford them. This is because car manufacturers employ many single-purpose detection RADARs and sensors, each designed for one specific task.

It is pragmatic to combine these features and functionalities into a single custom-designed 360° radar at a much lower affordable price. Through a study and investigation, a solution for every system limitation and challenge to creating a uniform 360° virtual view radar for decision-making purposes is found. To be more specific, one omnidirectional illuminator sends out a particular signal to the whole area around the vehicle, and seven receivers (three in the front, two in the rear, and one on each lateral side) receive the reflections. These echoes are processed/compared to extract object class (Pedestrian, Car, Tree, Guardrail, etc.) using Machine Learning algorithms for distance and angular location, which are the necessary inputs required for the above functionalities.

Employing one transceiver system with a single integrated processing unit diminishes the complexity and significantly lowers the price. Further, the interference level decreases sensibly, providing better performance and lower power consumption.

This study addresses the challenges of creating an omnidirectional virtual view for the FMCW RADAR, which substantially enhances the conventional single-feature detection with a very limited view for which FMCW is mostly used. Firstly, a new transceiver architecture for the 360-degree view is introduced, focusing on a key factor that any target in the detection range is in the beam range of at least two receivers to provide the required phase comparison ground for extracting the distance, angle of arrival, amplitude, and speed of the object. Secondly, a new algorithm is developed using simple machine learning algorithms to rectify the detection angle ambiguity with an accuracy of over 85% in a single snapshot, enabling the system to provide high precision and faster detection. And lastly, the required antenna elements for the transmitter and the receivers are designed. The transmitter antenna’s beam is shaped to efficiently propagate the signal power at different angles proportional to the coverage distance specified by the system. The receiver antenna is also designed to provide a stable fan-shaped beam width over the frequency range with a compact signature.

Back to top

© Concordia University