Metawave CEO Dr Maha Achour talks to Electrans about her company’s new radar technology
US company Metawave has raised an additional US$10 million from strategic and financial investors to develop its smart automotive radar system.
Metawave has been developing a new kind of radar, designed for use with self driving cars, dubbed WARLORD.
“Range, speed and brain, the car needs to have these three elements and these three elements can be achieved by adding a high-resolution radar that is able to detect, track and also classify objects at these long distances, 300 metres and above,” Metawave CEO Dr Maha Achour said in an interview with Electrans.
Autonomous vehicles currently rely on three types of sensors; camera, lidar and radar. Cameras have the highest resolution, but also the shortest range, capping out at 50 metres. Lidar, which uses a pulsed laser to measure distances, has a range of 150 metres.
Radar, operating using lower frequency electromagnetic waves, has a lower resolution but a longer range, operating at up to 300 metres.
Reliance on higher resolution, shorter range sensors reduces the maximum top speed of current self-driving cars. Even factoring in an autonomous vehicles faster reaction time over a human driver, a 50-metre stopping distance limits most vehicles to a (60 mph) 96 kmph top speed.
While this is fine for city driving, it’s a major limiting factor on motorways. Testing self-driving lorries is considered a major proving ground for autonomous driving technology, as self-driven long-haul trucks could speed down motorways faster and safer than a human driver could. A more efficient long-range sensor will be needed before this becomes a reality.
Because they use light to detect their surroundings, cameras and lidar are susceptible to extreme weather, either rain and snow, or bright sunlight. “Weather is a big issue, and also driving on dirty roads, where you have dirt and mud being accumulated on the apertures of the cameras and lidar. As soon as you block these apertures they’re not going to work. Radar, even if the aperture is blocked with dirt it is still going to operate,” said Dr Achour.
“At these wave lengths, we’re talking about 77 gigahertz, the wavelength is 3.8 mm, so it’s much longer than the 1.5 micron that the lidar uses or the visible light, and that’s why it’s able to operate in even at night or when you’re blinded by the sun.”
Dr Achour admits that at short ranges, it is hard to bear a camera’s resolution. “When we get closer of course, maybe at 50 metres, of course, the camera is the best, because the camera can see everything.
“The question is at 300 metres, do you really want to see the fingers of a person, or just seeing that this is a human being, and not a motorcycle is enough. So, of course, we can never meet the high resolution of cameras or lidar at the short range because these operate on the optical frequencies, but over long range we can provide enough information that will enable all different applications to benefit from these data.”
The shortcomings of relying purely on short-range sensors are something vehicle original equipment manufacturers are starting to realise. “Most of the car OEMs realise that the camera cannot be the primary sensor anymore, that the radar, the high resolution radar with long-range capability will always be the primary sensor on these cars,” said Dr Achour.
She also noted that the current generation of radars are not up to the task. “What prompted us to develop our radar [was that] mainstream radar technologies haven’t been able to meet these kind of milestones in terms of range and high resolution,” she noted.
Current radars use digital beamforming (DBF), which illuminates the scene omnidirectionally. This produces a great deal of background noise, which a computer will find difficult to process quickly and efficiently, making it slow to pick out relevant information.
This makes it hard to increase the resolution of a long range signal, as any boost in resolution will also increase the amount of noise, making it hard to discriminate humans in congested environments.
Metawave’s WARLORD offers a higher resolution radar by working in analogue mode, steering the beam to increase resolution and reduce the signal-to-noise ratio. This helps it recognise objects, feeding the information to the integrated AI, helping the radar learn and recognise objects faster.
“So, with time, say 10 years from now, the radar, by itself, will be smart enough to tell the car with high probability that the object that the radar sees are human or cars or bridges or so forth.
“Of course from day one it’s not going to happen, you have to build this kind of learning mechanism and then of course you want to make the radar work in harmony with the other sensors because the car will still have cameras [and] will still have lidar, so the sensor fusion itself can feed information back to the radar to make it even smarter,” Dr Achour said.
The new funding round adds five new strategic investors to Metawave’s list of backers.
DENSO, which led the round, alongside Toyota AI Ventures, Hyundai Motor Company and Asahi Glass will not join original investors Motus Ventures, Khosla Ventures, Autotech Ventures, Bold Capital, SAIC Capital, Western Technology Investment (WTI), and Alrai Capital.
This new wave of funding brings the total investment in the new radar system to US$17 million.
The funding will be used to grow the company to meet demand. Dr Achour said: “We have to increase the size of the company to meet the size of demand of companies interested in our radar.
“We doubled our size just in the past few months and we will triple it by the end of the summer.”
Metawave aims to demonstrate the WARLORD by the end of the year.