What does Saudi Arabia’s autonomous vehicle agenda mean to the rest of the world?

0
364
Oracle enhances customer experience platform with a B2B refresh

Source is ComputerWeekly.com

Saudi Arabia has set aside $500bn to invest in the economic zone of Neom, a smart city that will be built from scratch and where inhabitants will get around by using fully autonomous vehicles (AVs). While it is certainly a significant project, the systems being developed for the Neom megacity are still in the design phase, with many details either not yet decided or not communicated to the public. 

Leading the effort to develop the AVs in Neom is Syrian-born Nahid Sidki, who spent more than 30 years honing his skills in artificial intelligence (AI) and robotics in the US. Sidki recently left his position as executive director for the robotics centre at Stanford Research Institute to apply his knowledge to projects in the Middle East. 

Now chief technology officer of the Research Products Development Company (RPDC), a Riyadh-based innovation centre, Sidki has assembled an international team for the Neom autonomous vehicle project. The transportation system will function without any human input at all – but only if the vehicles remain in certain places. This kind of vehicle autonomy is classified as level 4 – the second-highest level of autonomy, according to the Society of Automotive Engineers International (SAE). 

“If Saudi Arabia is able to achieve level 4 autonomy, that would indeed be unique in the world,” said Daniel Faggella, head of research at Emerj. “Other autonomous vehicle projects have been run in places like Singapore, Hong Kong and Austin, Texas, but none have reached level 4 autonomy.” 

The Saudi project does have several things going for it that could make it a big success. Neom is being built from the ground up, which makes it easier to implement the most up-to-date infrastructure to support full vehicle automation. The five-year undertaking, still in its initial stages, will benefit from the 5G network that will cover the megacity and provide a medium for vehicle-to-vehicle and vehicle-to-city communications. 

The team will also develop better sensor technology, which it sees as the main obstacle to achieving full automation. Lidar, the sensor technology used by Tesla and many other automobile manufacturers, has several shortcomings, including a limited range and poor performance in sandstorms and heavy rain. Because the project leaders consider these limitations even bigger showstoppers than the current state of AI, they have chosen to focus mostly on developing radar-based sensors. 

But some experts warn that building a city from scratch and developing better sensors is not enough. “Neom will also have to attract the interest of the main technology players in this space,” said Pedro Pacheco, senior director of automotive and smart mobility research at Gartner. 

Self-driving cars not as far along as the hype says 

Reaching level 4 autonomy – especially across an entire city – will be a giant leap. Anyone who cuts through all the hype quickly recognises that the systems currently operating on public roads are more accurately described as advanced driver assistance systems (Adas) than as autonomous vehicles. The limited set of features, such as automatic emergency braking, lane centring and adaptive cruise control, all require uninterrupted driver attention. This means they fit somewhere in levels 0, 1 or 2, as defined by SAE. 

In fact, until very recently, no vehicle legally operating on any road in the world has been above level 2. This changed in November 2020, when the Japanese government approved Honda’s Traffic Jam Pilot system, making Japan the only country in the world to approve a level 3 system.  

While level 3 still relies on human assistance, the jump to level 3 is a big one. Cars in this category allow drivers to take their hands off the steering wheel in most conditions, but the driver must be ready to take over whenever the system gets confused. The hand-off is not always graceful.  

Take, for instance, the level 3 Honda systems that have just begun to operate in Japan. When the vehicle alerts the driver to take control and the driver does not respond, the car decelerates and stops on the shoulder. If there is no shoulder, the car slows to a halt, flashes its hazard lights and blasts the horn – surely not something that will go over well during the rush hour. 

Regulators and insurance companies in most parts of the world have not yet worked out how to deal with safety and liability in level 3 AVs. Many believe the driver is not responsible when automatic driving features are engaged, but immediately becomes liable when requested by the car to take back control. Although several manufacturers, including Audi and Honda, have cars with the technology for level 3, law makers outside Japan are not ready for them to be on public roads. 

Level 3 AVs are said to have “conditional automation”, but those at level 4 will have achieved “high automation”. At this level, the driver is never asked to take over. However, the automatic features can only be used in specific geofenced areas. Some experiments have demonstrated level 4 in very controlled environments, covering a very small geography – but none has done so on the scale of a city.

Level 5 is even more challenging. A car classified at level 5 is said to have “full autonomy”, able to self-drive in any location and never requiring human intervention. Nobody who understands the technology thinks the industry is anywhere near level 5 – not even Elon Musk. Although Musk said recently that Tesla would achieve full autonomy within a year, he emphatically walked that claim back very quickly. 

Getting to levels 4 or 5 will require AI that can react to a range of situations. A child may bounce a ball across the road, or pets may run in front of the car. And self-driving cars will have to adapt not only to other autonomous vehicles, but also to those driven by humans. After all, most cars will still be driven by people for at least another decade – and human drivers behave very differently from an automated system. 

To achieve high or full automation, a system must be able to first recognise objects that belong to more than just a few different categories – something computer vision cannot yet do. Then they must make sense of the objects to assess the situation and take appropriate action – something current AI algorithms can handle only if the specific situation has been learned. 

While sensors may be an obstacle to achieving level 4 – as the Saudi team maintains – many other experts say the current state of AI is an even bigger roadblock. “A machine doesn’t think like a person,” said Gartner’s Pacheco. “When a human driver is faced with a totally new situation, they will draw information from what they’ve learned and try to extrapolate that to the new situation. That’s what we commonly call improvisation. 

“For a machine, however, this is very hard to achieve. Every small change in ambient conditions is a new situation for an AV and if they haven’t been trained to handle it, they will probably disengage.” 

A study conducted by Emerj illustrates some of the problems with hype and how it very often affects development schedules. “In 2017, we did some coverage on the self-driving car timelines predicted by the leaders of Ford Nissan, Tesla and Toyota,” said Faggella. “They were all overly optimistic. When we updated that article three and a half years later, almost all of those plans had been pushed back. The plans were adjusted not only for technological reasons, but also for legal reasons: nobody can agree on who is responsible in the event of a crash.” 

Autonomous vehicles need smart infrastructure 

Another huge technological challenge is that an AV needs to know its position at all times – and to a high degree of accuracy. A mistake of 30cm could cause a car to drift into another lane. GPS can provide that level of accuracy in optimal conditions, but this cannot always be counted on. Signal degradation and signal loss can be caused by a number of factors – including large obstacles, such as building and tunnels, and even sunspots.  

Because GPS cannot do the job alone, other approaches for position determination are being explored. Some rely on vehicle-to-vehicle (V2V) communication to have the cars provide each other with position information. But those schemes assume that the other cars have more accurate data – and they rely on good faith cooperation between competitors.

“Can we count on having different autonomous vehicle manufacturers sharing information when some of that information is proprietary?” said Faggella. “Do they want to give away secrets about how they steer a car, or how they estimate position?” 

A more reliable approach is to use vehicle-to-infrastructure (V2I) communication, where the car communicates with roadside units or other infrastructure to get useful information, including data that can be used by a vehicle to determine its own coordinates. This kind of smart infrastructure and reliable communications may be where the Neom project has its biggest advantage. 

“Neom is a unique case, where both vehicles and infrastructure are being designed from the ground up to complement one another,” said Calum MacRae, head of automotive research and analysis at GlobalData. “It’s the perfect closed virtuous loop and should really ease AV implementation. The rest of the world will see the potential of V2I by following what happens in Neom.”  

What the world can learn from Saudi Arabia 

Whether or not the Saudi project goes according to plan, other countries might learn something about AVs from it. “The rest of the world can learn ambition and focus,” said Pacheco. “Many countries and cities want to become hotspots for AV development, but they don’t invest enough, nor do they create the right conditions and regulation to enable AV testing and deployment.” 

Faggella added: “What Saudi Arabia has going for it is a lot of money. They also have a strong command over their economy, more than Western nations. In the Middle East, governments have the top-down ability to command and control. They’ve got a lot of money per capita and a strong and clear emphasis on technology investment, which is very interesting and worth following.” 

The megacity project is a clear example of a willingness to invest in technology – and it isn’t Saudi Arabia’s first foray into AV technology. In 2019, the King Abdulla University of Science and Technology (KAUST) launched a project to run self-driving shuttles on its campus. One of the reasons for running the project on the university campus was to allow students and researchers to work with the companies to help develop new technology in a controlled environment. As new features are tested, data is gathered to measure performance. 

Two shuttles now run on the KAUST campus. Unlike the Neom project, the KAUST shuttles use Lidar. They also cover much less distance, and nobody is really depending on them for transportation – so the limitations of Lidar are not an impediment. The shuttles also use mapping and cognitive response technologies and obstacle avoidance systems to control, navigate and drive the vehicles. 

The KAUST shuttle does not qualify as level 4 because a human operator is always on board, ready to take control at any time. And to make sure passers-by do not put the system into panic mode, the university put the following warning on its website: “If you are riding a bike or driving a car, please give the shuttles enough space in the front and the rear to ensure that the shuttle does not perceive your presence as a hazard.” 

Faggella added: “Even if they sometimes fall short of their ambitious goals, projects like the ones in Saudi Arabia are certainly moving things forward. But some of the problems are too big to tackle in just a few years. My guess is that level 4 autonomy in an entire city will not happen within the next five years.” 

Source is ComputerWeekly.com

Vorig artikelLightBasin hackers breach 13 telcos in two years
Volgend artikelHow to Talk to the World Through Free Translation Apps