We are one step closer to never having to parallel park again — as regulators in the US have given the green light to the first commercial fleet of driverless taxis in California.
Robotic taxi service Cruise received approval to offer rides in San Francisco.
It will be the first time an autonomous ride-hailing service in the state has been allowed to charge for rides that will have nobody else in them besides the passengers.
Cruise's initial fleet of 30 electric cars are confined to transporting passengers in less congested parts of San Francisco between 10pm and 6am.
This is to minimise chances of the robotic taxis causing property damage, injuries or death if something goes awry.
It will also allow regulators from the California Public Utilities Commission to assess how the technology works before permitting the service to expand.
California regulators gave a robotic taxi service the green light to begin charging passengers for driverless rides in San Francisco. Pictured is Cruise's autonomous Bolt electric vehicle
Cruise, a company controlled by automaker General Motors, has been offering free, driverless rides in the city since February, along with another robotic car pioneer, Waymo.
But to charge passengers, a human driver had been required to take control if something were to go wrong with the technology.
Now, the self-driving cars have been cleared to go solo whilst charging passengers for the journey with the new 'Driverless Deployment Permit'.
This is an ambition that a wide variety of technology companies and traditional automakers have been pursuing for more than a decade.
Driverless vehicles have been hailed as a way to make taxi rides less expensive while reducing the traffic accidents and deaths caused by human drivers.
The regulators unanimously granted the permit yesterday, despite safety concerns arising from Cruise's inability to pick up and drop off passengers at the curb in its autonomous taxis, requiring the vehicles to double park in traffic lanes.
HOW DOES THE CRUISE AV WORK?
The Cruise AV is built on the foundation of the Chevy Bolt, with 40% of its hardware unique to self-driving
It has over 40 sensors, giving it a 360° view and the ability to map the location of surrounding objects within centimetres
It considers multiple paths per second and is constantly changing its route depending on road conditions
The car uses this information to tell its wheels, throttle, brakes and steering what to do and how to react to changes
Advertisement
Gil West, Cruise's chief operating officer, in a blog post hailed Thursday's vote as 'a giant leap for our mission here at Cruise to save lives, help save the planet, and save people time and money.'
He said the company would begin rolling out its fared rides gradually.
Waymo, the autonomous driving unit of Google's parent company Alphabet, has been running a driverless ride-hailing service in Phoenix, Arizona since October 2020.
Another driverless startup, focused on transporting goods instead of passengers, Nuro, has a deployment permit to operate driverless cars in San Francisco, too.
However, navigating the density and difficulty of more congested cities such as San Francisco has posed more daunting challenges for robotic taxis to overcome.
This is one of the reasons Cruise's newly approved driverless service in the city is being so tightly controlled.
Besides being restricted to places and times where there is less traffic and fewer pedestrians on the streets, Cruise's driverless service won't be allowed to operate in heavy rain or fog either.
While the company's application for a driverless taxi service in San Francisco won widespread backing, some transportation experts urged the Public Utilities Commission to move cautiously.
'Many of the claimed benefits of [autonomous vehicles] have not been demonstrated, and some claims have little or no foundation,' Ryan Russo, the director of the transportation department in Oakland, California, told the commission last month.
Just reaching this point has taken far longer than many companies envisioned when they began working on the autonomous technology.
Uber, the biggest ride-hailing service, had been hoping to have 75,000 self-driving cars on the road by 2019 and operating driverless taxi fleet in at least 13 cities in 2022.
This is according to court documents filed in a case accusing the company of stealing trade secrets from Waymo in 2018.
Uber wound up selling its autonomous driving division to Aurora in 2020 and still relies almost exclusively on human drivers who have been more difficult to recruit since the pandemic.
Tesla CEO Elon Musk promised his electric car company would be running a robotic taxi fleet by the end of 2020, but he is yet to reach that target.
SELF-DRIVING CARS 'SEE' USING LIDAR, CAMERAS AND RADAR
Self-driving cars often use a combination of normal two-dimensional cameras and depth-sensing 'LiDAR' units to recognise the world around them.
However, others make use of visible light cameras that capture imagery of the roads and streets.
They are trained with a wealth of information and vast databases of hundreds of thousands of clips which are processed using artificial intelligence to accurately identify people, signs and hazards.
In LiDAR (light detection and ranging) scanning - which is used by Waymo - one or more lasers send out short pulses, which bounce back when they hit an obstacle.
These sensors constantly scan the surrounding areas looking for information, acting as the 'eyes' of the car.
While the units supply depth information, their low resolution makes it hard to detect small, faraway objects without help from a normal camera linked to it in real time.
In November last year Apple revealed details of its driverless car system that uses lasers to detect pedestrians and cyclists from a distance.
The Apple researchers said they were able to get 'highly encouraging results' in spotting pedestrians and cyclists with just LiDAR data.
They also wrote they were able to beat other approaches for detecting three-dimensional objects that use only LiDAR.
Other self-driving cars generally rely on a combination of cameras, sensors and lasers.
An example is Volvo's self driving cars that rely on around 28 cameras, sensors and lasers.
A network of computers process information, which together with GPS, generates a real-time map of moving and stationary objects in the environment.
Twelve ultrasonic sensors around the car are used to identify objects close to the vehicle and support autonomous drive at low speeds.
A wave radar and camera placed on the windscreen reads traffic signs and the road's curvature and can detect objects on the road such as other road users.
Four radars behind the front and rear bumpers also locate objects.
Two long-range radars on the bumper are used to detect fast-moving vehicles approaching from far behind, which is useful on motorways.
Four cameras - two on the wing mirrors, one on the grille and one on the rear bumper - monitor objects in close proximity to the vehicle and lane markings.
Advertisement