WORLDREVIEWS

A global archive of independent reviews of everything happening from the beginning of the millennium


Read our Copyright Notice click here

For publication dates click here


Cambridge Autonomous Mobility



Driverless Cars 2015 Part 1


Driverless Cars 2015 Part 2


Video briefly explaining sensor fusion and object recognition.




15 December 2016

For two years I've been gently annoyed by a question that is posed as if it were a philosophical dilemma or ethical choice, especially because you hear it so often: what will a driverless car if do it has the choice of ploughing into a group of schoolchildren or driving into a wall killing its innocent occupants?

The answer almost certainly is that the onboard computers will never have to make ethical choices of this kind.

If the car recognizes an obstacle that it must avoid - children instead of a balloon, say (and this is why good object recognition is essential and most projects do not yet have it) - the car will apply the brakes, hard if necessary, and probably much earlier than a human could, to slow down. If there is a gap to go through to avoid the children it will go through it, if not it will continue to deliver maximum or optimum deceleration, with the result that the children and, maybe, the occupants (they should have airbags) may suffer injuries. What the algorithms will not deliver is a 'car commits suicide' option where it deliberately drives into a wall or something else.

On this website I once raised the question of what a driverless car would do if a large bird swooped down apparently intent on hitting it - would it brake, possibly occasioning another car to run into the back of it? What appeared to be the answer from Google came back indirectly: the car will ignore birds.

So a rarer 'ethical' question might be more pertinent than the common one: If a large piece of masonry was coming down off a building site 20 metres ahead, perhaps too close to allow much deceleration, would it drive at it or swerve and hit a wall to avoid it?

It must be said that Google is a company that has the resources to answer this one but I fear a lot of autonomous vehicle projects do not and so I am sceptical of most safety claims for autonomous vehicles in a way I was not when I was a pioneer researching how they could be introduced into the U.K. three years ago.

For many projects the likely answer would be that the object would not be recognised by the computers and if picked up by the vision systems ignored like the birds.

*****

There is also a fair amount of nonsense talked about who will be legally to blame for accidents but a dilemma of this nature can already be presented by today's technology.

If you have a car with infra-red sensors that can see an animal crossing the road in the dark before the driver can and the car applies the brakes hard, without the intervention of the driver, with the result that another vehicle runs into the back of the car, who is to blame?



8 September 2017

An AI-focussed, venture capital funded driverless car company, Five AI, has launched in Cambridge. It is probably not going to end up owning many driverless cars of its own - it is not a vehicle manufacturer by any means - but it is a good bet to develop AI that can be used by future driverless projects in an urban context and so this company looks like one that could eventually sell for a healthy multiple of initial funding.




15 September 2017

The Kangaroo Konundrum





Purely algorithmic object recognition cannot reliably identify a kangaroo crossing the road. Deep learning artificial intelligence should be able to do so and then cloud based parallel processing should be able to verify the fellow is a kangaroo in real time. When you have a non-sentient system that does this, come back to us and we will set you the next puzzle.




LANE MARKINGS


Reviewed by ANDRE BEAUMONT


6 February 2016

The news that Britain will be experimenting further, on a very limited scale, with removing white lane markings from the middle of roads leaves one in two minds. This may slow down motorised vehicles so reducing accidents but it does not help pedestrians.


In many of those places with no road markings, motorbicycles, buses and cycles arriving on the road space pedestrians are using or crossing is a nightmare, because they have little in the way of visual cues to know where they will be or how they are arriving, and it will be worse for those with impaired hearing or vision.

For example, if the pedestrian is crossing a road and is caught out by a rapidly turning vehicle he can stand on the white line and the driver will understand what he is about but it is harder to see, hear and anticipate everything on an unmarked road.

Like many cars now have, human beings have their own 'sensors' - eyes and ears - and they use them intensively but they need to be given cues that they can read. Which brings us tangentially to future autonomous vehicles.


Source: Daimler

A Mercedes truck that partially uses autonomous guidance (this one, at the entry for 12 May 2015) is heavily reliant on lane markings to be 'autonomous'. In many ways it is not truly autonomous partly because it will not drive in autonomous mode off the type of highway that has good markings.

Mercedes' research is near the top of the tree for autonomous vehicles, especially in dollars committed.

A lot of other 'driverless car' demos, though, are little more than the combination of lane keeping technology, cruise control and some less than fail-safe processing of sensor data. The vehicles are also in no real sense driverless.

Here at Worldreviews we are advocates of driverless cars but ultimately we are not starry eyed.

Some driverless car projects will come onstream for restricted public use in 2017-2020 but they will be in geographically limited areas that will have been mapped in advance more intensively - by orders of magnitude - than urban roadscapes ever have been before.


Some projects will have good sensor fusion, some good object recognition - but most probably will not have in the early years.

What they will rely on to operate, though, is lots of lane markings. These are the objects they need most of all to recognize.

A pedestrian having to second guess a bus on a road with no lane markings is bad enough, a driverless white van on the same road quite a harder prospect. How would you ever guess its intention to stop in a particular place or, worse, mount the kerb? (Driverless delivery vans are probably the least feasible outcome in the near term.)


*****

March 2016


For pods and shuttles, vehicles currently designed to move through shared space at low speed, and which we should not confuse with driverless cars, there is the prospect of navigation that is not reliant on lane markings.

These would include the recently announced NVIDIA-powered WEpod of Delft Technical University that uses some of the GPU manufacturer's deep learning capabilities and also a promising project in Greenwich, UK which looks like it is run by a commercially competent consortium which includes the Transport Research Laboratory and Heathrow.


The latter project has 3D local mapping provided by Oxbotica and is partly funded by the EU.

The Dutch WEpod is an iteration of the EZ10 vehicle, supplied by EasyMile, and part of the EU-funded CityMobil2 project. Other EZ10 trials are being run or rolled out in a handful of European countries. The vehicle is electric, semi-autonomous and, at present, is low speed and concentrates on 'last mile' routes that are protected from other vehicles. Many of the routes have the potential to terminate at railway stations.



April 2016

It is reported that Sheik Mohammed bin Rashid Al Maktoum would like to see a quarter of all road journeys in Dubai undertaken in driverless vehicles by 2030.

An EZ10 vehicle, which as its name implies carries ten passengers, is already being tested in the state. It does not require lane markings but vehicles with a lot more artificial intelligence, like Google's cars, will. For these, the lane markings issue need not be a problem, initially, in Dubai. Some of the roads have twelve lanes so dedicating some to driverless vehicles should not be a problem. Snow obscuring the markings is unlikely to be a problem; sand, perhaps.

For those of us who have been studying driverless vehicles and their constituent technologies from the outset, one of the main problems is finding conceptual realism in those we talk to. In 2016, the conceptual frameworks for them require an understanding of many disciplines all at once, from politics to sensor fusion.


A single discipline, like computer coding, cannot deliver the majority of the answers and neither can a turnkey contractor. Conceptually, the rendering of what is being mapped may be one of the most advanced disciplines.

Consultancy reports from the financial world extrapolate safety projections and financial benefits from minimal practical experience so these have to be taken with a pinch of salt.

A quarter of road journeys is about right as is the target date of 2030. There is no way that driverless vehicles can supplant legacy vehicles, except progressively, and never absolutely. Are we to have unmanned JCBs and fire engines on public roads?

Vision and realism are required. From decades of observing Sheik Mohammed's equine operation in Newmarket, I would say he has it.

*****


July 2016

I had a brief conversation with the Director General of the EU responsible for autonomous and connected vehicles in May and watched a presentation by him which contained nothing inconsistent with the direction of travel the U.K. should go in but where will Britain be in January 2019, possibly the date Brexit takes effect?


Coordinated with Europe on connected vehicles and road security standards and probably behind the curve on autonomous vehicles unless Volvo or another external player can bring a successful trial to Britain.

*****


March 2017

The U.K. autonomous vehicle effort took a conclusive wrong turning at the beginning of 2015 which took it down a cul-de-sac.

Capabilities to read lane markings, for instance, were not in evidence.

Fortunately, not all conceptual and technical expertise went down this road and a shift in effort is now taking place which will re-centre efforts around the U.K.'s known and innovative capabilities in vehicle testing.

It is apparent that for autonomous vehicles and semi-autonomous vehicles to read and follow lane markings with a high degree of accuracy they will have to rely on machine vision systems. These will be able to see in the dark (or the best will) but their effectiveness will be degraded when lane markings are wet. When they are obscured by snow, sand or proximate vehicles, distance measuring sensors - lidar, radar and ultrasound - will have to fill the gap in locating the vehicles relative to their environment - requiring strong sensor fusion capabilities.


Until this package is finely honed and road markings upgraded, limitations remain on autonomous commercial vehicle testing and on semi-autonomous commercial vehicle platoon formation testing.