06/04/2020

You’re probably familiar with terms such as “robot,” “smart device”, or “self-driving car.” It’s much less common to come across the term “autonomous system”, but that’s exactly what these things are.

A self-driving car is an autonomous system because it gathers information from its sensors, and then analyses that information to plan and execute an action. A smart thermostat in your home will gather information about the environment and use that data to decide whether to heat or cool a room, while home service robots will use the information they gather to devise a goal – such as move an item from one room to another – and execute a plan to achieve that goal,

The use of Autonomous Systems (AS) to complement and supplement work carried out by humans or human-operated systems is increasing. As our use of AS becomes more common, our dependence on them to provide a wide range of functions which are critical to safety, service and costs also grows.

In order to meet this increasing demand to carry out tasks in complex environments, with minimal human intervention, the development of AS needs to shift its focus to wider application of Artificial Intelligence (AI), creating self-aware autonomous systems, capable of learning. The future of AS lies in integration - embedding AI so that the technology can operate and co-exist alongside humans, combining the ability to learn and adapt like a human, with the efficiency and resilience to anomalies we would expect from a machine.

Gokhan Inalhan, Professor of Autonomy and Artificial Intelligence and Deputy Head of the Centre for Autonomous and Cyber-Physical Systems at Cranfield University, discusses below the research and development being carried out at Cranfield to bridge the gap between theory and application in order to achieve the integrated autonomy vision.


A paradigm shift towards implementations of AI

Current and near-future AS are wide-ranging, starting from building block one-unit cell AS (such as unmanned aerial vehicles (UAVs) or autonomous cars) and expanding to networks of autonomous/semi-autonomous technology driving large scale cyber-physical or socio-technical systems – such as robotic exploration vehicles, air traffic management networks, power grids and data communication networks.

However, with increasing complexity of AS, the major paradigm shift we face is the transition from design-time automated or sand-boxed autonomous systems, to the implementation of AI-based, self-aware learning AS. These intelligent AS are envisioned to provide the much-need capability to operate in complex, unpredictable environments: delivering safe, reliable and efficient operations, with through-life resilience against anomalies independently with minimum human intervention, while also learning, adapting and evolving through experiences.

Such application of human-like learning capabilities to autonomous systems was recently demonstrated in Cranfield’s involvement with the Nissan HumanDrive project. The autonomous car recently completed the UK’s longest and most complex self-navigated journey, travelling 230 miles from the Nissan European Technical Centre in Cranfield to the factory in Sunderland. The successful journey, which saw the car share the roads with regular road users, marks a milestone in the development of fully-autonomous vehicles and their ability to operate safely alongside humans. Cranfield’s role in the project involved developing ways of measuring human-like driving behaviour and using the Multi-User Environment for Autonomous Vehicle Innovation (MUEAVI) smart road as a test environment to analyse and refine the vehicle’s perception and control systems accordingly. The systems developed as a result of this research allowed the car to successfully navigate challenging scenarios such as negotiating roundabouts and encountering other road users such as cyclists, with human-like driving characteristics.

Research was also recently conducted at Cranfield to develop technology enabling multiple UAVs to be used to carry out airborne monitoring of traffic to detect and assess hidden threats. Swarms of autonomous airborne sensors have long endurance, good spatial coverage, can provide a level of accurate estimation of a large number of moving targets and have the capability to respond to threats from the air. However, the detection of suspicious behaviour requires a human operator to analyse mass amounts of data and develop a clear picture of events. The research carried out at Cranfield focused on the development of a high-level analysis algorithm to process the data gathered by UAVs – providing them with awareness of what constitutes abnormal behaviour. This decision-making model enabled the sensors to isolate useful data and place it into a context with predictive value.

Developing and embedding AI into autonomous systems, in both instances, removed some of the need for human intervention – enhancing the capabilities of the technology.

Bridging the gap between theory and application

At the Cranfield Centre for Autonomous and Cyber-Physical Systems, we are leading the way in autonomy and AI technologies that bridge the critical algorithmic and computational gaps between theory and application. With the facilities across our laboratories in the Aerospace Integration Research Centre (AIRC), testbeds such as the MUEAVI road and the Global Research Airport (including our digital air traffic control tower and flight simulators) and participation in initiatives such as the National Beyond Visual Line of Site Experimentation Corridor (NBEC) we are able to explore ways to embed AI into autonomous systems.

Current work with partners such as BAE Systems, Ocado, and Boeing R&T Europe spans a wide spectrum of drone applications: urban air and cargo mobility, medical emergency and delivery, environmental monitoring and intelligent agriculture. Last year, researchers at Cranfield developed a smartphone app which can connect with off-the-shelf drones and send them to autonomously inspect multiple locations. By automating control, facilitating communication between devices and reducing the level of human intervention required, the CASCADE project aims to accelerate the use of UAVs across a range of scientific and industrial applications – for example monitoring crop health data or facilitating missing person searches. Such use of AI has already been shown to be of value, in this instance in remote sensing, in being able to monitor illicit opium poppy cultivation in Afghanistan.

Using digital twins and simulation to capture complexities

Exploring ways to embed AI into autonomous systems in a way that is explainable, certifiable and secure, enables us to develop technology that can operate and co-exist with humans in urban, suburban and remote settings with minimum human intervention. Doing so requires an understanding of the complexities of dynamic systems in a range of scenarios. Through digital twins and surrogate modelling, we can capture the inherent complexities that feed the product-life cycle, informing design, operation and maintenance.

Facilities at Cranfield such as Air Traffic Management and Unmanned Air System Traffic Management simulators, flight simulators and the MUEAVI smart road alongside actual flying elements in our National Flying Laboratory Centre provide the ideal testbed for developing integrated autonomous systems. Combining both virtual and actual elements through co-simulation and flight testing allows us to continually test new ideas and drive innovation. In that respect, facilities such as the Digital Aviation Research and Technology Centre, along with new laboratories at the Centre for Autonomous and Cyber-Physical Systems which sit at the heart of Cranfield’s research into digital aviation, provide a research and development environment with access to both a physical and virtual ‘sandbox’. This allows for key verification, validation and hardware-in-the-loop testing for future of flight technologies – applicable to urban air and cargo mobility, autonomy and electric air vehicles.

Find out more about the research currently happening at the Centre for Autonomous and Cyber-Physical Systems.