skip to Main Content

Artificial Intelligence in space

Artificial Intelligence in space

Artificial intelligence (AI) is omnipresent. From healthcare to transportation, tasks that were usually performed by humans are now carried out faster and more efficiently by computers and robots. Now AI is making its way to space. This took a while due to the lack of sufficient energy. While a satellite can run on just a few solar panels, AI hardware usually requires a lot of power.

The Dutch company cosine has found a solution for this, working in collaboration with the European Space Agency (ESA) and other partners. This month, AI technology, under the name PhiSat, will go into space as part of the HyperScout-2 miniature space camera, on board the FSSCat mission, a constellation of two nanosatellites each the size of a shoebox.

The idea of AI in space came about following the launch of the first HyperScout into space in 2018. That was the first miniature space camera to carry out data processing on board a spacecraft. “HyperScout-1 and HyperScout-2 produce hyperspectral images,” explains cosine’s Director, physics professor Marco Beijersbergen (51). “That means that we do not acquire images in only three different colors – the colors that the human eye can see – but in 45 different colors or spectral bands. This provides much more information. HyperScout uses these measurements to map the condition of land, water, vegetation and buildings on Earth’s surface.”

The first application of the artificial intelligence on HyperScout-2 will be filtering out clouds. “If you want to use data from Earth observation, detecting clouds is important,” says cosine System Architect Nathan Vercruyssen (37), who is working on the instrument. “If part of an image is covered by clouds it cannot be used. It is often cloudy above the Netherlands in winter. If HyperScout-2 filters out all the cloudy images in space and does not send them to Earth it saves transmission of a lot of unnecessary data.”

But this is not so easy. “It is more difficult than you think to program. The distinction between clouds and other bright objects can be difficult to make. Some cloud types are not easily recognizable – a thin veil of clouds for example. And this must be done efficiently and accurately.”

“For other purposes, the main task is to record changes. Every time the camera flies over an area, it checks whether anything has changed and, if so, what. The AI ensures that information is only sent to Earth if something has changed. This saves us a lot of time because we only receive information when it is absolutely necessary.”

The instrument does not only observe visible light; it also detects infrared, with which it can measure temperature. This is a unique combination that makes new applications possible. “In agriculture, we can observe how crops are doing, so that we can say something about their health and the extent to which they need water. We also look at fire hazards. Based on color profiles and temperature, HyperScout-2 can predict the location of a fire hazard and how a fire will spread. Flooding and water quality problems can also be identified.”

Vercruyssen explains how AI works. “The unique thing here is that, for the first time, we can apply AI in a space instrument by using a chip that is capable of running a neural network with an energy consumption of only a few watts. A neural network like this is a form of artificial intelligence in which a computing network recognizes patterns in data sets because it has been trained with sample data. It is not programmed with a traditional computer algorithm but learns from examples. That is why we also call this machine learning.”

“In combination with our electronics, the AI can analyze the acquired data directly in space, without having to download it to Earth first. Normally, downloading and processing can take days to weeks, because HyperScout-2 records 10 megabytes per second and generates three gigabytes of data every five minutes. This leads to enormous files.”

The miniature space camera’s neural network is pre-trained using a computer on the ground, after which it analyzes data autonomously in space. “We can continue to upload our algorithms to space and run them in the neural network after the satellite is launched. This makes it a very flexible system. In addition, we can update it if new applications are found in the future.” Business Unit Manager Marco Esposito (44) adds: “This mission is therefore not only important for cosine. In future, it will also offer give other parties the opportunity to develop new AI algorithms and to apply them operationally in space.”

The ultimate goal is to have a swarm of HyperScouts working together in space. Vercruyssen explains this: “That was the reason for developing a very small instrument, so that it can be used in large numbers at a relative low cost. This allows us to get an update about a specific area every few hours. Because of the wide viewing angle – an image covers 300 kilometers on the ground – we can map the entire world quickly. This makes it a very powerful tool.”

Copyright: Elsevier Weekblad | Boris van Zonneveld | 23 March 2020

Share this on social media

×Close search
Search