A new Algorithm that will revolutionize the Satellite Imagery

A new Algorithm that will revolutionize the Satellite Imagery

Satellite Imagery might take a huge boost as a high-frequency satellite is integrated with a high-resolution one.

Satellite data is used for various purposes in today’s world and the most common among them include monitoring of urban development and agriculture. Similarly, it is used for measuring economic development and the quality of the environment. The images obtained through satellites are an essential part of the Big Data. Technologies like Aerial Photographs, Drones, and Satellite Sensors are used to capture these images as physical contact is not possible all the time. Satellite Imagery has brought extensive changes in many industries. It was used for the first time in 1973 to demonstrate seasonal vegetation change. Scientists built on that and this information is now used to track droughts in different parts of the world.

Satellite Imagery is also used to measure and identify human activity. It looks for a change in human density around the globe and it is one of its major application these days. Satellite images provided evidence about the burning of Rohingya villages in Myanmar, last year. Similarly, they recorded the destruction of cultural heritage of Iraq and Syria in 2014. These images are also very useful in determining the power of disasters. A proof of that can be found in the images taken before and after Hawaii’s Kilauea Volcano Eruption. They clearly revealed the directions of lava flow and the loss of property it inflicted.

Despite all these benefits, there are some limitations to these satellite images. An important one among them is that they are only as good as their resolution. Having said that, even high-resolution images cannot be trusted without validating it on the ground. An instance where remotely sensed data was misused led to destructive results in 2003. On the basis of satellite images, sites of weapons of mass destruction were observed in Iraq.

According to those images, that part of Earth was graded to hide the signs of chemical production. However, later ground inspection proved that it was not true. Another dilemma that is associated with Satellite Imagery is the compromise that we have to make. Either we have to reduce the number of images to obtain high-resolution or we sacrifice the quality of the images to capture more pictures. The researchers of the University of Illinois claim to have found a solution to this long-lasting problem.

They have developed an algorithm that eliminates this trade-off as they integrated a high-frequency satellite with a high-resolution one. It has the ability to generate 30-meter continuous images on daily basis. Another benefit of this tool is that it can pull out data going back to the year 2000. Kaiyu Guan, a co-author of the study who is a Professor of Natural Resources and Environmental Sciences, told the world about that by saying,

This can be used to study changes in agricultural productivity, ecosystem and polar ice dynamics since 2000 in much higher detail than previously possible. Our approach may revolutionize the use of satellite data.”

It is critical for agricultural applications to have images with 10-30 meter resolution as farmers need to observe rapid and subtle changes in crop conditions that can affect the yield of the crop. According to the researchers, the data that we already have either lacks in resolution quality or its frequency is low. They are pretty hopeful that this new algorithm will bring positive changes in all the fields including farming. Guan mentioned that in the following words:

We struggled to find satellite data that has both high spatial resolution and high frequency in our own research — it simply did not exist. So we took the initiative to produce it ourselves.”

Efforts were made in the past to fuse temporal data and high-resolution spatial but all of them had their limitations. Most of them didn’t support Automation and failed to deal with missing pixels and temporal fusion at the same time. This led to localized applications that were not feasible for a long-term use. In order to counter these problems, the design of this algorithm allows it to automatically integrate information from the existing data. This helps to eliminate the problem of Data Gaps as this algorithm can produce images without any missing pixels. The team of researchers aims to build long-term daily, continental scale images for different applications. Jian Peng, a co-author of the study, said,

Generating this sort of data requires significant computing resources, making accessibility difficult. We want to share the output with the broader scientific community and we are working to find a way to make that possible.”

Leave a Reply

Your email address will not be published. Required fields are marked *