Wed. Nov 20th, 2019

A new ‘app’ allows working with astronauts to complete the light pollution map | Technology



What does Madrid look like from space? And at night? Alejandro Sánchez, astrophysicist at University of Exeter (United Kingdom), ensures that very few people know how to answer this question. "They know the shape of the city but they are not used to seeing it at night and it is what reveals an impressive reality: the intensity of human activity," he explains. The group Cities At Night (Night cities), with the participation of the POT and of the European Space Agency, launches on Thursday a new online platform for any citizen to help professionals classify images of the Earth taken from space. The application Lost at night (Lost at night) aims to locate up to 90,000 cities to discover precisely which areas of the planet suffer the most Light pollution.

Astronauts perceive cities more intensely than the stars around them

Astronauts photograph the Earth from the International Space Station which is at an altitude of 400 kilometers circling the planet in 45 minutes. The scientist has a field of vision of up to 2,000 kilometers of radius so he does not always know exactly which city he is capturing. A second is hundreds of kilometers elapsed, says Sánchez, therefore, it is very difficult to locate high resolution images because it can be Madrid, Ávila or even a city in France. The low resolution ones are simpler since the country's shape is quickly recognized.

Light pollution eclipses the brightness of the stars. Even in space, those 400 kilometers in the sky, astronauts perceive cities more intensely than the stars around them. The new project Lost at night, funded by a crowdfunding and adapted for mobile, aims to fight against the pollution of industrial light, raise public awareness and make it a participant in combat. "Communication is part of the tool, you have to reach people," says the astrophysicist.

People, better than algorithms

With an informative and educational vocation, the website proposes a photograph of a large and upstairs night city and several small ones already located. The user has to discover in a range of 1,000 kilometers which image corresponds to the main proposal. "While computer algorithms have trouble distinguishing between stars, the Moon and cities, people are better at recognizing patterns and analyzing complex images," says Sanchez.

While computer algorithms have trouble distinguishing between stars, the Moon and cities, people are better at recognizing patterns and analyzing complex images.

The first phase of the project was born five years ago and according to the scientist, of the million and a half night images of the Earth that NASA has, they managed to classify and locate 3,000 images when the space agency can take 10 years. The system also needs artificial intelligence to replace the human eye when the peak of activity will decrease in about six months, the expert estimates. "Now, we have to train him and that's why we ask people to participate and locate more photos," he says.

Humans also make mistakes, so this initiative calls for contributions of five people per image to reduce the margin of error. From there, artificial intelligence takes over the project. Of the 300,000 high resolution classified images, 30,000 are cities and only 3,000 are located. The user will therefore work side by side with scientists and artificial intelligence to complete this puzzle of Earth's light pollution.

"More images of astronauts and more clues of the inhabitants of the Earth will help researchers to better evaluate these effects over time and will encourage actions to optimize street lighting," the scientists in charge of the project expect.

. (tagsToTranslate) new (t) new (t) app (t) allow (t) work (t) astronaut (t) complete (t) map (t) light pollution (t) light (t) scientific (t) t) want (t) locate (t) 90,000 (t) city (t) optimize (t) street lighting (t)



Source link