Data collected by satellites, drones, radars and microscopes provide a wealth of information to better understand our environment. And when this data is coupled with artificial intelligence (AI), it can unlock the secrets of phenomena occurring at all levels.
In a rapidly changing world with many environmental threats, a better understanding of natural and human-induced processes can help corroborate perspectives, guide conservation and renewal efforts, and direct new research. One key to achieving this kind of understanding is imaging technology. A wealth of data is captured by satellites, radars, lidars, and microscopes; the trick is to bring together the different forms of data and, sometimes with the help of AI, glean valuable insights. Researchers in many fields are exploiting the opportunities offered by new imaging technologies—such as determining the chemical composition of plants, detecting floating debris in the ocean, quantifying and characterizing rainfall, mapping coral reefs in the Red Sea, and assessing the health of large areas of cropland—to learn more about ecosystems of all sizes.
Freeze plants to see them better
At EPFL’s Laboratory of Biological Geochemistry (LGB), scientists study a variety of biological and other processes at the subcellular level: the breakdown of the relationship between microalgae and the coral they live in, the stress caused to plants by salt stress, the reconstruction of past climate conditions from tiny carbonate shells less than a millimeter long, etc. Scientists use a variety of microscopes and other high-tech microanalysis instruments to observe chemical transfers where even small molecular and ionic variations can disrupt an entire organism and have an impact on a much larger scale.
Consider coral and the thousands of microalgae it contains, which live in a perfect symbiosis: the coral feeds on the nutrients released by the microalgae, while the latter absorb the CO₂ produced by the coral. But this centuries-old relationship, which gives corals their shimmering colour, is now threatened by global warming. Rising water temperatures put the microalgae under pressure and cause them to release compounds that are toxic to the coral. The coral reacts by expelling them, leading to coral bleaching or even death. When this happens on a large scale, entire coral reef ecosystems can collapse and lead to a massive loss of ocean biodiversity. For several years, a team of scientists at LGB has been using an ion microscope to investigate the hidden secrets of this symbiotic relationship. “We use a NanoSIMS microscope, which bombards the samples with ions,” explains Nils Rädecker, a postdoctoral fellow at LGB. “This allows us to observe the transfer processes at very high resolution. We can see individual cells and even subcellular structures.” Thanks to NanoSIMS, the scientists were able to discover new mechanisms in the breakdown of symbiosis, such as the selfish way in which microalgae stop providing nutrients to the coral long before the coral expels them.

“The problem with NanoSIMS is that most of the soluble compounds are lost during sample preparation,” says Anders Meibom, a professor at LGB. To get around this problem, the scientists have painstakingly developed a CryoNanoSIMS microscope that can analyze biological samples in a frozen state and from which nothing is lost. “The CryoNanoSIMS thus allows us to precisely image where soluble compounds, such as specific molecules like drugs or micropollutants, accumulate in individual cells,” says Anders Meibom. The microscope has opened up many new avenues of research. For example, Priya Ramakrishna, a postdoctoral fellow at LGB, is using it to produce high-resolution chemical maps of a model plant to study the cellular response to soil salinity. “Increased soil salinity affects plant growth and therefore has consequences for food crops. We need to understand how plants react to this,” she explains.
Images and AI give a voice to our planet
The Earth is over 196 million km², which leaves plenty of room for ecosystems to thrive in remote areas off the beaten track, inaccessible to field scientists. Yet drones, satellites, and smartphones equipped with sensors form a dense network of data collection devices capable of providing anonymized and actionable information. “The satellite we use the most, for example, can take very detailed images of areas 290 kilometers wide with a resolution of 10 meters,” says Devis Tuia, a professor at EPFL’s Laboratory for Environmental Computing and Earth Observation (ECEO). “Because the images are geolocalized, we always know the coordinates of the place we are analyzing.”
From studying animal populations, crop distribution and maturity, identifying floating debris on the ocean surface, to tracking glacier melt, the potential of imaging to observe and monitor the environment is immense. “Each problem has its own sensor and preferred resolution. In addition, the available data is very heterogeneous. We use standard information extraction algorithms and AI to sort, catalogue, search, and process these heterogeneous and unstructured datasets and transform them into useful, structured information,” Tuia explains. His research group recently developed an AI program for rapid 3D mapping of corals—organisms known to play a critical role in marine ecosystems—from footage filmed by commercially available cameras. With this technology, even untrained divers can easily collect data on large coral reefs.
And then there’s satellite data. There’s still untapped potential in this type of imagery, and researchers often have to train basic image recognition programs from scratch with the limited data available for a specific domain. “Until now, there was no program that could quickly go from recognizing a piece of trash to recognizing a tree or a building, for example,” Tuia says. He and his team, working with colleagues at Wageningen University in the Netherlands, MIT, Yale, and the Jülich Research Center in Germany, have developed a chameleon application called METEOR that can train algorithms to recognize new objects from just a few good-quality images, along with a meta-learning algorithm. This is a huge time-saver when acquiring field data is difficult or very expensive.
Cloud profiling
Scientists at EPFL’s Environmental Remote Sensing Laboratory (LTE) are trying to understand why no two snowflakes, or even raindrops, are the same. They monitor precipitation and study clouds all over the world, including in the Alps, Antarctica, the Arctic, and Greece, using radar, lidar, and a special device that takes 3D photos of snowflakes. “Imaging is the only way to observe how weather phenomena evolve in time and space, at different scales,” says Alexis Berne, a professor at LTE. Even today, researchers struggle to obtain accurate and reliable quantitative data on precipitation, especially when it is solid and in mountainous and polar regions. Yet such data can greatly contribute to preserving water resources, predicting natural disasters, and assessing the effects of climate change in highly sensitive regions.
Extra crystals
There is still much to learn about how water droplets and ice crystals form in clouds. While the mechanism of condensation around certain aerosols—solid or liquid particles suspended in the atmosphere—that serve as so-called “ice-forming” nuclei is well known, a second process, secondary ice, is still somewhat of a mystery. When the researchers pointed their radars at clouds to quantify precipitation formation, droplets and crystals far outnumbered aerosol particles. The numbers didn’t add up. “We don’t really know how this secondary ice process works yet,” Berne says. His lab, along with others at EPFL (the Laboratory for Extreme Environments Research and the Laboratory of Atmospheric Processes and their Impacts), will be participating in a major EU-funded project to profile clouds in different locations around the world. The goal is to observe the behavior of cumulonimbus and other cloud families. “Here, computer modeling will also help us better understand the ambient conditions in which we make our observations,” Berne explains.
Images derived from electromagnetic waves
“We don’t do image analysis like in biomedical imaging, for example,” Berne says. The radars used by scientists in his field produce tens of gigabytes of data every day, which are analyzed to conduct case studies on specific weather phenomena and generate statistics. “The factors we study most are generally those observed indirectly,” Berne says. “Lidars and radars work with electromagnetic waves, and we measure the electromagnetic properties of objects in real time. Our work focuses on restitution algorithms that will allow us to extract information on the microphysical properties of cloud particles to better understand the mechanisms involved and more precisely quantify precipitation.”