Written by: Hannah McGrath
Whether it’s recent Space X launches or the Perseverance Rover landing on Mars, there seems to be a lot going on above our heads. If anyone reading this follows me on Twitter, you’ll potentially note that I occasionally get far too excited about space. But sometimes, it can be hard to connect these achievements with our day-to-day life, particularly if you’re stood in a blustery carrot field looking at insect damage.
In my opinion, one of the greatest challenges agriculture currently faces is how to improve the sustainability of farming across the 51 million km2 of farmland around the globe1.
For every big statistic I can give you about our changing landscapes, ultimately, it comes down to individual trees being cut down, singular seeds being planted, and each litre of water used. We need to find ways to monitor and manage vast areas whilst understanding what’s going on in each field to help feed the world and allow nature to flourish. Satellites are going to be a helpful bit of tech as we try to do this.
At the beginning of 2021, there were 3,372 active satellites2 hurtling around the earth. Current at 786 km above our heads are two Sentinel-2 satellites made by the European Space Agency scanning the UK every 4 or 5 days. This specific satellite is a passive optical imaging satellite, which receives visible and near infrared wavelengths of light at a 10m resolution. This sounds quite technical, what this really means is that the satellite detects the light that isn’t absorbed by anything on earth and reflects back into space in the same kind of way that our eyes receive light rays. If you’ve ever googled why do plants appear green, it is because the chloroplasts absorb and use red wavelengths of light, only reflecting the green wavelengths which our eyes can pick up.
So, Sentinel-2 satellites are a bit like our eyes, but obviously, this satellite is highly sophisticated and uses things called beam splitters and focal plane assemblies3 to ‘see’ which as a biologist I don’t really understand! What I do know is that this satellite setup results in a map of an area that can be broken down into a 10 by 10 metre square grid of visual images. These maps are then what I get excited about!
There are lots of uses of these images, one those of you in agriculture might have heard about is using these images to calculate NDVI or the Normalized Difference Vegetation Index. Again, some technical jargon for you! What this index shows is how much more near infrared light is reflected compared to visible red light. This is important because this difference in reflection, which is picked up by the satellite, can tell help us about the health of a plant, like the tree in the figure below. But, again, this gets really useful when we can look at plant health across a whole farm, county or country!
This picture from Earth Observing System shows how NDVI can be used as an indicator for plant health4. There are differences in the ratio of the arrows of near infrared and infrared visible light in this picture are used to calculate the NDVI index.
With the growth in sophisticated machine learning image classification techniques to help us process these satellite images, there will be increasingly more up-to-date and useful maps of our farmland. It doesn’t matter if we are looking at crop rotations, droughts and floods, where trees are being planted or how healthy are carrot crops are, all of this data comes from a satellite about the size of an SUV whizzing above our heads.
These technological advances happened as scientists, engineers and mathematicians from across Europe have worked together and undertaken space exploration. When they go to team meetings at the European Space Agency studying the detail rocket propulsion or spacecraft design, I’m guessing they probably won’t focus on how their work allows me to look at a carrot field. But what I do know is that sometimes science that seems a long way from our daily lives can influence what ends up on our dinner plates.