Is This Weed-Spotting, Yield-Predicting Rover the Future of Farming?

Earth's population will reach almost ten billion by 2050. This growth brings with it a huge demand for food resources, especially drought-, heat, pest, and disease-resistant crop varieties that can produce high yields in the face climate change.
Alphabet Inc.'s moonshot factory is where innovative people face the most difficult problems in the world and create ground-breaking technology at a rapid pace. Project Mineral is one of X's current projects. It focuses on finding a way to solve the global food security problem through computational agriculture. This term was invented by X to describe new technologies which will increase our understanding about the plant world.

Elliot Grant, Project Mineral's lead, said that digitization has made the agriculture industry more digital. Today's farmers use GPS, spreadsheets and sensors to gather data about their crops and create satellite imagery of the fields. It has not led to greater understanding. The next step is to combine multiple technologies, such as sensors, machine learning, machine modeling and robotics, to make sense of the complex plant world. Grant explained that computational agriculture is the analysis of all data.

Mineral team innovators are focused on one question since the project was launched in 2016. Can a machine learn to understand the world of plants?

Grant and his team have perfected their latest prototype, a robot-like plant-scanning robot powered by artificial Intelligence. It will be made public at Smithsonians Futures, an extensive exploration of the future through art, design, history, and technology. The sleek, four-wheeled robot can sync up with satellite imagery and weather data to provide soil information. It is about the same height as a shipping container, and nearly as wide as a car. It can detect weeds and measure the fruit's ripeness as it moves through farms. To accommodate different stages of crop development, the Mineral rover can adjust its length, width and height. It can also be made taller to capture mature wheat plants or scan large areas of lettuce.

It didn't look so impressive at first: The prototype was built with two bikes and some scaffolding. Minerals diverse team of engineers, biologists and agronomists tested their Franken-machine. They took it to a nearby strawberry farm and pulled it through rows red fruits to test its ability to capture enough images for machine learning.

After a few hours of pulling and pushing this contraption through the mud, and some squashed berries we returned to the lab and looked at the imagery. Grant explained that while there were still a lot of things that needed improvement, there was a glimmer that this would work.

The Mineral team began their experiment with farmers and plant-breeders. They then built, scrapped, and reimagined the rover. This phase of momentum-building and burn-and-churn is part X's rapid iteration process. If an experiment does not work out, X project managers can learn from their mistakes and move on. Grant says that the essence of rapid iteration is to move quickly and take risks.

Mineral tested a machine learning algorithm called CycleGAN (or cycle generative adversarial network) to create simulated strawberry plant images. CycleGAN creates realistic images that Mineral can use to diversify their rovers' image library. The rover can identify certain crops, traits, or ailments in different situations out in the field.

A.I. A.I. This prevents the negative alternative of intentionally inoculating fields.

Grant explains that we are able to create realistic simulated images of plants.

The team eventually built a rover capable of detecting rust disease, and other fungal diseases. Mineral has partnered up with a farmer from the Philippines to help catch banana diseases. The rover will learn how to identify diseases in banana crops by looking at images of sick bananas.

The robot can also take images of flowers, and then use the machine learning model for calculating a plant's blooming rate. This is crucial to understand how plants respond to their environment and predict how many fruits a plant will produce. The robot can count individual raspberry buds and estimate the number in a field of soybeans. Mineral has so far experimented with images of soybeans, melons and oilseeds as well as lettuce, oats, and barley, from early spouts up to fully grown produce.

The robot can detect greenness and see different leaf sizes through algorithms. The robot takes photos of plants from many angles and then converts each pixel into data. Mineral uses RGB (Red Green, Blue) as well as HSV (Hue Saturation Val) color coding to analyze the color of plants.

Olivia Evans, marketing manager at Xs, explains that if you can identify a specific hue of green in a plant, that will help you predict the yield. Because we see color differently, this is something people cannot objectively do. A machine, however, can see the colors objectively and detect patterns using RGB color coding or hue saturation value code.

Plant breeders spend hours documenting the physical characteristics and growth patterns of thousands of plants in a field. This is phenotyping. However, phenotype data collection is dependent on human perception. Human perception may not always be accurate.

Grant asks, "Can we create a technical set to help these breeders see the world of plants in a new way? Higher fidelity, more often, and more easily?" It is very time-consuming to go through the field and phenotype plants.

Scientists are still working fast to discover more about the genes of plants, or their genotype. They also match these traits with their physical traits, or phenotype. This is called the phenotyping bottleneck in agriculture. It's the lack of information about how genes link to desired traits. Scientists may be able to combine available genetic sequence logs with plant traits to create more resilient plants that can withstand climate change.

It takes time to bring new varieties of crops to market. It takes time to understand how genes work through environmental traits and plant traits. There are a lot of genetic and phenotype information that must be analyzed.

Chinmay Soman is cofounder and CEO at the agritech company EarthSense. He has been working on similar rover technologies. It all begins with high-throughput field phenotyping.

Computer vision is increasingly being used to solve the phenotyping problem. A simple photo can be used to derive plant information. TerraSentia from EarthSense is a small robot that can fit into a car trunk and zip under a plant canopy. Mineral's rover, on the other hand, towers over crops and requires a truck to transport it. Both use A.I. A.I. could be used to help crop breeders develop better varieties of crops by capturing data about plant traits. The mineral rover captures thousands of images per minute. This amounts to more than a hundred millions images in one season.

Although Project Minerals' rover is a lot more advanced than its original cobbled-together design, it is still a prototype. Mineral insists that despite all the tech they have, they continue to improve and work closely with experts in agriculture to better understand plants.

Grant says that the rover is the current manifestation of the vision we have for breeders. Grant also said that the rover is learning along with them.

Futures will feature the prototype in its Futures that Work section in the AIB West Hall. This area was created to reflect upon renewability and sustainability and to show off any innovations that might soon be made.

We were really happy to be able show something that was still in a semifinished prototypical phase. Ashley Molese, special projects curator for the Smithsonians Art & Industries Building says that they are very pleased. It's not like rolling out machine factory floors yet. It's past the stage of early prototyping where there are still many kinks to be worked out.

A video will be shown behind the rover display showing a fleet Mineral rovers traversing a field. Then, the footage will cut to footage of the rover seeing while it images strawberries and soybeans.

Molese states that there's something slightly anamorphic to it. Its cameras look forward like they are looking ahead, which is kind of like eyes. It's very interesting to see how visitors react to it.

Visitors can see Mineral's plant robot, envision the future of food security and sustainability, and, just like the Mineral team, think about all the possible outcomes.

Imagine if every plant could be managed by a farmer. How would this contribute to sustainability? Imagine if disease could be detected before it was visible. What if you could create a symbiotic relationship between plants so that they need less inputs and produce healthier plants? Grant says these are the things that keep us going every day.