The Self-Driving Farm
In the opening pages of Robot Ethics 2.0, we’re presented with one of the most infamous ethical dilemmas, the Trolley Problem. In the trolley problem, you find yourself standing next a lever that controls the rails of a trolley. The trolley’s brakes are broken and it cannot stop. Your switch will let you change which rail the trolley will advance down, but there’s a catch. On one rail, there are several people tied to the rail, and on the other, just one person. There are some variants to the trolley problem. In some, the one person is your spouse, while the people tied down are your children, making it a more difficult decision. In another, there’s a fat man nearby that you could push onto the tracks to stop the trolley, but in doing so you’d be certainly declared a murderer. The point of the trolley problem is to present a person with a no-win scenario where they will be forced to choose the lesser of two evils.
This scenario is infamous, but unlikely that we would ever be faced with it in our day to day lives. The life of a robot, however, is very different. When a human makes a decision to drive onto the side walk and run over someone, rather than crash into a schoolbus full of a children, it is the driver that is held responsible. When a robot with some degree of self-awareness does it, who is responsible? The programmer, the manufacturer, the owner of the vehicle, or is the robot capable of being punished?
A farmer may be wondering what this has to do with her, but she may one day find herself browsing self-driving tractors that are ranked by their ability to make ethical decisions on the fly. Self-driving farm equipment is probably closer to reality than self-driving trucks or taxis. Case IH has already revealed their own self-driving tractor, and John Deere has been showing off an autonomous tractor of their own that’s capable of being accurate down to 2.5 centimeters.
But a farmer will still need to be able to trust the equipment to get the job done well. If a hired hand takes the tractor for a joy ride, he can be fired. But can a tractor be trusted not to use the livestock as manure, and to avoid knocking down a barn to make more space for rows?
Consider a scenario where you have just purchased a self-driving spreader. It rained a few days ago, but you need to get seed in the ground soon or you’ll be harvesting late. As the spreader gets to work, it sends an alert back to you in the comfort of your home office that the ground is muddy and it is struggling to slow down. You head outside to see just how bad it is, when the family dog runs out with you. Being eager to help, she catches up to the spreader before you do. The spreader is then faced with an ethical dilemma – does it swerve to avoid the pet – does it apply its brakes and risk sliding into the dog – or does it continue with its task, ignoring this new variable?
For some the answer is obvious. It should do what it needs to do to avoid hitting the dog. For another farmer, the deep gouges the spreader leaves as it slams to a stop will mean an unproductive spot on the farm, so the spreader needs to swerve. Some farmers may say they need the perfect rows for their other robots to work, and the gap between rows will allow sunlight through where weeds can take root – so the tractor may need to slam on its brakes even if it means the family dog gets hurt. All of this will assume that the tractors are intelligent enough to tell the difference between the family pet, and a piece of mud on a sensor that looks like Fido.
The artificial intelligence governing these self-driving tractors may one day be intelligent enough to make these decisions on the fly, but until then, companies like Case and John Deere may soon be knocking on the doors of farmers and asking them to weigh in on difficult ethical decisions. After all, when a tractor does something unethical, who is to blame? The programmer, the owner of the tractor, or will the tractor be aware enough that the artificial intelligence itself can be punished? Before these questions can be answered, the machine must be capable of making these decisions in the first place.
It is likely that in the short term these tractors may come with settings you can adjust – your own tractor might tread carefully and leave sloppy rows to keep your pets and children safe. While Kenneth down the road? Some say he’s stopped keeping pets around after the last few incidents – but his rows are perfect and his yields are top in the county. Of course, everyone knows that Kenneth also uses robots that eliminate anything that isn’t corn or soy. He utilizes Perfect Pest Prevention methods.
For citations, please visit my bibliography in the link below. It will be updated as more citations are found, with commentary as more information is uncovered.
At the time of this writing, I am a student of computer science & crop science at Parkland College in Illinois. To learn more, check out my About Me page.