Lonely diesel-powered, laser-blasting, weed killing robot

The 9,000 pound, autonomous field tool uses a 72-horsepower Cummins diesel engine and lasers that operate outside of the visible range at a 10.6-micron wavelength, rapidly making intense heat that, when focused on weeds, kills them. | Carbon Robotics photo

A Seattle-based startup has launched a 9,000 pound autonomous robot that uses a 72-horsepower Cummins diesel engine to power weed-blasting lasers.

The machine was built to manage weeds in vegetable row crops, although it may also have a fit on Austin Powers’ nemesis Dr. Evil’s farm, if he had one.

“We’re focusing on low-lying row crops, or what some people refer to as specialty crops, it’s the kind of things that you eat directly. So examples would be onions, carrots, sweet potatoes, potatoes, asparagus, broccoli, cauliflowers, leafy greens, you know, spinach and some herbs, like mint and things like that,” said Paul Mikesell, creator of Carbon Robotics in Seattle, Washington.

“It’s those kinds of plants that usually have a robust organic market, and because we are an organic certified solution for weed control so we can help farmers become organic.”

The Cummins engine powers the hydraulic system including the wheels and steering, as well as the generator that powers the onboard computers and the lasers.

“There are eight lasers across. They are arranged in a fashion that’s parallel to the furrows. So if you imagine a row with our vehicle in it that’s driving forward, those lasers are arranged linearly pointing back to front. Then through some optics the targeting bounces that beam down at the bottom of the robot to target the weeds,” Mikesell said.

The lasers operate outside of the visible range with a 10.6-micron wavelength, which can make intense heat quickly.

Mikesell said the lasers work best when they target the growth plate of young plants, where the meristematic growth cells are located.

“We like to shoot them (weeds) as soon as that is (growth plate) visible above the ground. And so this is usually within the first 10 days of the weeds popping up,” Mikesell said.

“If they (weeds) get too big and to the point where the laser would simply take too long to kill them, then it would be too late for us to get in there.”

The Carbon Robotics robot typically works at about six mph, but the company is working on making it faster.

Mikesell said large vegetable fields will need multiple laser-blasting robots to keep ahead of the weeds, but that the system offers considerable savings compared to relying on labors to manually kill weeds.

“For an organic farmer, we are anywhere from half to in some cases one-tenth the price to do weed control per acre,” he said.

“For conventional (farms) we are comparable or slightly more expensive.”

Mikesell started the company in 2018 with another engineer and they worked with a farmer in Seattle to research ways to bring autonomy to farming.

After the first year, the company raised just under $9 million from investors to bring the concept to a production platform.

“This is the third generation of that particular robot. We’ve had two earlier ones that we’ve been testing and development on. So we’re pretty happy with the robot. We know how it works. We know the pitfalls of being out in the field with weather, wind, dust, heat, temperature fluctuations, that kind of a thing,” he said.

Mikesell’s background is in computer vision and deep learning, which he applied to help the robot differentiate weeds from crops.

“It’s a learning algorithm, so it’s a neural net that has many different layers to it. It runs on GPU’s (graphics processing unit), which originally originated for graphics processing and have now been used for other things, you know like crypto currency mining. We use them a lot in deep learning because it’s very fast vector operations, things that can run in parallel, much like a human brain does,” Mikesell said.

He said the learning procedure involves providing the algorithm many sample images with enough labels that say what’s in the image.

“By label I mean pixel locations that have a label associated with it. So like this would be an onion for example and there’s an outline of an onion, or this is a weed that we want to shoot and we’ll have the center of the weed meristematic growth plate of the weed that we’re shooting with the laser,” Mikesell said.

Once the neural net is given enough samples it will learn to differentiate weeds and crops.

“Now it can make inferences about things that it hasn’t seen before, and it can say, ‘oh that’s this kind of plant I’ve seen that before, it’s not an exact copy but I know that’s an onion. Or I’ve seen that before, it’s not an exact copy, but I know that’s a purslane, which is a type of weed, or lambs quarters, which is a type of weed.’ So it learns, and then as we feed it more and more information and it gets better and better.”

Eight lasers across, the unit uses optial sensors to find and kill weeds. | Carbon Robotics photo

The processing and predictions are made in real-time by the on-board computer, which does not need broadband connectivity.

However, during the training process, the neural nets require the team to gather example imagery and upload it to computers that conduct the deep learning processes.

Before turning the laser-blasting robots loose on a field, Carbon Robotic scouts weeds to fine-tune the AI for a specific field.

“Sometimes we can deploy the exact same ones (neural nets) that we’ve had before. Sometimes there are some smaller tweaks, what’s generally referred to as fine-tuning,” Mikesell said.

“The procedure usually takes 24 to 48 hours from initial arrival (at the field) to getting a good neural net, good predictions for us. That’s assuming it’s a new field.”

He said the company will have several robots deployed in several fields this year, and by next year there will be between 30 and 50 robots out working.

The laser-blasting robot’s autonomous driving capability also relies on computer vision.

“So again this idea about neural nets, AI, deep learning, training, those images in this case are instead of looking down at the field are looking forward out the front of the robot and it can see the furrows,” Mikesell said.

“Our robots will drive down the same furrows in the same tracks that the tractor tires go, and it does this by visually identifying those furrows by looking out these cameras.”

At the end of the rows the company sets up a geofence, which is a series of points that marks out the edge of the fields.

When the laser-blasting robot crosses a geofence, it knows it has to turn around before it re-engages the lasers.

The Carbon Robotic autonomous weeding robot also uses lidar to look for anything in front or behind the robot, which will automatically stop if an obstacle is detected.

The lidar and computer-vision based safety features ensure that the laser-blasting robot doesn’t laser blast someone’s foot.

However, Mikesell said if a foot does get hit with a laser it wouldn’t be a big deal.

“This is not like a laser blaster from Star Wars kind of thing. But it’s really not comfortable (to get blasted).” Mikesell said.

“If you felt it with your hand, which many people have done through various accidents and whatnot, it feels hot really fast.”

He said Carbon Robotic autonomous weeding robot cost approximately US$200,000.

About the author

explore

Stories from our other publications