Perception for LS3
NREC’s sensor system for DARPA’s Legged Squad Support System (LS3) enables LS3 to perceive its surroundings and autonomously track and follow a human leader.
Boston Dynamics developed a rugged, highly mobile, four-legged robot, the Legged Squad Support System (LS3). LS3 walks (or even runs) with dismounted troops, lugging along their gear and supplies. It carries 400 pound loads and has enough fuel on board to travel up to 20 miles on missions that can last as long as 24 hours.
LS3 follows troops, navigated through its surroundings, and obeys commands in a way that’s similar to a trained pack animal. To do these things, it must be able to perceive the world around it and act on that information. LS3’s perception system gives it the ability to sense and understand its environment.
LS3’s perception system includes stereo, color, and infrared cameras plus ladar. It uses these sensors to locate and classify obstacles, generate 3D maps of terrain, and follow personnel. LS3 can detect and avoid obstacles in its path. It is able to distinguish vegetation (which it can pass through) from rocks and other hard obstacles (which it cannot). This information serves as the foundation for the planning of foot placement as well as motion planning.
Following a Leader
LS3 can track and follow an individual soldier or Marine in many different ways.
In leader-follower tight mode, LS3 tracks and follows the leader’s path as closely as possible. It trusts the leader to choose a safe route that avoids dangerous obstacles and terrain.
In leader-follower corridor mode, LS3 tracks and follows the leader but is free to choose its own path that avoids obstacles and terrain hazards. The leader does not need to worry about picking a route that is safe for the robot.
In go-to-waypoint mode, LS3 navigates on its own to a specific location. It uses its perception system to find a route to its goal, avoiding obstacles and terrain hazards along the way.
Troops can also directly operate LS3 with a hand-held controller.