Inventing Marine Robots

Professors Hanumant Singh (ECE), Mark Patterson (CEE & MES), and Joseph Ayers (MES) are inventing new and improved robots to explore parts of the ocean that humans cannot.


Source: Northeastern Magazine

An unmanned kayak glides slowly through a fjord in Greenland, 100 miles from the nearest town, in temperatures hovering below freezing.

The kayak stops. Above it looms a glacier that is 3 miles wide and taller than a Manhattan skyscraper. Silently, the kayak begins its work. Using sensors, it slips along the edges of the glacier, collecting data about the massive ice form—data that could help scientists track the pace of climate change.

This 2014 research mission would have been impossible without the kayak, a marine robot named JetYak invented by Northeastern professor Hanumant Singh. Because humans who venture near a glacier risk getting struck by giant shards of falling ice, Singh operated the robot from a boat a safe distance away.

In marine science, robots are transforming ocean exploration. Often called autonomous underwater vehicles, or AUVs, robots reach remote locations that are otherwise inaccessible to humans. They can survey shipwrecks, detect pollutants, track fish populations, and even search for underwater mines.

Northeastern is a leader in marine robotics, with three pioneers in this emerging field: Singh, Mark Patterson, and Joe Ayers. The three have spent decades making robots that conduct research across the globe. They continue to enhance their AUVs, recognizing that many fields—homeland security, climate change, and coral reef ecology, to name a few—could benefit from the untapped potential of sea-bound robots.

“Robots can work at night, in a storm—anytime,” says Patterson. “They can do the dirty, dull, or dangerous work that humans can’t or don’t want to do.”

Dive buddy

Professor Mark Patterson invented the prototype for Fetch at his kitchen table. Now he has five of them exploring oceans around the world at depths of up to 500 feet.

Professor Mark Patterson invented the prototype for Fetch at his kitchen table. Now he has five of them exploring oceans around the world at depths of up to 500 feet.

Patterson has been tinkering with electronics since high school, when a shop teacher kicked the rambunctious student out of class and sent him to learn programming instead.

Later, as a Harvard undergraduate, Patterson built primitive instruments to collect ocean data. He went on to earn three Harvard degrees and teach at the College of William and Mary in Virginia, where he led oceanic research trips that included stints living and working in an underwater lab.

In the early 1990s, Patterson realized that he wanted a “dive buddy”—a robot to accompany him under the sea and make it easier to gather data. For less than $10,000, using parts from an Apple computer and ordered through the mail, Patterson sat at his kitchen table with a friend and built the robot, Fetch. The friends patented their design and formed a business. Since then, Patterson has built marine technologies for federal agencies, academia, and defense contractors, but his primary mission has been to perfect AUVs for research.

There are now five Fetches in Patterson’s lab. Each is 6 feet long and weighs 200 pounds, with torpedo-like shapes and long snouts covered in sensors. This “taxicab for sensors,” as Patterson calls it, moves at walking speed and can explore the ocean by itself for up to two days.

Satellites can capture data from only the top few feet of an ocean, but Fetch can dive down 500 feet. As it hums along the ocean floor, it can measure water temperature, pH levels, salinity, and hydrocarbons—indicators of the health of the ocean. Cameras mounted on Fetch also use sound waves to capture sonar images, allowing researchers to survey ocean habitats or count fish populations.

The RoboLobster is the brainchild of Professor Joe Ayers, who created an electronic nervous system that allows the robot to independently navigate the jumbled world of the intertidal zone.

The RoboLobster is the brainchild of Professor Joe Ayers, who created an electronic nervous system that allows the robot to independently navigate the jumbled world of the intertidal zone.

In 2005, Patterson took Fetch to Antarctica to count the swarms of krill that are critical to that region’s ecosystem. Three years later, he used his robot to study hydrothermal vents (rare pockets of hot water that pour from the earth’s crust) in Iceland and the health of coral reefs near the Caribbean island of Bonaire. On the reefs, Fetch conducted 4-hour dives at 250 feet—a task that would have been impossible for human divers. In 2010, Fetch and another robot ambled through the salt marshes of Louisiana and discovered that naturally occurring microbes were chewing up some of the oil that had seeped into the marsh during the Deepwater Horizon disaster.

While Patterson explores the ocean’s depths, Ayers is intrigued by what’s happening near the surface. His robots, RoboLobsters, are uniquely flexible and can quickly change course if they encounter roadblocks—just like the animals for which they’re named. This ability makes them perfect for navigating the often-unpredictable terrain near shorelines.

Ayers never expected to make robots his life’s work. He trained as a neurophysiologist, learning to reverse-engineer the nerve networks that control the behavior of sea creatures. In the late 1990s, Ayers built a sonar-based technology that he attached to lobsters to learn about their nervous systems.

Mark Patterson

Mark Patterson

His research caught the attention of the U.S. Department of Defense, which asked Ayers whether lobster nervous systems could be applied to remote-sensing technologies. Ayers—a gregarious guy with a booming voice—tested the theory by building a lobster robot.

“The behavior of lobsters is pretty simple and well-understood, and they can move in any direction,” he says. “When you think about rocks and reefs and seaweed, lobsters are a proven solution because they’ve been navigating treacherous things for millions of years.”

Joseph Ayers

Early versions of the RoboLobster were controlled by computer algorithms that told the robots what to do. Over time, Ayers created a sophisticated electronic nervous system—a network of neurons and synapses—that replaced the algorithms. This network more closely mimicked the lobster nervous system; it made it simpler for the RoboLobster to move and react on its own—“to wiggle out of trouble,” Ayers says.

He is currently working with a graduate student to invent a smell technology that would help the RoboLobster sniff out danger, such as explosives. He envisions these versatile creatures helping to detect pollutants or underwater mines in shallow water in the same way that a lobster searches for food. Researchers estimate that 60 million land mines are buried worldwide; some are buried in streams, lakes, or close enough to waterways to enter the water during rainstorms. Minesweepers (boats with sensors) on the water’s surface can’t always detect these mines.

“[RoboLobsters] already have the mobility. The next stage is to give them the smarts to find something,” Ayers says.

Socially relevant robots

Hanumant Singh

Hanumant Singh

Like Ayers and Patterson, Singh is fascinated by the technical wizardry of marine robots. He spent the last decade building them at the prestigious Woods Hole Oceanographic Institute, where he worked for 26 years before joining Northeastern’s faculty in January. But what really excites Singh is the social relevance of AUVs. He believes sea robots have the potential to answer science’s toughest questions, including how climate change is unfolding.

“We know more about the surface of the moon than our own oceans. Putting people down there is extremely risky and expensive,” he says. “We need to figure out what’s going on in the world’s oceans, for naval applications, fisheries management, oil and gas, climate change—so many things. Beyond the gee-whiz factor, robots are tools for doing real work. Let’s put them in the ocean to make a difference.”

Singh grew up in Chandigarh, a landlocked town in India. He saw the ocean just once before moving to the U.S. to attend George Mason University in Virginia. He was a desk-bound junior studying computer science when a brochure for a marine science internship caught his eye. He never looked back.

Seabed is “like an underwater helicopter” that can film the ocean bottom down to 15,000 feet.

Seabed is “like an underwater helicopter” that can film the ocean bottom down to 15,000 feet.

Singh built a prototype of Seabed, a deep-water robot, more than a decade ago for a student who wanted a deep-sea device to study coral in Australia. There are now 12 Seabeds that have been used by academic researchers and government agencies around the world. The robots can hover over the ocean floor at depths of up to 15,000 feet, taking detailed sonar pictures and surveying sea habitats. “It’s like a helicopter for the underwater world,” says Singh, noting that Seabed can spend up to 24 hours at extreme depths.

Seabeds have captured history by hovering over ancient shipwrecks in Greece, mapping the remains of a fourth-century ship in hours instead of months. His robots have surveyed coral reefs near Puerto Rico and counted endangered fish populations on Georges Bank off the coast of New England.

These machines are built for risk. Singh’s motto is: If you don’t lose or almost lose an AUV every so often, then you aren’t using it to its full potential. He’s taken Seabed three times to Antarctica—one of the toughest research climates on the planet—to study the thickness of sea ice as a harbinger of global warming.

After it was lowered from the deck of a huge mother ship, Seabed spent the next six hours in the frigid ocean, roaming “back and forth, back and forth, like a lawn mower cutting grass,” Singh says. The robot used underwater acoustic beacons and imaging technologies to map the underside of ice floes, while researchers aboard the ship used other tools to map the ice from above.

Singh is already looking ahead to the next advances in marine robotics: finding ways for AUVs to work in teams, remotely recharge their batteries, and explore the sea with little or no direction from researchers back on shore.

He asks, “What if we could go from an expeditionary mode to the mode where these robots are sitting out in the deep long term, and are ready to go to work if they see something interesting?”

Related Faculty: Hanumant Singh, Mark Patterson, Joseph Ayers

Related Departments:Civil & Environmental Engineering, Electrical & Computer Engineering