A Whale of a Deep Learning Story

Once so numerous that early settlers said you could cross Cape Cod Bay by walking atop their backs, the North Atlantic right whale is today one of the most endangered whale species. Decimated by whaling, only about 500 remain.

Time may not be on the side of the right whale. But deep learning is.

This month marked the end of a contest to develop an algorithm that could identify each of the 500 right whales from aerial photographs. The National Oceanic and Atmospheric Administration will use the winning code to help turn the manual task of identifying the whales into a speedier, automated process.

Read NVIDIA CEO Jen-Hsun Huang’s latest blog post to learn more about how deep learning, and GPUs, are changing the world.

The contest, organized and designed by the data-scientist community Kaggle, was the brainchild of Christin Khan, a marine biologist at NOAA’s Northeast Fisheries Science Center. Khan and other scientists identify (from large patches of raised tissue on their heads called callosities) and track each right whale to monitor individual and species health.

Today, they do this by manually matching photos taken from a plane or boat to images in the North Atlantic Right Whale Catalog curated by the New England Aquarium. It’s painstaking, costly work that impedes scientists from more productive research or fieldwork, Khan said.

After realizing that Facebook could automatically detect her image in photos, Khan set about finding something similar for whales.


A right whale mother and her calf. Image courtesy of Christin Khan of the Northeast Fisheries Science Center, NOAA.

How to Tell One Whale from Another

Whales are not inclined to pose for pictures, which made this a particularly thorny image-recognition problem. Photos were shot from different angles and different distances. Many photos were more water than whale.

“Discriminating between individual whales is quite a bit harder than discriminating between, say, dogs, cats, wombats and planes,” said Robert Bogucki, chief science officer at first-place winner deepsense.io, a big data analytics company that built its solution on deep learning and NVIDIA GPUs. “The whale algorithm had to find fine-grained details that might occupy only a tiny portion of the picture. In this case, it was the whales’ heads and the callosity patterns.”

Deep learning works by training many layers of simulated neural networks, algorithms modeled on the human brain. Deepsense.io used convolutional neural networks because they are ideal for image recognition tasks. It accelerated the training with Tesla K80 GPUs and GRID K520 GPUs. (Second-place finisher, Felix Lau, describes how he used NVIDIA GPUs and cuDNN in his solution in our Accelerated Computing News Center.)

“When you’re working on a problem like this, you need to have a short experimental cycle so you can test tons of ideas,” said Bogucki. “That’s why speeding up the training with GPUs is important.”

There’s a lot riding on using deepsense.io’s solution and possibly other technology to help preserve the right whale. Even with protections and the vigilant action by Khan and others, the whales face many challenges – they get struck by ships, tangled in fishing gear and have low birth rates.

NOAA plans to use the deepsense.io code to develop whale-identification software, Khan said. Eventually, she said, deep learning and image recognition could come to the aid of humpback whales, bottlenose dolphins and other threatened or endangered species.

Read NVIDIA CEO Jen-Hsun Huang’s latest blog post to learn more about how deep learning, and GPUs, are changing the world.