Robots with Argus eyes
A finely tuned robot glance can detect imperfections in roe and fish – faster than any human eyes.
Robots that can see as well as the human eye are being rapidly introduced into the fishing industry. A machine that can sort salmon and trout roe is already on the market. Next will be robots that can do everything from sorting and trimming fish fillets, to picking out damaged line hooks.
It’s called machine vision.
Machine vision is created by using digital cameras and a computer. The field includes computer technology, optics, mechanics and industrial automation. The technology is well known, but advances in cameras and computers have enabled the technology to develop and new applications to be created.
All of these factors enable a robot’s eyes to be so fine-tuned that it can find individual damaged fish eggs in several thousand healthy eggs.
John Reidar Mathiassen has taken the technical approaches behind egg sorting several steps further. His doctoral dissertation shows how machine vision can be used for different tasks in the fishing industry.
Mathiassen compares machine vision technology to a box of Lego. The blocks themselves are familiar enough – the creative or innovative part comes from how they are used, or what they are used to build.
“It’s all about seeing and understanding contexts: what are the problems and what are the solutions,” the researcher explains.
How vision works
“In order to teach robots to see, I first have to understand how my own vision works. For example, how I see the cup there,” says Mathiassen, pointing to a green coffee cup on the table before us.
“First, I have to find a way to describe the cup, in a way that characterizes it and distinguishes it from all other objects in the environment. The distinction can be colour, size, shape, or distance to the object. When I have been able to describe how I detected the cup with my own vision, I give the computer this description in a language that it understands – a programming language.
In this way, digital cameras acquire images, send the images to the computer and the computer running the program can find the cup in the image.”
Colour, shape, size
The room we are in is sparingly furnished, with just a few things. Only two things are green – the cup and a plant. The plant is much darker than the cup. This makes it easy to identify the colour that will enable a robot with machine vision to recognize the cup.
“I just need to program a code for light green. Then the robot will go after everything that the cameras capture as light green. If there had been several light green objects, I would have had to add details that would distinguish the cup. It might be the size or shape,” explains Mathiassen.
Forgetting to take these kinds of things into account can be disastrous. Mathiassen has seen a competition where the robots were programmed to kick an orange ball. Unfortunately, someone in the audience had a T-shirt that was the exact same colour as the ball. So the robot stopped at the edge of the track in a vain attempt to kick the spectator.
Detecting the only green coffee cup sounds simple enough. But how can you get robots to find moving objects in turbulent environments where there’s a lot going on? Or to distinguish between tiny objects that are almost identical, like fish roe?
The principle is exactly the same and starts with a description. The defining characteristic of healthy fish eggs that are used in fish farming is that they have two eyes. The task of the robot, or machine, is to detect defects: Some eggs may have a fungal infection that can be confusingly similar to eyes – these must be separated out. Other eggs can be one-eyed or have three or four eyes – these must be removed or the fish will be deformed.
An experienced egg sorter can handle between 4,000 and 5,000 eggs per hour – the computer manages over 100,000.
Finds defective line hooks
In the laboratory, Mathiassen has helped design robots that can detect even the slightest damage to fish, and can trim fillets as accurately as a chef.
A robot that can sort longline hooks has been under development, but is not in production yet. Laboratory experiments have shown that machine vision can detect 97.5 per cent of defective hooks.
Mathiassen therefore sees great potential to streamline longline fishing. He says there are as many as 40,000 hooks on each line and a crew of three people needs about 24 hours to inspect and replace or repair the hooks manually.
The research is being conducted at NTNU’s Department of Engineering Cybernetics and SINTEF Fisheries and Aquaculture, in cooperation with Aqua Gen AS and Maskon AS.