07/04/2025 | Press release | Distributed by Public on 07/03/2025 21:15
Agriculture, aquaculture, and livestock farming are often practiced in remote areas far from urban centers. Coastal and mountainous areas like these often lack reliable connectivity, and this has hindered the adoption of digital and AI technologies in primary industries.
To address this challenge, SoftBank Corp. (TOKYO: 9434) and U.S.-based Aizip, Inc. collaborated on developing an on-device machine learning application capable of counting fish using smartphones and other edge devices. In January 2025, this groundbreaking innovation earned them a prestigious CES Innovation Award® in the Food & AgTech category. SoftBank and Aizip showcased the technology at a booth during the event.
SoftBank News spoke with two researchers behind the project to learn more about how AI can be utilized to transform aquaculture.
AI on your smartphone to monitor fish
Yuko Ishiwaka, Ph.D.
Principle Investigator, Advanced Technology Promotion Office
Information Technology & Architect Division, SoftBank Corp.
Yuko Ishiwaka holds a Ph.D. in Engineering from Hokkaido University, specializing in reinforcement learning for multi-agent systems. After working as an assistant professor at Hakodate National College of Technology and an associate professor at Hokkaido University's Graduate School of Information Science, she joined SoftBank, where she now focuses on foundational research in machine learning and computational neuroscience. To further her research into smart aquaculture, she obtained both a professional marine diver certification and a Class 1 small vessel operator license.
SoftBank has been working toward realizing smart aquaculture for some time now, hasn't it?
The initiative began with our 'Comprehensive Partnership Agreement on Promoting Digital Transformation' with Ehime Prefecture. As part of the aquaculture project, we've been conducting joint research with Akasaka Fisheries Inc. This marks the third year of our efforts.
How is the on-device machine learning application that you developed with Aizip different from your previous approaches?
Aizip not only specializes in miniaturizing machine learning models; it also has expertise in pipeline technologies optimized for devices. Using their pipeline, we implemented a fish-counting AI on an edge device-a smartphone. They also collaborated with us on algorithm enhancements.
As a result, using underwater cameras, we achieved real-time fish counting with 95% accuracy. We're also working on estimating fish sizes and detecting damage to fish nets.
Real-time fish counting
Detecting damage to fish nets
What improvements were made to enhance accuracy?
Initially, we used bounding boxes-annotation markers drawn around objects in images-to detect individual fish. However, when fish overlapped each other, it became difficult to count them accurately. Overlapping bounding boxes reduced the visible area of hidden fish, or made them disappear entirely, lowering the detection count.
To address this issue, we switched to a heatmap-based algorithm. Developing this required retraining of the AI, and we used the 3D computer graphics (CG) simulation technology we presented at the 2021 SIGGRAPH conference. Normally, object recognition for machines requires manual annotation, but we successfully generated highly realistic CG simulations of fish movements in an automated fashion. By training the AI with this data, the algorithm was able to count fish accurately even after making the change.
Fish counting at an actual fish farm
To create realistic CG simulations, various elements need to be incorporated, don't they?
That's true. First we created 3D models of fish and fish farms. Then we incorporated eight behavioral traits, such as preferences for water temperature, brightness, and private space, which vary by species. By calculating fish school behaviors and simulating 12 movement patterns, we developed CG simulations of fish behavior inside fish farms.
The CG simulations are so realistic it's hard to tell them apart from actual footage.
Yes, we sometimes even have difficulty distinguishing between them when not looking closely.
What kind of impact do you expect this technology will have?
Most fish farms are small-scale operations, which makes it difficult to install expensive, large-scale equipment. Crucial tasks such as fish counting, size estimation, disease detection, and shipping preparation are mostly done manually. Complex tasks like inspecting underwater fish farms or repairing nets are either left to professional divers or overlooked altogether, as net damage often goes unnoticed.
Our technology addresses these challenges by reducing costs and improving operational efficiency. Since it's affordable and easy to deploy, our AI solution enables fish farmers to monitor fish growth, count fish, and estimate numbers and weights needed for shipments. It also optimizes feed amounts, which helps reduce waste.
Fish farms are often located offshore, where connectivity is limited. How can you address this issue?
One of the key outcomes of our research is the use of edge AI technology, which works even in environments with limited connectivity. By integrating Tiny Machine Learning (TinyML)-a low-power, miniature, and highly accurate AI-into edge devices such as smartphones, we've enabled offline functionality. Edge AI processes data and makes inferences directly on devices and sensors located at the 'edge' of a network. This means the AI can operate without a constant network connection.
Edge AI Advantages: Localization and Personalization
Kazuto Suda
Director, Advanced Technology Promotion Office
Information Technology & Architect Division, SoftBank Corp.
At SoftBank, Kazuto Suda leads research and development teams focused on cutting-edge technologies, particularly in parallel processing algorithms and computer graphics. Recently, he has been concentrating his efforts on the development of smart solutions for primary industries such as agriculture and fishing.
Edge AI doesn't necessarily require a constant network connection or the latest data, right?
Having the latest data isn't always necessary while a network connection exists. With edge AI, processing happens locally, and queries are made to external sources like the cloud only when necessary. These queries might target a nearby edge device or large language model (LLM) that aggregates data. Edge AI determines what information to request and where to request it, and even the decision-making process takes place on the edge of a network.
What would you say is the biggest advantage of edge AI?
The greatest advantages are personalization and localization. If you think of an LLM as a large brain that knows almost everything, you can query it for general information. However, it's challenging to personalize. Not all the data and knowledge stored in an LLM are necessary for all cases, so distributing only the relevant information to the edge makes sense. With edge AI, you can have a small AI personal assistant tailored to your specific needs, making it incredibly useful for targeted applications.
Not every individual has to have all the knowledge of humanity. We only need information that's relevant or interesting to us, and that information can fit into a smartphone. Things like dictionaries or legal codes aren't essential for daily life. While LLMs can store all of that, we rarely use the entirety of their functionality in everyday situations.
You're also collaborating with your partner Aizip on a Questions and Answers (QA) application, correct?
Yes, we're jointly developing a QA application that incorporates Small Language Models (SLM). SoftBank employees use chatbots for inquiries, but due to security and privacy reasons, questions that use sensitive data can't be addressed.
With this QA application, the AI processes everything locally on each user's smartphone for work, enabling personalization while addressing privacy-related queries. We expect this will improve operational efficiency within SoftBank. We're aiming to implement it internally as soon as possible.
Edge AI supporting SoftBank's vision for next-generation social infrastructure
SoftBank is advancing the development of decentralized AI data centers as part of its next-generation social infrastructure. What role does edge AI play in this vision?
If we look at broader trends in computing, we see cycles of 'decentralization' and 'centralization.' When personal computers emerged, they enabled individuals to handle various tasks independently. However, limitations in processing power, security, and data sharing led to the current centralized approach, where data, including AI processing, is consolidated in the cloud.
But centralization creates challenges, such as bloated hardware environments. The same applies to AI-bringing everything to the cloud could become unsustainable in the future.
To address this, SoftBank is working on next-generation social infrastructure we refer to as "distributed AI data centers." In this system, core brains for large-scale computing are located in major centers, and regional brains handle computations at the prefectural level, and edge AI operates at the distributed endpoints farthest out.
It's about finding the right balance between decentralization and centralization, right?
To power AI, high-performance servers with Graphics Processing Units (GPUs) are required, and these consume a tremendous amount of energy. While our AI data centers are designed to utilize renewable energy, distributing AI processing across edge devices can also help reduce power consumption.
Additionally, we're exploring the use of edge AI in implementing 'AI-RAN,' a next-generation mobile network foundation that supports AI-based solutions. This is another way to optimize energy use and efficiency in AI operations.
With the ability to utilize AI offline, edge AI could open up new opportunities in areas where it's difficult to adopt AI.
Exactly. In primary industries, the development of AI-equipped devices and IoT-based technologies has lagged behind. One reason is due to their reliance on network connectivity. With edge AI solutions like the one we developed, these devices and sensors can operate offline, enabling better 'smartification' across industry sectors. Looking ahead, we aim to expand this technology into livestock farming to drive AI transformation (AX) in the industry.
Related Articles
(Posted on July 4, 2025)
by SoftBank News Editors