The Data Bottleneck: A Growing Challenge for Autonomous and Uncrewed Operations
As the demand for subsea data grows, I’ve seen companies investing in autonomy and uncrewed operations struggle to scale how data is processed.
While autonomous underwater vehicles (AUVs) and other advanced systems are transforming data collection, there’s an untapped opportunity to automate the workflows that happen once data is collected. Heading into 2025, we now have the capability to scale the data processing of survey operations and complete the transformation to highly efficient subsea data workflows that can keep pace with today’s increasing complexity, volume, and frequency of data.
This fundamental shift from manual, reactive workflows to automated, scalable systems presents an enormous opportunity to modernize data processing workflows to the advantage of organizations on the vanguard of adoption.
The Cost of Unscalable Data Processing
For every day of data collected, it can take three to five days to process that data.
With data processed manually—subject to a finite talent pool and growing competition for expertise—the status quo has been an unavoidable data bottleneck that limits operational efficiency and delays insights. Scaling operations to meet the demands of emerging markets like environmental monitoring and infrastructure surveys has been nearly impossible with this model, but that old paradigm is now changing with the ability to automate the processing of the data. The cost of manual data processing is more than inefficiency—it’s a barrier to growth, timely insights, and scalability. In fast-evolving industries, traditional workflows create bottlenecks that strain resources, delay decisions, and limit our ability to meet market demands. Embracing automation is no longer optional; it’s essential to eliminating these constraints, enabling faster, more accurate insights, and positioning organizations to lead in a data-driven world.The Problem Is Three-Fold
The challenges with subsea data collection and processing are multifaceted, creating significant barriers to scalability and efficiency.1. Manual Data Processing
Most subsea data workflows rely heavily on manual techniques. For example, sonar and optical data still need experts to manually tag and classify objects of interest from dense datasets. For every day of data collected, it can take three to five days to process that data. This lag in delivering results is compounded by the risk of human error and subjective interpretation, which can lead to inconsistencies in data quality and reliability.
2. Lack of Suitably Qualified Resources
The subsea industry faces a critical shortage of qualified data processors and analysts. Interpreting complex datasets like multibeam sonar and optical imagery requires specialized skills—but the talent pool is small and in high demand. This shortage creates delays, increases costs, and limits growth.
3. Lack of Scalable Workflows
Current methods and tools for data processing were designed for smaller-scale tasks, such as pipeline inspections or localized seabed mapping. However, emerging applications—like large area environmental surveys or surveillance of critical infrastructure at higher frequency—require workflows that can scale to analyze higher volumes of data. Without scalable solutions, companies are unable to meet growing demands, stalling operational expansion and innovation.
Reengineering Our Approach
At Cathx, we are reengineering data processing workflows with automation at their core. Once automated, we can then execute these in parallel to current workflows to ensure seamless integration and validation. This parallel execution allows us to compare results, fine-tune processes, and address any discrepancies before fully transitioning to the automated system. By taking an iterative approach, we can build confidence in the accuracy and reliability of the new workflows while maintaining uninterrupted operations.
How data is currently acquired and handled…
How Cathx enables data to be acquired and handled…
And this is the key to unlocking how to derisk and accelerate the development of autonomous systems. By creating robust, validated workflows that integrate seamlessly with advanced technologies, we ensure that autonomous systems operate with precision and reliability in complex underwater environments. This approach not only mitigates risks associated with adopting new technologies but also shortens the development cycle, enabling faster deployment of innovative solutions. At Cathx, our focus on automation is paving the way for next-generation autonomy.
The Valley of Machine Learning Disillusionment
We’ve all seen the hype around machine learning’s potential to transform subsea operations. But there’s a reality check: many companies are stuck in what I call the “valley of machine learning disillusionment.” They expect ML to deliver rapid automation and smarter decision-making, but the path to this end is complex. Unlike generic AI models, subsea operations require bespoke workflows, high-quality labeled data, and specialized models tailored to multi-sensor inputs like sonar, optical, and electromagnetic data.
One of the biggest reasons machine learning hasn’t yet delivered in our industry is the lack of sufficient, high-quality data from the seabed. Without good training data in sufficient quantities, machine learning hasn’t had a chance to work.
To move beyond this valley, we need a rethink. Standardizing methodologies, adopting shared automation frameworks, and focusing on incremental improvements can deliver better outcomes. It all starts with collecting larger amounts of higher-quality data in the first place.
What’s the True Prize of Autonomy?
Addressing the data processing bottleneck will unlock the full potential of autonomous surveying in subsea industries, creating substantial opportunities for service providers, AUV manufacturers, and sensor manufacturers to accelerate growth and innovation.
For service providers, it’s now possible to move beyond traditional, project-based revenue models tied to the number of survey days, which are often unpredictable and resource intensive. Instead, service providers can adopt a service-oriented approach, offering long-term, subscription-based contracts where clients receive continuous access to high-quality, real-time data insights.
This model will provide more stable recurring revenue streams, support scalable growth, and enhance operational predictability, allowing service providers to plan resources more effectively while reducing costs. Almost more importantly, it will position companies as long-term strategic partners rather than one-off service providers, fostering deeper client relationships.