3 Best Practices
Real-Time Data Gathering on Fishing Vessels Focusing on Four Basic Sensors
The four basic sensors commonly referenced in these projects include GPS, fuel consumption meters, weighing scales, and towing force meters. Effective utilization of these sensors enhances the precision of operational data collection and processing.
Data Gathering on the Vessel
Sensor Calibration and Maintenance: Ensure all sensors are regularly calibrated and maintained to prevent data inaccuracies caused by wear, environmental factors and human error. Automated procedures can help detect outliers
Optimal Placement: Position sensors (e.g., GPS antennas, weighing scales, towing meters) strategically to minimize interference and maximize data accuracy.
Automated Data Logging: Implement onboard systems for continuous, automated data capture to minimize manual input errors and ensure consistent data quality and frequency.
Data Transfer from Vessel to Land
Reliable Connectivity Solutions: Utilize stable data concentrators or transmitters capable of handling high data volumes. Connectivity solutions should be resilient to weather-related disruptions.
Data Compression and Encryption: Employ data compression to minimize transfer bandwidth requirements and ensure encryption for data security during transmission.
Buffering Systems: Incorporate buffering capabilities to prevent data loss during transmission interruptions, ensuring that data collected during downtime is automatically transferred once connectivity is restored.
Data Processing on Land Standardized
Data Formats: Use standardized data formats for incoming datasets from vessels to facilitate easier processing, integration, and compatibility across platforms. - Example: ILVO utilizes PoseiDAT format - https://www.poseidat.org/
Data Aggregation and Integration: Integrate data from multiple sensors using centralized systems like Azure-based services to streamline data processing, validation, and analysis. - Example: EV ILVO utilizes the DBMatic platform as first buffer server and works in an Azure environment to offer end-user interfaces through PowerBI
Scalable Cloud Solutions: Opt for scalable solutions such as cloud computing platforms that can accommodate fluctuating data volumes without compromising on processing efficiency.
Algorithms: Add in tow detection, trip detection, fishing vs steaming
Data Visualization
Be mindfull of end-users requirements: The interface needs to be adapted to the specific end-user needs, not only regarding the information streams or visualisation, but also the context that it will be used and the size of screen. For fishers; a vessel owner might use the tool on a smartphone, tablet or laptop, which each have their own specific limitation. A skipper on a bridge has to have access to essential information quickly and easily.
Real-Time Dashboards: Develop real-time dashboards that offer intuitive visualization of core metrics, such as vessel position (GPS data), fuel consumption trends, catch weight (from weighing scales), and towing force details.
Customizable Reports: Enable the generation of customizable reports tailored to the needs of different stakeholders, such as vessel operators, managers, and research institutions. Interactive Interfaces: Offer interactive features within dashboards for deeper data exploration, such as historical trend comparisons and geographic data overlays.
Best Practices for Environmental Sensors
Environmental sensors, such as those measuring conductivity, temperature, depth, turbidity, and other parameters, play a crucial role in data collection efforts within the fishing sector.
Sensor Selection and Quality: Choose high-quality, durable sensors (e.g., NKE WiSense sensors) designed for marine environments, capable of withstanding harsh conditions and providing precise measurements.
Calibration and Testing: Conduct regular calibration and field testing of environmental sensors to ensure data consistency and accuracy. This should include periodic comparison of readings with reference standards or backup sensors.
Data Collection Protocols: Establish clear protocols for when and how data is collected, including sampling intervals and environmental conditions that may impact readings.
Minimizing Data Gaps: Address potential gaps in environmental data through redundancy (e.g., using multiple sensors per parameter) and preventive maintenance plans to minimize downtime.
NOTE: this aspect will be updated after feedback from external evaluations
Best Practices for Data-Quality Checks
Ensuring high data quality is critical for accurate analysis and decision-making. Effective practices encompass checks during data collection, transmission, and processing.
Automated Data Validation: Implement automated validation scripts to detect anomalies, outliers, or missing data immediately upon collection or arrival at the processing hub.
Cross-Sensor Correlation: Cross-check data from different sensors (e.g., fuel consumption vs. towing force during towing) to detect inconsistencies that may indicate sensor faults or operational issues.
Data Cleaning Processes: Establish rigorous data cleaning protocols, including filtering out erroneous readings and standardizing inconsistent data formats.
Periodic Audits: Conduct regular audits of data quality by comparing collected data against known baselines or reference data, such as environmental benchmarks or historical records (EModnet, Copernicus).
Feedback Loops: Incorporate feedback loops with vessel operators and data analysts to continuously improve data accuracy through real-world testing and feedback.