Low data quality leads to annual loss of $300,000 per vessel.


We provide shipping companies with high quality data for their requirements in the areas of SAFETY, COMPLIANCE, PLANNING and OPTIMIZATION. Because only those who have trustful data on shore, can make the right decision on board.

70 Years of Marine Precision

70 Years of Marine Precision

In 1949 the German engineer Hans Hugo Karl Hoppe founded the company Hoppe Bordmesstechnik. His business idea was to deliver precise instruments and measuring equipment that could exactly determine the speed and power of seagoing vessels. Many technical inventions and patents for on-board measuring systems have characterized the work-life of Hans Hoppe. Today Hoppe is a family-owned group of companies with a global presence. In many years of organic growth, the company started businesses in various maritime markets to cater to globalized shipbuilding. Combined with countless inventions and patents Hoppe has reached a leading position in several maritime business areas like marine digitalization.

The company‘s continued passion for technology and the permanent motivation to deliver customer-oriented products and services have been the keys to its success. The combination of seven decades of engineering know-how, sustainable on-board experiences with the continuous development of new technologies has made Hoppe a major player in the digitalization process of the marine industry. A key demand of today‘s maritime data acquisition is the exact validation of a large number of a ship‘s operational and nautical measurements after receiving them onshore. Hoppe offers you a convenient one-stop solution with high data quality on-board and superior analysis ashore.

With over 70 years of experience and more than 7,000 installations, Hoppe Marine can be the trusted technology partner in the vessels digitalization process and helps relieve the owners and operators technical personnel. As we have always done and always will do.

We provide shipping companies with high quality data for their requirements in the areas of SAFETY, COMPLIANCE, PLANNING and OPTIMIZATION. Because only those who have trustful data on shore, can make the right decision on board.

What do validated measuring instruments for planning and optimization mean for fleet operation centres in fleet benchmarking?

On a container ship, the unnoticed offset of the speedlog of 0.3, as well as the use of static instead of dynamic draught measurement over a period of one year, led to an incorrect adjustment of the dynamic trim position. The assumed operating point with a consumption of 80 tons/day corresponded to the optimum trim of 0.4 m top-heavy at 16 kn for the 330 m long container ship. However, the real dynamic trim was 0.5 m top-heavy at the stern. At an average of 15.7 kn STW (speed through water) this resulted in an additional 4% difference in power requirements. The facts became apparent during an initial analysis catalogue with measurement equipment validation. The omitted savings potential at 200 days at sea and 400 $ / ton MGO was calculated at 300,000 dollars.

Data Butler

Provision of excellent ship data

The Data Butler is a cloud based service to provide ship/fleet data to the customer via standard interface for further processing. Different service levels are available, from high resolution data provision on demand up to full data quality information.

  • Provision of ship data with easy access
  • Full system responsibility
  • Unlimited cloud based data storage – with 24/7 access
  • Focus on cyber security – encrypted ship-to-shore data transfer
  • Remote service, updates and customer specific configuration

Data Inspector

Ensure Data Quality

The Data Inspector is an additional service to the Data Butler or available as a stand-alone service. — It contains daily data checks, fleet data quality reports and troubleshooting measures and thus provides the data basis for optimized fleet planning and vessel operation.

  • Ensured data quality as a precondition for compliance, planning and optimization
  • Continuous system health check and daily data evaluation by qualified service technicians, analysts and marine engineers
  • Fleet data transparency by automatic reports, including system health, quality KPIs
  • In-depth evaluation of primary signals to detect implausibilities, drifts and offsets
  • Fast response – (remote) service attendance reduces downtime to a minimum

Choose your perfect fit – our data packages

Frequently Asked Questions
Data Butler
Data Inspector
Transparency

Quality

Maintenance

Analyzer

ON BOARD
Ship data server application incl. configuration
Webbased data transmission software
  • Configurable bandwidth limit,
  • Bandwidth detection prevents any transmission, in case the risk of a weak transmission is detected.
Completeness check for data transmission

SatCom Hardware and Airtime not included.

CLOUD BASED DATA HOSTING
Backup protection
  • Data Hosting on EU Server centres with 6 backup copies.
Unlimited data storage volume
Full access to raw data

Further information on our API Developer Portal https://docs.hoppe-sts.com.

Flexible configuration of vessel data
  • Free remote update of data configuration, logging and export configuration.
Data Quality API
  • Quality Data via API with quality information for single variables.
DAILY CHECK

Routine work is performed every regular working day by a service technician.

Daily System Health Check
  • Check if monitored Hoppe systems are online,
  • Logging device active,
  • Completeness-check for data transmission.
Fleet Data quality timeline
Total Fleet Data quality timeline
Data check for validity and threshold exceedances
  • The availability and validity of all signals acc. to the agreed on signal list will be checked.
  • All signals are checked against physical thresholds for a plausibility check.
Fleet Data Summary
PROFOUND VALIDATION AND TROUBLESHOOTING

Validation chapter of Data Analysis Catalog will be carried out quarterly.

Measurement devices

Profound validation of the following devices will be carried out:

  • Shaft Power Meter,
  • All Flowmeter,
  • Speedlog,
  • Draught Sensors.
Downtime Analysis

Downtime analysis for systems and signal.

Primary signals Correlation matrix

Correlation and time series evaluation of ISO19030 primary signals to detect specific behavior, sensor errors and offsets.

Troubleshooting Analysis

One individual troubleshooting per vessel and year according to vessels under contract. Use cases for an evaluation are e.g. Data Gaps, Speed Claims, Groundings, Bunker Claims.

  • The user can choose from more than 100 plots and visualizations.
  • The user specific selection of the Analysis Catalog will be evaluated.
SERVICE LEVEL AGREEMENTS
Basic (Tier I)
  • 48h e-mail Response / Notification
  • 1h Service Inhouse (SIH) per vessel and month for general troubleshooting included (e.g. 20 vessel = 20 h included)
Advanced (Tier II)
  • Technical Service Hotline – Working Day 9am – 5pm
  • 24h e-mail Response / Notification
  • Remote Maintenance
  • Accounting via SIH, in case of not caused by Hoppe or within warranty
UPCOMING FEATURES
csv / Excel Download
  • Free selectable time range for ad-hoc analysis or troubleshooting.
Full access to high-resolution data
  • High-resolution historic data on demand.
  • On-demand data access to high-resolution data.
HODROP - File Sync
  • Encrypted file transfer from the ship directly to the office by drag & drop.
  • Notification for new files.
  • Automatic bandwidth check.



Can the choosen data package be changed after initial setup of interfaces between Basic Data and Quality Data?

Yes, the subscription model can be changed and optimized afterwards according to the client’s needs. Beside this, get in contact to learn to know our SDK.

Is the provision of historical data possible?

The APIs provide all data files which are available for the specific vessel. E.g. if data recording with the embeddded iPC HOMIP2 on board started two years before contract was signed, this data can be made available on the shore side API upon customer request. Further more, see „Is it possible to load external or historic data into the data pool via interface?“ in the Ship-to-Shore FAQ.

Is the data also retrievable by Fleet Management systems / Fleet Optimization providers?

In principle, every IT-system that communicates with REST-APIs is capable of retrieving data from the Hoppe Data-Pool. Therefore interfaces to Business Intelligence Tools, like Tableau, but also Elasticsearch or classic ISO-SQL-conformal data base systems via JDBC can be provided. Please contact our Hoppe Marine system administrators for technical details.

How can the information about data quality be retrieved?

Hoppe Marine offers a simple interface for retrieving information about data quality. The interface is described in the developer documentation. Information about data quality of every single data point as well as of entire signal groups can be retrieved via this interface.

Is it possible to export data for further individual work with Excel?

Offering raw data in various formats (e.g. .csv and .xls) is planned as a new feature still in 2020. Today, raw data is offered as optimized SQLite data or in JSON format.

Where can I get information about the APIs?

Hoppe Marine offers a developer portal for clients. Via docs.hoppe-sts.com the portal can be reached worldwide 24/7. After successful registration, the client’s developer have access to all necessary API details and functionalities. Prior to first use a user account must be set up by our Hoppe Marine administrators.

How often new data will be available via API?

This behaviour is fully customer configurable. Data update rates range from two minutes to once per day. This is always a trade-off between used data volume and realtime data requirements. This configuration can be made during the early project planning phase for the initial installation. Changes in this configuration are possible later on. A typical use case with sufficient high resolution for data evaluation and fleet optimization the export logging rate of 1 min and the export interval and transmission is of 5 min.

Is it possible to have access to a sample data set, in order to replicate the live API before the vessels is online?

A representative data sample will be provided upon request. This data set is representative simulated ship data worth one month of time. The available set of signals (database columns) as well as the data aggregation for a specific vessel might differ from the sample data set. All these configurations (data aggregation rate, data export rate and alike) are at the customers disposal and can be configured and adjusted upon customer needs.

Can I get some assistance by Hoppe Marine in terms of setup and configuration, in case I have no experience in retrieving data via APIs?

Hoppe Marine is delighted to assist during the initial setup. For this purpose, an initial consultation meeting will be held to discuss all requirements and to provide possible short-term solutions for an optimized utilization of the Data-Pool-Services. After that, we can decide how Hoppe Marine can assist the client with further experiences. In any case, after the initial consultation meeting the full access to the system and interface documentation is granted.

Will the signals API be versioned? In other words, is the current API version "1" and will any changes to the signals structure and mappings be named and released separately?

Any backwards compatibility breaking changes will be announced and versioned by path versioning. For the moment we have on our development roadmap only schema additions which we take to not be breaking changes but minor releases. These will not be reflected in a version scheme.

We have tried to gain access to the signals API but we are still getting the response: "403 Forbidden" Is this expected? It would be very helpful for us to have the full signal schema so that we can integrate generally with the API.

Could you please check again that you subscribed for the Signals-API in the developer portal and that you provide your personal API key (to be obtained at https://docs.hoppe-sts.com/ dashboard) via the x-api-key header field in your GET request? Please also check that you can retrieve data via the “Try it out” functionality in the developer portal. A sample request in CURL would look like: „curl -X GET „https://api.hoppe-sts.com/signals/collections/hoppe/signals“ -H „accept: application/json“ -H „Authorization: “ -H „x-api-key:“

We have been receiving a 429 Too Many Requests response code when sending multiple requests simultaneously across several vessels. In the short term, we can limit the requests to only registered vessels that are confirmed to have data. However, as more vessels are brought onto our platform, we may run into these limits in a more significant way, especially as we will continue to utilize a parallelized workflow. What is the API specification on the rate limiting threshold and retry periods?

Yes, the APIs are rate limited per default. This is to encourage users to also use local caching technologies on slowly changing data. However, the limits are set per API key and can always be adjusted to the customer’s needs. In order to adjust the limits we would like to ask you to provide us with number about your expected usage pattern. The information we would need is:

  • regular requests/second
  • max. burst requests/second
  • requests per month.

With these numbers at hand we can adjust your account settings accordingly.

Which products are utilised and how do you manage your cloud security?

Amazon S3 and Amazon DynamoDB are used for storage. Data in S3 is encrypted at rest
using the SSE-KMS method described here. The underlying block cipher used is AES-256. For
DynamoDB also, the data is encrypted at rest as described here.
Regarding cloud security management, as there four major components:

  1. From a user management perspective for our APIs, a fine-grained Role-Based Access Control model
    that uses Amazon Cognito with Amazon API Gateway for authentication and authorization is used.
  2. Inside AWS, all our cloud resources use AWS Identity and Access Management (IAM) for
    implementing least privilege access.
  3. Hoppe uses security groups where possible to restrict access as well.
  4. Hoppe uses AWS Secrets Manager for handling secrets internally.
What is the required data storage volume which we can expect on the vessel (is the data stored on the vessel or in the cloud?)

All raw data is kept on the ship as primary data source and is stored safely on the HOMIP2 devices flash memory. With a 64GB flash drive one can expect a log retention of at least 2 years in a standard performance monitoring solution (300 signals logged every minute in average). Before we transmit the data to shore we pre-aggregate the data in order to optimize the data volume with regard to the used transmission volume. Therefore the exported data usually features lower time resolution compared to what would be available on the ship, though this is customer configurable. From this point of view the cloud storage is in most cases a down sampled mirror of the database on board.

How is the daily data inspection handled?

Daily evaluation of data quality is performed by a team, consisting of marine engineers, naval architects and experienced service technicians. For at least once per working day, the team visualizes and traces data losses, implausibilities and deterioration in quality on the highest aggregated level. Based on sudden events and also on gradual changes, technical clarification and troubleshooting is done. Situations which can be solved internally by analyzing the data with the Hoppe Marine Signal Inspector, Data Analysis Catalog or by remote access, as well as the communication with clients or third party providers in terms of interface troubleshooting, are part of the daily business.

In case conspicuous sensors are detected, further agreements with clients are possible to directly offer corresponding spare parts and service jobs.

What happens when a faulty sensor, an invalid signal or a failed interface is detected?

In case a signal is transmitting invalid data of a normally intact sensor, a check regarding implausible behaviour due to unfavorable environmental or operating conditions is done, to figure out whether the sensor is malfunctioning or not. For example, the alleged offset of a flow meter on a ship during standstill can be of various reasons.

In case conspicuous sensors are detected, further agreements with clients are possible to directly offer corresponding spare parts and service jobs.

How can the data quality be calculated?

The data quality is based on the quality of every single signal. One can distinguish between Primary Signals (data essential for reporting and optimization of operation) and Secondary Signals (further ship operation data for monitoring, planned maintenance, etc.) to output aggregated quality. The quality for each signal is calculated in consideration of invalid and implausible data (limit value exceedance, etc., application of algorithms).

Furthermore, with the Data Analysis Catalog a correlation matrix will be created, including the evaluation of different engineering processes and their relationship on board. The correlation has no direct influence on data quality, since in this case an evaluation for the exact determination of the faulty input variable is performed by a technician.

Can the data quality be monitored by other sensors?

Yes, all system-included sensors and values can be monitored. The detection of failed interfaces and malfunctioning sensors, as well as testing the exceedance of operationally not reachable limit values are part of the scope of performance. Normally, in-depth validation algorithms that assume knowledge about sensor behavior, can only be applied to sensors made by Hoppe Marine.

Are individual evaluations and reports possible?

The package “Operational Analyst” includes complete or customized analysis catalogs that offer in-depth information about the vessel operation. The fact sheet for the Data Analysis Catalog can be found on the hoppe-marine.com website.

Which requirements must be met by the vessel’s IT infrastructure?

The Hoppe Ship-to-Shore connection does not provide any own VSAT connection. Therefore the client must ensure to have a VSAT connection available. For the communication with the land-based servers, the following IP addresses on the corresponding port for outgoing TCP data traffic of HOMIP2 must be enabled in the vessels firewall.

Primary Address – IP: 75.2.111.192 – Port: 11550
Fallback Address – IP: 99.83.166.216 – Port: 11550

A detailed checklist will be provided upon request.

Is it possible to load external or historic data into the data pool via interface?

In terms of data storage, Hoppe Marine utilizes the structure of a “Data-Lake”. It is based on a concept in which structured data can be loaded into the data pool from a variable source. In this term structured data means e.g. log data, telemetry data etc. In the field of data analysis, time series data in various formats are the most relevant data. Currently, the following formats are accepted: CSV, JSON, SQLite and Parquet. Furthermore, it is important that the client defines the column name-assignments, due to the fact that for evaluation the data must be set to a universal naming standard.

Can data and files of external partners be transmitted?

Data transmission is based on packages, their content is irrelevant. The only important thing is, that all packages need to be cryptographically signed prior to transmission to give proper evidence regarding their source. Therefore Hoppe Marine offers a crypto-software-solution on HOMIP units, capable of signing data of various sources. In this case a REST API enables access to this feature.

Currently, this feature is available only for Hoppe-internal services. The release of the interface for data transmission of external partners is part of our release plan 2021.

However, the embeddded iPC HOMIP supports several protocols for data collection. with the iDBS database solution Hoppe Marine offers a feature to make use of interfaces of other manufacturers to store the data and to make it available on shore side. This feature is available right after activation of the service.

Is the Ship-to-Shore transmission secure?

The Ship-to-Shore transmission utilizes a multi-level safety concept. This concept distinguishes between Identity Protection, Access Protection and Integrity Protection.

Identity Protection

The first level of the safety concept ensures trustworthiness of the communication partners. Thereby every end point of communication is fitted ex works with a private, cryptographic key. This key never leaves the device and cannot be compromised therefore. Only when a correct key is known the device gets enabled to transmit or receive data.

Access Protection

When it is ensured that data comes from a trustworthy source, a secure SSL-encrypted connection between the communication partners will be established in the next step. This encrypted connection prevents access from third parties, only both communication partners are able to read the data in clear format.

Integrity Protection

After successful transmission of data a further step is implemented to ensure the data is intact and corresponds to the data that has been sent from the vessel. Therefore cryptographic signatures according the industrial standard RFC 7519 are used.

Is the data protected inside the Hoppe data pool?

Our provider, responsible for data storage, is certified according to ISO/IEC 27001:2013. The certification guarantees the following core concepts of a data storage provider in terms of data protection and security:

  • We evaluate our IT-safety risks systematically considering the effects of threats and weaknesses.
  • We design and we follow a complete range of IT-security checks and other forms of risk management to manage all safety risks of company and architecture.
  • We introduce a complete management process to assure that our IT-security checks meet our ITsafety procedures at all times.

Accessing the data is permitted based on the so called Least-Privilege model. The access to every resource must be actively granted by Hoppe administrators, before a new user can have access to data. The access to data can be defined very detailed and tailored:

  • In general, for every user the access to specific vessels can be determined in details.
  • Furthermore, it can be determined individually or for every user whether one can have a look on raw data, or just have access to aggregated data.

Generally, the data access is only possible via a few but highly monitored channels. This eases an early detection of unauthorized access.

How is the ship data encrypted in transit from ship-to-shore? What transfer protocols are utilized?

During transport the data is encrypted via TLS from the endpoint on the ship to the endpoint in the cloud solution and during all transport inside the clouds private network. Additionally the payload data is signed with a cryptographic key (EC-512) to eliminate any chance of data manipulation in transit. Hardware on the ship needs to prove it’s identity before a connection can be established to shore. This identity management is based on EC-512 certificates. Trust is established by exchanging the public keys of the device with the shore side server during device production.

What about accessibility and backup security?

All data is protected against loss by six backup copies. Furthermore data is stored in at least two different server centres within the European Union (EU) to assure data security and data integrity even in natural disasters, like fire. In terms of constant, worldwide accessibility, the provider of the infrastructure guarantees an availability of 99.47%. For the worst case in terms of daily data transmission, this means that the access to data can only be not granted for maximum two days per year. In terms of data storage it can be assured at all times that data can be stored and catalogued appropriately and that no data gaps occur.

Can data get lost?

For data storage Hoppe Marine utilizes a model called WORMS. With WORMS all data, that has been stored once in the data pool, cannot be changed any more. This ensures that all data can neither be overwritten nor be deleted. Furthermore, each data is protected against any loss by six backup copies. And in addition to that, all data is stored in at least two different server centres within the European Union (EU) to ensure data protection and data integrity even in natural disasters, like fire.

Is it possible to transmit 1000 data points per minute from a vessel?

The chosen IT-infrastructure is designed for good horizontal scaling. This means, that during higher data volume the service resources get enhanced accordingly. Therefore a transmission of 1000 data points per minute is no problem, as long as the satellite connection offers the required bandwidth.

Do you have the capability to remotely access your ship hardware? What protocol is used to do this?

Upload from the shore side to the ship is only possible in a specific package format. The corresponding API endpoints are secured by two measures: user authentication as well as IP based whitelisting. No ingress connection can be initiated from the outside towards the ship. Only the ship can establish a connection to two fixed unicast IP addresses. No DNS resolution is involved in the process. Only approved service engineers can upload and install updates for the device on the ship and change connection parameters, but have never direct remote access (e.g. via SSH or any remote shell) to it. Only cryptographically signed update packages are accepted by the devices on the ship. The cryptographic material for signing is managed in a HashiCorp Vault implementation.

How is the data transmission working?

The data transmission is file-based. The interval for data exports can be configured. For data transmission a satellite connection is established and for transmitting the data, a direct, encrypted connection to fixed IP addresses without DNS is realized. All files are collected as database files and transmitted together in blocks. By using the Hash-method, the entire file content is subject to an integrity test. The transmission of encrypted data underlies a bandwidth regulation.

How big is the transmission overhead in addition to the transmitted amount of data?

Encryption, connection set-up, etc. have been optimized already. However, one should always consider the following: The smaller the single exported data blocks are (i.e. the shorter the export intervals are), the bigger the relative overhead is. This can be seen similar to a letter that is always 80ct., no matter if it is one A4 page or three A4 pages. The following image indicates the measuring results of the interrelation between payload size and transmission overhead. It is well indicated that the relative overhead becomes smaller in case of encryption and transmission of bigger file sizes.

What happens when the integrity test fails? Will the entire file content, in which the integrity test has been failed, be transmitted again?

Yes. But this case only happens to files where the transmission cannot be continued. In the „normal“ case of a connection failure, the transmission will continue where it got interrupted. Only in case of a high amount of package-losses, the transmission will be repeated completely. A bandwidth detection prevents any transmission, in case the risk of a weak transmission is detected. Therefore, the complete repetition of a transmission is only the last resort. The maximum amount of transmission attempts can be determined per file. Thus, expensive and endless attempts are avoided. When a better connection quality is detected, the transmission can be restarted via a web interface directly on board.

What is the origin of the timestamp for time series data?

The time series data is logged by Hoppe Marines iDB server, which uses the master time on the HOMIP2 as the time source. This time source is configured on the HOMIP2 display and is independent of the date/time of the operating system. If the customer has a GPS receiver on board, the HOMIP2 can also use the GPS time signals as the time source. GPS time signals are preferred as time source, since they provide a well established times standard. Additionally, chances to enter the date/time incorrectly are minimized.

Why is time correctness essential for meaningful high quality data

Our goal is to log and provide ship operation data accurately and consistently. This allows in retrospective to identify the exact date and time at which an event occurred or what the signal value was at that moment. This cannot be done if the time source is not reliable. As a worst case example, if the time has changed back, duplicate data can occur for the same given time range. If this occurs, meaningful report can not be generated.

Get in touch

We’ll be happy to hear about your project. Just drop us a line and we’ll be right back at you in no time.

Error

Thank you!

We will be coming back to you shortly.

Partner

Our APIs integrate well with additional service providers. See who is already using the Hoppe Data Services successfully.

The path to High Quality Data

Sensors
On Board Data Acquisition
Data Integration and Database
Monitoring and Control
Ship to Shore Connection
Data Storage
Data Processing
Data Analysis
Data Maintenance and API with Qualified Data

Glossary

ESCClose Overlay

Hoppe provides ship data onshore via standard APIs (application programming interfaces). The whole API documentation is reachable under https://docs.hoppe-sts.com/apis.

Data acquisition is the process of sampling signals that measure real world physical conditions provided by sensors, components or external systems and converting the resulting samples into digital numeric values.

The Data Butler is a cloud based service to provide ship/fleet data to the customer via standard interface for further processing. Different service levels are available, from high resolution data provision on demand up to full data quality information.

In general, data encryption is the security method to encode information which can only be accessed with the correct encryption key. In terms of cyber security, all data is cryptographically signed, and sent to shore via an encrypted channel.

Data path from sensor signal acquisition to the data provision on shore used by Data Butler / Data Inspector services.

The Data Inspector is a add-on service to the data butler or available as a stand alone service.

It contains the daily data check, fleet data quality reports and troubleshooting action according our service level agreements.

Data Maintenance represents all service activities to keep the data quality of the vessels at a high level.

Data maintenance services are implemented in Data Inspector and Analysis Catalog.

The data quality is the arithmetical size of valid and invalid/implausible data. Invalid data concludes “Not a Number” (NaN) values as well as empty log file entries. A data point is stated as implausible when it exceeds physically plausible thresholds which are meticulously defined for each signal.

The data validation is a process to ensure data quality. It uses a cascaded process (data validity flag, check for threshold exceedances, device / components health determination, correlation matrix between sensors and devices) to check for correctness and meaningfulness.

There is a differentiation between Data Validation and profound validation.

The documentation which is accessible in the Developer Portal provides all means to access this data in your custom tailored application.

To provide the developers with more context, starting at https://docs.hoppe-sts.com provides all the required information.

The term downtime is used in two different contexts:

  1. Periods when our data services are unavailable,
  2. Non-operational periods of onboard systems and equipment for monitoring and control.

The evaluation is manual work performed by service technician or marine engineer. Depending on the agreed data services, a customer-specific selection or the entire analysis catalogue is evaluated. In the event of abnormalities in the ship’s operations, recommendations are written and the further procedure is coordinated with the customer.

The fleet data quality in percent is the weighted average of the data quality for a fleet with service contract defined by the customer.

For each vessel two timelines (the primary signals and all signals) show the vessel specific data quality over the reporting period. All data is loaded into a ship database. Algorithms determine a quality index for each aggregation. This index is supplemented by an overall quality for the observed period.

The Summary Report lists the most interesting key operation values for each vessel in the fleet. It gives information about the travelled distance, the consumed fuel as well as the specific fuel oil consumption, average speed, main engine load and energy consumption. In the last row all key values are summarized to give an aggregated view of the fleet performance.

The Fleet Summary Report provides an aggregated overview of the data quality and its development over time for each vessel in a whole fleet by using a traffic light system. In addition to that, a trend graphs shows the temporal evolution of the data quality for primary and all signals. The report consists of the Fleet Data Quality Timeline, the Total Fleet Data Quality Timeline and the Fleet Data Summary.

The Summary Report can be extended by the Validity Timeline for each individual vessel upon request.

HOMIP – HOPPE Embedded-iPC for data collection, processing, control and monitoring of applications onboard a vessel.

The cloud-based Hoppe Datapool on the shore side contains all necessary vessel data, particulars and details for the internal Data Inspector services and the developer platform for Data Butler customers.

In the maritime context primary signals are described in the ISO 19030 standard. Such as but not limited to:

  • Main Engine, Boiler, Auxiliary Engines Fuel Mass Supply Actual [t/h]
  • Main Engine and Auxiliary Engine Power [kW]
  • ME Torque Actual [kNm] and Shaft Speed Actual [rpm]
  • Course OG Actual [°] and GPS Position
  • Speed OG and Speed TW [kn]
  • Wind Speed and Direction

The profound validation is carried out in the validation chapter of the Data Analysis Catalog. It includes an evaluation with recommendations for the further procedure, in case abnormalities with e.g. the MAIHAK Shaft Power Meter or draft sensors are detected.

Hoppe provides Remote Service in different levels. For detailed information see our SLA (Service level agreements).

Sensor Fusion is the mathematical combination of sensor information to validate measured data and to determine system condition.

To minimize uncertainty, one example of sensor fusion is the draft measurement system. The correlation check of drafts, bending and torsion by individual sensors allows a profound validation after the sensor fusion.

Service Inhouse represents remote troubleshooting activities by a service technician that do not take place on board.

As a technology Ship to Shore enables data transfer from onboard a vessel to the cloud-based infrastructure. It is the essential satellite-based communication link between the vessels and Hoppes shoreside Datapool in the Data Highway.

The signal list is a vessel specific list which shows all logged signals and the aggregation and processing information.

The Total Fleet Data Quality Timeline represents the aggregated and weighted data quality for the whole fleet. It shows graphs for the primary and all signals behaviour, including thresholds.