Glossary

API

Hoppe provides ship data onshore via standard APIs (application programming interfaces). The whole API documentation is reachable under https://docs.hoppe-sts.com/apis.

Data Aquisition

Data acquisition is the process of sampling signals that measure real world physical conditions provided by sensors, components or external systems and converting the resulting samples into digital numeric values.

Data Butler

The Data Butler is a cloud based service to provide ship/fleet data to the customer via standard interface for further processing. Different service levels are available, from high resolution data provision on demand up to full data quality information.

Data Encryption

In general, data encryption is the security method to encode information which can only be accessed with the correct encryption key. In terms of cyber security, all data is cryptographically signed, and sent to shore via an encrypted channel.

Data highway

Data path from sensor signal acquisition to the data provision on shore used by Data Butler / Data Inspector services.

Data Inspector

The Data Inspector is a add-on service to the data butler or available as a stand alone service.

It contains the daily data check, fleet data quality reports and troubleshooting action according our service level agreements.

Data Maintenance

Data Maintenance represents all service activities to keep the data quality of the vessels at a high level.

Data maintenance services are implemented in Data Inspector and Analysis Catalog.

Data Quality

The data quality is the arithmetical size of valid and invalid/implausible data. Invalid data concludes “Not a Number” (NaN) values as well as empty log file entries. A data point is stated as implausible when it exceeds physically plausible thresholds which are meticulously defined for each signal.

Data Validation

The data validation is a process to ensure data quality. It uses a cascaded process (data validity flag, check for threshold exceedances, device / components health determination, correlation matrix between sensors and devices) to check for correctness and meaningfulness.

There is a differentiation between Data Validation and profound validation.

Developer Portal

The documentation which is accessible in the Developer Portal provides all means to access this data in your custom tailored application.

To provide the developers with more context, starting at https://docs.hoppe-sts.com provides all the required information.

Downtime

The term downtime is used in two different contexts:

  1. Periods when our data services are unavailable,
  2. Non-operational periods of onboard systems and equipment for monitoring and control.

Evaluation

The evaluation is manual work performed by service technician or marine engineer. Depending on the agreed data services, a customer-specific selection or the entire analysis catalogue is evaluated. In the event of abnormalities in the ship’s operations, recommendations are written and the further procedure is coordinated with the customer.

Fleet Data Quality

The fleet data quality in percent is the weighted average of the data quality for a fleet with service contract defined by the customer.

Fleet Data Quality Timeline

For each vessel two timelines (the primary signals and all signals) show the vessel specific data quality over the reporting period. All data is loaded into a ship database. Algorithms determine a quality index for each aggregation. This index is supplemented by an overall quality for the observed period.

Fleet Data Summary

The Summary Report lists the most interesting key operation values for each vessel in the fleet. It gives information about the travelled distance, the consumed fuel as well as the specific fuel oil consumption, average speed, main engine load and energy consumption. In the last row all key values are summarized to give an aggregated view of the fleet performance.

Fleet Summary Report

The Fleet Summary Report provides an aggregated overview of the data quality and its development over time for each vessel in a whole fleet by using a traffic light system. In addition to that, a trend graphs shows the temporal evolution of the data quality for primary and all signals. The report consists of the Fleet Data Quality Timeline, the Total Fleet Data Quality Timeline and the Fleet Data Summary.

The Summary Report can be extended by the Validity Timeline for each individual vessel upon request.

HOMIP

HOMIP – HOPPE Embedded-iPC for data collection, processing, control and monitoring of applications onboard a vessel.

Hoppe Datapool

The cloud-based Hoppe Datapool on the shore side contains all necessary vessel data, particulars and details for the internal Data Inspector services and the developer platform for Data Butler customers.

Primary Signals

In the maritime context primary signals are described in the ISO 19030 standard. Such as but not limited to:

  • Main Engine, Boiler, Auxiliary Engines Fuel Mass Supply Actual [t/h]
  • Main Engine and Auxiliary Engine Power [kW]
  • ME Torque Actual [kNm] and Shaft Speed Actual [rpm]
  • Course OG Actual [°] and GPS Position
  • Speed OG and Speed TW [kn]
  • Wind Speed and Direction

Profound Validation

The profound validation is carried out in the validation chapter of the Data Analysis Catalog. It includes an evaluation with recommendations for the further procedure, in case abnormalities with e.g. the MAIHAK Shaft Power Meter or draft sensors are detected.

Remote Service

Hoppe provides Remote Service in different levels. For detailed information see our SLA (Service level agreements).

Sensor Fusion

Sensor Fusion is the mathematical combination of sensor information to validate measured data and to determine system condition.

To minimize uncertainty, one example of sensor fusion is the draft measurement system. The correlation check of drafts, bending and torsion by individual sensors allows a profound validation after the sensor fusion.

Service Inhouse (SIH)

Service Inhouse represents remote troubleshooting activities by a service technician that do not take place on board.

Ship to Shore

As a technology Ship to Shore enables data transfer from onboard a vessel to the cloud-based infrastructure. It is the essential satellite-based communication link between the vessels and Hoppes shoreside Datapool in the Data Highway.

Signal List

The signal list is a vessel specific list which shows all logged signals and the aggregation and processing information.

Total Fleet Data Quality Timeline

The Total Fleet Data Quality Timeline represents the aggregated and weighted data quality for the whole fleet. It shows graphs for the primary and all signals behaviour, including thresholds.

Glossary

ESCClose Overlay

Hoppe provides ship data onshore via standard APIs (application programming interfaces). The whole API documentation is reachable under https://docs.hoppe-sts.com/apis.

Data acquisition is the process of sampling signals that measure real world physical conditions provided by sensors, components or external systems and converting the resulting samples into digital numeric values.

The Data Butler is a cloud based service to provide ship/fleet data to the customer via standard interface for further processing. Different service levels are available, from high resolution data provision on demand up to full data quality information.

In general, data encryption is the security method to encode information which can only be accessed with the correct encryption key. In terms of cyber security, all data is cryptographically signed, and sent to shore via an encrypted channel.

Data path from sensor signal acquisition to the data provision on shore used by Data Butler / Data Inspector services.

The Data Inspector is a add-on service to the data butler or available as a stand alone service.

It contains the daily data check, fleet data quality reports and troubleshooting action according our service level agreements.

Data Maintenance represents all service activities to keep the data quality of the vessels at a high level.

Data maintenance services are implemented in Data Inspector and Analysis Catalog.

The data quality is the arithmetical size of valid and invalid/implausible data. Invalid data concludes “Not a Number” (NaN) values as well as empty log file entries. A data point is stated as implausible when it exceeds physically plausible thresholds which are meticulously defined for each signal.

The data validation is a process to ensure data quality. It uses a cascaded process (data validity flag, check for threshold exceedances, device / components health determination, correlation matrix between sensors and devices) to check for correctness and meaningfulness.

There is a differentiation between Data Validation and profound validation.

The documentation which is accessible in the Developer Portal provides all means to access this data in your custom tailored application.

To provide the developers with more context, starting at https://docs.hoppe-sts.com provides all the required information.

The term downtime is used in two different contexts:

  1. Periods when our data services are unavailable,
  2. Non-operational periods of onboard systems and equipment for monitoring and control.

The evaluation is manual work performed by service technician or marine engineer. Depending on the agreed data services, a customer-specific selection or the entire analysis catalogue is evaluated. In the event of abnormalities in the ship’s operations, recommendations are written and the further procedure is coordinated with the customer.

The fleet data quality in percent is the weighted average of the data quality for a fleet with service contract defined by the customer.

For each vessel two timelines (the primary signals and all signals) show the vessel specific data quality over the reporting period. All data is loaded into a ship database. Algorithms determine a quality index for each aggregation. This index is supplemented by an overall quality for the observed period.

The Summary Report lists the most interesting key operation values for each vessel in the fleet. It gives information about the travelled distance, the consumed fuel as well as the specific fuel oil consumption, average speed, main engine load and energy consumption. In the last row all key values are summarized to give an aggregated view of the fleet performance.

The Fleet Summary Report provides an aggregated overview of the data quality and its development over time for each vessel in a whole fleet by using a traffic light system. In addition to that, a trend graphs shows the temporal evolution of the data quality for primary and all signals. The report consists of the Fleet Data Quality Timeline, the Total Fleet Data Quality Timeline and the Fleet Data Summary.

The Summary Report can be extended by the Validity Timeline for each individual vessel upon request.

HOMIP – HOPPE Embedded-iPC for data collection, processing, control and monitoring of applications onboard a vessel.

The cloud-based Hoppe Datapool on the shore side contains all necessary vessel data, particulars and details for the internal Data Inspector services and the developer platform for Data Butler customers.

In the maritime context primary signals are described in the ISO 19030 standard. Such as but not limited to:

  • Main Engine, Boiler, Auxiliary Engines Fuel Mass Supply Actual [t/h]
  • Main Engine and Auxiliary Engine Power [kW]
  • ME Torque Actual [kNm] and Shaft Speed Actual [rpm]
  • Course OG Actual [°] and GPS Position
  • Speed OG and Speed TW [kn]
  • Wind Speed and Direction

The profound validation is carried out in the validation chapter of the Data Analysis Catalog. It includes an evaluation with recommendations for the further procedure, in case abnormalities with e.g. the MAIHAK Shaft Power Meter or draft sensors are detected.

Hoppe provides Remote Service in different levels. For detailed information see our SLA (Service level agreements).

Sensor Fusion is the mathematical combination of sensor information to validate measured data and to determine system condition.

To minimize uncertainty, one example of sensor fusion is the draft measurement system. The correlation check of drafts, bending and torsion by individual sensors allows a profound validation after the sensor fusion.

Service Inhouse represents remote troubleshooting activities by a service technician that do not take place on board.

As a technology Ship to Shore enables data transfer from onboard a vessel to the cloud-based infrastructure. It is the essential satellite-based communication link between the vessels and Hoppes shoreside Datapool in the Data Highway.

The signal list is a vessel specific list which shows all logged signals and the aggregation and processing information.

The Total Fleet Data Quality Timeline represents the aggregated and weighted data quality for the whole fleet. It shows graphs for the primary and all signals behaviour, including thresholds.