The accident above was attributed to the presence of thick fog and visibility issues. The ship captain was arrested on suspicion of gross negligence manslaughter, which may be due to the fact that safety protocols were not followed. Apart from loss of human life and property destruction that will likely result in millions of dollars in damages, the environmental aspects of aviation fuel leakage into the North Sea is of concern. With upward of 120K shipping containers traversing the oceans today, hauling 11B tons of cargo at any given time, it is clear that the equivalent of ADAS (Advanced Driver Assistance Systems) features which are standard on passenger cars today need to be installed on ships.

Manually operated commercial ships and fishing vessels operate under strict safety protocols that prioritize collision avoidance. Cameras provide situational awareness in good lighting conditions. Dual band (X and S band) 360° radars mounted on rotating gimbals provide range adjustable angular resolution (range can be programmed from 1 – 48 nautical miles). In bad weather and low visibility conditions, ship speed is reduced (to allow for adequate braking distance in the event of obstacles or other ships), and coded sound and lighting signals are used to caution neighboring vessels of the ship proximity. GPS-enabled AIS (Automatic Identification System) allows all ships to see the location and exact vessel type/registration of other ships in the vicinity. As maritime autonomy develops, the challenge is to translate these protocols and sensing systems to computers and AI.

Orca-ai, headquartered in Israel with locations in the United Kingdom, Greece and Singapore aims to do exactly this, and more, by providing perception sensors and AI (Artificial Intelligence) software to enhance situational awareness and operational efficiencies. The hardware product is the “SeaPod”, an automated watchkeeper with a hardened camera pod (8 high resolution day cameras and 3 thermal cameras (Figure 1). The company currently generates significant revenues ($Ms) from the world’s leading shipping companies.

The technical details of the SeaPod are as follows:

  1. Lenses packaged inside enclosure, Focal Length: f/1.8–f/2.8 (visible), f/1.0 – f/1.4 (thermal). No heating or self cleaning capability of window
  2. Camera Resolution: 16 mp (visible), 640×512 (thermal)
  3. Thermal camera wavelength: LWIR, 8-12 um
  4. Seapod platform not gimbal mounted, Field of View (FoV): 225° (visible) 100° (thermal)

The Orca platform is currently installed on ~800 maritime vessels across 40 customers (cargo container, tankers, bulks, car carriers and cruise ships), with 300 more to be outfitted in the near future. Camera data along with onboard radar, GPS and other sensor data (IMU, speed, direction, etc.) is fused with proprietary on-board AI software to caution the crew of hazardous situations and obstacles. As satellite-based services like Starlink have become more available and reliable, real-time data from the platform can be transmitted to Orca for mapping routes, traffic and landmarks, critical information for developing autonomous navigation capability. The data is also communicated in real time across its network of customers (FleetView Dashboard) to enable more efficient and safe navigation, providing performance insights, optimizing best practices and ensuring regulatory compliance (Figure 2):

In Figure 2, red dots represent safety or operational events. The Safety Rating quantifies multiple parameters such as severity, location, trend, historical information and visibility and ranges from 1-10 (10 is the safest). The blue circles are ships equipped with Orca’s SeaPod platform which uploads live locations.

Coincidentally, a SeaPod platform mounted on one of Orca-ai customer’s vessels was able to record in real-time (Figure 3) the tragic accident shown in the lead figure of this article.

In heavy fog, daytime cameras are not able to see through. Thermal cameras are excellent in such environments (operate at much higher wavelengths and measure temperature differences relative to the environment rather than reflected photons). In Figure 3, the thermal camera was able to image at a distance of ~ 2 nautical miles (~3.7 terrestrial miles). It is puzzling given the safety protocols described earlier that the accident occurred at all. Could it have been avoided if one or both vessels were outfitted with the SeaPod platform?

Orca recently announced that the Seaspan Corporation which owns ~200 vessels, will outfit its entire fleet with Orca-ai’s platform. This was driven by ~ 2 years of data gathering on a smaller portion of the fleet, which indicated that it saved the company $100K in fuel costs and reduced CO2 emission by 500 tons/ship/year. Marubeni in Japan is a large import-export company and is a global distributor of Orca-ai’s products and services. It also owns its own ships which are installed with these products. The NYK group is another customer. In 2022, Orca AI and NYK completed a successful autonomous voyage trial in congested waters off Japan’s east coast as part of the DFFAS consortium (Designing the Future of Full Autonomous Ships). Other notable customers include Maran Tankers, MMSL and EPS.

Yarden Gross is the CEO of Orca-ai. With deep experience in shipping (previously a commando in the Israeli Navy), he started Orca-ai in 2018 with co-founder and CTO Dror Raviv (also a Navy veteran with experience in operating unmanned ships) to “empower shipping with data-driven technologies and the automation needed for navigating the safest voyages with the most efficient operations”. He explained that Orca’s intellectual property enables depth extraction with monocular cameras by training the AI system on images of known objects and sizes at different ranges. A labeled data set of ~0.5M samples across visibility conditions and locations is used to train the range estimation algorithms using radar and AIS locations as the ground truth. These algorithms estimate distance and bearing to all objects above the surface which are detectable (not just large vessels with known AIS locations). The smallest size object that can be detected by visible cameras is as small as a jet ski (1 m width) at a range of 3 nautical miles (5.5 miles). Orca AI’s algorithms are also trained on thermal datasets, incorporating the same capabilities for low visibility conditions, but at lower ranges (due to lower resolution, temperature variations, etc.). The accuracy of the range estimate is ~10% of the total range. These algorithms are used during operation to estimate range measurements at distances of 2-4 nautical miles.

Figure 4 shows an example of a daytime camera image in degraded visibility conditions, of a 1 m wide buoy at a range of 2 nautical miles (3.6 miles).

At this range, a 1 m object maps into 3-4 camera pixels, adequate for recognition and range estimation. The thermal camera has lower resolution, hence shorter range (2.7 miles).

According to Mr. Raviv, LiDAR (Laser Detection and Ranging) is not considered as one of the viable options due to range limitations, harsh operating environments (saline corrosion, vibrations, temperature variations), costs and eye-safety issues. In fog (like in Figure 3), the thermal camera data is fused with radar to estimate range. His opinion is that existing solutions like radars, infrared and high resolution visible cameras can produce affordable and adequate results for range estimation for safety and autonomous navigation in maritime environments.

The AoT™ (Autonomy of Things) revolution is also in play for sea-faring vessels, and Orca is active in moving this forward. The SeaPod platform and other shipboard sensors provide perception and situational awareness. Cloud-transmitted data from customer ships which spans diverse weather conditions, locations and ports are used to train AI algorithms on route planning, obstacle awareness and other maritime traffic details (similar to Mobileye and Tesla’s use of cloud data from their customer’s cars to develop vehicle autonomy). Figure 5 shows the definition of levels of autonomy, per the International Maritime Organization (IMO). They are similar to those defined by the SAE (Society of Automobile Engineers) for passenger cars and trucks on roadways.

At present, the Orca-ai platform has Level 1 capability (ADAS for ships), with human operators in full control of navigation and speed. The demonstration with NYK in 2022 where the vessel traversed autonomously from Tokyo Bay to Tsumatsusaka port in Ise Bay, a journey of spanning 40 hours is an example of Level 3 capability.

Other Maritime Autonomy Efforts

Maritime autonomy frees up the on-board human crew to engage on other critical functions on the vessel rather than watchkeeping and navigating in open seas(which is mostly uneventful). Boredom, distraction and complacency of crews are issues, as is the ability to identify dangers in bad weather and visibility. Another driver for autonomy is to optimize routes, and reduce fuel consumption and carbon footprint.

The idea of maritime autonomy is not new and many other companies are pursuing this. Maersk, one of the largest shipping operators in the world, collaborated with Rolls Royce to launch the world’s first remotely operated commercial vessel (a tugboat) in 2017. In an interview, then Maersk CEO Soren Skou said that he did not “expect we will be allowed to sail around with400-meter long container ships, weighing 200,000 tons without any human beings on board”. Rolls Royce and Sea Machines announced a partnership to develop on-board and remotely controlled autonomy for commercial shipping, pleasure yachts and military vessels in 2021. Kongsberg Maritime and Yara developed an autonomous ship which has been transporting fertilizer containers from the factory in Porsgrunn, Norway to the export port in Brevik since April 2022. Hyundai Heavy Industries has been engaged in delivering autonomy solutions for pleasure and commercial vessels since 2022.

Do LiDAR and Other Sensors Have A Role?

Maritime LiDARs are available and increasingly being deployed for obstacle detection, situational awareness and autonomy. Given the vessel costs, cargo value, human lives and environmental risks, this sensor modality should be considered (even if costs are higher) to address low light conditions and limitations of mono cameras for accurate range measurements. Considerations like harsh maritime environments are addressed today in terrestrial applications and these learnings can be applied to the maritime environment. Doppler LiDAR in addition to 3D scene information also provides radial velocities of surrounding objects is important to consider. It is interesting to note that more sophisticated sensors like LiDAR are not widely deployed currently in maritime environments, which is opposite to the autonomous car revolution a decade ago which relied on and drove the LiDAR industry.

Visible imagery holds significant amounts of information useful for sensing objects, and their locations. However, wavelengths beyond the visible are also critical as are optical properties like polarization. The problem is with fusing all these different modalities into a compact imaging platform. Eoptic, based in Rochester, New York, aims to do this by providing multi-camera fusion using prismatic optics that enables FoV registration from different cameras (visible, NIR, SWIR, polarization, etc.).

Maritime autonomy is imminent, although it is unlikely that fully autonomous ships with no humans on board (equivalent to L5 autonomy levels for automobiles and trucks) will be a possibility in the future. The risks to property, human and environmental damage are too high large compared to the benefits gained by eliminating human labor. Ingress and egress at ports is also likely to remain in the hands of experienced human pilots who understand the geography, traffic patterns, weather and personality of the port. Partial and remotely controlled autonomy in open waters is more practical (similar to highway autonomy for trucks), and delivers significant advantages in safety, efficiency and environmental blueprint. In many ways, this movement mirrors the autonomous car revolution, but ~5 years behind. As regulatory bodies get more involved, it is likely that LiDAR sensors developed for autonomous cars will be adapted and used widely for this application.

Share.

Leave A Reply

Exit mobile version