Maritime Intelligence Platform
Two workstreams, one timeline. The data foundation and the product that depended on it — delivered together.
Global Maritime Services Provider
One of the world's largest maritime services companies, headquartered in Norway
Maritime ship services sits on a wealth of untapped operational data. With growth resulting from bringing together businesses and systems, the opportunity has never been greater. By establishing a unified data foundation, there is a clear path to transforming fragmented information into a strategic asset –driving smarter decisions, stronger customer outcomes, and sustainable competitive advantage.
Multiple internal platforms. None talked directly to the other. Data of all type being ingested from various sources with varying dependencies creating a maze for data lineage. The brief was unusually direct: start from scratch and deliver something the organisation could own, extend, and create new business value.
Shape client's data platform strategy by building a modern maritime services intelligence platform that replaces legacy applications, architecture, and fragmented data flows. Drive new business and ways of working. Scaleup of resources as a business, data, & tech strategy partner.


We were handed a goal. The scope, the data strategy, and the architecture were ours to define.
Discovery surfaced the full picture quickly. The data warehouse needed significant structural work before it could support the queries a modern application would demand — which meant defining and executing a new data strategy in parallel with building a product that assumed that strategy was already in place. Both workstreams ran simultaneously from the start.
The lack of initial feature definition created scope pressure as requirements were shaped through real use and user feedback. But it was the design-led discovery process that surfaced the two most valuable features in the first release — neither of which appeared in the original brief.
The first was a configurable cross-fleet analysis table. Users could define the columns to match the question they were trying to answer at that moment — comparing scheduling, crew, positioning, and operational data across any selection of vessels simultaneously. The data architecture that made it possible also opened a direct path to a future where a user asks an AI a question and receives the answer in text summary and a table configures itself in response.
The second was role-based vessel favouriting — operations managers, fleet teams, and individuals each maintaining their own curated group of vessels for fast navigation across a fleet of 300. A feature that sounds simple until you consider that eight global offices and multiple management layers each have a different answer to the question: which vessels matter to me right now.
A unified full-stack web application — built three capability layers across four concurrent workstreams:
-
Vessel intelligence. — everything that answers where a vessel is and what is known about it.
-
Operational context. — everything that answers who is accountable and how each role navigates their specific view across a fleet of 300.
-
Cross-fleet analysis. — structured on a data foundation designed to support AI-driven querying as the next evolution.
Product & delivery. Iterative design-led process. Backlog built from zero. Scrum ceremonies run across a multifunctional team of designers, backend developers, and data engineers. Delivery rhythm maintained across time zones for five months.
Data strategy. The client's Snowflake environment was assessed, source-to-downstream dependencies mapped, and a full data quality and governance strategy prescribed and executed. The result wasn't just a cleaner warehouse — it was a data foundation structured for future AI-driven analysis.
Technical architecture. React Router v7 with a BFF layer in front of a .NET backend as the single gateway to Snowflake. Offline capability built in from day one — vessel operations don't wait for connectivity.
Documentation. Extensive technical documentation produced across the engagement directly supported the client's successful path to ISO27001 certification.
100+ users across global offices moved off their workarounds within weeks of launch. Two legacy applications were consolidated and retired. The client took full ownership of the codebase with the documentation, architecture, and backlog in place to continue building — including a clear pathway for AI-powered experiences and future revenue opportunities.
The gap between what data the business had and what decisions it could accelerate narrowed by several years in five months.
Snowflake moved from passive warehouse to active business asset — with the data landscape mapped, quality improvements in place, and the foundation laid for three things the business is now building toward: downstream data products, API monetisation, and AI-powered experiences built on clean, structured, trustworthy data.
The application launched successfully to internal users, with a clear path to onboarding customers and vendors, and a data warehouse strategy built for long-term scale. Setting a new standard for how the organisation approaches digital products was an explicit expectation of the engagement — the design, engineering, and data principles delivered were built to be the foundation everything that follows is measured against.
Senior enterprise work comes with NDAs. The client and identifying details are anonymised. The problems solved, the decisions made, and the outcomes delivered are real.
