US Navy
Sector

Physical digital installations

Location

Norfolk, Virginia - USA

Project Info

In partnership with the Hampton Roads Naval Museum and Nauticus, the U.S. Fleet Forces Command has created Stewards of the Sea: Defending Freedom, Protecting the Environment. The 1,000-square-foot, highly interactive exhibit uses immersive scenarios to explore how the Navy protects marine life while fulfilling its mission.

Our studio joined as a technical partner in the project’s early stages, contributing to a range of areas including installation setup, prototyping, engineering and user testing.

 

In partnership with the Hampton Roads Naval Museum and Nauticus, the U.S. Fleet Forces Command has created Stewards of the Sea: Defending Freedom, Protecting the Environment. The 1,000-square-foot, highly interactive exhibit uses immersive scenarios to explore how the Navy protects marine life while fulfilling its mission.

Our studio joined as a technical partner in the project’s early stages, contributing to a range of areas including installation setup, prototyping, engineering and user testing.

 

Officials visit the exhibition

Mr. Donald R. Schregardus, Deputy Assistant Secretary of the Navy (Environment) interacts with the “protective measures” portion of an environmental exhibit at the Nauticus National Center in downtown Norfolk.

Installation Setup

We configured the hardware setup, which contains different components for the system that runs the installations. This includes several 42″ touchscreen elo monitors as well as optimised PC modules.

The applications run on Windows, which had support for multi-touch drivers needed for the screen interface. To run the maps without an internet connection – we had to create a custom version of a tilestream server, that runs TileMill on a Virtual Linux Machine. And finally, to enable the web stack to run smoothly on the screen, we used Google Chrome in KIOSK mode.

All the data collected from the usage of the installation is saved locally and accessible to museum admins. We worked in parallel on the same system configuration to test performance and the deliverables.

Sonar Tech

The Sonar Tech interactive consists of four parts:

Learn Sound – The user learns what objects sound like underwater. They can choose from various biologic and man-made sounds. You can explore below some of the sounds used in the exhibit, by clicking on each photo.

Identify Sound – Once the user is familiar with the sounds, the software tests the user on the sounds they just learned. They have a time limit to recognise the sounds they hear.

Sound Composition – The third part of the interactive gives the user the ability to listen to multiple sounds at one time. This happens frequently so users are given the ability to experience what it sounds like.

Battle Station – The fourth and final part of the interactive puts the user into a Sonar Tech environment. Using passive sonar, the user listens for a diesel submarine. The user is given a time interval in which they determine whether or not a submarine is in the area. If identifed, the user will be prompted to use active sonar to determine the location of the threat. Once location is determined, users can fire and destroy the threat.

 

Testimonials

“This exhibit demonstrates, in an entertaining, imaginative manner, the highly technical environment in which naval personnel perform their duties. Visiting families, children and adults, will encounter scenarios where they will have to choose the correct action to take, just like actual Sailors do on a daily basis while at sea.”

Elizabeth Poulliot, Director of the Hampton Roads Naval Museum

What's Happening In The Ocean (WHITO)

The interactive is built on top of Mapbox. The map displays locations of the following kinds of ocean traffic and activity: Fishing, Commercial Shipping, Oil & Gas, Marine Mammals, and Navy.

The user is able to select and filter ocean traffic by distinct categories. The zoom feature allows for more detailed views. Users can pan, zoom and drag the map and the navigator. We built this functionality similar to how Google Maps works, for users to be familiar with the interactions.

Note: technical stuff (can skip)
The data displayed on the map is real data provided as GIS and Illustrator files. We used Tilemill to convert the GIS files to mbtiles.

For the Illustrator files we used a different approach. We first converted the files to .dwg (DraWinG) so we can import them to ArcMap. Here we added the Spatial Coordinate Systems and we positioned the data in this system, so we can export the results as .shp – a format that can be read by Tilemill.

Once we were done with this process we obtained a single type of file (.mbtiles) which contains the image of the maps, but a database as well, with information about properly positioning the images. Using a custom version of tilstream for the backend and modestamps for the frontend we were able to position the images on the maps. You can see the results below.