Above you can see a video of the final presentation we presented last week. In the video you can see the click-through of the interface. To see the presentation with notes, check out this link:
Above you see the final AR user interface concept for the trainer (icebreaker). This is all compressed down to one screen for demonstration purposes.
To the top left is the selection of the different ice-tags the trainor can choose between, these are based on the ice categorizations from “The Nautical Institute, Ice Navigator Accreditation Standard”.
The tag in the center includes the ice-type and the ice-symbol. The area of interest is in perspective, so the students (convoy) know what area they are looking at.
The circle in the bottom left corner is the mini-map which show the landscape. The mini-map has a radius of 1 nautical mile, which is standard for a normal radar, but the trainer can choose to zoom in and out. The data in this mini-map is based on the latest generation ice-radar, which can show the ice more clearly then a normal radar.
Within the ice-radar, the user sees the ships in the convoy, and the users ship is marked as yellow. The little square in the minimap is a representation of the ice-tag in the environment, and is possible to move around. The component pointing out from the square, is the ice-symbol which the user can choose to delete, or scale up and down. It is placed on the outside so the user know which tag is which, for instance if the user where to place may ice-tags.
This was quite a tricky part, because the little square in the minimap doesn’t provide the user with much information except where the tag is placed. Therefore there was need to place it outside the mini-map with the ice-symbol. But what if there is multiple ice-tags, how will these components outside the mini-map be ordered?
I tried several alternatives, and here are a few of them:
As you can see, when all the tags are placed in front of each other there is no problem (all to the left). But as they are moved around they lines overlap, and if becomes difficult to see which one is which (center). Therefore, if the user places multiple tags, they will be arrange so that the closest tag in the environment is the closest one to you in the mini-map (all to the right). This is all very hard to explain, but at the end I think I found a way. It could of course be solved in different ways, and inspiration can be taken from tools such as Photoshop or Illustrator where they sort components in layers:
The students in the convoy, has many of the same components in their UI.
But in the ice-tag, they’ll be presented with more information within each ice-tag. They can also choose to take a “screengrab” so they can look at that later when there is downtime on the ship. This is a different tool I haven’t decided to design this round.
Additionally as the student get better at the different ice-type, they can choose to deselect the tagged ice (top left) if they feel they don’t have to see it.
Through my research and through interviews, I found that in the Arctic it is important that all ships communicate well, but also that they use the same language. Especially since different languages can lead to communication failure. After reading The Nauticals Institute’s “Ice Navigator Accreditation Standard”, I gained insight into what standard “language” is and how important it is that all ships traveling in the Arctic can afford this taxonomy to understand each other in communication. As icebreakers and ships in the convoy want to communicate in my concept, I saw the importance of building on these standards. Therefore, the “tags” to the students (in convoy) contain the standard ice-symbol also called ice nomeclature. (read more here: http://www.aari.ru/gdsidb/docs/wmo/nomenclature/WMO_Nomenclature_draft_version1-0.pdf)
I believe it is important to use these symbols so that ships can get a symbolic representation of the ice-type, when language can be a challenge. I also think that it is important considering future communication between ships in the Arctic.
Ice class is also an important element in navigation in the Arctic. “Iceclass denotes the additional level of strengthening as well as other arrangements that enable a ship to navigate through sea ice.” (https://en.wikipedia.org/wiki/Ice_class). Therefore it’s highly important that a navigator on a ship is aware of the type of ice the ship can go through, so that the ship avoids drastic accidents such as happened with MS exploder in 2015. MS explorer drove into a thick ice flood in Antarctica, which caused a hole in the hull, and the ship sank. (https://www.aftenposten.no/verden/i/dmX8O/Her-synker-MS-Explorer)
A tagg should therefore include both Ice-symbol, which ice-class the ice demands, the hallmarks so the crew can recognise the ice, and recommended speed to have when breaking through the ice. As we learned when visiting an icebreaker; having the correct speed is crucial to not get stuck.
As my co-students Gustav and Chris found out, it is also important that the information is above the horizon, where there is a bunch of space, and do not interfere with whats happening in the ocean, aka. the area where navigators need to have a clear view. Hence the vertical line between the area of interest, and the rest of the information.
In this figure I am basically trying to picture what a scenario using these AR-tags could look like. And even though it’s just a superquick sketch, I was able to see a challenge; that the tag absolutely needs to be placed at the right point in the environment. If it’s not, the convoy will maybe look at a wrongly placed tag. I want to emphasize what an important interaction-component this “placing of tag” action is in the concept. Again referring to the mini-maps I posted earlier, which I think could be a good way of solving this challenge. It is indeed a “tough nut to crack”.
Since we didn’t arrange for a third field-trip on a ship soon enough, I had to find other ways of testing out the AR-user interface. I came across an online tool which allowed me to test out where the different components in the UI should be placed, according to a persons field of view. The tool didn’t allow me to test out any gestures, but at least I got a sense of the combination of the widgets.
I also saw that for instance yellow and orange on white/blueish background doesn’t really work that well. Since I am designing in an environment full of ice and water, I know that the colours I choose on the UI should probably not be blue or white. Yellow didn’t work well in the test I did, and red may signalize alarms og danger. Additionally the colours should be lighter than the environment, so the AR-glasses can project them. This excludes quite a big spectre of colour to use, but colours like purple, pink, green are still available. This is of course not tested out very clearly, rather based on my assumptions.
Since the tests with the hologram landscape didn’t quite work in Augmented Reality, I researched other ways of showing a landscape. So called “minimaps” are often used in games, and allows the user to know where in the landscape they are, and can prepare the user for their next action, whether is it taking down an enemy or just moving to a different place.
After testing the landscape in the Hololens, I added “tags” to see if I could use the landscape to move the these tags around. The tags are not moveable, but at least I could pretend that I grabbed them.
The video shows me trying to grab the tags, but I concluded that this did not work very well. First of all, my hands came behind the tags which made it very difficult to grab them. Secondly, it was difficult to see the total landscape. This is because the landscape appears in a perspective – and the eye sees most of what appears in front, but the eye has difficulty seeing what is at the back of the hologram in a perspective. Therefore, the user loses an overview of half the landscape.
In the convoy scenario, I defined the icebreaker crew as trainor, and the rest of the ship convoy as students. Sketch 1 shows a quick sketch of how a teacher ui could look like in AR. The sketch shows the tag, it shows a menu for the various ice types, and it shows a tool for placing the tags in the environment.
I made this tagging tool in Unity, and put it onto the Hololend. In Figure 1, the tagging tool is intended to visualise the landscape around the ship in a 3d hologram. In a way, it is inspired by Rolls Royce’s future ship bridge concept, where they can see the ship in a hologram(picture below). A hologram can give you a see-through, detailed reflection of a physical object or a landscape, which can be good in a context where you need to see details clearly.
In Figure 2 I wanted to see if it was easier to see the details of the landscape, if the shape did not have a background. I also tested out if it is better to look at the landscape in a square.
AR opens the possibility of adding an additional layer of information. The previous blog post shows a sketch of a person who looks out on the ice, and gets information about what kind of ice it is watching. I belive that to “tag” the ice is an interesting way of using AR technology for something useful. Figure 1 shows an example of a scenario in which a ship travels in the Arctic, and on its way, the crew can see what type of ice they are navigating through by wearing AR glasses. But who is the one who has laid out the “AR-tags” in the first place? One can imagine that any ship can “tag” the ice, so that other ships can see these tags if they take the same route. But this creates problems, first and foremost because Arctic ice is constantly moving, and also because it is very doubtful that all ships have enough knowledge to tag the ice. There is a need for someone who are experts in the field to tag the ice.
Icebreakers are the ships with the greatest knowledge of ice-filled waters. After visiting two different icebreakers, we were impressed with the vast knowledge the crew of icebreakers has about ice and navigation. Therefore, I have reason to believe that icebreakers are the most suitable ships to lay out the tags in the Arctic.
But in which scenario should this “tagging” take place? As mentioned earlier, some of the icebreakers’ main tasks are to keep passages in the ice open to other ships, and assist ships with difficulty getting through the ice. I think it’s a good idea to choose a frequent scenario, so that the most possible ships can access the valuable information the icebreakers have.
Figure 2 shows a scenario where the icebreaker has posted ice tags in the environment, which are also available for the rest of the convoy to see. This becomes a kind of learning scenario, where the icebreaker acts as a teacher and the convoy acts as a student. This scenario will also ensure that the AR-tags contain information from experts, as well as the ice would not have had time to move a lot.
We know that ships that are going to travel in the Arctic need to learn more about the ice conditions there. The Polar Code is a new measure based on this challenge. The Polar Code has rules on the design and construction of ships, equipment, operational conditions, environmental protection and training. (https://www.sjofartsdir.no/aktuelt/nyheter/fra-nyttar-gjelder-de-nye-reglene-for-skipsfart-i-polare-farvann/) From January this year, all ships wishing to travel through the Arctic have to pass this training course on ice and navigation.
This course requires participants to complete a theoretical part and a practical part in the simulator. But is this enough? When I interviewed a sailor who is also working on teaching sailors in a simulator, he told that sailors often get quite a surprise when they go out on a real ship in real environments. One reason for this is because the simulators do not represent the actual visualisation of ice-filled waters. First of all, today’s simulators do not have enough process-capacity to create realistic ice environments, and secondly, it is extremely difficult to visualise all the different ice types in a realistic manner.
Therefore, I think there is a need for ships to learn about ice conditions in the Arctic, when they are actually in the Arctic. I want to make a concept of how ships can use augmented reality as a tool to learn about Arctic ice-conditions. Augmented reality opens up the possibility of adding a filter of information to the the environment. Figure 1 shows a sketch of a sailor looking on the icetype multi-year ice. By using the AR glasses, he gets information about what ice type he is looking at. I wish to build on this concept this semester.