top of page
ARMRVR.png

IMMERSIVE EXPERIENCES

 

At Samsung, I worked on a variety of projects that included - improving the user experience of Tizen UX, contributing to My Galaxy, the Social Camera Project, Emotify project ,  AR Doodle, etc.  While some of these were shipped to the market others were innovation research projects, 

this case study is of one such project. 

Since Samsung Bangalore is a Research and Development Institute, there is  always ongoing research on new/emerging technologies.

As part of the Product Innovation  team my work on a daily basis involved trying to see how these emerging technologies would create meaningful experiences for our target consumers.  The following is a documentation of the same.  

Goal

1. Decoding   immersive experiences 

2. Contribution  to SRIB, strategy.

My Role 

Senior Designer

(Research, Ideation, Visual Design, Motion Design,Interaction Design,  Ideation, UX story building, Pitch Creation, Strategy, Innovation ) 

Tools

After Effects, Adobe Premiere Pro, Adobe Photoshop, Adobe Dimension,  Paper , Pens, Sticky Notes, Microsoft Powerpoint 

Project Team 

3 designers

1 Design manager 

The Challenge 

The brief that was given to us  was to explore the areas of AR and VR and how it could add value to Samsung Mobile phone users.

Methodology  
 Secondary Research

Social and tech trend research was conducted to understand the market for immersion . This helped provide a landscape of the exisiting enablers and influencers. ​​

These trends were documented to later create an ecosystem map for immersion . 

Primary  Research

One on one semi structured interviews were conducted  with 6 genzs, and 8 millenials within the city of Bangalore . A focus group study was also conducted .  The objective of the research was to gage with people how they understood and interpreted immersion in daily life 

processfor-AR22_edited.jpg

Social/ Tech Trend  research, Articles, Case Studies , 

Research Papers, 

Existing products 

One on One Interviews, 

Focus Group

Transcribing, 

Clustering, 

Reframing , 

Persona Creation 

Sketching, 

Story Boarding, 

Brain Storming

Final Pitch Deck Prep

Story Building   

 Hi fidelity Prototypes, 

Video Explainer 

Due to non disclosure agreement with the company I have  appropriated confidential information . The information included here are my own and does not reflect the views of Samsung. 

Ecosystem Mapping 

insights , social trends and tech enablers, associated companies 

Screenshot 2023-12-26 at 19.09.25.png

stimulate your senses

experience 
the extreme

uninhabited 
experience 

catharisis  of 
equilibrium 

feel a deep 
connection

living your imagination

Personas 

fig: Over view of personas

Ideation 

Before the ideation , i  had to look at the constraints. One of the key constraints was that I had to create  concepts specifically for the Samsung mobile experience 

The other constraint was the technical aspects of it.  Since the idea was to be as explorative,  We came up with the decision to t be mindful of possible technical issues that could arise. 

The third constraint was the inability to posses a VR / MR device to get more context while ideating.

 

For the ideation phase, I used sketching  and body storming as a methodology to come of with ideas, each of these ideas was a sticky note on the wall and as a team we used the Rose Thorn Bud methodology to filter out the  final concepts.  

The selected ideas were further  detailed out using the Story  Boarding methodology.

 User Insights 

Affinity Mapping was used to finally group all interviews into 6 key insights to the research question: What is immersion to people and how do they interpret it in daily life. 

fig: An over view of the key insights 

3.gif

The long press interaction allows users to select a texture.  A relevant feedback is given by the system to indicate that the particular texture is picked and stored in the system.Users can pick  many textures  in a single session

4.gif

Once the user has finish picking textures , the user can change the camera mode to selfie mode by a simple swipe up and apply  a texture to themselves (from  the repository of textures that the user has picked earlier)  with a simple tap or explore and apply them to anything else. 

Concept 2:

Ability to turn day to day objects and surfaces into something delightful/ magical and interact with them

USER PERSONA : Exprience Explorer , Eager Bieber

USER NEED  : Uninhibited Exploration , Stimulate Your Senses, Living your Imagination 

POSSIBLE TECHNICAL REQUIREMENTS  : Surface detection,  Realtime Image processing, Simultaneous Localisation and Mapping, Surface tracking, mutimodal large langauge models

1(a).gif

Users can tap on a surface to make a selection. The system uses  surface detection tech to map the surface  and then using artificial intelligence intelligently augments something the system finds relevant on the surface. users can continuously tap in order to get multiple augmentations.

2.gif

System uses tech like object tracking, occlusion mapping and depth sensing inorder to give its users an interactive experience with objects which when tapped on via the camera will intelligently give relevant augmentation to create playful experiences

Concept 3:

Ability to create multiple versions of self or others 

USER PERSONA : Exprience Explorer , Eager Bieber, Connect Nurturer

TECHNNICAL REQUIREMENTS :Realtime image Segmentation, Simultaneous Localisation and Mapping, 

USER NEED  :Feel a deep connection , Living your imagination 

1.gif

When user's tap on the augmenter button, the system captures a 4 seconds long  boomerang video . 

2.gif

In  the process of capturing the video, the system also separates the background from the foreground and stores only the information of the foreground( object or person)  for the 4 seconds. Using AI, the system then creates an  augmentation that is an exact replica of the 4 second video captured and  places the augmentation  in  the same scene. 

3.gif

This process can be repeated as many times as the user taps on the augmenter button to create multiple versions of objects and people  played on loop .

USE CASE

usecase.gif

Use case:  Using it in the messaging  sphere where users could now express their multiple emotions or express more ideas with multiplicity 

usecase2.gif

Use case:  Using it to create memes and other post worthy material.

Concept 4:

As this was an explorative  design fiction work , I also looked  took the brief a little further and thought of ideas  that could be possible if there were no constraints at all and if users could fully immerse into a virtual space.   This one one concept made it to the final pitch deck

USER PERSONA : Exprience Explorer , Solace Seejker

USER NEED : Stimulate your senses, Catharsis of Equilibrium, Uninhibibited Exploration 

note - do click the volume on button on each of the videos to activate the audio.

1.gif

Normal view when MR/VR set is not worn by user. 

2.gif

View when MR/VR set is worn by the user. The system captures sound and converts the sound into an augmented visualisation of the same based on certain parameters that quantifies sound.

This augmented visual of the sound can be interacted upon by the user using certain interaction parameters  to invoke the augmentation( a simple tap in the air)   or to change the angle of viewing the augmentation ( a finger swipe or a palm swipe) 

Users can also add layers of sound augmentation on an existing augmentation by simply double-tapping on the existing one. In the video prototype, while the initial sound augmentation was a green visualisation of just the bird chirping, the double tap triggered an new augmentation of the second sound of crickets that was depicted by a "wirey"  black visual augmentation.

Users can almost manipulate the exisiting augmented sound to create new soundscapes and visualisation using various interaction paradigms. In the video, the user uses both is hands to change the existing augmentation to something else and thereby also affecting the sound.

Users can also invite other users and work/play  together  on a sound scape augmentation .

Outcome 

The goal of the project was to decode immersive experiences and contribute to the SRIB design strategy and I can safely say that these goals were achieved, 

 

The  research and concept pitch deck help create an AR vision team at Samung R n D , Bangalore.

The team would later go on to collborate with Samsung Korea on  ideation  , conceptualisation  and building of  AR experiences for Samsung mobile that I got the

opportunity to be part of 

I also received the  Samsung Spot Award for contributions to AR/MR/VR for the team.

I was later chosen as a one of the employees to travel to Korea , to pitch and discuss AR concepts with design counterparts in SAMSUNG,Korea 

Key Contributions  

 ​

-  Pioneering exploratory research, focus groups and user interviews 

-  Synthesizing results and conducting a persona development workshop with the design  team. 

-  Generating possible ideas and directions based on insights .

-  Defining possible interaction paradigms for the concepts , keeping in mind the restrictions of the device that it was ideated for 

- Creating of video prototypes 

-  Handled the pitch deck for creating the final story and the final concept proposals 

-  Stake holder management 

Learnings

1. One of the biggest challenges was not having actually AR VR glasses to prototype or the technical expertise  to realise the concepts and be

    able to test them . A big learning was figuring out workarounds and making the best use of a situation .  

 2. This project  had  a lot of ambiguity.  At several stages in the project questions about where it was really heading or what would be the 

    outcome came up multiple times and the greatest learning for me was  to have a constant point of contact  with the design managers about 

    the progress of such a project so that they could provide direction and pointers for next steps. 

3.  Often it is hard to see the larger picture when we are so focused on figuring out the outcome. For a project like this it was very important 

    to have that larger picture in mind  to  motivate and feel motivated to continue working on it. 

4.The learning for me here was to figuring  out what UX tools are relevant and what is not based on the goal of the project.

    

5. Building the concepts would also mean creating relevant parameters for an ideal AR/VR /MR experience and also the ideal interaction paradigm these could have been  looked into as as the way forward.

Other Projects  

Check out other projects below: 

Concept 1

Ability to pick things from the environment and create an augmentation filter.   

USER PERSONA : Exprience Explorer 

USER NEED  : Uninhibited Exploration , Stimulate Your Senses

 POSSIBLE TECHNICAL REQUIREMENTS   :  Image Recognition, Surface detection, Surface mapping, Deep sensing, Data Processing

ezgif.com-gif-maker.gif

Using sensors in the phone camera the system recognises detects objects  within their space and their textures. Users are notified  incase  the proximity of the object for the camera isn't correct with a small live notification.

ezgif.com-gif-maker.gif

In order to check if a texture can be selected or not , the user simply taps the camera screen. A relevant feedback is given by the system to confirm the same

bottom of page