Designed as an interactive AR experience where users can display their specified images, in this case, a portfolio on a marker.
The marker functions as the catalyst for keeping the holographic position in check. Wherever the user moves their specified marker, the images displayed will move along with the marker retaining its position relative to the marker.
Not only will the images be displayed as a hologram, but users can interact with it using their hands using positioning and gestures.
Hand tracking has been a part of AR that has gradually begun to gain popularity and thus was a key component in this project. Using the data retrieved, certain interactions can be implemented.
For this example, I decided to focus on position collisions and object scalability.
Position collision in this case was an interesting functionality.
Using data grabbed from specific parts of the hand, like the thumb, to create interactions once something is withing proximity.
This functionality combined the use of both proximity data and data produced through ray casting
With the data gathered from hand tracking, the calculations for when a pinch gesture occurs can now be used for making the selected image smaller or larger.