Abstract: A product is traditionally experienced in XR mainly through sight that accounts for about 80% of sensory inputs, thereby impeding an immersive experience of a product. In the present disclosure, an object's form factor guides a user's gestures in carrying out actions on an augmentation. A suitable physical object marker mimics physical touch interaction with a virtual product. Leveraging the sense of touch, the user gains deeper understanding of the virtual product, since the product responds to the user's natural gestures. In accordance with the present disclosure, transform data in the form of a position, an orientation and a scale associated with the physical object and user's hands handling the physical object are recorded to identify an associated gesture. Utilizing gesture to action mapping, the user's action is translated to the virtual product. The augmentation is modified in real time to provide an immersive experience.