Patents by Inventor Vinod A. Bijlani

Vinod A. Bijlani has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20100114660
    Abstract: A method for making an inference based on cumulative data. The method utilizes video, audio, and biometric devices to observe a retail environment for the presence of a customer. Once a customer is present, the method identifies every cohort to which the customer corresponds. Next, the method observes the customer as they peruse aisles in the retail environment. When the customer selects a product, the method identifies the selected product and searches the cohorts for alternate products to offer the customer. The method offers one alternate product to the customer and records to the cohorts whether the customer thereinafter accepts the method's offer and selects the alternate product or rejects the method's offer and continues perusing the retail environment aisles. The method continues observing the customer and offering alternate product until the customer leaves the retail environment.
    Type: Application
    Filed: November 5, 2008
    Publication date: May 6, 2010
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Robert Lee Angell, Vinod A. Bijlani, Jack Chen, Robert R. Friedlander, James R. Kraemer, Le Gang Wu
  • Patent number: 7685023
    Abstract: An interactive virtual model of a physical storefront can be presented to a shopper within a user interactive interface of a computing device remotely located from the physical storefront. At least a portion of the organizational structure of the interactive virtual model can be identical a portion of the organizational structure of the physical storefront. The organization structure can varies from storefront-to-storefront of different physical storefronts, each being related to a different interactive virtual model. A change involving at least one physical object within the physical storefront can be sensed. Responsive to sensing the change, a virtual object presented in the interactive virtual model can be changed so that the change to the physical object occurring in the physical storefront is reflected in the interactive virtual model and is shown in the user interactive interface.
    Type: Grant
    Filed: December 24, 2008
    Date of Patent: March 23, 2010
    Assignee: International Business Machines Corporation
    Inventors: Subil M. Abraham, Vinod A. Bijlani, Mathews Thomas
  • Patent number: 7562816
    Abstract: The present invention includes a method for providing a robust user experience with an automated system that includes sensory output, such as touch, taste, and/or smell as well as visual output. The automated system can be an e-commerce or automated shopping system. The method can visually present a user selected item for consumer purchase within a graphical user interface (GUI). Sensory output can be produced that simulates how the selected item smells, feels, and/or tastes can be provided. A user can then made a selection in the GUI to modify an aspect of the selected item and/or an environmental condition for an environment of the selected item. The visual presentation of the object and the sensory output for the item can be adjusted in accordance with the user specified modification.
    Type: Grant
    Filed: December 18, 2006
    Date of Patent: July 21, 2009
    Assignee: International Business Machines Corporation
    Inventors: Subil M. Abraham, Vinod A. Bijlani, Mathews Thomas
  • Publication number: 20090079833
    Abstract: The present solution can include a method for allowing the selective modification of audio characteristics of items appearing in a video. In this method, a RFID tag can be loaded with audio characteristics specific to a sound-producing element. The RFID tag can then be attached to an item that corresponds to the sound-producing element. The video and audio of the area including the item can be recorded. The audio characteristics can be automatically obtained by scanning the RFID tag. The audio characteristics can then be embedded within the video so that the audio characteristics are available when the item appears in the video.
    Type: Application
    Filed: September 24, 2007
    Publication date: March 26, 2009
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: SUBIL M. ABRAHAM, VINOD A. BIJLANI, MATHEWS THOMAS
  • Publication number: 20080143481
    Abstract: The present solution can include a method for embedding information contained within a RFID tag into a video. A RFID tag can be loaded with item information specific to a product item. The RFID tag can then be attached to a physical item corresponding to the product item. While the physical item is being recorded, the RFID tag can be simultaneously scanned to obtain the item information automatically. The item information can then be embedded within the video at the time when the physical item is present. A user can view and interact with the generated video in a manner that permits the user to selectively view and/or otherwise utilize the embedded item information. For example, a user can opt to purchase an item appearing in the interactive video.
    Type: Application
    Filed: December 18, 2006
    Publication date: June 19, 2008
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: SUBIL M. ABRAHAM, VINOD A. BIJLANI, MATHEWS THOMAS
  • Publication number: 20080147515
    Abstract: The present invention includes a method for providing a robust user experience with an automated system that includes sensory output, such as touch, taste, and/or smell as well as visual output. The automated system can be an e-commerce or automated shopping system. The method can visually present a user selected item for consumer purchase within a graphical user interface (GUI). Sensory output can be produced that simulates how the selected item smells, feels, and/or tastes can be provided. A user can then made a selection in the GUI to modify an aspect of the selected item and/or an environmental condition for an environment of the selected item. The visual presentation of the object and the sensory output for the item can be adjusted in accordance with the user specified modification.
    Type: Application
    Filed: December 18, 2006
    Publication date: June 19, 2008
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: SUBIL M. ABRAHAM, VINOD A. BIJLANI, MATHEWS THOMAS