Patents by Inventor Rupa Chaturvedi

Rupa Chaturvedi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11922489
    Abstract: A camera is used to capture image data of representations of a physical environment. Planes and surfaces are determined from a representation. The planes and the surfaces are analyzed using relationships there between to obtain shapes and depth information for available spaces within the physical environment. Locations of the camera with respect to the physical environment are determined. The shapes and the depth information are analyzed using a trained neural network to determine items fitting the available spaces. A live camera view is overlaid with a selection from the items to provide an augmented reality (AR) view of the physical environment from an individual location of the locations. The AR view is enabled so that a user can port to a different location than the individual location by an input received to the AR view while the selection from the items remains anchored to the individual location.
    Type: Grant
    Filed: February 11, 2019
    Date of Patent: March 5, 2024
    Assignee: A9.com, Inc.
    Inventors: Rupa Chaturvedi, Xing Zhang, Frank Partalis, Yu Lou, Colin Jon Taylor, Simon Fox
  • Patent number: 11403829
    Abstract: Users can view images or renderings of items placed (virtually) within a physical space. For example, a rendering of an item can be placed within a live camera view of the physical space. A snapshot of the physical space can be captured and the snapshot can be customized, shared, etc. The renderings can be represented as two-dimensional images, e.g., virtual stickers or three-dimensional models of the items. Users can have the ability to view different renderings, move those items around, and develop views of the physical space that may be desirable. The renderings can link to products offered through an electronic marketplace and those products can be consumed. Further, collaborative design is enabled through modeling the physical space and enabling users to view and move around the renderings in a virtual view of the physical space.
    Type: Grant
    Filed: February 24, 2021
    Date of Patent: August 2, 2022
    Assignee: A9.COM, INC.
    Inventors: Jason Canada, Rupa Chaturvedi, Jared Corso, Michael Patrick Cutter, Sean Niu, Shaun Michael Post, Peiqi Tang, Stefan Vant, Mark Scott Waldo, Andrea Zehr
  • Patent number: 11238515
    Abstract: The present embodiments provide visual search techniques which produces results that include both accurate similar items as well diversified items through attribute manipulation. In some embodiments, a feature vector describing the item of interest is obtained. A target feature vector is then generated at least partially from the original feature vector, in which the target feature vector shares only a subset of attribute values with the original feature vector and includes at least some values that are different from the original feature vector. An electronic catalog of items is then queried using the target feature vector, and a set of candidate items are determined from the electronic catalog based at least in part on similarity to the target feature vector. The original feature vector may be used to query for a set of similar items that are as similar as possible to the item of interest.
    Type: Grant
    Filed: February 1, 2019
    Date of Patent: February 1, 2022
    Assignee: A9.COM, INC.
    Inventors: Krystle Elaine de Mesa, Aishwarya Natesh, Andrea Joyce Diane Zehr, Rupa Chaturvedi, Mehmet Nejat Tek, Julie Chang
  • Patent number: 11126845
    Abstract: A computing device is used to capture image data of a physical environment. The image data is of a live camera view from the camera. The image data includes a representation of a physical environment. A selection of an aspect of the representation is determined. The image data is analyzed using a trained neural network and using the selection to determine one or more types of items for the representation. At least two items associated with the one or more types of items are generated. The at least two items and at least one visible marker associated with the aspect are overlaid in the live camera view to provide an augmented reality view of the physical environment.
    Type: Grant
    Filed: December 7, 2018
    Date of Patent: September 21, 2021
    Assignee: A9.com, Inc.
    Inventor: Rupa Chaturvedi
  • Patent number: 11093748
    Abstract: Various embodiments of the present disclosure provide systems and method for visual search and augmented reality, in which an onscreen body of visual markers overlayed on the interface signals the current state of an image recognition process. Specifically, the body of visual markers may take on a plurality of behaviors, in which a particular behavior is indicative of a particular state. Thus, the user can tell what the current state of the scanning process is by the behavior of the body of visual markers. The behavior of the body of visual markers may also indicate to the user recommended actions that can be taken to improve the scanning condition or otherwise facilitate the process. In various embodiments, as the scanning process goes from one state to another state, the onscreen body of visual markers may move or seamlessly transition from one behavior to another behavior, accordingly.
    Type: Grant
    Filed: January 27, 2020
    Date of Patent: August 17, 2021
    Assignee: A9.COM, INC.
    Inventors: Peiqi Tang, Andrea Zehr, Rupa Chaturvedi, Yu Lou, Colin Jon Taylor, Mark Scott Waldo, Shaun Michael Post
  • Publication number: 20210183154
    Abstract: Users can view images or renderings of items placed (virtually) within a physical space. For example, a rendering of an item can be placed within a live camera view of the physical space. A snapshot of the physical space can be captured and the snapshot can be customized, shared, etc. The renderings can be represented as two-dimensional images, e.g., virtual stickers or three-dimensional models of the items. Users can have the ability to view different renderings, move those items around, and develop views of the physical space that may be desirable. The renderings can link to products offered through an electronic marketplace and those products can be consumed. Further, collaborative design is enabled through modeling the physical space and enabling users to view and move around the renderings in a virtual view of the physical space.
    Type: Application
    Filed: February 24, 2021
    Publication date: June 17, 2021
    Inventors: Jason Canada, Rupa Chaturvedi, Jared Corso, Michael Patrick Cutter, Sean Niu, Shaun Michael Post, Peiqi Tang, Stefan Vant, Mark Scott Waldo, Andrea Zehr
  • Patent number: 10943403
    Abstract: Users can view images or renderings of items placed (virtually) within a physical space. For example, a rendering of an item can be placed within a live camera view of the physical space. A snapshot of the physical space can be captured and the snapshot can be customized, shared, etc. The renderings can be represented as two-dimensional images, e.g., virtual stickers or three-dimensional models of the items. Users can have the ability to view different renderings, move those items around, and develop views of the physical space that may be desirable. The renderings can link to products offered through an electronic marketplace and those products can be consumed. Further, collaborative design is enabled through modeling the physical space and enabling users to view and move around the renderings in a virtual view of the physical space.
    Type: Grant
    Filed: April 29, 2019
    Date of Patent: March 9, 2021
    Assignee: A9.com, Inc.
    Inventors: Jason Canada, Rupa Chaturvedi, Jared Corso, Michael Patrick Cutter, Sean Niu, Shaun Michael Post, Peiqi Tang, Stefan Vant, Mark Scott Waldo, Andrea Zehr
  • Patent number: 10789699
    Abstract: A computing device is used to capture image data of a physical environment. The image data is analyzed to determine color information for colors represented in the physical environment and to determine scene information that describes a room type associated with the physical environment. A palette of colors is assembled using the colors from the color information and provided for display. Upon selection of a color from the palette of colors, a product associated with the selected color and with the room type is provided for display.
    Type: Grant
    Filed: December 26, 2019
    Date of Patent: September 29, 2020
    Assignee: A9.COM, INC.
    Inventor: Rupa Chaturvedi
  • Publication number: 20200258144
    Abstract: A camera is used to capture image data of representations of a physical environment. Planes and surfaces are determined from a representation. The planes and the surfaces are analyzed using relationships there between to obtain shapes and depth information for available spaces within the physical environment. Locations of the camera with respect to the physical environment are determined. The shapes and the depth information are analyzed using a trained neural network to determine items fitting the available spaces. A live camera view is overlaid with a selection from the items to provide an augmented reality (AR) view of the physical environment from an individual location of the locations. The AR view is enabled so that a user can port to a different location than the individual location by an input received to the AR view while the selection from the items remains anchored to the individual location.
    Type: Application
    Filed: February 11, 2019
    Publication date: August 13, 2020
    Inventors: Rupa Chaturvedi, Xing Zhang, Frank Partalis, Yu Lou, Colin Jon Taylor, Simon Fox
  • Publication number: 20200160058
    Abstract: Various embodiments of the present disclosure provide systems and method for visual search and augmented reality, in which an onscreen body of visual markers overlayed on the interface signals the current state of an image recognition process. Specifically, the body of visual markers may take on a plurality of behaviors, in which a particular behavior is indicative of a particular state. Thus, the user can tell what the current state of the scanning process is by the behavior of the body of visual markers. The behavior of the body of visual markers may also indicate to the user recommended actions that can be taken to improve the scanning condition or otherwise facilitate the process. In various embodiments, as the scanning process goes from one state to another state, the onscreen body of visual markers may move or seamlessly transition from one behavior to another behavior, accordingly.
    Type: Application
    Filed: January 27, 2020
    Publication date: May 21, 2020
    Inventors: Peiqi Tang, Andrea Zehr, Rupa Chaturvedi, Yu Lou, Colin Jon Taylor, Mark Scott Waldo, Shaun Michael Post
  • Publication number: 20200134811
    Abstract: A computing device is used to capture image data of a physical environment. The image data is analyzed to determine color information for colors represented in the physical environment and to determine scene information that describes a room type associated with the physical environment. A palette of colors is assembled using the colors from the color information and provided for display. Upon selection of a color from the palette of colors, a product associated with the selected color and with the room type is provided for display.
    Type: Application
    Filed: December 26, 2019
    Publication date: April 30, 2020
    Inventor: Rupa Chaturvedi
  • Patent number: 10572988
    Abstract: A computing device is used to capture image data of a physical environment. The image data is analyzed to determine color information for colors represented in the physical environment and to determine scene information that describes a room type associated with the physical environment. A palette of colors is assembled using the colors from the color information and provided for display. Upon selection of a color from the palette of colors, a product associated with the selected color and with the room type is provided for display.
    Type: Grant
    Filed: June 19, 2017
    Date of Patent: February 25, 2020
    Assignee: A9.COM, INC.
    Inventor: Rupa Chaturvedi
  • Patent number: 10558857
    Abstract: Various embodiments of the present disclosure provide systems and method for visual search and augmented reality, in which an onscreen body of visual markers overlayed on the interface signals the current state of an image recognition process. Specifically, the body of visual markers may take on a plurality of behaviors, in which a particular behavior is indicative of a particular state. Thus, the user can tell what the current state of the scanning process is by the behavior of the body of visual markers. The behavior of the body of visual markers may also indicate to the user recommended actions that can be taken to improve the scanning condition or otherwise facilitate the process. In various embodiments, as the scanning process goes from one state to another state, the onscreen body of visual markers may move or seamlessly transition from one behavior to another behavior, accordingly.
    Type: Grant
    Filed: March 5, 2018
    Date of Patent: February 11, 2020
    Assignee: A9.COM, INC.
    Inventors: Peiqi Tang, Andrea Zehr, Rupa Chaturvedi, Yu Lou, Colin Jon Taylor, Mark Scott Waldo, Shaun Michael Post
  • Publication number: 20190272425
    Abstract: Various embodiments of the present disclosure provide systems and method for visual search and augmented reality, in which an onscreen body of visual markers overlayed on the interface signals the current state of an image recognition process. Specifically, the body of visual markers may take on a plurality of behaviors, in which a particular behavior is indicative of a particular state. Thus, the user can tell what the current state of the scanning process is by the behavior of the body of visual markers. The behavior of the body of visual markers may also indicate to the user recommended actions that can be taken to improve the scanning condition or otherwise facilitate the process. In various embodiments, as the scanning process goes from one state to another state, the onscreen body of visual markers may move or seamlessly transition from one behavior to another behavior, accordingly.
    Type: Application
    Filed: March 5, 2018
    Publication date: September 5, 2019
    Inventors: Peiqi Tang, Andrea Zehr, Rupa Chaturvedi, Yu Lou, Colin Jon Taylor, Mark Scott Waldo, Shaun Michael Post
  • Publication number: 20190251753
    Abstract: Users can view images or renderings of items placed (virtually) within a physical space. For example, a rendering of an item can be placed within a live camera view of the physical space. A snapshot of the physical space can be captured and the snapshot can be customized, shared, etc. The renderings can be represented as two-dimensional images, e.g., virtual stickers or three-dimensional models of the items. Users can have the ability to view different renderings, move those items around, and develop views of the physical space that may be desirable. The renderings can link to products offered through an electronic marketplace and those products can be consumed. Further, collaborative design is enabled through modeling the physical space and enabling users to view and move around the renderings in a virtual view of the physical space.
    Type: Application
    Filed: April 29, 2019
    Publication date: August 15, 2019
    Inventors: Jason Canada, Rupa Chaturvedi, Jared Corso, Michael Patrick Cutter, Sean Niu, Shaun Michael Post, Peiqi Tang, Stefan Vant, Mark Scott Waldo, Andrea Zehr
  • Patent number: 10319150
    Abstract: Users can view images or renderings of items placed (virtually) within a physical space. For example, a rendering of an item can be placed within a live camera view of the physical space. A snapshot of the physical space can be captured and the snapshot can be customized, shared, etc. The renderings can be represented as two-dimensional images, e.g., virtual stickers or three-dimensional models of the items. Users can have the ability to view different renderings, move those items around, and develop views of the physical space that may be desirable. The renderings can link to products offered through an electronic marketplace and those products can be consumed. Further, collaborative design is enabled through modeling the physical space and enabling users to view and move around the renderings in a virtual view of the physical space.
    Type: Grant
    Filed: May 15, 2017
    Date of Patent: June 11, 2019
    Assignee: A9.COM, INC.
    Inventors: Jason Canada, Rupa Chaturvedi, Jared Corso, Michael Patrick Cutter, Sean Niu, Shaun Michael Post, Peiqi Tang, Stefan Vant, Mark Scott Waldo, Andrea Zehr
  • Patent number: 10210664
    Abstract: Systems and methods herein enable adding changing light in a live camera view, an image or a video using a light source that is virtual and that includes an associated lighting profile. The system includes receiving image data of the live camera view. The system determines a first lighting profile associated with the representation of the object. Position information associated with the object is also determined with respect to the camera. The system receives a second lighting profile associated with the light source. The second lighting profile provides at least intensity values and direction information for light projected from the light source. The system determines and applies changes to the first lighting profile to affect a light surrounding the representation of the object using the second lighting profile. The system displays the image data with the changes to the first lighting profile.
    Type: Grant
    Filed: May 3, 2017
    Date of Patent: February 19, 2019
    Assignee: A9.com, Inc.
    Inventor: Rupa Chaturvedi