Patents by Inventor Rupa Chaturvedi
Rupa Chaturvedi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11922489Abstract: A camera is used to capture image data of representations of a physical environment. Planes and surfaces are determined from a representation. The planes and the surfaces are analyzed using relationships there between to obtain shapes and depth information for available spaces within the physical environment. Locations of the camera with respect to the physical environment are determined. The shapes and the depth information are analyzed using a trained neural network to determine items fitting the available spaces. A live camera view is overlaid with a selection from the items to provide an augmented reality (AR) view of the physical environment from an individual location of the locations. The AR view is enabled so that a user can port to a different location than the individual location by an input received to the AR view while the selection from the items remains anchored to the individual location.Type: GrantFiled: February 11, 2019Date of Patent: March 5, 2024Assignee: A9.com, Inc.Inventors: Rupa Chaturvedi, Xing Zhang, Frank Partalis, Yu Lou, Colin Jon Taylor, Simon Fox
-
Patent number: 11403829Abstract: Users can view images or renderings of items placed (virtually) within a physical space. For example, a rendering of an item can be placed within a live camera view of the physical space. A snapshot of the physical space can be captured and the snapshot can be customized, shared, etc. The renderings can be represented as two-dimensional images, e.g., virtual stickers or three-dimensional models of the items. Users can have the ability to view different renderings, move those items around, and develop views of the physical space that may be desirable. The renderings can link to products offered through an electronic marketplace and those products can be consumed. Further, collaborative design is enabled through modeling the physical space and enabling users to view and move around the renderings in a virtual view of the physical space.Type: GrantFiled: February 24, 2021Date of Patent: August 2, 2022Assignee: A9.COM, INC.Inventors: Jason Canada, Rupa Chaturvedi, Jared Corso, Michael Patrick Cutter, Sean Niu, Shaun Michael Post, Peiqi Tang, Stefan Vant, Mark Scott Waldo, Andrea Zehr
-
Patent number: 11238515Abstract: The present embodiments provide visual search techniques which produces results that include both accurate similar items as well diversified items through attribute manipulation. In some embodiments, a feature vector describing the item of interest is obtained. A target feature vector is then generated at least partially from the original feature vector, in which the target feature vector shares only a subset of attribute values with the original feature vector and includes at least some values that are different from the original feature vector. An electronic catalog of items is then queried using the target feature vector, and a set of candidate items are determined from the electronic catalog based at least in part on similarity to the target feature vector. The original feature vector may be used to query for a set of similar items that are as similar as possible to the item of interest.Type: GrantFiled: February 1, 2019Date of Patent: February 1, 2022Assignee: A9.COM, INC.Inventors: Krystle Elaine de Mesa, Aishwarya Natesh, Andrea Joyce Diane Zehr, Rupa Chaturvedi, Mehmet Nejat Tek, Julie Chang
-
Patent number: 11126845Abstract: A computing device is used to capture image data of a physical environment. The image data is of a live camera view from the camera. The image data includes a representation of a physical environment. A selection of an aspect of the representation is determined. The image data is analyzed using a trained neural network and using the selection to determine one or more types of items for the representation. At least two items associated with the one or more types of items are generated. The at least two items and at least one visible marker associated with the aspect are overlaid in the live camera view to provide an augmented reality view of the physical environment.Type: GrantFiled: December 7, 2018Date of Patent: September 21, 2021Assignee: A9.com, Inc.Inventor: Rupa Chaturvedi
-
Patent number: 11093748Abstract: Various embodiments of the present disclosure provide systems and method for visual search and augmented reality, in which an onscreen body of visual markers overlayed on the interface signals the current state of an image recognition process. Specifically, the body of visual markers may take on a plurality of behaviors, in which a particular behavior is indicative of a particular state. Thus, the user can tell what the current state of the scanning process is by the behavior of the body of visual markers. The behavior of the body of visual markers may also indicate to the user recommended actions that can be taken to improve the scanning condition or otherwise facilitate the process. In various embodiments, as the scanning process goes from one state to another state, the onscreen body of visual markers may move or seamlessly transition from one behavior to another behavior, accordingly.Type: GrantFiled: January 27, 2020Date of Patent: August 17, 2021Assignee: A9.COM, INC.Inventors: Peiqi Tang, Andrea Zehr, Rupa Chaturvedi, Yu Lou, Colin Jon Taylor, Mark Scott Waldo, Shaun Michael Post
-
Publication number: 20210183154Abstract: Users can view images or renderings of items placed (virtually) within a physical space. For example, a rendering of an item can be placed within a live camera view of the physical space. A snapshot of the physical space can be captured and the snapshot can be customized, shared, etc. The renderings can be represented as two-dimensional images, e.g., virtual stickers or three-dimensional models of the items. Users can have the ability to view different renderings, move those items around, and develop views of the physical space that may be desirable. The renderings can link to products offered through an electronic marketplace and those products can be consumed. Further, collaborative design is enabled through modeling the physical space and enabling users to view and move around the renderings in a virtual view of the physical space.Type: ApplicationFiled: February 24, 2021Publication date: June 17, 2021Inventors: Jason Canada, Rupa Chaturvedi, Jared Corso, Michael Patrick Cutter, Sean Niu, Shaun Michael Post, Peiqi Tang, Stefan Vant, Mark Scott Waldo, Andrea Zehr
-
Patent number: 10943403Abstract: Users can view images or renderings of items placed (virtually) within a physical space. For example, a rendering of an item can be placed within a live camera view of the physical space. A snapshot of the physical space can be captured and the snapshot can be customized, shared, etc. The renderings can be represented as two-dimensional images, e.g., virtual stickers or three-dimensional models of the items. Users can have the ability to view different renderings, move those items around, and develop views of the physical space that may be desirable. The renderings can link to products offered through an electronic marketplace and those products can be consumed. Further, collaborative design is enabled through modeling the physical space and enabling users to view and move around the renderings in a virtual view of the physical space.Type: GrantFiled: April 29, 2019Date of Patent: March 9, 2021Assignee: A9.com, Inc.Inventors: Jason Canada, Rupa Chaturvedi, Jared Corso, Michael Patrick Cutter, Sean Niu, Shaun Michael Post, Peiqi Tang, Stefan Vant, Mark Scott Waldo, Andrea Zehr
-
Patent number: 10789699Abstract: A computing device is used to capture image data of a physical environment. The image data is analyzed to determine color information for colors represented in the physical environment and to determine scene information that describes a room type associated with the physical environment. A palette of colors is assembled using the colors from the color information and provided for display. Upon selection of a color from the palette of colors, a product associated with the selected color and with the room type is provided for display.Type: GrantFiled: December 26, 2019Date of Patent: September 29, 2020Assignee: A9.COM, INC.Inventor: Rupa Chaturvedi
-
Publication number: 20200258144Abstract: A camera is used to capture image data of representations of a physical environment. Planes and surfaces are determined from a representation. The planes and the surfaces are analyzed using relationships there between to obtain shapes and depth information for available spaces within the physical environment. Locations of the camera with respect to the physical environment are determined. The shapes and the depth information are analyzed using a trained neural network to determine items fitting the available spaces. A live camera view is overlaid with a selection from the items to provide an augmented reality (AR) view of the physical environment from an individual location of the locations. The AR view is enabled so that a user can port to a different location than the individual location by an input received to the AR view while the selection from the items remains anchored to the individual location.Type: ApplicationFiled: February 11, 2019Publication date: August 13, 2020Inventors: Rupa Chaturvedi, Xing Zhang, Frank Partalis, Yu Lou, Colin Jon Taylor, Simon Fox
-
Publication number: 20200160058Abstract: Various embodiments of the present disclosure provide systems and method for visual search and augmented reality, in which an onscreen body of visual markers overlayed on the interface signals the current state of an image recognition process. Specifically, the body of visual markers may take on a plurality of behaviors, in which a particular behavior is indicative of a particular state. Thus, the user can tell what the current state of the scanning process is by the behavior of the body of visual markers. The behavior of the body of visual markers may also indicate to the user recommended actions that can be taken to improve the scanning condition or otherwise facilitate the process. In various embodiments, as the scanning process goes from one state to another state, the onscreen body of visual markers may move or seamlessly transition from one behavior to another behavior, accordingly.Type: ApplicationFiled: January 27, 2020Publication date: May 21, 2020Inventors: Peiqi Tang, Andrea Zehr, Rupa Chaturvedi, Yu Lou, Colin Jon Taylor, Mark Scott Waldo, Shaun Michael Post
-
Publication number: 20200134811Abstract: A computing device is used to capture image data of a physical environment. The image data is analyzed to determine color information for colors represented in the physical environment and to determine scene information that describes a room type associated with the physical environment. A palette of colors is assembled using the colors from the color information and provided for display. Upon selection of a color from the palette of colors, a product associated with the selected color and with the room type is provided for display.Type: ApplicationFiled: December 26, 2019Publication date: April 30, 2020Inventor: Rupa Chaturvedi
-
Patent number: 10572988Abstract: A computing device is used to capture image data of a physical environment. The image data is analyzed to determine color information for colors represented in the physical environment and to determine scene information that describes a room type associated with the physical environment. A palette of colors is assembled using the colors from the color information and provided for display. Upon selection of a color from the palette of colors, a product associated with the selected color and with the room type is provided for display.Type: GrantFiled: June 19, 2017Date of Patent: February 25, 2020Assignee: A9.COM, INC.Inventor: Rupa Chaturvedi
-
Patent number: 10558857Abstract: Various embodiments of the present disclosure provide systems and method for visual search and augmented reality, in which an onscreen body of visual markers overlayed on the interface signals the current state of an image recognition process. Specifically, the body of visual markers may take on a plurality of behaviors, in which a particular behavior is indicative of a particular state. Thus, the user can tell what the current state of the scanning process is by the behavior of the body of visual markers. The behavior of the body of visual markers may also indicate to the user recommended actions that can be taken to improve the scanning condition or otherwise facilitate the process. In various embodiments, as the scanning process goes from one state to another state, the onscreen body of visual markers may move or seamlessly transition from one behavior to another behavior, accordingly.Type: GrantFiled: March 5, 2018Date of Patent: February 11, 2020Assignee: A9.COM, INC.Inventors: Peiqi Tang, Andrea Zehr, Rupa Chaturvedi, Yu Lou, Colin Jon Taylor, Mark Scott Waldo, Shaun Michael Post
-
Publication number: 20190272425Abstract: Various embodiments of the present disclosure provide systems and method for visual search and augmented reality, in which an onscreen body of visual markers overlayed on the interface signals the current state of an image recognition process. Specifically, the body of visual markers may take on a plurality of behaviors, in which a particular behavior is indicative of a particular state. Thus, the user can tell what the current state of the scanning process is by the behavior of the body of visual markers. The behavior of the body of visual markers may also indicate to the user recommended actions that can be taken to improve the scanning condition or otherwise facilitate the process. In various embodiments, as the scanning process goes from one state to another state, the onscreen body of visual markers may move or seamlessly transition from one behavior to another behavior, accordingly.Type: ApplicationFiled: March 5, 2018Publication date: September 5, 2019Inventors: Peiqi Tang, Andrea Zehr, Rupa Chaturvedi, Yu Lou, Colin Jon Taylor, Mark Scott Waldo, Shaun Michael Post
-
Publication number: 20190251753Abstract: Users can view images or renderings of items placed (virtually) within a physical space. For example, a rendering of an item can be placed within a live camera view of the physical space. A snapshot of the physical space can be captured and the snapshot can be customized, shared, etc. The renderings can be represented as two-dimensional images, e.g., virtual stickers or three-dimensional models of the items. Users can have the ability to view different renderings, move those items around, and develop views of the physical space that may be desirable. The renderings can link to products offered through an electronic marketplace and those products can be consumed. Further, collaborative design is enabled through modeling the physical space and enabling users to view and move around the renderings in a virtual view of the physical space.Type: ApplicationFiled: April 29, 2019Publication date: August 15, 2019Inventors: Jason Canada, Rupa Chaturvedi, Jared Corso, Michael Patrick Cutter, Sean Niu, Shaun Michael Post, Peiqi Tang, Stefan Vant, Mark Scott Waldo, Andrea Zehr
-
Patent number: 10319150Abstract: Users can view images or renderings of items placed (virtually) within a physical space. For example, a rendering of an item can be placed within a live camera view of the physical space. A snapshot of the physical space can be captured and the snapshot can be customized, shared, etc. The renderings can be represented as two-dimensional images, e.g., virtual stickers or three-dimensional models of the items. Users can have the ability to view different renderings, move those items around, and develop views of the physical space that may be desirable. The renderings can link to products offered through an electronic marketplace and those products can be consumed. Further, collaborative design is enabled through modeling the physical space and enabling users to view and move around the renderings in a virtual view of the physical space.Type: GrantFiled: May 15, 2017Date of Patent: June 11, 2019Assignee: A9.COM, INC.Inventors: Jason Canada, Rupa Chaturvedi, Jared Corso, Michael Patrick Cutter, Sean Niu, Shaun Michael Post, Peiqi Tang, Stefan Vant, Mark Scott Waldo, Andrea Zehr
-
Patent number: 10210664Abstract: Systems and methods herein enable adding changing light in a live camera view, an image or a video using a light source that is virtual and that includes an associated lighting profile. The system includes receiving image data of the live camera view. The system determines a first lighting profile associated with the representation of the object. Position information associated with the object is also determined with respect to the camera. The system receives a second lighting profile associated with the light source. The second lighting profile provides at least intensity values and direction information for light projected from the light source. The system determines and applies changes to the first lighting profile to affect a light surrounding the representation of the object using the second lighting profile. The system displays the image data with the changes to the first lighting profile.Type: GrantFiled: May 3, 2017Date of Patent: February 19, 2019Assignee: A9.com, Inc.Inventor: Rupa Chaturvedi