Patents by Inventor Avi Bar-Zeev

Avi Bar-Zeev has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9914539
    Abstract: An airlift package protection (APP) airbag may protect a package (e.g., an item or a number of items) that is dropped from within a predetermined height range by an unmanned aerial vehicle (UAV). The APP airbag may at least partially surround the package and create a container for the package. In some embodiments, the APP airbag may be inflated just prior to dropping of the package from the UAV. After inflation, the APP airbag may be at least partially sealed to reduce or inhibit deflation of the APP airbag, but possibly not to completely prevent airflow from the APP airbag upon contact with the ground. The APP airbag may exhaust some air upon impact with the ground, thereby reducing a deceleration of a package contained inside of the APP airbag.
    Type: Grant
    Filed: March 25, 2015
    Date of Patent: March 13, 2018
    Assignee: Amazon Technologies, Inc.
    Inventors: Avi Bar-Zeev, Gur Kimchi
  • Patent number: 9911236
    Abstract: An optical see-through head-mounted display device includes a see-through lens which combines an augmented reality image with light from a real-world scene, while an opacity filter is used to selectively block portions of the real-world scene so that the augmented reality image appears more distinctly. The opacity filter can be a see-through LCD panel, for instance, where each pixel of the LCD panel can be selectively controlled to be transmissive or opaque, based on a size, shape and position of the augmented reality image. Eye tracking can be used to adjust the position of the augmented reality image and the opaque pixels. Peripheral regions of the opacity filter, which are not behind the augmented reality image, can be activated to provide a peripheral cue or a representation of the augmented reality image. In another aspect, opaque pixels are provided at a time when an augmented reality image is not present.
    Type: Grant
    Filed: February 12, 2016
    Date of Patent: March 6, 2018
    Assignee: Telefonaktiebolaget L M Ericsson (publ)
    Inventors: Avi Bar-Zeev, Bob Crocco, Alex Kipman, John Lewis
  • Patent number: 9863840
    Abstract: Systems and methods for providing a multi-direction wind tunnel, or “windball,” are disclosed. The system can have a series of fans configured to provide air flow in a plurality of directions to enable accurate testing of aircraft, unmanned aerial vehicles (UAVs), and other vehicles capable of multi-dimensional flight. The system can comprise a spherical or polyhedral test chamber with a plurality of fans. The fans can be arranged in pairs, such that a first fan comprises an intake fan and a second fan comprises an exhaust fan. The direction of the air flow can be controlled by activating one or more pairs of fans, each pair of fan creating a portion of the air flow in a particular direction. The direction of the air flow can also be controlled by rotating one or more pairs of fans with respect to the test chamber on a gimbal device, or similar.
    Type: Grant
    Filed: December 22, 2014
    Date of Patent: January 9, 2018
    Inventors: Brian C. Beckman, Avi Bar-Zeev, Steven Gregory Dunn, Amir Navot
  • Patent number: 9786187
    Abstract: A transportation network is provided that utilizes autonomous vehicles (e.g., unmanned aerial vehicles) for identifying, acquiring, and transporting items between network locations without requiring human interaction. A travel path for an item through the transportation network may include a passing of the item from one autonomous vehicle to another or otherwise utilizing different autonomous vehicles for transporting the item along different path segments (e.g., between different network locations). Different possible travel paths through the transportation network may be evaluated, and a travel path for an item may be selected based on transportation factors such as travel time, cost, safety, etc., which may include consideration of information regarding current conditions (e.g., related to network congestion, inclement weather, etc.). Autonomous vehicles of different sizes, carrying capacities, travel ranges, travel speeds, etc.
    Type: Grant
    Filed: June 9, 2015
    Date of Patent: October 10, 2017
    Assignee: Amazon Technologies, Inc.
    Inventors: Avi Bar-Zeev, Brian C. Beckman, Daniel Buchmueller, Steven Gregory Dunn, Gur Kimchi
  • Patent number: 9741255
    Abstract: Described is an airborne monitoring station (“AMS”) for use in monitoring a coverage area and/or unmanned aerial vehicles (“UAVs”) positioned within a coverage area of the AMS. For example, the AMS may be an airship that remains at a high altitude (e.g., 45,000 feet) that monitors a coverage area that is within a line-of-sight of the AMS. As UAVs enter, navigate within and exit the coverage area, the AMS may wirelessly communicate with the UAVs, facilitate communication between the UAVs and one or more remote computing resources, and/or monitor a position of the UAVs.
    Type: Grant
    Filed: May 28, 2015
    Date of Patent: August 22, 2017
    Assignee: Amazon Technologies, Inc.
    Inventors: Amir Navot, Gur Kimchi, Brandon William Porter, Avi Bar-Zeev, Daniel Buchmueller
  • Patent number: 9710973
    Abstract: A system that includes a head mounted display device and a processing unit connected to the head mounted display device is used to fuse virtual content into real content. In one embodiment, the processing unit is in communication with a hub computing device. The processing unit and hub may collaboratively determine a map of the mixed reality environment. Further, state data may be extrapolated to predict a field of view for a user in the future at a time when the mixed reality is to be displayed to the user. This extrapolation can remove latency from the system.
    Type: Grant
    Filed: May 23, 2016
    Date of Patent: July 18, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Avi Bar-Zeev, J. Andrew Goossen, John Tardif, Mark S. Grossman, Harjit Singh
  • Patent number: 9690099
    Abstract: A method and system that enhances a user's experience when using a near eye display device, such as a see-through display device or a head mounted display device is provided. An optimized image for display relative to the a field of view of a user in a scene is created. The user's head and eye position and movement are tracked to determine a focal region for the user. A portion of the optimized image is coupled to the user's focal region in the current position of the eyes, a next position of the head and eyes predicted, and a portion of the optimized image coupled to the user's focal region in the next position.
    Type: Grant
    Filed: December 17, 2010
    Date of Patent: June 27, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Avi Bar-Zeev, John R. Lewis, Georg Klein
  • Publication number: 20170110017
    Abstract: This disclosure describes an unmanned aerial vehicle (“UAV”) configured to autonomously deliver items of inventory to various destinations. The UAV may receive inventory information and a destination location and autonomously retrieve the inventory from a location within a materials handling facility, compute a route from the materials handling facility to a destination and travel to the destination to deliver the inventory.
    Type: Application
    Filed: December 28, 2016
    Publication date: April 20, 2017
    Inventors: Gur Kimchi, Daniel Buchmueller, Scott A. Green, Brian C. Beckman, Scott Isaacs, Amir Navot, Fabian Hensel, Avi Bar-Zeev, Severan Sylvain Jean-Michel Rault
  • Patent number: 9588341
    Abstract: The technology provides an augmented reality display system for displaying a virtual object to be in focus when viewed by a user. In one embodiment, the focal region of the user is tracked, and a virtual object within the user focal region is displayed to appear in the focal region. As the user changes focus between virtual objects, they appear to naturally move in and out of focus as real objects in a physical environment would. The change of focus for the virtual object images is caused by changing a focal region of light processing elements in an optical path of a microdisplay assembly of the augmented reality display system. In some embodiments, a range of focal regions are swept through at a sweep rate by adjusting the elements in the optical path of the microdisplay assembly.
    Type: Grant
    Filed: February 12, 2016
    Date of Patent: March 7, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Avi Bar-Zeev, John Lewis
  • Patent number: 9573684
    Abstract: This disclosure describes an unmanned aerial vehicle (“UAV”) configured to autonomously deliver items of inventory to various destinations. The UAV may receive inventory information and a destination location and autonomously retrieve the inventory from a location within a materials handling facility, compute a route from the materials handling facility to a destination and travel to the destination to deliver the inventory.
    Type: Grant
    Filed: September 30, 2014
    Date of Patent: February 21, 2017
    Assignee: Amazon Technologies, Inc.
    Inventors: Gur Kimchi, Daniel Buchmueller, Scott A. Green, Brian C. Beckman, Scott Isaacs, Amir Navot, Fabian Hensel, Avi Bar-Zeev, Severan Sylvain Jean-Michel Rault
  • Patent number: 9563326
    Abstract: Information may be presented to a user in a way that reflects an awareness of the user's current situation. The relationship between user's situation, and various people and things, may be analyzed to determine the user's proximity to those people and things. (Proximity may refer not only to geographic proximity, but also temporal proximity, relevance proximity, etc.) A user interface may show people and things at different levels of proximity to the user's current situation, with the level of proximity being represented visually. The user may reposition the center of focus to one of the people or things depicted. When the center is repositioned, the level of proximity of people and things may be shown relative to the new center of focus, filtered based on existing relationships of those people and things to the user.
    Type: Grant
    Filed: October 18, 2012
    Date of Patent: February 7, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Avi Bar-Zeev, Gonzalo A. Ramos, Michael Chowning Byron
  • Patent number: 9522330
    Abstract: A method for providing three-dimensional audio is provided. The method includes receiving a depth map imaging a scene from a depth camera and recognizing a human subject present in the scene. The human subject is modeled with a virtual skeleton comprising a plurality of joints defined with a three-dimensional position. A world space ear position of the human subject is determined based on the virtual skeleton. Furthermore, a target world space ear position of the human subject is determined. The target world space ear position is the world space position where a desired audio effect can be produced via an acoustic transducer array. The method further includes outputting a notification representing a spatial relationship between the world space ear position and the target world space ear position.
    Type: Grant
    Filed: December 21, 2012
    Date of Patent: December 20, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Shawn Pile, Jon Vincent, Scott Henson, Jason Flaks, Avi Bar-Zeev, John Tardif
  • Publication number: 20160342432
    Abstract: The claimed subject matter relates to an architecture that can provide for a second-person avatar. The second-person avatar can rely upon a second-person-based perspective such that the avatar is displayed to appear to encompass all or portions of a target user. Accordingly, actions or a configuration of the avatar can serve as a model or demonstration for the user in order to aid the user in accomplishing a particular task. Updates to avatar activity or configuration can be provided by a dynamic virtual handbook. The virtual handbook can be constructed based upon a set of instruction associated with accomplishing the desired task and further based upon features or aspects of the user as well as those of the local environment.
    Type: Application
    Filed: August 4, 2016
    Publication date: November 24, 2016
    Inventors: Eyal Ofek, Blaise H. Aguera y Arcas, Avi Bar-Zeev, Gur Kimchi, Jason Szabo
  • Patent number: 9503505
    Abstract: Embodiments enable the evaluation of injected queries within a monad. One or more operators with closures are received from a first process. The operators with closures represent one or more functions to be applied by a second process. The second process evaluates the received operators with closures to apply the functions within the monad. During evaluation, the second process converts the closures to simply typed closures. Further, the second process binds the converted closures within the monad to restrict execution of the functions. In some embodiments, the queries (e.g., sequences of one or more operators with closures) are composed using a set of query operators from the language integrated query (LINQ) framework encoded in uniform resource locators (URLs) in the representational state transfer (REST) style.
    Type: Grant
    Filed: June 25, 2015
    Date of Patent: November 22, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Brian Beckman, Elad Gerson, Gur Kimchi, Avi Bar-Zeev, Selvi Chennai, Henricus Johannes Maria Meijer
  • Publication number: 20160267717
    Abstract: A system that includes a head mounted display device and a processing unit connected to the head mounted display device is used to fuse virtual content into real content. In one embodiment, the processing unit is in communication with a hub computing device. The processing unit and hub may collaboratively determine a map of the mixed reality environment. Further, state data may be extrapolated to predict a field of view for a user in the future at a time when the mixed reality is to be displayed to the user. This extrapolation can remove latency from the system.
    Type: Application
    Filed: May 23, 2016
    Publication date: September 15, 2016
    Inventors: Avi Bar-Zeev, J. Andrew Goossen, John Tardif, Mark S. Grossman, Harjit Singh
  • Publication number: 20160261720
    Abstract: In server/client architectures, the server application and client applications are often developed in different languages and execute in different environments specialized for the different contexts of each application (e.g., low-level, performant, platform-specialized, and stateless instructions on the server, and high-level, flexible, platform-agnostic, and stateful languages on the client) and are often executed on different devices. Convergence of these environments (e.g., server-side JavaScript using Node.js) enables the provision of a server that services client applications executing on the same device. The local server may monitor local events occurring on the device, and may execute one or more server scripts associated with particular local events on behalf of local clients subscribing to the local event (e.g., via a subscription model).
    Type: Application
    Filed: May 17, 2016
    Publication date: September 8, 2016
    Inventors: Avi Bar-Zeev, Gur Kimchi, Brian C. Beckman, Scott Isaacs, Meir Ben-Itay, Eran Yariv, Blaise Aguera y Arcas
  • Patent number: 9436276
    Abstract: The claimed subject matter relates to an architecture that can provide for a second-person avatar. The second-person avatar can rely upon a second-person-based perspective such that the avatar is displayed to appear to encompass all or portions of a target user. Accordingly, actions or a configuration of the avatar can serve as a model or demonstration for the user in order to aid the user in accomplishing a particular task. Updates to avatar activity or configuration can be provided by a dynamic virtual handbook. The virtual handbook can be constructed based upon a set of instruction associated with accomplishing the desired task and further based upon features or aspects of the user as well as those of the local environment.
    Type: Grant
    Filed: February 25, 2009
    Date of Patent: September 6, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Eyal Ofek, Blaise H. Aguera y Arcas, Avi Bar-Zeev, Gur Kimchi, Jason Szabo
  • Publication number: 20160189432
    Abstract: An augmented reality system provides improved focus of real and virtual objects. A see-through display device includes a variable focus lens a user looks through. A focal region adjustment unit automatically focuses the variable focus lens in a current user focal region. A microdisplay assembly attached to the see-through display device generates a virtual object for display in the user's current focal region by adjusting its focal region. The variable focus lens may also be adjusted to provide one or more zoom features. Visual enhancement of an object may also be provided to improve a user's perception of an object.
    Type: Application
    Filed: March 3, 2016
    Publication date: June 30, 2016
    Inventors: Avi Bar-Zeev, John Lewis
  • Publication number: 20160171779
    Abstract: An optical see-through head-mounted display device includes a see-through lens which combines an augmented reality image with light from a real-world scene, while an opacity filter is used to selectively block portions of the real-world scene so that the augmented reality image appears more distinctly. The opacity filter can be a see-through LCD panel, for instance, where each pixel of the LCD panel can be selectively controlled to be transmissive or opaque, based on a size, shape and position of the augmented reality image. Eye tracking can be used to adjust the position of the augmented reality image and the opaque pixels. Peripheral regions of the opacity filter, which are not behind the augmented reality image, can be activated to provide a peripheral cue or a representation of the augmented reality image. In another aspect, opaque pixels are provided at a time when an augmented reality image is not present.
    Type: Application
    Filed: February 12, 2016
    Publication date: June 16, 2016
    Inventors: Avi Bar-Zeev, Bob Crocco, Alex Kipman, John Lewis
  • Publication number: 20160161740
    Abstract: The technology provides an augmented reality display system for displaying a virtual object to be in focus when viewed by a user. In one embodiment, the focal region of the user is tracked, and a virtual object within the user focal region is displayed to appear in the focal region. As the user changes focus between virtual objects, they appear to naturally move in and out of focus as real objects in a physical environment would. The change of focus for the virtual object images is caused by changing a focal region of light processing elements in an optical path of a microdisplay assembly of the augmented reality display system. In some embodiments, a range of focal regions are swept through at a sweep rate by adjusting the elements in the optical path of the microdisplay assembly.
    Type: Application
    Filed: February 12, 2016
    Publication date: June 9, 2016
    Inventors: Avi Bar-Zeev, John Lewis