Patents Assigned to Robotical Ltd.
  • Patent number: 10946981
    Abstract: An unmanned aerial vehicle (UAV) ground station, comprising: a landing surface having a perimeter and a center; a plurality of pushers held above the landing surface by a plurality of linear actuators; at least one electro-mechanical connector attached to one of the plurality of pushers, mechanically adapted to be electrically connected to a compatible electro-mechanical connector of a UAV; and a landing detection controller adapted to instruct the plurality of linear actuators to move the plurality of pushers simultaneously from the perimeter toward the center when a landing event related to the UAV is detected.
    Type: Grant
    Filed: November 13, 2019
    Date of Patent: March 16, 2021
    Assignee: Percepto Robotics Ltd
    Inventors: Raviv Raz, Jonathan Jaffe, Sagi Blonder
  • Patent number: 10939962
    Abstract: A system and method for verifying the accurate insertion positioning of a robotically guided surgical tool or probe, at its cranial target region, such as for Deep Brain Stimulation. A head mounted robot aligns a probe or tool guiding sleeve, together with an aiming rod attached at a predefined position and angle to the guiding sleeve. The aiming rod incorporates apertures through which an X-ray system can view the patient's skull. The aiming rod is attached to the tool guiding sleeve at an angle and position calculated such that the line of sight through the apertures falls exactly on the target region when the tool or probe is inserted to its predetermined depth. If the tip of the tool or probe is seen located at the center of the apertures in the X-ray image, verification is obtained that the insertion procedure has been performed accurately.
    Type: Grant
    Filed: April 4, 2016
    Date of Patent: March 9, 2021
    Assignee: Mazor Robotics Ltd.
    Inventors: Aviv Ellman, Eli Zehavi
  • Publication number: 20210040757
    Abstract: Disclosed is a method for controlling cleaning at least a portion of a building facade (104a). A multi-dimensional map of a facade (104a) is cleaned from an elevator platform (200) of an elevator system is received. According to the map, an ordered sequence of instructions is determined. The instructions comprise robotic arm instructions and elevator platform instructions that are temporally intertwined to execute a cleaning pattern covering at least part of facade (104a). The robotic arm instructions are forwarded to control one or more robotic arms (206) and the elevator platform instructions are forwarded to control the elevation of the elevator platform (200). A corresponding device (221, 213) and non-transient memory (223), and a system having the device, are also disclosed.
    Type: Application
    Filed: February 7, 2019
    Publication date: February 11, 2021
    Applicant: Skyline Robotics Ltd.
    Inventors: Avi ABADI, Yaron SCHWARCZ
  • Publication number: 20210026366
    Abstract: Systems and methods for optimizing sensory signal capturing by reconfiguring robotic device configurations. In an implementation, upcoming sensor readings are predicted based on navigation path data. Based on the predicted sensor readings, optimized sensor parameters that will improve the quality of resulting sensory signals are determined. Sensors of a robotic device are reconfigured based on the optimized sensor parameters. In another implementation, deficiencies in sensory signals are determined. A motion configuration of a robotic device is optimized based on the deficiencies to allow for capturing better quality sensory signals.
    Type: Application
    Filed: October 8, 2020
    Publication date: January 28, 2021
    Applicant: R-Go Robotics LTD.
    Inventor: Nizan HORESH
  • Publication number: 20200406467
    Abstract: A method for adaptively adjusting a user experience interacting with an electronic device. A method includes: collecting, using at least one sensor, data related to an interaction of a user with a feature of an electronic device; activating of a timer to measure a response time of a user to the feature; analyzing the collected data to determine responsiveness of the user to the feature; and adjusting a user experience parameter of the electronic device based on the determined responsiveness.
    Type: Application
    Filed: June 26, 2020
    Publication date: December 31, 2020
    Applicant: Intuition Robotics, Ltd.
    Inventors: Shay ZWEIG, Roy AMIR, Itai MENDELSOHN, Dor SKULER
  • Publication number: 20200406903
    Abstract: A controller and method for real-time customization of presentation features of a vehicle. A method includes collecting a first dataset about a knowledge level of an operator of the vehicle, wherein the first dataset is collected with respect to a feature of the vehicle; collecting, using at least one sensor, a second dataset regarding an external environment of the vehicle and a cabin of the vehicle; determining, based on the first dataset and the second dataset, a presentation feature from a plurality of presentation features associated with the feature of the vehicle; customizing the presentation feature based on at least the first dataset, wherein the customization is performed in real-time when the operator operates the vehicle; and presenting the presentation feature to the operator of the vehicle.
    Type: Application
    Filed: June 26, 2020
    Publication date: December 31, 2020
    Applicant: Intuition Robotics, Ltd.
    Inventors: Shay ZWEIG, Roy AMIR, Itai MENDELSOHN, Dor SKULER
  • Publication number: 20200410317
    Abstract: A system and method for real-time customization of presentation features of a social robot. A method includes collecting a first dataset regarding a knowledge level of a user of the social robot with respect to at least one feature of the social robot; collecting a second dataset, wherein the second dataset data is collected from at least an environment of the social robot; determining, based on the first dataset and the second dataset, at least one presentation feature from a plurality of presentation features; selecting a first presentation feature of the at least one presentation feature; customizing the selected first presentation feature based on at least the first dataset; and presenting in real-time the customized presentation feature, wherein the presentation is performed using at least one electronic component of the social robot.
    Type: Application
    Filed: June 26, 2020
    Publication date: December 31, 2020
    Applicant: Intuition Robotics, Ltd.
    Inventors: Shay ZWEIG, Roy AMIR, Itai MENDELSOHN, Dor SKULER
  • Publication number: 20200393291
    Abstract: A sensor comprising a whisker shaft and a follicle is provided. The shaft has a root end and a tip end and the shaft tapers from the root end to the tip end so that the root end is wider and the tip end is narrower. The root end is pivotably mounted in the follicle.
    Type: Application
    Filed: June 10, 2020
    Publication date: December 17, 2020
    Applicant: Bristol Maritime Robotics Ltd
    Inventors: Richard Sewell, Thomas Rooney
  • Publication number: 20200394807
    Abstract: Systems and methods for monitoring movements. A method includes detecting motion based on first localization data related to a localization device moving in a distinct motion pattern, wherein the first localization data is based on sensor readings captured by at least one sensor; correlating the detected motion to a known motion of the localization device based on respective times of the first localization data and of the localization device; localizing the localization device with respect to a map based on the correlation; tracking at least one first location of an object based on second localization data captured by the at least one sensor, wherein the at least one first location is on the map, wherein the tracking further comprises identifying at least one second location of the object based on the second localization data and determining the at least one first location based on the at least one second location.
    Type: Application
    Filed: August 27, 2020
    Publication date: December 17, 2020
    Applicant: R-Go Robotics LTD.
    Inventors: Nizan HORESH, Amir BOUSANI, Assaf AGMON
  • Patent number: 10775786
    Abstract: A method for emulating control signals for controlling an unmanned aerial vehicle (UAV). The method comprises using an integrated circuit installed on the UAV and communicatively coupled to a flight control processor and a companion processor for receiving flight control parameters from the companion processor, the flight control parameters are based data from a plurality of sensors communicatively coupled to the companion processor and mounted on the UAV, wirelessly receiving remote control signals from a remote control unit, modulating the flight control parameters to create emulated signals emulating a signals generated by a remote control designated to wirelessly control the UAV, the emulated signals encode instructions to maneuver the UAV, and switching between the remote control signals and the emulated signals during a flight time of the UAV.
    Type: Grant
    Filed: July 11, 2018
    Date of Patent: September 15, 2020
    Assignee: Percepto Robotics Ltd
    Inventors: Dor Abuhasira, Sagi Blonder, Raviv Raz
  • Publication number: 20200216082
    Abstract: A system and method for monitoring and managing a cognitive load of an occupant of a vehicle, including: determining, based on analysis of a first set of sensory inputs received from a first set of sensors from a cabin of the vehicle, a current cognitive load score of an occupant of the vehicle; determining, based on an analysis of a second set of sensory inputs, a current state of the vehicle; analyzing the current cognitive load score of the occupant with respect to the current state of the vehicle; and, selecting at least one predetermined plan for execution based on a determination that a reduction of the current cognitive load score of the occupant is desirable, wherein the determination is based on a result of the analysis of the current cognitive load score of the occupant and the current state of the vehicle.
    Type: Application
    Filed: January 8, 2020
    Publication date: July 9, 2020
    Applicant: Intuition Robotics, Ltd.
    Inventors: Roy AMIR, Itai MENDELSOHN, Dor SKULER, Shay ZWEIG
  • Publication number: 20200218263
    Abstract: A system and method for explaining actions of a vehicle, including: receiving an indication associated with at least one action of the vehicle; receiving a data input from the vehicle, wherein the data input is associated with at least one of: an external environment of the vehicle and an internal environment of the vehicle; analyzing the indication with respect to the data input to determine when it is desirable to provide an explanation for the at least one action of the vehicle; selecting, when it is desirable to provide an explanation for the at least one action, a predetermined plan including an explanation of the at least one action, wherein the selection is based on the analysis of the indication with respect to the data input; and, executing the predetermined plan.
    Type: Application
    Filed: January 8, 2020
    Publication date: July 9, 2020
    Applicant: Intuition Robotics, Ltd.
    Inventors: Roy AMIR, Itai MENDELSOHN, Dor SKULER, Shay ZWEIG
  • Publication number: 20200219608
    Abstract: A system and method for monitoring and managing a cognitive load of a person, including: determining, based on an analysis of at least one input data associated with a person, a current cognitive load score of the person; determining, based on the analysis of the at least one input data and the determined current cognitive load score, whether a reduction of the current cognitive load score of the person is desirable; and selecting at least one predetermined plan for execution based on when a reduction of the current cognitive load score of the person is desirable.
    Type: Application
    Filed: January 8, 2020
    Publication date: July 9, 2020
    Applicant: Intuition Robotics, Ltd.
    Inventors: Roy AMIR, Itai MENDELSOHN, Dor SKULER, Shay ZWEIG
  • Publication number: 20200219412
    Abstract: A system and method for determining a responsive action based on sensor fusion, including: performing a sensor fusion on data received from a plurality of sensors to produce output fusion data; analyzing the output fusion data to determine one or more potential actionable scenarios to be selected; determining if the one or more potential actionable scenarios are to be executed; and sending commands to one or more resources to perform the one or more potential actionable scenarios.
    Type: Application
    Filed: January 8, 2020
    Publication date: July 9, 2020
    Applicant: Intuition Robotics, Ltd.
    Inventors: Roy AMIR, Itai MENDELSOHN, Dor SKULER, Shay ZWEIG
  • Publication number: 20200201360
    Abstract: There is provided a method of automatically landing a drone on a landing pad having thereon guiding-elements arranged in a pattern relative to a central region of the landing pad, comprising: receiving first image(s) captured by a camera of the drone, processing the first image(s) to compute a segmentation mask according to an estimate of a location of the landing pad, receiving second image(s) captured by the camera, processing the second image(s) according to the segmentation mask to compute a segmented region and extracting from the segmented region guiding-element(s), determining a vector for each of the extracted guiding-element(s), and aggregating the vectors to compute an estimated location of the central region of the landing pad, and navigating and landing the drone on the landing pad according to the estimated location of the central region of the landing pad.
    Type: Application
    Filed: January 13, 2020
    Publication date: June 25, 2020
    Applicant: Percepto Robotics Ltd
    Inventors: Sagi BLONDER, Ariel BENITAH, Eyal GIL-AD
  • Publication number: 20200160479
    Abstract: Systems and methods for providing geometric interactions via three-dimensional mapping. A method includes determining a plurality of first descriptors for a plurality of key points in a plurality of first images, wherein each first image shows a portion of a 3D environment in which a robotic device and a visual sensor are deployed; generating a 3D map of the 3D environment based on the plurality of key points and the plurality of descriptors; determining a pose of the visual sensor based on at least one second descriptor and the plurality of first descriptors, wherein the second image is captured by the visual sensor; and determining a target action location based on at least one user input made with respect to a display of the second image and the pose of the visual sensor, wherein the target action location is a location within the 3D environment.
    Type: Application
    Filed: January 23, 2020
    Publication date: May 21, 2020
    Applicant: R-Go Robotics LTD.
    Inventors: Nizan HORESH, Amir BOUSANI
  • Publication number: 20200159229
    Abstract: Techniques for controlling a robotic device in motion using image synthesis are presented. A method includes determining local motion features based on image patches included in first and second input images captured by a camera installed on the robotic device; determining a camera motion based on the local motion features, wherein the camera motion indicates a movement of the camera between capture of the first input image and capture of the second input image; determining a scene geometry based on the camera motion, wherein the scene geometry is a set of three-dimensional coordinates corresponding to two-dimensional image coordinates of the image patches; generating a single perspective synthesized image based on the camera motion and the scene geometry; detecting a change between the first and second input images based on the synthesized image; and modifying motion of the robotic device based on the detected at least one change.
    Type: Application
    Filed: January 23, 2020
    Publication date: May 21, 2020
    Applicant: R-Go Robotics LTD.
    Inventors: Nizan HORESH, Amir BOUSANI
  • Publication number: 20200079529
    Abstract: An unmanned aerial vehicle (UAV) ground station, comprising: a landing surface having a perimeter and a center; a plurality of pushers held above the landing surface by a plurality of linear actuators; at least one electro-mechanical connector attached to one of the plurality of pushers, mechanically adapted to be electrically connected to a compatible electro-mechanical connector of a UAV; and a landing detection controller adapted to instruct the plurality of linear actuators to move the plurality of pushers simultaneously from the perimeter toward the center when a landing event related to the UAV is detected.
    Type: Application
    Filed: November 13, 2019
    Publication date: March 12, 2020
    Applicant: Percepto Robotics Ltd
    Inventors: Raviv RAZ, Jonathan JAFFE, Sagi BLONDER
  • Patent number: 10571896
    Abstract: A method for defining a robotic machine task. The method comprises a) collecting a sequence of a plurality of images showing at least one demonstrator performing at least one manipulation of at least one object, b) performing an analysis of said sequence of a plurality of images to identify demonstrator body parts manipulating said at least one object and said at least one manipulation of said at least one object, c) determining at least one robotic machine movement to perform said task, and d) generating at least one motion command for instructing said robotic machine to perform said task.
    Type: Grant
    Filed: August 21, 2017
    Date of Patent: February 25, 2020
    Assignee: Deep Learning Robotics Ltd.
    Inventors: Carlos Benaim, Miriam Reiner
  • Patent number: 10551852
    Abstract: There is provided a method of automatically landing a drone on a landing pad having thereon guiding-elements arranged in a pattern relative to a central region of the landing pad, comprising: receiving first image(s) captured by a camera of the drone, processing the first image(s) to compute a segmentation mask according to an estimate of a location of the landing pad, receiving second image(s) captured by the camera, processing the second image(s) according to the segmentation mask to compute a segmented region and extracting from the segmented region guiding-element(s), determining a vector for each of the extracted guiding-element(s), and aggregating the vectors to compute an estimated location of the central region of the landing pad, and navigating and landing the drone on the landing pad according to the estimated location of the central region of the landing pad.
    Type: Grant
    Filed: July 20, 2017
    Date of Patent: February 4, 2020
    Assignee: Percepto Robotics Ltd
    Inventors: Sagi Blonder, Ariel Benitah, Eyal Gil-Ad