Patents Assigned to Robotical Ltd.
-
Patent number: 10946981Abstract: An unmanned aerial vehicle (UAV) ground station, comprising: a landing surface having a perimeter and a center; a plurality of pushers held above the landing surface by a plurality of linear actuators; at least one electro-mechanical connector attached to one of the plurality of pushers, mechanically adapted to be electrically connected to a compatible electro-mechanical connector of a UAV; and a landing detection controller adapted to instruct the plurality of linear actuators to move the plurality of pushers simultaneously from the perimeter toward the center when a landing event related to the UAV is detected.Type: GrantFiled: November 13, 2019Date of Patent: March 16, 2021Assignee: Percepto Robotics LtdInventors: Raviv Raz, Jonathan Jaffe, Sagi Blonder
-
Patent number: 10939962Abstract: A system and method for verifying the accurate insertion positioning of a robotically guided surgical tool or probe, at its cranial target region, such as for Deep Brain Stimulation. A head mounted robot aligns a probe or tool guiding sleeve, together with an aiming rod attached at a predefined position and angle to the guiding sleeve. The aiming rod incorporates apertures through which an X-ray system can view the patient's skull. The aiming rod is attached to the tool guiding sleeve at an angle and position calculated such that the line of sight through the apertures falls exactly on the target region when the tool or probe is inserted to its predetermined depth. If the tip of the tool or probe is seen located at the center of the apertures in the X-ray image, verification is obtained that the insertion procedure has been performed accurately.Type: GrantFiled: April 4, 2016Date of Patent: March 9, 2021Assignee: Mazor Robotics Ltd.Inventors: Aviv Ellman, Eli Zehavi
-
Publication number: 20210040757Abstract: Disclosed is a method for controlling cleaning at least a portion of a building facade (104a). A multi-dimensional map of a facade (104a) is cleaned from an elevator platform (200) of an elevator system is received. According to the map, an ordered sequence of instructions is determined. The instructions comprise robotic arm instructions and elevator platform instructions that are temporally intertwined to execute a cleaning pattern covering at least part of facade (104a). The robotic arm instructions are forwarded to control one or more robotic arms (206) and the elevator platform instructions are forwarded to control the elevation of the elevator platform (200). A corresponding device (221, 213) and non-transient memory (223), and a system having the device, are also disclosed.Type: ApplicationFiled: February 7, 2019Publication date: February 11, 2021Applicant: Skyline Robotics Ltd.Inventors: Avi ABADI, Yaron SCHWARCZ
-
Publication number: 20210026366Abstract: Systems and methods for optimizing sensory signal capturing by reconfiguring robotic device configurations. In an implementation, upcoming sensor readings are predicted based on navigation path data. Based on the predicted sensor readings, optimized sensor parameters that will improve the quality of resulting sensory signals are determined. Sensors of a robotic device are reconfigured based on the optimized sensor parameters. In another implementation, deficiencies in sensory signals are determined. A motion configuration of a robotic device is optimized based on the deficiencies to allow for capturing better quality sensory signals.Type: ApplicationFiled: October 8, 2020Publication date: January 28, 2021Applicant: R-Go Robotics LTD.Inventor: Nizan HORESH
-
Publication number: 20200406467Abstract: A method for adaptively adjusting a user experience interacting with an electronic device. A method includes: collecting, using at least one sensor, data related to an interaction of a user with a feature of an electronic device; activating of a timer to measure a response time of a user to the feature; analyzing the collected data to determine responsiveness of the user to the feature; and adjusting a user experience parameter of the electronic device based on the determined responsiveness.Type: ApplicationFiled: June 26, 2020Publication date: December 31, 2020Applicant: Intuition Robotics, Ltd.Inventors: Shay ZWEIG, Roy AMIR, Itai MENDELSOHN, Dor SKULER
-
Publication number: 20200406903Abstract: A controller and method for real-time customization of presentation features of a vehicle. A method includes collecting a first dataset about a knowledge level of an operator of the vehicle, wherein the first dataset is collected with respect to a feature of the vehicle; collecting, using at least one sensor, a second dataset regarding an external environment of the vehicle and a cabin of the vehicle; determining, based on the first dataset and the second dataset, a presentation feature from a plurality of presentation features associated with the feature of the vehicle; customizing the presentation feature based on at least the first dataset, wherein the customization is performed in real-time when the operator operates the vehicle; and presenting the presentation feature to the operator of the vehicle.Type: ApplicationFiled: June 26, 2020Publication date: December 31, 2020Applicant: Intuition Robotics, Ltd.Inventors: Shay ZWEIG, Roy AMIR, Itai MENDELSOHN, Dor SKULER
-
Publication number: 20200410317Abstract: A system and method for real-time customization of presentation features of a social robot. A method includes collecting a first dataset regarding a knowledge level of a user of the social robot with respect to at least one feature of the social robot; collecting a second dataset, wherein the second dataset data is collected from at least an environment of the social robot; determining, based on the first dataset and the second dataset, at least one presentation feature from a plurality of presentation features; selecting a first presentation feature of the at least one presentation feature; customizing the selected first presentation feature based on at least the first dataset; and presenting in real-time the customized presentation feature, wherein the presentation is performed using at least one electronic component of the social robot.Type: ApplicationFiled: June 26, 2020Publication date: December 31, 2020Applicant: Intuition Robotics, Ltd.Inventors: Shay ZWEIG, Roy AMIR, Itai MENDELSOHN, Dor SKULER
-
Publication number: 20200393291Abstract: A sensor comprising a whisker shaft and a follicle is provided. The shaft has a root end and a tip end and the shaft tapers from the root end to the tip end so that the root end is wider and the tip end is narrower. The root end is pivotably mounted in the follicle.Type: ApplicationFiled: June 10, 2020Publication date: December 17, 2020Applicant: Bristol Maritime Robotics LtdInventors: Richard Sewell, Thomas Rooney
-
Publication number: 20200394807Abstract: Systems and methods for monitoring movements. A method includes detecting motion based on first localization data related to a localization device moving in a distinct motion pattern, wherein the first localization data is based on sensor readings captured by at least one sensor; correlating the detected motion to a known motion of the localization device based on respective times of the first localization data and of the localization device; localizing the localization device with respect to a map based on the correlation; tracking at least one first location of an object based on second localization data captured by the at least one sensor, wherein the at least one first location is on the map, wherein the tracking further comprises identifying at least one second location of the object based on the second localization data and determining the at least one first location based on the at least one second location.Type: ApplicationFiled: August 27, 2020Publication date: December 17, 2020Applicant: R-Go Robotics LTD.Inventors: Nizan HORESH, Amir BOUSANI, Assaf AGMON
-
Patent number: 10775786Abstract: A method for emulating control signals for controlling an unmanned aerial vehicle (UAV). The method comprises using an integrated circuit installed on the UAV and communicatively coupled to a flight control processor and a companion processor for receiving flight control parameters from the companion processor, the flight control parameters are based data from a plurality of sensors communicatively coupled to the companion processor and mounted on the UAV, wirelessly receiving remote control signals from a remote control unit, modulating the flight control parameters to create emulated signals emulating a signals generated by a remote control designated to wirelessly control the UAV, the emulated signals encode instructions to maneuver the UAV, and switching between the remote control signals and the emulated signals during a flight time of the UAV.Type: GrantFiled: July 11, 2018Date of Patent: September 15, 2020Assignee: Percepto Robotics LtdInventors: Dor Abuhasira, Sagi Blonder, Raviv Raz
-
Publication number: 20200216082Abstract: A system and method for monitoring and managing a cognitive load of an occupant of a vehicle, including: determining, based on analysis of a first set of sensory inputs received from a first set of sensors from a cabin of the vehicle, a current cognitive load score of an occupant of the vehicle; determining, based on an analysis of a second set of sensory inputs, a current state of the vehicle; analyzing the current cognitive load score of the occupant with respect to the current state of the vehicle; and, selecting at least one predetermined plan for execution based on a determination that a reduction of the current cognitive load score of the occupant is desirable, wherein the determination is based on a result of the analysis of the current cognitive load score of the occupant and the current state of the vehicle.Type: ApplicationFiled: January 8, 2020Publication date: July 9, 2020Applicant: Intuition Robotics, Ltd.Inventors: Roy AMIR, Itai MENDELSOHN, Dor SKULER, Shay ZWEIG
-
Publication number: 20200218263Abstract: A system and method for explaining actions of a vehicle, including: receiving an indication associated with at least one action of the vehicle; receiving a data input from the vehicle, wherein the data input is associated with at least one of: an external environment of the vehicle and an internal environment of the vehicle; analyzing the indication with respect to the data input to determine when it is desirable to provide an explanation for the at least one action of the vehicle; selecting, when it is desirable to provide an explanation for the at least one action, a predetermined plan including an explanation of the at least one action, wherein the selection is based on the analysis of the indication with respect to the data input; and, executing the predetermined plan.Type: ApplicationFiled: January 8, 2020Publication date: July 9, 2020Applicant: Intuition Robotics, Ltd.Inventors: Roy AMIR, Itai MENDELSOHN, Dor SKULER, Shay ZWEIG
-
Publication number: 20200219608Abstract: A system and method for monitoring and managing a cognitive load of a person, including: determining, based on an analysis of at least one input data associated with a person, a current cognitive load score of the person; determining, based on the analysis of the at least one input data and the determined current cognitive load score, whether a reduction of the current cognitive load score of the person is desirable; and selecting at least one predetermined plan for execution based on when a reduction of the current cognitive load score of the person is desirable.Type: ApplicationFiled: January 8, 2020Publication date: July 9, 2020Applicant: Intuition Robotics, Ltd.Inventors: Roy AMIR, Itai MENDELSOHN, Dor SKULER, Shay ZWEIG
-
Publication number: 20200219412Abstract: A system and method for determining a responsive action based on sensor fusion, including: performing a sensor fusion on data received from a plurality of sensors to produce output fusion data; analyzing the output fusion data to determine one or more potential actionable scenarios to be selected; determining if the one or more potential actionable scenarios are to be executed; and sending commands to one or more resources to perform the one or more potential actionable scenarios.Type: ApplicationFiled: January 8, 2020Publication date: July 9, 2020Applicant: Intuition Robotics, Ltd.Inventors: Roy AMIR, Itai MENDELSOHN, Dor SKULER, Shay ZWEIG
-
Publication number: 20200201360Abstract: There is provided a method of automatically landing a drone on a landing pad having thereon guiding-elements arranged in a pattern relative to a central region of the landing pad, comprising: receiving first image(s) captured by a camera of the drone, processing the first image(s) to compute a segmentation mask according to an estimate of a location of the landing pad, receiving second image(s) captured by the camera, processing the second image(s) according to the segmentation mask to compute a segmented region and extracting from the segmented region guiding-element(s), determining a vector for each of the extracted guiding-element(s), and aggregating the vectors to compute an estimated location of the central region of the landing pad, and navigating and landing the drone on the landing pad according to the estimated location of the central region of the landing pad.Type: ApplicationFiled: January 13, 2020Publication date: June 25, 2020Applicant: Percepto Robotics LtdInventors: Sagi BLONDER, Ariel BENITAH, Eyal GIL-AD
-
Publication number: 20200160479Abstract: Systems and methods for providing geometric interactions via three-dimensional mapping. A method includes determining a plurality of first descriptors for a plurality of key points in a plurality of first images, wherein each first image shows a portion of a 3D environment in which a robotic device and a visual sensor are deployed; generating a 3D map of the 3D environment based on the plurality of key points and the plurality of descriptors; determining a pose of the visual sensor based on at least one second descriptor and the plurality of first descriptors, wherein the second image is captured by the visual sensor; and determining a target action location based on at least one user input made with respect to a display of the second image and the pose of the visual sensor, wherein the target action location is a location within the 3D environment.Type: ApplicationFiled: January 23, 2020Publication date: May 21, 2020Applicant: R-Go Robotics LTD.Inventors: Nizan HORESH, Amir BOUSANI
-
Publication number: 20200159229Abstract: Techniques for controlling a robotic device in motion using image synthesis are presented. A method includes determining local motion features based on image patches included in first and second input images captured by a camera installed on the robotic device; determining a camera motion based on the local motion features, wherein the camera motion indicates a movement of the camera between capture of the first input image and capture of the second input image; determining a scene geometry based on the camera motion, wherein the scene geometry is a set of three-dimensional coordinates corresponding to two-dimensional image coordinates of the image patches; generating a single perspective synthesized image based on the camera motion and the scene geometry; detecting a change between the first and second input images based on the synthesized image; and modifying motion of the robotic device based on the detected at least one change.Type: ApplicationFiled: January 23, 2020Publication date: May 21, 2020Applicant: R-Go Robotics LTD.Inventors: Nizan HORESH, Amir BOUSANI
-
Publication number: 20200079529Abstract: An unmanned aerial vehicle (UAV) ground station, comprising: a landing surface having a perimeter and a center; a plurality of pushers held above the landing surface by a plurality of linear actuators; at least one electro-mechanical connector attached to one of the plurality of pushers, mechanically adapted to be electrically connected to a compatible electro-mechanical connector of a UAV; and a landing detection controller adapted to instruct the plurality of linear actuators to move the plurality of pushers simultaneously from the perimeter toward the center when a landing event related to the UAV is detected.Type: ApplicationFiled: November 13, 2019Publication date: March 12, 2020Applicant: Percepto Robotics LtdInventors: Raviv RAZ, Jonathan JAFFE, Sagi BLONDER
-
Patent number: 10571896Abstract: A method for defining a robotic machine task. The method comprises a) collecting a sequence of a plurality of images showing at least one demonstrator performing at least one manipulation of at least one object, b) performing an analysis of said sequence of a plurality of images to identify demonstrator body parts manipulating said at least one object and said at least one manipulation of said at least one object, c) determining at least one robotic machine movement to perform said task, and d) generating at least one motion command for instructing said robotic machine to perform said task.Type: GrantFiled: August 21, 2017Date of Patent: February 25, 2020Assignee: Deep Learning Robotics Ltd.Inventors: Carlos Benaim, Miriam Reiner
-
Patent number: 10551852Abstract: There is provided a method of automatically landing a drone on a landing pad having thereon guiding-elements arranged in a pattern relative to a central region of the landing pad, comprising: receiving first image(s) captured by a camera of the drone, processing the first image(s) to compute a segmentation mask according to an estimate of a location of the landing pad, receiving second image(s) captured by the camera, processing the second image(s) according to the segmentation mask to compute a segmented region and extracting from the segmented region guiding-element(s), determining a vector for each of the extracted guiding-element(s), and aggregating the vectors to compute an estimated location of the central region of the landing pad, and navigating and landing the drone on the landing pad according to the estimated location of the central region of the landing pad.Type: GrantFiled: July 20, 2017Date of Patent: February 4, 2020Assignee: Percepto Robotics LtdInventors: Sagi Blonder, Ariel Benitah, Eyal Gil-Ad