Patents by Inventor Rasmus Dahl
Rasmus Dahl has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20220326783Abstract: Method including providing a sensor including light emitters, photodiode detectors, and lenses arranged so as to direct light beams from light emitters exiting lenses along a detection plane, and so as to direct light beams entering lenses at a specific angle of incidence onto photodiode detectors, mounting the sensor on a display presenting virtual input controls for an electronic device, such that the detection plane resides in an airspace in front of the display, activating light emitters to project light beams through lenses along the detection plane, wherein at least one of the light beams is interrupted by a finger, detecting light reflected by the finger, identifying emitters that projected the light beam that was reflected and photodiode detectors that detected the reflected light, as emitter-detector pairs, calculating display coordinates based on target positions associated with the identified emitter-detector pairs, and transmitting the calculated display coordinates to the electronic device.Type: ApplicationFiled: June 9, 2022Publication date: October 13, 2022Inventors: Björn Thomas Eriksson, Björn Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain
-
Patent number: 11379048Abstract: An input panel for an electronic device, including an arrangement of buttons, wherein each button is actuated when pressed, providing input to an electronic device, and a sensor, detecting location of a user's finger above the buttons, the sensor including a housing, a printed circuit board, light emitters and photodiode detectors, lenses mounted in the housing in such a manner that, when the housing is mounted along an edge of the arrangement, the lenses direct light from the emitters along a plane above the buttons, and direct light from the plane, reflected toward the lenses by an object inserted into the plane, onto the detectors, a processor configured to identify a location in the plane at which the object is inserted based on the detections of light reflected by the object, and a communications port configured to output the identified location to the electronic device.Type: GrantFiled: October 6, 2020Date of Patent: July 5, 2022Assignee: Neonode Inc.Inventors: Björn Thomas Eriksson, Björn Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain
-
Patent number: 10984756Abstract: A display system divides a screen into regions and applies a different set of rendering/encoding parameters to each region. The system applies a first set of parameters to a first region that is being viewed by a fovea of an eye of a user. The system may also apply a second set of parameters to a second region that is being viewed by a parafovea of the eye, and apply a third set of parameters to a third region that is being viewed by the area of the eye outside of the parafovea. The first set of parameters are selected to yield relatively high image quality, while the second set of parameters are yield intermediate quality, and the third set of parameters yield lower quality. As a result, the second region and the third region can be rendered, encoded, and transmitted with less computing power and less bandwidth.Type: GrantFiled: June 26, 2019Date of Patent: April 20, 2021Assignee: Facebook Technologies, LLCInventors: Martin Henrik Tall, Javier San Agustin Lopez, Rasmus Dahl
-
Publication number: 20210081053Abstract: A contactless input method for an electronic device or other equipment, including projecting focused light beams from a series of locations along an edge of a control panel including a matrix of controls for an electronic device or other equipment, across a plane in an airspace in front of the controls, whereby the projected light beams traverse an area equal in size to the area of the matrix, detecting reflections of the projected light beams reflected by an object inserted into the plane, identifying which light beams are reflected, further identifying an angle at which the detected light beams are reflected, calculating a location in the plane at which the object is inserted based on the detecting, identifying and further identifying, and outputting the calculated location from the sensor to the electronic device or other equipment as an actuated corresponding location on the control panel.Type: ApplicationFiled: October 6, 2020Publication date: March 18, 2021Inventors: Björn Thomas Eriksson, Björn Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain
-
Patent number: 10921896Abstract: An augmented reality (AR) device can access a library of applications or user interfaces (UIs) designed to control a set of devices. The AR device can determine which UI to present based on detection of a device to be controlled near the AR device. For example, a user wearing an AR device may look at a thermostat placed on a wall and a UI to control the thermostat may be presented to the user. The determination that the user is looking at the thermostat may be made by correlating the gaze tracking information of the user-facing camera with the location of the thermostat in an image captured by a world-facing camera. Determination of the location of the thermostat in the image can be performed using image recognition technology. The UI can be selected based on a database record pairing the UI with the thermostat.Type: GrantFiled: March 16, 2016Date of Patent: February 16, 2021Assignee: Facebook Technologies, LLCInventors: Javier San Agustin Lopez, Martin Henrik Tall, Rasmus Dahl, Jonas Priesum
-
Patent number: 10802601Abstract: A sensor, including light emitters projecting directed light beams, light detectors interleaved with the light emitters, lenses, each lens oriented relative to a respective one of the light detectors such that the light detector receives maximum intensity when light enters the lens at an angle b, whereby, for each emitter E, there exist corresponding target positions p(E, D) along the path of the light from emitter E, at which an object located at any of the target positions reflects the light projected by emitter E towards a respective one of detectors D at angle b, and a processor storing a reflection value R(E, D) for each co-activated emitter-detector pair (E, D), based on an amount of light reflected by an object located at p(E, D) and detected by detector D, and calculating a location of an object based on the reflection values and target positions.Type: GrantFiled: November 25, 2019Date of Patent: October 13, 2020Assignee: Neonode Inc.Inventors: Björn Thomas Eriksson, Björn Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain
-
Publication number: 20200089326Abstract: A sensor, including light emitters projecting directed light beams, light detectors interleaved with the light emitters, lenses, each lens oriented relative to a respective one of the light detectors such that the light detector receives maximum intensity when light enters the lens at an angle b, whereby, for each emitter E, there exist corresponding target positions p(E, D) along the path of the light from emitter E, at which an object located at any of the target positions reflects the light projected by emitter E towards a respective one of detectors D at angle b, and a processor storing a reflection value R(E, D) for each co-activated emitter-detector pair (E, D), based on an amount of light reflected by an object located at p(E, D) and detected by detector D, and calculating a location of an object based on the reflection values and target positions.Type: ApplicationFiled: November 25, 2019Publication date: March 19, 2020Inventors: Björn Thomas Eriksson, Björn Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain
-
Patent number: 10496180Abstract: A proximity sensor including a housing, light emitters mounted in the housing for projecting light out of the housing along a detection plane, light detectors mounted in the housing for detecting amounts of light entering the housing along the detection plane, whereby for each emitter-detector pair (E, D), when an object is located at a target position p(E, D) in the detection plane, corresponding to the pair (E, D), then the light emitted by emitter E is scattered by the object and is expected to be maximally detected by detector D, and a processor to synchronously activate emitter-detector pairs, to read the detected amounts of light from the detectors, and to calculate a location of the object in the detection plane from the detected amounts of light, in accordance with a detection-location relationship that relates detections from emitter-detector pairs to object locations between neighboring target positions in the detection plane.Type: GrantFiled: February 18, 2018Date of Patent: December 3, 2019Assignee: Neonode, Inc.Inventors: Björn Thomas Eriksson, Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Simon Greger Fellin, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Richard Tom Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain, Oskar Bertil Hagberg, Joel Verner Rozada
-
Publication number: 20190318708Abstract: A display system divides a screen into regions and applies a different set of rendering/encoding parameters to each region. The system applies a first set of parameters to a first region that is being viewed by a fovea of an eye of a user. The system may also apply a second set of parameters to a second region that is being viewed by a parafovea of the eye, and apply a third set of parameters to a third region that is being viewed by the area of the eye outside of the parafovea. The first set of parameters are selected to yield relatively high image quality, while the second set of parameters are yield intermediate quality, and the third set of parameters yield lower quality. As a result, the second region and the third region can be rendered, encoded, and transmitted with less computing power and less bandwidth.Type: ApplicationFiled: June 26, 2019Publication date: October 17, 2019Inventors: Martin Henrik Tall, Javier San Agustin Lopez, Rasmus Dahl
-
Patent number: 10373592Abstract: A display system divides a screen into regions and applies a different set of rendering/encoding parameters to each region. The system applies a first set of parameters to a first region that is being viewed by a fovea of an eye of a user. The system may also apply a second set of parameters to a second region that is being viewed by a parafovea of the eye, and apply a third set of parameters to a third region that is being viewed by the area of the eye outside of the parafovea. The first set of parameters are selected to yield relatively high image quality, while the second set of parameters are yield intermediate quality, and the third set of parameters yield lower quality. As a result, the second region and the third region can be rendered, encoded, and transmitted with less computing power and less bandwidth.Type: GrantFiled: July 28, 2017Date of Patent: August 6, 2019Assignee: Facebook Technologies, LLCInventors: Martin Henrik Tall, Javier San Agustin Lopez, Rasmus Dahl
-
Publication number: 20180181209Abstract: A proximity sensor including a housing, light emitters mounted in the housing for projecting light out of the housing along a detection plane, light detectors mounted in the housing for detecting amounts of light entering the housing along the detection plane, whereby for each emitter-detector pair (E, D), when an object is located at a target position p(E, D) in the detection plane, corresponding to the pair (E, D), then the light emitted by emitter E is scattered by the object and is expected to be maximally detected by detector D, and a processor to synchronously activate emitter-detector pairs, to read the detected amounts of light from the detectors, and to calculate a location of the object in the detection plane from the detected amounts of light, in accordance with a detection-location relationship that relates detections from emitter-detector pairs to object locations between neighboring target positions in the detection plane.Type: ApplicationFiled: February 18, 2018Publication date: June 28, 2018Inventors: Björn Thomas Eriksson, Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Simon Greger Fellin, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Richard Tom Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain, Oskar Bertil Hagberg, Joel Verner Rozada
-
Patent number: 9961258Abstract: Eye tracking technology may be used in a wide range of lighting conditions and with many different and varying light levels. In some embodiments, an eye tracking device may employ active illumination (e.g., in the form of infrared light-emitting diodes (LEDs)). However, employing active illumination may reduce the battery life of the device. Under some circumstances (e.g., in a dark environment), the light intensity may be excessive and could be reduced, thereby reducing energy consumption and extending the battery life of the device. An algorithm may be used to adjust the duration of light in eye tracking systems that employ active illumination.Type: GrantFiled: February 23, 2016Date of Patent: May 1, 2018Assignee: Facebook, Inc.Inventors: Martin Henrik Tall, Sebastian Sztuk, Javier San Agustin Lopez, Rasmus Dahl
-
Patent number: 9921661Abstract: A proximity sensor including a housing, light emitters mounted in the housing for projecting light out of the housing along a detection plane, light detectors mounted in the housing for detecting amounts of light entering the housing along the detection plane, whereby for each emitter-detector pair (E, D), when an object is located at a target position p(E, D) in the detection plane, corresponding to the pair (E, D), then the light emitted by emitter E is scattered by the object and is expected to be maximally detected by detector D, and a processor to synchronously activate emitter-detector pairs, to read the detected amounts of light from the detectors, and to calculate a location of the object in the detection plane from the detected amounts of light, in accordance with a detection-location relationship that relates detections from emitter-detector pairs to object locations between neighboring target positions in the detection plane.Type: GrantFiled: January 19, 2016Date of Patent: March 20, 2018Assignee: Neonode Inc.Inventors: Thomas Eriksson, Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Rosengren, Xiatao Wang, Stefan Holmgren, Gunnar Martin Fröjdh, Simon Fellin, Jan Tomas Hartman, Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Richard Berglind, Karl Erik Patrik Nordström, Lars Sparf, Erik Rosengren, John Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain, Oskar Hagberg, Joel Rozada
-
Publication number: 20180033405Abstract: A display system divides a screen into regions and applies a different set of rendering/encoding parameters to each region. The system applies a first set of parameters to a first region that is being viewed by a fovea of an eye of a user. The system may also apply a second set of parameters to a second region that is being viewed by a parafovea of the eye, and apply a third set of parameters to a third region that is being viewed by the area of the eye outside of the parafovea. The first set of parameters are selected to yield relatively high image quality, while the second set of parameters are yield intermediate quality, and the third set of parameters yield lower quality. As a result, the second region and the third region can be rendered, encoded, and transmitted with less computing power and less bandwidth.Type: ApplicationFiled: July 28, 2017Publication date: February 1, 2018Inventors: Martin Henrik Tall, Javier San Agustin Lopez, Rasmus Dahl
-
Publication number: 20160274762Abstract: An augmented reality (AR) device can access a library of applications or user interfaces (UIs) designed to control a set of devices. The AR device can determine which UI to present based on detection of a device to be controlled near the AR device. For example, a user wearing an AR device may look at a thermostat placed on a wall and a UI to control the thermostat may be presented to the user. The determination that the user is looking at the thermostat may be made by correlating the gaze tracking information of the user-facing camera with the location of the thermostat in an image captured by a world-facing camera. Determination of the location of the thermostat in the image can be performed using image recognition technology. The UI can be selected based on a database record pairing the UI with the thermostat.Type: ApplicationFiled: March 16, 2016Publication date: September 22, 2016Inventors: Javier San Agustin Lopez, Martin Henrik Tall, Rasmus Dahl, Jonas Priesum
-
Publication number: 20160248971Abstract: Eye tracking technology may be used in a wide range of lighting conditions and with many different and varying light levels. In some embodiments, an eye tracking device may employ active illumination (e.g., in the form of infrared light-emitting diodes (LEDs)). However, employing active illumination may reduce the battery life of the device. Under some circumstances (e.g., in a dark environment), the light intensity may be excessive and could be reduced, thereby reducing energy consumption and extending the battery life of the device. An algorithm may be used to adjust the duration of light in eye tracking systems that employ active illumination.Type: ApplicationFiled: February 23, 2016Publication date: August 25, 2016Inventors: Martin Henrik Tall, Sebastian Sztuk, Javier San Agustin Lopez, Rasmus Dahl
-
Publication number: 20160154475Abstract: A proximity sensor including a housing, light emitters mounted in the housing for projecting light out of the housing along a detection plane, light detectors mounted in the housing for detecting amounts of light entering the housing along the detection plane, whereby for each emitter-detector pair (E, D), when an object is located at a target position p(E, D) in the detection plane, corresponding to the pair (E, D), then the light emitted by emitter E is scattered by the object and is expected to be maximally detected by detector D, and a processor to synchronously activate emitter-detector pairs, to read the detected amounts of light from the detectors, and to calculate a location of the object in the detection plane from the detected amounts of light, in accordance with a detection-location relationship that relates detections from emitter-detector pairs to object locations between neighboring target positions in the detection plane.Type: ApplicationFiled: January 19, 2016Publication date: June 2, 2016Inventors: Thomas Eriksson, Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Rosengren, Xiatao Wang, Stefan Holmgren, Gunnar Martin Fröjdh, Simon Fellin, Jan Tomas Hartman, Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Richard Berglind, Karl Erik Patrik Nordström, Lars Sparf, Erik Rosengren, John Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain, Oskar Hagberg, Joel Rozada