Patents by Inventor Rasmus Dahl

Rasmus Dahl has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20220326783
    Abstract: Method including providing a sensor including light emitters, photodiode detectors, and lenses arranged so as to direct light beams from light emitters exiting lenses along a detection plane, and so as to direct light beams entering lenses at a specific angle of incidence onto photodiode detectors, mounting the sensor on a display presenting virtual input controls for an electronic device, such that the detection plane resides in an airspace in front of the display, activating light emitters to project light beams through lenses along the detection plane, wherein at least one of the light beams is interrupted by a finger, detecting light reflected by the finger, identifying emitters that projected the light beam that was reflected and photodiode detectors that detected the reflected light, as emitter-detector pairs, calculating display coordinates based on target positions associated with the identified emitter-detector pairs, and transmitting the calculated display coordinates to the electronic device.
    Type: Application
    Filed: June 9, 2022
    Publication date: October 13, 2022
    Inventors: Björn Thomas Eriksson, Björn Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain
  • Patent number: 11379048
    Abstract: An input panel for an electronic device, including an arrangement of buttons, wherein each button is actuated when pressed, providing input to an electronic device, and a sensor, detecting location of a user's finger above the buttons, the sensor including a housing, a printed circuit board, light emitters and photodiode detectors, lenses mounted in the housing in such a manner that, when the housing is mounted along an edge of the arrangement, the lenses direct light from the emitters along a plane above the buttons, and direct light from the plane, reflected toward the lenses by an object inserted into the plane, onto the detectors, a processor configured to identify a location in the plane at which the object is inserted based on the detections of light reflected by the object, and a communications port configured to output the identified location to the electronic device.
    Type: Grant
    Filed: October 6, 2020
    Date of Patent: July 5, 2022
    Assignee: Neonode Inc.
    Inventors: Björn Thomas Eriksson, Björn Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain
  • Patent number: 10984756
    Abstract: A display system divides a screen into regions and applies a different set of rendering/encoding parameters to each region. The system applies a first set of parameters to a first region that is being viewed by a fovea of an eye of a user. The system may also apply a second set of parameters to a second region that is being viewed by a parafovea of the eye, and apply a third set of parameters to a third region that is being viewed by the area of the eye outside of the parafovea. The first set of parameters are selected to yield relatively high image quality, while the second set of parameters are yield intermediate quality, and the third set of parameters yield lower quality. As a result, the second region and the third region can be rendered, encoded, and transmitted with less computing power and less bandwidth.
    Type: Grant
    Filed: June 26, 2019
    Date of Patent: April 20, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Martin Henrik Tall, Javier San Agustin Lopez, Rasmus Dahl
  • Publication number: 20210081053
    Abstract: A contactless input method for an electronic device or other equipment, including projecting focused light beams from a series of locations along an edge of a control panel including a matrix of controls for an electronic device or other equipment, across a plane in an airspace in front of the controls, whereby the projected light beams traverse an area equal in size to the area of the matrix, detecting reflections of the projected light beams reflected by an object inserted into the plane, identifying which light beams are reflected, further identifying an angle at which the detected light beams are reflected, calculating a location in the plane at which the object is inserted based on the detecting, identifying and further identifying, and outputting the calculated location from the sensor to the electronic device or other equipment as an actuated corresponding location on the control panel.
    Type: Application
    Filed: October 6, 2020
    Publication date: March 18, 2021
    Inventors: Björn Thomas Eriksson, Björn Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain
  • Patent number: 10921896
    Abstract: An augmented reality (AR) device can access a library of applications or user interfaces (UIs) designed to control a set of devices. The AR device can determine which UI to present based on detection of a device to be controlled near the AR device. For example, a user wearing an AR device may look at a thermostat placed on a wall and a UI to control the thermostat may be presented to the user. The determination that the user is looking at the thermostat may be made by correlating the gaze tracking information of the user-facing camera with the location of the thermostat in an image captured by a world-facing camera. Determination of the location of the thermostat in the image can be performed using image recognition technology. The UI can be selected based on a database record pairing the UI with the thermostat.
    Type: Grant
    Filed: March 16, 2016
    Date of Patent: February 16, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Javier San Agustin Lopez, Martin Henrik Tall, Rasmus Dahl, Jonas Priesum
  • Patent number: 10802601
    Abstract: A sensor, including light emitters projecting directed light beams, light detectors interleaved with the light emitters, lenses, each lens oriented relative to a respective one of the light detectors such that the light detector receives maximum intensity when light enters the lens at an angle b, whereby, for each emitter E, there exist corresponding target positions p(E, D) along the path of the light from emitter E, at which an object located at any of the target positions reflects the light projected by emitter E towards a respective one of detectors D at angle b, and a processor storing a reflection value R(E, D) for each co-activated emitter-detector pair (E, D), based on an amount of light reflected by an object located at p(E, D) and detected by detector D, and calculating a location of an object based on the reflection values and target positions.
    Type: Grant
    Filed: November 25, 2019
    Date of Patent: October 13, 2020
    Assignee: Neonode Inc.
    Inventors: Björn Thomas Eriksson, Björn Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain
  • Publication number: 20200089326
    Abstract: A sensor, including light emitters projecting directed light beams, light detectors interleaved with the light emitters, lenses, each lens oriented relative to a respective one of the light detectors such that the light detector receives maximum intensity when light enters the lens at an angle b, whereby, for each emitter E, there exist corresponding target positions p(E, D) along the path of the light from emitter E, at which an object located at any of the target positions reflects the light projected by emitter E towards a respective one of detectors D at angle b, and a processor storing a reflection value R(E, D) for each co-activated emitter-detector pair (E, D), based on an amount of light reflected by an object located at p(E, D) and detected by detector D, and calculating a location of an object based on the reflection values and target positions.
    Type: Application
    Filed: November 25, 2019
    Publication date: March 19, 2020
    Inventors: Björn Thomas Eriksson, Björn Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain
  • Patent number: 10496180
    Abstract: A proximity sensor including a housing, light emitters mounted in the housing for projecting light out of the housing along a detection plane, light detectors mounted in the housing for detecting amounts of light entering the housing along the detection plane, whereby for each emitter-detector pair (E, D), when an object is located at a target position p(E, D) in the detection plane, corresponding to the pair (E, D), then the light emitted by emitter E is scattered by the object and is expected to be maximally detected by detector D, and a processor to synchronously activate emitter-detector pairs, to read the detected amounts of light from the detectors, and to calculate a location of the object in the detection plane from the detected amounts of light, in accordance with a detection-location relationship that relates detections from emitter-detector pairs to object locations between neighboring target positions in the detection plane.
    Type: Grant
    Filed: February 18, 2018
    Date of Patent: December 3, 2019
    Assignee: Neonode, Inc.
    Inventors: Björn Thomas Eriksson, Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Simon Greger Fellin, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Richard Tom Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain, Oskar Bertil Hagberg, Joel Verner Rozada
  • Publication number: 20190318708
    Abstract: A display system divides a screen into regions and applies a different set of rendering/encoding parameters to each region. The system applies a first set of parameters to a first region that is being viewed by a fovea of an eye of a user. The system may also apply a second set of parameters to a second region that is being viewed by a parafovea of the eye, and apply a third set of parameters to a third region that is being viewed by the area of the eye outside of the parafovea. The first set of parameters are selected to yield relatively high image quality, while the second set of parameters are yield intermediate quality, and the third set of parameters yield lower quality. As a result, the second region and the third region can be rendered, encoded, and transmitted with less computing power and less bandwidth.
    Type: Application
    Filed: June 26, 2019
    Publication date: October 17, 2019
    Inventors: Martin Henrik Tall, Javier San Agustin Lopez, Rasmus Dahl
  • Patent number: 10373592
    Abstract: A display system divides a screen into regions and applies a different set of rendering/encoding parameters to each region. The system applies a first set of parameters to a first region that is being viewed by a fovea of an eye of a user. The system may also apply a second set of parameters to a second region that is being viewed by a parafovea of the eye, and apply a third set of parameters to a third region that is being viewed by the area of the eye outside of the parafovea. The first set of parameters are selected to yield relatively high image quality, while the second set of parameters are yield intermediate quality, and the third set of parameters yield lower quality. As a result, the second region and the third region can be rendered, encoded, and transmitted with less computing power and less bandwidth.
    Type: Grant
    Filed: July 28, 2017
    Date of Patent: August 6, 2019
    Assignee: Facebook Technologies, LLC
    Inventors: Martin Henrik Tall, Javier San Agustin Lopez, Rasmus Dahl
  • Publication number: 20180181209
    Abstract: A proximity sensor including a housing, light emitters mounted in the housing for projecting light out of the housing along a detection plane, light detectors mounted in the housing for detecting amounts of light entering the housing along the detection plane, whereby for each emitter-detector pair (E, D), when an object is located at a target position p(E, D) in the detection plane, corresponding to the pair (E, D), then the light emitted by emitter E is scattered by the object and is expected to be maximally detected by detector D, and a processor to synchronously activate emitter-detector pairs, to read the detected amounts of light from the detectors, and to calculate a location of the object in the detection plane from the detected amounts of light, in accordance with a detection-location relationship that relates detections from emitter-detector pairs to object locations between neighboring target positions in the detection plane.
    Type: Application
    Filed: February 18, 2018
    Publication date: June 28, 2018
    Inventors: Björn Thomas Eriksson, Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Simon Greger Fellin, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Richard Tom Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain, Oskar Bertil Hagberg, Joel Verner Rozada
  • Patent number: 9961258
    Abstract: Eye tracking technology may be used in a wide range of lighting conditions and with many different and varying light levels. In some embodiments, an eye tracking device may employ active illumination (e.g., in the form of infrared light-emitting diodes (LEDs)). However, employing active illumination may reduce the battery life of the device. Under some circumstances (e.g., in a dark environment), the light intensity may be excessive and could be reduced, thereby reducing energy consumption and extending the battery life of the device. An algorithm may be used to adjust the duration of light in eye tracking systems that employ active illumination.
    Type: Grant
    Filed: February 23, 2016
    Date of Patent: May 1, 2018
    Assignee: Facebook, Inc.
    Inventors: Martin Henrik Tall, Sebastian Sztuk, Javier San Agustin Lopez, Rasmus Dahl
  • Patent number: 9921661
    Abstract: A proximity sensor including a housing, light emitters mounted in the housing for projecting light out of the housing along a detection plane, light detectors mounted in the housing for detecting amounts of light entering the housing along the detection plane, whereby for each emitter-detector pair (E, D), when an object is located at a target position p(E, D) in the detection plane, corresponding to the pair (E, D), then the light emitted by emitter E is scattered by the object and is expected to be maximally detected by detector D, and a processor to synchronously activate emitter-detector pairs, to read the detected amounts of light from the detectors, and to calculate a location of the object in the detection plane from the detected amounts of light, in accordance with a detection-location relationship that relates detections from emitter-detector pairs to object locations between neighboring target positions in the detection plane.
    Type: Grant
    Filed: January 19, 2016
    Date of Patent: March 20, 2018
    Assignee: Neonode Inc.
    Inventors: Thomas Eriksson, Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Rosengren, Xiatao Wang, Stefan Holmgren, Gunnar Martin Fröjdh, Simon Fellin, Jan Tomas Hartman, Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Richard Berglind, Karl Erik Patrik Nordström, Lars Sparf, Erik Rosengren, John Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain, Oskar Hagberg, Joel Rozada
  • Publication number: 20180033405
    Abstract: A display system divides a screen into regions and applies a different set of rendering/encoding parameters to each region. The system applies a first set of parameters to a first region that is being viewed by a fovea of an eye of a user. The system may also apply a second set of parameters to a second region that is being viewed by a parafovea of the eye, and apply a third set of parameters to a third region that is being viewed by the area of the eye outside of the parafovea. The first set of parameters are selected to yield relatively high image quality, while the second set of parameters are yield intermediate quality, and the third set of parameters yield lower quality. As a result, the second region and the third region can be rendered, encoded, and transmitted with less computing power and less bandwidth.
    Type: Application
    Filed: July 28, 2017
    Publication date: February 1, 2018
    Inventors: Martin Henrik Tall, Javier San Agustin Lopez, Rasmus Dahl
  • Publication number: 20160274762
    Abstract: An augmented reality (AR) device can access a library of applications or user interfaces (UIs) designed to control a set of devices. The AR device can determine which UI to present based on detection of a device to be controlled near the AR device. For example, a user wearing an AR device may look at a thermostat placed on a wall and a UI to control the thermostat may be presented to the user. The determination that the user is looking at the thermostat may be made by correlating the gaze tracking information of the user-facing camera with the location of the thermostat in an image captured by a world-facing camera. Determination of the location of the thermostat in the image can be performed using image recognition technology. The UI can be selected based on a database record pairing the UI with the thermostat.
    Type: Application
    Filed: March 16, 2016
    Publication date: September 22, 2016
    Inventors: Javier San Agustin Lopez, Martin Henrik Tall, Rasmus Dahl, Jonas Priesum
  • Publication number: 20160248971
    Abstract: Eye tracking technology may be used in a wide range of lighting conditions and with many different and varying light levels. In some embodiments, an eye tracking device may employ active illumination (e.g., in the form of infrared light-emitting diodes (LEDs)). However, employing active illumination may reduce the battery life of the device. Under some circumstances (e.g., in a dark environment), the light intensity may be excessive and could be reduced, thereby reducing energy consumption and extending the battery life of the device. An algorithm may be used to adjust the duration of light in eye tracking systems that employ active illumination.
    Type: Application
    Filed: February 23, 2016
    Publication date: August 25, 2016
    Inventors: Martin Henrik Tall, Sebastian Sztuk, Javier San Agustin Lopez, Rasmus Dahl
  • Publication number: 20160154475
    Abstract: A proximity sensor including a housing, light emitters mounted in the housing for projecting light out of the housing along a detection plane, light detectors mounted in the housing for detecting amounts of light entering the housing along the detection plane, whereby for each emitter-detector pair (E, D), when an object is located at a target position p(E, D) in the detection plane, corresponding to the pair (E, D), then the light emitted by emitter E is scattered by the object and is expected to be maximally detected by detector D, and a processor to synchronously activate emitter-detector pairs, to read the detected amounts of light from the detectors, and to calculate a location of the object in the detection plane from the detected amounts of light, in accordance with a detection-location relationship that relates detections from emitter-detector pairs to object locations between neighboring target positions in the detection plane.
    Type: Application
    Filed: January 19, 2016
    Publication date: June 2, 2016
    Inventors: Thomas Eriksson, Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Rosengren, Xiatao Wang, Stefan Holmgren, Gunnar Martin Fröjdh, Simon Fellin, Jan Tomas Hartman, Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Richard Berglind, Karl Erik Patrik Nordström, Lars Sparf, Erik Rosengren, John Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain, Oskar Hagberg, Joel Rozada