Patents by Inventor Linus MARTENSSON

Linus MARTENSSON has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20140104392
    Abstract: The present invention relates to a method for generating an image information. According to the method, a light field information of an environment (13) is captured (31) and a gaze information (20) indicating a position on a display unit (17) a user (19) is gazing at is detected (32). Based on the light field information and the gaze information and image information is generated (33).
    Type: Application
    Filed: October 11, 2013
    Publication date: April 17, 2014
    Applicant: SONY MOBILE COMMUNICATIONS AB
    Inventors: Ola Thörn, David De Léon, Linus Martensson, Andreas Kristensson, Pär-Anders Aronsson
  • Publication number: 20140022372
    Abstract: A method and system are provided for monitoring an object. The method comprises: utilizing a first sensor system comprising at least one sensor disposed in a monitored space to store first data on the monitored space, utilizing a second sensor system to store second data on an object of interest to be monitored; analyzing first data produced by the first sensor system with a signal processing equipment by searching from the stored first data at least one object to be monitored and by comparing parameters describing characteristics of the at least one detected object with stored reference parameters corresponding to reference characteristics and list of interested objects based on the information received from the second sensor system; and generating, on the basis of an event detected in the analysis, information relating to the state of the object.
    Type: Application
    Filed: July 9, 2013
    Publication date: January 23, 2014
    Inventors: Linus MARTENSSON, Par STENBERG, Markus AGEVIK, Karl Ola THORN
  • Publication number: 20130268240
    Abstract: The present invention relates to a method and device for classifying an activity of an object, the method comprising: receiving a sound signal from a sensor, determining type of sound based on said sound signal, and determining said activity based on said type of sound.
    Type: Application
    Filed: March 8, 2013
    Publication date: October 10, 2013
    Inventors: Ola Karl Thörn, Linus Mårtensson, Henrik Bengtsson, Pär-Anders Aronsson, Håkan Jonsson
  • Publication number: 20130265437
    Abstract: The invention is directed to systems, methods and computer program products for transferring content between electronic devices via skin input. An exemplary method includes detecting, by an interface device, an input received on a user's skin, wherein the interface device is in electronic communication with at least one of the first device or the second device; in response to detecting the input, determining the type of input; and at least one of: in response to determining the type of input is a first type of input, initiating transmission of content from the first device to the second device; or in response to determining the type of input is a second type of input, initiating reception of content at the second device.
    Type: Application
    Filed: April 9, 2012
    Publication date: October 10, 2013
    Applicant: SONY MOBILE COMMUNICATIONS AB
    Inventors: Ola Thörn, Henrik Bengtsson, Håkan Jonsson, Linus Mårtensson, Pär-Anders Aronsson
  • Publication number: 20130156264
    Abstract: A device may obtain, from a camera associated with a reference object, depth image data including objects in a first frame and a second frame; identify features of the objects in the first frame and the second frame; and track movements of the features between the first frame and the second frame. The device may also identify independently moving features in the second frame, based on the tracking movements; remove the independently moving features from the depth image data to obtain a static feature set; and process the depth image data corresponding to the static feature set to detect changes in the relative position of objects in the first frame and the second frame. The processor may further translate the changes in relative position into corresponding movement data of the camera and provide the corresponding movement data to an inertial navigation system.
    Type: Application
    Filed: November 29, 2012
    Publication date: June 20, 2013
    Inventor: Linus Mårtensson