Patents by Inventor Linus MARTENSSON
Linus MARTENSSON has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20160286129Abstract: The invention is directed to controlling shake blur and motion blur associated with an image. An exemplary method comprises: receiving a first image frame and a second image frame associated with the image; determining first movement of the camera in the first image frame or the second image frame; determining second movement of the camera between the first image frame and the second image frame; controlling, based on the first movement, at least a portion of shake blur associated with the image; and controlling, based on the second movement, at least a portion of motion blur associated with the image.Type: ApplicationFiled: March 19, 2014Publication date: September 29, 2016Inventors: Linus Martensson, Ola Thorn, Jonas Gustavsson
-
Publication number: 20160266643Abstract: The present disclosure provides a system and method of setting a display brightness of a display of an electronic device. One or more images of a user of the electronic device may be captured with a camera of the electronic device. A characteristic of an eye of the user may be detected in the one or more images. A light adaptation state of the user's eye may be determined based on the detected characteristic. The display brightness of the display of the electronic device may be set in view of the determined light adaptation state.Type: ApplicationFiled: February 5, 2014Publication date: September 15, 2016Inventors: Linus MARTENSSON, Ola THORN, David DE LEON
-
Publication number: 20160255269Abstract: Obtaining a digital image tagged with corrected image direction information includes capturing and storing a digital image of a scene defined by a field of view of a lifelog camera (10) used to capture the image. A forward-facing direction (28) of a user of the lifelog camera is identified, the forward-facing direction within the field of view of the lifelog camera. A component of the image that contains image data corresponding to a portion of the scene containing the forward-facing direction of the user is also identified. The digital image is then tagged with the identified component.Type: ApplicationFiled: June 3, 2014Publication date: September 1, 2016Inventors: Linus MARTENSSON, Ola THORN, Henrik BENGTSSON
-
Publication number: 20160239126Abstract: An electronic device includes a proximity-sensitive touch sensor array which extends along an input surface of the electronic device and a processing device coupled to the touch sensor array. The processing device is configured to process data captured by the touch sensor array to determine a finger angle at which a finger is directed towards the input surface and an actuation position on the input surface. The processing device is configured to establish an offset-corrected actuation position as a function of the actuation position and the finger angle.Type: ApplicationFiled: April 25, 2016Publication date: August 18, 2016Inventors: Magnus MIDHOLT, Ola THÖRN, David DE LÉON, Linus MÅRTENSSON, Jim RASMUSSON
-
Publication number: 20160227446Abstract: The present disclosure relates generally to a method and a wireless communication device that optimizes communication between devices which is body contact initiated. The method, performed in the wireless communication device comprises pairing with another wireless communication device using a first communication technology, wherein the first communication technology is Body Coupled Communication, BCC, technology, exchanging, between the wireless communication device and the other wireless communication device, using BCC technology, device specific information associated with pairing in a second communication technology and connecting with the other wireless communication device, using the second communication technology, thereby performing a handover from BCC technology to the second communication technology.Type: ApplicationFiled: February 3, 2015Publication date: August 4, 2016Inventors: Aleksandar RODZEVSKI, Linus MÅRTENSSON, Erik WESTENIUS
-
Publication number: 20160182866Abstract: Video data is captured by an imaging sensor. Motion is detected in the captured video data, e.g., by applying image analysis to different video frames of the captured video data. At least one subarea of an overall imaging area of the imaging sensor is determined. The subarea is determined to correspond to a position of the detected motion. In the determined subarea, a video capturing frame rate applied which is higher than a video capturing frame rate applied in other parts of the overall imaging area.Type: ApplicationFiled: December 19, 2014Publication date: June 23, 2016Inventors: Magnus Landqvist, Alexander Hunt, Peter Isberg, Ola Thörn, Linus Mårtensson
-
Patent number: 9323407Abstract: An electronic device includes a proximity-sensitive touch sensor array which extends along an input surface of the electronic device and a processing device coupled to the touch sensor array. The processing device is configured to process data captured by the touch sensor array to determine a finger angle at which a finger is directed towards the input surface and an actuation position on the input surface. The processing device is configured to establish an offset-corrected actuation position as a function of the actuation position and the finger angle.Type: GrantFiled: December 6, 2013Date of Patent: April 26, 2016Assignees: Sony Corporation, Sony Mobile Communications Inc.Inventors: Magnus Midholt, Ola Thorn, David De Leon, Linus Martensson, Jim Rasmusson
-
Patent number: 9323342Abstract: A system includes a processing logic and output logic. The processing logic is configured to receive information identifying points of interest, identify at least one of location or distance information from each of the points of interest to the system, and sort the points of interest based on the location or distance information. The output logic is configured to project information identifying the points of interest onto a surface, wherein the projected information is displayed in accordance with the sorting.Type: GrantFiled: October 22, 2013Date of Patent: April 26, 2016Assignee: SONY CORPORATIONInventors: Pär-Anders Aronsson, Henrik Sven Bengtsson, Håkan Jonsson, Linus Mårtensson, Ola Thörn
-
Publication number: 20160014332Abstract: Operating a camera system that includes a forward-facing camera directed at a forward scene and a rearward-facing camera directed at a rearward scene opposite the forward scene includes analyzing a video stream containing a reflection of the forward scene in a cornea of an eye of a user of the camera system, the video stream captured with the rearward-facing camera. The analyzing includes identifying an item in the reflection moving relative to a field of view of the forward-facing camera and predicted to enter the field of view of the forward-facing camera and identifying a characteristic of the item. A camera setting of the forward-facing camera is adjusted according to the characteristic of the identified item, the adjusted camera setting selected to improve capturing a photograph or video containing the item with the forward-facing camera.Type: ApplicationFiled: February 13, 2014Publication date: January 14, 2016Applicant: Sony CorporationInventors: David de Leon, Linus Martensson, Ola Thorn
-
Publication number: 20150373236Abstract: A lifelog camera is configured to capture the digital image without user input upon detection of a sound-based trigger in an audio signal output by a microphone present at the lifelog camera. The sound-based trigger is indicative of activity of the user or another person near the user.Type: ApplicationFiled: June 24, 2014Publication date: December 24, 2015Inventors: Ola Thorn, Henrik Bengtsson, Jonas Gustavsson, Linus Martensson
-
Patent number: 9202262Abstract: The solution disclosed herein reduces the amount of time and computational resources necessary to determine a dominant gradient direction of an image area comprising a plurality of pixels of an image. To that end, the dominant gradient direction of an image area is determined based on two gradient magnitudes determined from four sample points in the image area, where a direction of one of the gradient magnitudes is perpendicular to a direction of the other of the gradient magnitudes. The dominant gradient direction is then determined by taking the arctangent of the computed gradient magnitudes.Type: GrantFiled: January 31, 2014Date of Patent: December 1, 2015Assignee: SONY CORPORATIONInventors: Linus Mårtensson, Jonas Gustavsson, Jim Rasmusson
-
Patent number: 9111351Abstract: A device may obtain, from a camera associated with a reference object, depth image data including objects in a first frame and a second frame; identify features of the objects in the first frame and the second frame; and track movements of the features between the first frame and the second frame. The device may also identify independently moving features in the second frame, based on the tracking movements; remove the independently moving features from the depth image data to obtain a static feature set; and process the depth image data corresponding to the static feature set to detect changes in the relative position of objects in the first frame and the second frame. The processor may further translate the changes in relative position into corresponding movement data of the camera and provide the corresponding movement data to an inertial navigation system.Type: GrantFiled: November 29, 2012Date of Patent: August 18, 2015Assignees: Sony Corporation, Sony Mobile Communications ABInventor: Linus Mårtensson
-
Publication number: 20150221068Abstract: The solution disclosed herein reduces the amount of time and computational resources necessary to determine a dominant gradient direction of an image area comprising a plurality of pixels of an image. To that end, the dominant gradient direction of an image area is determined based on two gradient magnitudes determined from four sample points in the image area, where a direction of one of the gradient magnitudes is perpendicular to a direction of the other of the gradient magnitudes. The dominant gradient direction is then determined by taking the arctangent of the computed gradient magnitudes.Type: ApplicationFiled: January 31, 2014Publication date: August 6, 2015Inventors: Linus Mårtensson, Jonas Gustavsson, Jim Rasmusson
-
Publication number: 20150222850Abstract: The present disclosure relates to electronic communication devices, and more specifically to streaming media between the electronic communication devices. The present disclosure relates to a method for setting up a media stream connection between a sending device 100, 110 and at least one receiving device 200, 210. The method comprises sending a session initiation message S10 to the at least one receiving device, sending a request S20 for a list of IP-addresses of the at least one receiving device and receiving S30 the list of IP-addresses from the at least one receiving device. The method further comprises setting up uni-directional or bi-directional connection between the sending device 100, 110 and the at least one receiving device 200, 210 by using the received list of IP-addresses and sending the media stream S50 via the uni-directional or bi-directional connection to the at least one receiving device.Type: ApplicationFiled: December 19, 2014Publication date: August 6, 2015Applicant: SONY CORPORATIONInventors: Johan Apelqvist, Henrik Bengtsson, Linus Mårtensson, Christopher Ohlsson, Per Ögren
-
Patent number: 9076269Abstract: An object of the present invention is to provide a way of improving the stitching procedure. The object is achieved by a method in an electronic device for creating a combined image. The method comprises obtaining (701) a first set of at least two sub-images recorded by an array camera. The method further comprises obtaining (702) a second set of at least two sub-images recorded by the array camera. A first depth map is calculated for the first set of sub-images. The first depth map comprises information about a first plane of the sub-images and a second plane of the sub-images. The first plane being at a first distance from the point where the sub-images have been recorded, and the second plane being at a second distance from the point where the sub-images have been recorded. A second depth map is calculated for the second set of sub-images. The second depth map comprises information about the first plane and the second plane.Type: GrantFiled: January 31, 2012Date of Patent: July 7, 2015Assignees: Sony Corporation, Sony Mobile Communications ABInventors: Per Åstrand, Magnus Landqvist, Karl Ola Thörn, Mats Wernersson, Linus Mårtensson, Henrik Bengtsson
-
Patent number: 9024973Abstract: An object of the present invention is to provide a way of displaying content of an obscured area of a view for a user. The object is achieved by a method in an electronic device. The electronic device obtains (402) an information identifying a position of the obscured area in relation to the view. The electronic device further calculates (403) an information identifying the position of the obscured area on a display. The electronic device obtains (404) an information regarding the content of the view or part of the view when not obscured from a memory. The electronic device creates (405) an image relating to the content of the obscured area of the view. The electronic device further displays (406) the image on a place on the display that corresponds to the calculated (403) position of the obscured area in relation to the display.Type: GrantFiled: March 12, 2012Date of Patent: May 5, 2015Assignees: Sony Corporation, Sony Mobile Communications ABInventors: Karl Ola Thörn, Per Åstrand, Henrik Bengtsson, Magnus Landqvist, Linus Mårtensson, Mats Wernersson
-
Patent number: 8994672Abstract: The invention is directed to systems, methods and computer program products for transferring content between electronic devices via skin input. An exemplary method includes detecting, by an interface device, an input received on a user's skin, wherein the interface device is in electronic communication with at least one of the first device or the second device; in response to detecting the input, determining the type of input; and at least one of: in response to determining the type of input is a first type of input, initiating transmission of content from the first device to the second device; or in response to determining the type of input is a second type of input, initiating reception of content at the second device.Type: GrantFiled: April 9, 2012Date of Patent: March 31, 2015Assignees: Sony Corporation, Sony Mobile Communications ABInventors: Ola Thörn, Henrik Bengtsson, Håkan Jonsson, Linus Mårtensson, Pär-Anders Aronsson
-
Publication number: 20150029094Abstract: A system includes a processing logic and output logic. The processing logic is configured to receive information identifying points of interest, identify at least one of location or distance information from each of the points of interest to the system, and sort the points of interest based on the location or distance information. The output logic is configured to project information identifying the points of interest onto a surface, wherein the projected information is displayed in accordance with the sorting.Type: ApplicationFiled: October 22, 2013Publication date: January 29, 2015Applicant: SONY CORPORATIONInventors: Pär-Anders Aronsson, Henrik Sven Bengtsson, Håkan Jonsson, Linus Mårtensson, Ola Thörn
-
Publication number: 20140219581Abstract: An object of the present invention is to provide a way of improving the stitching procedure. The object is achieved by a method in an electronic device for creating a combined image. The method comprises obtaining (701) a first set of at least two sub-images recorded by an array camera. The method further comprises obtaining (702) a second set of at least two sub-images recorded by the array camera. A first depth map is calculated for the first set of sub-images. The first depth map comprises information about a first plane of the sub-images and a second plane of the sub-images. The first plane being at a first distance from the point where the sub-images have been recorded, and the second plane being at a second distance from the point where the sub-images have been recorded. A second depth map is calculated for the second set of sub-images. The second depth map comprises information about the first plane and the second plane.Type: ApplicationFiled: January 31, 2012Publication date: August 7, 2014Inventors: Per Astrand, Magnus Landqvist, Karl Ola Thörn, Mats Wernersson, Linus Mårtensson, Henrik Bengtsson
-
Publication number: 20140184558Abstract: An electronic device includes a proximity-sensitive touch sensor array which extends along an input surface of the electronic device and a processing device coupled to the touch sensor array. The processing device is configured to process data captured by the touch sensor array to determine a finger angle at which a finger is directed towards the input surface and an actuation position on the input surface. The processing device is configured to establish an offset-corrected actuation position as a function of the actuation position and the finger angle.Type: ApplicationFiled: December 6, 2013Publication date: July 3, 2014Applicant: Sony Mobile Communications ABInventors: Magnus Midholt, Ola Thorn, David De Leon, Linus Martensson, Jim Rasmusson