Abstract: A method of assisting tilt angle adjustment of a thermal camera comprises arranging the thermal camera at an initial tilt angle, acquiring at least one thermal image by the thermal camera, determining, from the at least one thermal image, a series of sharpness indicators of image parts corresponding to vertically spaced parts of the camera view, identifying a maximum in the series of sharpness indicators, based on the identified maximum, determining a target sharpness indicator as a predetermined fraction of the identified maximum, during tilt angle adjustment, assisting by providing a target signal for indicating a target tilt angle of the thermal camera in which a sharpness indicator of a lower part of the camera's field of view equals the determined target sharpness indicator.
Type:
Application
Filed:
December 10, 2019
Publication date:
June 18, 2020
Applicant:
Axis AB
Inventors:
Thomas Winzell, Hongping Zhao, Anthony Hawkins
Abstract: A method for remotely controlling a networked video camera includes sending an instruction message for the video camera, from the client to a communication controller, and determining whether the message is a candidate for being sent via a peer-to-peer connection. In response to the instruction message being determined not to be a candidate for being sent via a peer-to-peer connection, sending the instruction message from the communication controller to a camera control service, logging at the camera control service at least a portion of the instruction message, and sending the instruction message received at the camera control service from the camera control service to the video camera via the communication network. In response to the instruction message being determined to be a candidate for being sent via a peer-to-peer connection, the method further includes sending the instruction message via a peer-to-peer connection.
Abstract: In a method for tracking an object in video-monitoring scenes, multiple feature vectors are extracted (722) and assembled (724) in point clouds, wherein a point cloud may be assembled for each tracklet, i.e. for each separate part of a track. In order to determine if different tracklets relate to the same or different objects the point clouds of each tracklet is compared (734). Based on the outcome of the comparison it is deduced if the first object and the second object may be considered to be the same object and, if so, the first object is associated (738) with the second object.
Abstract: Systems, apparatuses, and techniques for video delivery can include one or more of the following: a wireless camera arranged in a wearable form factor comprising a battery to provide energy, and configured to generate a video feed, and a base station in wireless communication with the wireless camera and configured to receive the video feed from the wireless camera and process the video feed, and a video portal device communicatively coupled with the base station and configured to receive the processed video feed from the base station and deliver at least a portion of the processed video feed to one or more remote clients. A base station can reserve a wireless channel for the wireless camera for a video transmission.
Abstract: The disclosure relates to a method, apparatus and system for detecting and reducing the effects of color fringing in digital video acquired by a camera comprising an iris. The method comprises: acquiring, by the camera, a first digital image frame using a first camera setting, including a first iris aperture size; acquiring, by the camera, a second digital image frame using a second camera setting, including a second iris aperture size, wherein the second aperture size is smaller than the first aperture size; comparing the first and the second digital image frame, at least a specific color component thereof; localizing regions having a disproportional intensity ratio in the specific color component between the first digital image frame and the second digital image frame; and reducing the specific color component in the localized regions for subsequently acquired digital image frames.
Abstract: The present invention relates to an emergency notification system where indicators are mounted on or in a building in such a way that optical or thermal signals emitted from the indicators form a time-variant indication detectable outside the building of an emergency event taking place inside the building. Sensors detecting a predetermined sound are mounted inside the building and are each connected to a nearby indicator. When a sensor makes a detection, it sends event information to its associated indicator which will prompt the indicator to emit a first optical or thermal signal. Based on a signal from a timer connected to the indicator, a property of the first signal will change after a certain time has passed, thereby providing the time-variant indication.
Type:
Grant
Filed:
September 24, 2019
Date of Patent:
June 16, 2020
Assignee:
AXIS AB
Inventors:
Ingemar Larsson, Anders Hansson, Daniel Andersson
Abstract: A method and device for encoding a plurality of image frames uses two separate encoders, where each image frame is divided into two portions to each be encoded by one of the two encoders, where the image frame is divided to minimize motion across the boundary between the two portions, such that the two encoders may operate independently of each other without a substantial bit rate penalty or reduced encoding quality.
Abstract: A method and device for encoding a plurality of image frames uses two separate encoders, where each image frame is divided into two portions to each be encoded by one of the two encoders, where a boundary between the two portions is offset between the image frames according to a size of a search window of one of the encoders. Consequently, copying of pixel data for the purpose of motion search is only required in one direction between a first and a second encoder.
Abstract: Methods and apparatus, including computer program products, for creating a quality annotated training data set of images for training a quality estimating neural network. A set of images depicting a same object is received. The images in the set of images have varying image quality. A probe image whose quality is to be estimated is selected from the set of images. A gallery of images is selected from the set of images. The gallery of images does not include the probe image. The probe image is compared to each image in the gallery and a match score is generated for each image comparison. Based on the match scores, a quality value is determined for the probe image. The probe image and its associated quality value are added to a quality annotated training data set for the neural network.
Abstract: The present invention relates to allowing control of a monitoring camera, typically outside of what is supported by a video management system, to which the camera is connected. The camera overlays a pattern on the video stream representing a link to an action in a control interface for controlling the camera and an operator uses an operator controlled device, such as a mobile phone, to scan the pattern and perform the action to control the camera.
Abstract: A method and a digital video camera for reducing intensity variations in a video stream depicting a scene comprising capturing, using a first sensor setting, a first frame; detecting a change in intensity in a portion of the first frame, the portion represents a first area of the scene; determining a second sensor setting based on the first frame; capturing, using the second sensor setting, a second frame; creating a local tone mapping mask, wherein a local tone mapping in the first area of the scene is different from a local tone mapping in an area outside the first area, and wherein the local tone mapping in the area outside the first area is based on a relation between the first sensor setting and the second sensor setting; and applying the local tone mapping mask to the second frame.
Type:
Application
Filed:
November 21, 2019
Publication date:
May 28, 2020
Applicant:
Axis AB
Inventors:
Sebastian Fors, Johan Jeppsson, Anton Ohrn, Jimmie Jonsson, Bjorn Benderius, Andreas Muhrbeck, Karin Dammer
Abstract: A housing for a device an outside and an inside, as well as an access passage for access from the outside of the housing to a component of the device inside the housing. The access passage comprises an outer opening section, and an inner drainage channel. The drainage channel is arranged below the outer opening section, which has vertical side walls extending inwardly from the outside of the housing. Each sidewall has a top width (WT) and a bottom width (WB), the top width (WT) being narrower than the bottom width (WB). The sidewalls provide a guiding surface for a liquid entering the access passage, such that the liquid is guided towards the drainage channel.
Abstract: A method, device, and computer-readable medium for synchronizing video are described. A wearable camera captures first video data. Also, the wearable camera timestamps the first video data and organizes the first video data using a hash table. The wearable camera transmits the first video data to a wireless hub via a wireless connection and captures second video data. When the wireless connection between the wearable camera and the wireless hub is unable to support full resolution video playback, the wearable camera down-converts the second video data, timestamps the second video data and organizes the second video data using the hash table. The wearable camera transmits the second video data to the wireless hub. Moreover, the first video data and the second video data are synchronizable according to gap synchronization based on one or more timestamps of the first video data and one or more timestamps of the second video data.
Abstract: Video encoding uses periodic intra refresh, which can adapt padding of an intra encoding region between image frames based on motion in the image frame.
Inventors:
Joakim Veberg, Johan Persson, Johan Widerdal, Jonas Sjogren, Mathias Walter, Mariano Vozzi, Ola Andersson, Christian Jacobsson, Daniel Ahman, Henrik Svedberg
Inventors:
Joakim Veberg, Johan Persson, Johan Widerdal, Jonas Sjogren, Mathias Walter, Mariano Vozzi, Ola Andersson, Christian Jacobsson, Daniel Ahman, Henrik Svedberg