Patents by Inventor Ilia Ovsiannikov

Ilia Ovsiannikov has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20200150924
    Abstract: An N×N multiplier may include a N/2×N first multiplier, a N/2×N/2 second multiplier, and a N/2×N/2 third multiplier. The N×N multiplier receives two operands to multiply. The first, second and/or third multipliers are selectively disabled if an operand equals zero or has a small value. If the operands are both less than 2N/2, the second or the third multiplier are used to multiply the operands. If one operand is less than 2N/2 and the other operand is equal to or greater than 2N/2, the first multiplier is used or the second and third multipliers are used to multiply the operands. If both operands are equal to or greater than 2N/2, the first, second and third multipliers are used to multiply the operands.
    Type: Application
    Filed: February 14, 2019
    Publication date: May 14, 2020
    Inventors: Ilia OVSIANNIKOV, Ali SHAFIEE ARDESTANI, Joseph HASSOUN, Lei WANG
  • Publication number: 20200097823
    Abstract: A system and a method of quantizing a pre-trained neural network, includes determining by a layer/channel bit-width determiner for each layer or channel of the pre-trained neural network a minimum quantization noise for the layer or the channel for each master bit-width value in a predetermined set of master bit-width values; and selecting by a bit-width selector for the layer or the channel the master bit-width value having the minimum quantization noise for the layer or the channel. In one embodiment, the minimum quantization noise for the layer or the channel is based on a square of a range of weights for the layer or the channel that is multiplied by a constant to a negative power of a current master bit-width value.
    Type: Application
    Filed: November 5, 2018
    Publication date: March 26, 2020
    Inventors: Hui CHEN, Ilia OVSIANNIKOV
  • Patent number: 10557925
    Abstract: The Time-of-Flight (TOF) technique is combined with analog amplitude modulation within each pixel in an image sensor. The pixel may be a two-tap pixel or a one-tap pixel. Two photoelectron receiver circuits in the pixel receive respective analog modulating signals. The distribution of the received photoelectron charge between these two circuits is controlled by the difference (or ratio) of the two analog modulating voltages. The differential signals generated in this manner within the pixel are modulated in time domain for TOF measurement. Thus, the TOF information is added to the received light signal by the analog domain-based single-ended to differential converter inside the pixel itself. The TOF-based measurement of range and its resolution are controllable by changing the duration of modulation. An autonomous navigation system with these features may provide improved vision for drivers under difficult driving conditions like low light, fog, bad weather, or strong ambient light.
    Type: Grant
    Filed: November 1, 2016
    Date of Patent: February 11, 2020
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Yibing Michelle Wang, Tae-Yon Lee, Ilia Ovsiannikov
  • Publication number: 20200043196
    Abstract: A Dynamic Vision Sensor (DVS) pose-estimation system includes a DVS, a transformation estimator, an inertial measurement unit (IMU) and a camera-pose estimator based on sensor fusion. The DVS detects DVS events and shapes frames based on a number of accumulated DVS events. The transformation estimator estimates a 3D transformation of the DVS camera based on an estimated depth and matches confidence-level values within a camera-projection model such that at least one of a plurality of DVS events detected during a first frame corresponds to a DVS event detected during a second subsequent frame. The IMU detects inertial movements of the DVS with respect to world coordinates between the first and second frames. The camera-pose estimator combines information from a change in a pose of the camera-projection model between the first frame and the second frame based on the estimated transformation and the detected inertial movements of the DVS.
    Type: Application
    Filed: October 9, 2019
    Publication date: February 6, 2020
    Inventors: Zhengping JI, Lilong SHI, Yibing Michelle WANG, Hyun Surk RYU, Ilia OVSIANNIKOV
  • Publication number: 20200033456
    Abstract: An image sensor includes a time-resolving sensor and a processor. The time-resolving sensor outputs a first signal and a second signal pair in response detecting one or more photons that have been reflected from an object. A first ratio of a magnitude of the first signal to a sum of the magnitude of the first signal and a magnitude of the second signal is proportional to a time of flight of the one or more detected photons. A second ratio of the magnitude of the second signal to the sum of the magnitude of the first signal and the magnitude of the second signal is proportional to the time of flight of the one or more detected photons. The processor determines a surface reflectance of the object where the light pulse has been reflected based on the first signal and the second signal pair and may generate a grayscale image.
    Type: Application
    Filed: September 24, 2018
    Publication date: January 30, 2020
    Inventors: Yibing Michelle WANG, Lilong SHI, Ilia OVSIANNIKOV
  • Patent number: 10547830
    Abstract: An apparatus and a method are provided. The apparatus includes a light source configured to project light in a changing pattern that reduces the light's noticeability; collection optics through which light passes and forms an epipolar plane with the light source; and an image sensor configured to receive light passed through the collection optics to acquire image information and depth information simultaneously. The method includes projecting light by a light source in a changing pattern that reduces the light's noticeability; passing light through collection optics and forming an epipolar plane between the collection optics and the light source; and receiving in an image sensor light passed through the collection optics to acquire image information and depth information simultaneously.
    Type: Grant
    Filed: January 6, 2016
    Date of Patent: January 28, 2020
    Assignee: Samsung Electronics Co., Ltd
    Inventors: Ilia Ovsiannikov, Yibing Michelle Wang, Peter Deane
  • Publication number: 20200026980
    Abstract: A neural processor. In some embodiments, the processor includes a first tile, a second tile, a memory, and a bus. The bus may be connected to the memory, the first tile, and the second tile. The first tile may include: a first weight register, a second weight register, an activations buffer, a first multiplier, and a second multiplier. The activations buffer may be configured to include: a first queue connected to the first multiplier and a second queue connected to the second multiplier. The first queue may include a first register and a second register adjacent to the first register, the first register being an output register of the first queue. The first tile may be configured: in a first state: to multiply, in the first multiplier, a first weight by an activation from the output register of the first queue, and in a second state: to multiply, in the first multiplier, the first weight by an activation from the second register of the first queue.
    Type: Application
    Filed: August 27, 2019
    Publication date: January 23, 2020
    Inventors: Ilia Ovsiannikov, Ali Shafiee Ardestani, Joseph H. Hassoun, Lei Wang, Sehwan Lee, JoonHo Song, Jun-Woo Jang, Yibing Michelle Wang, Yuecheng Li
  • Publication number: 20200026979
    Abstract: A neural processor. In some embodiments, the processor includes a first tile, a second tile, a memory, and a bus. The bus may be connected to the memory, the first tile, and the second tile. The first tile may include: a first weight register, a second weight register, an activations buffer, a first multiplier, and a second multiplier. The activations buffer may be configured to include: a first queue connected to the first multiplier and a second queue connected to the second multiplier. The first queue may include a first register and a second register adjacent to the first register, the first register being an output register of the first queue. The first tile may be configured: in a first state: to multiply, in the first multiplier, a first weight by an activation from the output register of the first queue, and in a second state: to multiply, in the first multiplier, the first weight by an activation from the second register of the first queue.
    Type: Application
    Filed: August 27, 2019
    Publication date: January 23, 2020
    Inventors: Ilia Ovsiannikov, Ali Shafiee Ardestani, Joseph H. Hassoun, Lei Wang, Sehwan Lee, JoonHo Song, Jun-Woo Jang, Yibing Michelle Wang, Yuecheng Li
  • Publication number: 20200026978
    Abstract: A neural processor. In some embodiments, the processor includes a first tile, a second tile, a memory, and a bus. The bus may be connected to the memory, the first tile, and the second tile. The first tile may include: a first weight register, a second weight register, an activations buffer, a first multiplier, and a second multiplier. The activations buffer may be configured to include: a first queue connected to the first multiplier and a second queue connected to the second multiplier. The first queue may include a first register and a second register adjacent to the first register, the first register being an output register of the first queue. The first tile may be configured: in a first state: to multiply, in the first multiplier, a first weight by an activation from the output register of the first queue, and in a second state: to multiply, in the first multiplier, the first weight by an activation from the second register of the first queue.
    Type: Application
    Filed: August 27, 2019
    Publication date: January 23, 2020
    Inventors: Ilia Ovsiannikov, Ali Shafiee Ardestani, Joseph H. Hassoun, Lei Wang, Sehwan Lee, JoonHo Song, Jun-Woo Jang, Yibing Michelle Wang, Yuecheng Li
  • Patent number: 10531073
    Abstract: Using one or more patterned markers inside the projector module of a three-dimensional (3D) camera to facilitate automatic calibration of the camera's depth sensing operation. The 3D camera utilizes epipolar geometry-based imaging in conjunction with laser beam point-scans in a triangulation-based approach to depth measurements. A light-sensing element and one or more reflective markers inside the projector module facilitate periodic self-calibration of camera's depth sensing operation. To calibrate the camera, the markers are point-scanned using the laser beam and the reflected light is sensed using the light-sensing element. Based on the output of the light-sensing element, the laser's turn-on delay is adjusted to perfectly align a laser light spot with the corresponding reflective marker. Using reflective markers, the exact direction and speed of the scanning beam over time can be determined as well.
    Type: Grant
    Filed: May 17, 2016
    Date of Patent: January 7, 2020
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventor: Ilia Ovsiannikov
  • Publication number: 20190392253
    Abstract: A client device configured with a neural network includes a processor, a memory, a user interface, a communications interface, a power supply and an input device, wherein the memory includes a trained neural network received from a server system that has trained and configured the neural network for the client device. A server system and a method of training a neural network are disclosed.
    Type: Application
    Filed: August 27, 2019
    Publication date: December 26, 2019
    Inventors: Zhengping JI, Ilia OVSIANNIKOV, Yibing Michelle WANG, Lilong SHI
  • Publication number: 20190392287
    Abstract: A neural processor. In some embodiments, the processor includes a first tile, a second tile, a memory, and a bus. The bus may be connected to the memory, the first tile, and the second tile. The first tile may include: a first weight register, a second weight register, an activations buffer, a first multiplier, and a second multiplier. The activations buffer may be configured to include: a first queue connected to the first multiplier and a second queue connected to the second multiplier. The first queue may include a first register and a second register adjacent to the first register, the first register being an output register of the first queue. The first tile may be configured: in a first state: to multiply, in the first multiplier, a first weight by an activation from the output register of the first queue, and in a second state: to multiply, in the first multiplier, the first weight by an activation from the second register of the first queue.
    Type: Application
    Filed: June 19, 2019
    Publication date: December 26, 2019
    Inventors: Ilia Ovsiannikov, Ali Shafiee Ardestani, Joseph H. Hassoun, Lei Wang, Sehwan Lee, JoonHo Song, Jun-Woo Jang, Yibing Michelle Wang, Yuecheng Li
  • Patent number: 10510160
    Abstract: A Dynamic Vision Sensor (DVS) pose-estimation system includes a DVS, a transformation estimator, an inertial measurement unit (IMU) and a camera-pose estimator based on sensor fusion. The DVS detects DVS events and shapes frames based on a number of accumulated DVS events. The transformation estimator estimates a 3D transformation of the DVS camera based on an estimated depth and matches confidence-level values within a camera-projection model such that at least one of a plurality of DVS events detected during a first frame corresponds to a DVS event detected during a second subsequent frame. The IMU detects inertial movements of the DVS with respect to world coordinates between the first and second frames. The camera-pose estimator combines information from a change in a pose of the camera-projection model between the first frame and the second frame based on the estimated transformation and the detected inertial movements of the DVS.
    Type: Grant
    Filed: March 13, 2017
    Date of Patent: December 17, 2019
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Zhengping Ji, Lilong Shi, Yibing Michelle Wang, Hyun Surk Ryu, Ilia Ovsiannikov
  • Publication number: 20190379851
    Abstract: Using the same image sensor to capture a two-dimensional (2D) image and three-dimensional (3D) depth measurements for a 3D object. A laser point-scans the surface of the object with light spots, which are detected by a pixel array in the image sensor to generate the 3D depth profile of the object using triangulation. Each row of pixels in the pixel array forms an epipolar line of the corresponding laser scan line. Timestamping provides a correspondence between the pixel location of a captured light spot and the respective scan angle of the laser to remove any ambiguity in triangulation. An Analog-to-Digital Converter (ADC) in the image sensor operates as a Time-to-Digital (TDC) converter to generate timestamps. A timestamp calibration circuit is provided on-board to record the propagation delay of each column of pixels in the pixel array and to provide necessary corrections to the timestamp values generated during 3D depth measurements.
    Type: Application
    Filed: August 23, 2019
    Publication date: December 12, 2019
    Inventors: Yibing Michelle WANG, Ilia OVSIANNIKOV
  • Publication number: 20190349569
    Abstract: A structured-light imaging system includes a projector, an image sensor and a controller. The projector projects a structured-light pattern onto a selected slice of a scene in which the selected slice of the scene includes a first predetermined size in a first direction and a second predetermined size in a second direction that is substantially orthogonal to the first direction. The image sensor scans the selected slice of the scene and generates an output corresponding to each region of at least one region of the selected slice. The image sensor and the projector are synchronized in an epipolar manner. The controller is coupled to the image sensor and detects whether an object is located within each scanned region and controls the projector to project the structured-light pattern a first plurality of times towards regions of the selected slice of the scene in which no object has been detected.
    Type: Application
    Filed: July 17, 2018
    Publication date: November 14, 2019
    Inventors: Yibing Michelle WANG, Seunghoon HAN, Lilong SHI, Byunghoon NA, Ilia OVSIANNIKOV
  • Patent number: 10460231
    Abstract: An image signal processing (ISP) system is provided. The system includes a neural network trained by inputting a set of raw data images and a correlating set of desired quality output images; the neural network including an input for receiving input image data and providing processed output; wherein the processed output includes input image data that has been adjusted for at least one image quality attribute. A method and an imaging device are disclosed.
    Type: Grant
    Filed: March 18, 2016
    Date of Patent: October 29, 2019
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Qiang Zhang, Zhengping Ji, Yibing Michelle Wang, Ilia Ovsiannikov
  • Patent number: 10447958
    Abstract: Using the same image sensor to capture a two-dimensional (2D) image and three-dimensional (3D) depth measurements for a 3D object. A laser point-scans the surface of the object with light spots, which are detected by a pixel array in the image sensor to generate the 3D depth profile of the object using triangulation. Each row of pixels in the pixel array forms an epipolar line of the corresponding laser scan line. Timestamping provides a correspondence between the pixel location of a captured light spot and the respective scan angle of the laser to remove any ambiguity in triangulation. An Analog-to-Digital Converter (ADC) in the image sensor operates as a Time-to-Digital (TDC) converter to generate timestamps. A timestamp calibration circuit is provided on-board to record the propagation delay of each column of pixels in the pixel array and to provide necessary corrections to the timestamp values generated during 3D depth measurements.
    Type: Grant
    Filed: October 1, 2018
    Date of Patent: October 15, 2019
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Yibing Michelle Wang, Ilia Ovsiannikov
  • Patent number: 10438112
    Abstract: A method for configuring a neural network is provided. The method includes: selecting a neural network including a plurality of layers, each of the layers including a plurality of neurons for processing an input and providing an output; and, incorporating at least one switch configured to randomly select and disable at least a portion of the neurons in each layer. Another method in the computer program product is disclosed.
    Type: Grant
    Filed: August 26, 2015
    Date of Patent: October 8, 2019
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Qiang Zhang, Zhengping Ji, Lilong Shi, Ilia Ovsiannikov
  • Patent number: 10417525
    Abstract: A client device configured with a neural network includes a processor, a memory, a user interface, a communications interface, a power supply and an input device, wherein the memory includes a trained neural network received from a server system that has trained and configured the neural network for the client device. A server system and a method of training a neural network are disclosed.
    Type: Grant
    Filed: March 19, 2015
    Date of Patent: September 17, 2019
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Zhengping Ji, Ilia Ovsiannikov, Yibing Michelle Wang, Lilong Shi
  • Publication number: 20190281276
    Abstract: A Time-of-Flight (TOF) technique is combined with analog amplitude modulation within each pixel in a pixel array using multiple Single Photon Avalanche Diodes (SPADs) in conjunction with a single Pinned Photo Diode (PPD) in each pixel. A SPAD may be shared among multiple neighboring pixels. The TOF information is added to the received light signal by the analog domain-based single-ended to differential converter inside the pixel itself. The spatial-temporal correlation among outputs of multiple, adjacent SPADs in a pixel is used to control the operation of the PPD to facilitate recording of TOF values and range of an object. Erroneous range measurements due to ambient light are prevented by stopping the charge transfer from the PPD—and, hence, recording a TOF value—only when two or more SPADs in the pixel are triggered within a pre-defined time interval. An autonomous navigation system with multi-SPAD pixels provides improved vision for drivers under difficult driving conditions.
    Type: Application
    Filed: May 24, 2019
    Publication date: September 12, 2019
    Inventors: Yibing Michelle WANG, Lilong SHI, Ilia OVSIANNIKOV