VISION SENSOR, IMAGE PROCESSING DEVICE INCLUDING THE SAME, AND OPERATING METHOD OF THE VISION SENSOR
Provided is a vision sensor including a pixel array including a plurality of pixels disposed in a matrix form, an event detection circuit configured to detect whether an event has occurred in the plurality of pixels and generate event signals corresponding to pixels from among the plurality of pixels in which an event has occurred, a map data processor configured to generate a timestamp map based on the event signals, and an interface circuit configured to transmit vision sensor data including at least one of the event signals and the timestamp map to an external processor, wherein the timestamp map includes timestamp information indicating polarity information, address information, and an event occurrence time of a pixel included in an event signal corresponding to the pixel.
Latest Samsung Electronics Patents:
This application is based on and claims priority to Korean Patent Application No. 10-2020-0165943, filed on Dec. 1, 2020, in the Korean Intellectual Property Office, the disclosure of which is incorporated by reference herein in its entirety.
BACKGROUND 1. FieldExample embodiments of the present disclosure relate to a vision sensor, and more particularly, to a vision sensor capable of generating and transmitting a timestamp map, an image processing device including the vision sensor, and an operating method of the vision sensor.
2. Background of Related ArtA vision sensor, for example, a dynamic vision sensor, generates, upon occurrence of an event, for example, a variation in an intensity of light, information about the event, that is, an event signal, and transmits the event signal to a processor.
A dynamic vision sensor according to related art transmits only event data and a timestamp, and data processing such as writing a timestamp map by using the data is mainly performed using a processor located outside the dynamic vision sensor. An operation of writing a timestamp map by using raw data consumes resources of a host, and thus, a method of reducing the burden of the host needs to be researched.
SUMMARYOne or more example embodiments provide a vision sensor generating a timestamp map and an optical flow map based on event data and a timestamp, an image processing device including the vision sensor, and an operating method of the vision sensor.
According to an aspect of an example embodiment, there is provided a vision sensor including a pixel array including a plurality of pixels disposed in a matrix form, an event detection circuit configured to detect whether an event has occurred in the plurality of pixels and generate event signals corresponding to pixels from among the plurality of pixels in which an event has occurred, a map data processor configured to generate a timestamp map based on the event signals, and an interface circuit configured to transmit vision sensor data including at least one of the event signals and the timestamp map to an external processor, wherein the timestamp map includes timestamp information indicating polarity information, address information, and an event occurrence time of a pixel included in an event signal corresponding to the pixel.
According to another aspect of an example embodiment, there is provided an image processing device including a vision sensor configured to generate a plurality of event signals corresponding to a plurality of pixels in which an event has occurred based on a movement of an object, from among the plurality of pixels included in a pixel array, generate a timestamp map based on polarity information, address information, and event occurrence time information of a pixel included in the plurality of event signals, and output vision sensor data including the event signals and the timestamp map, and a processor configured to detect the movement of the object by processing the vision sensor data output from the vision sensor.
According to an aspect of an example embodiment, there is provided an operating method of a vision sensor, the operating method including detecting whether an event has occurred in a plurality of pixels and generating event signals corresponding to pixels from among the plurality of pixels in which an event has occurred, generating vision sensor data including a timestamp map based on the event signals, and transmitting the vision sensor data to an external processor, wherein the timestamp map includes timestamp information indicating polarity information, address information, and an event occurrence time of a pixel included in an event signal.
The above and/or other aspects, features, and advantages of example embodiments will be more clearly understood from the following detailed description, taken in conjunction with the accompanying drawings, in which:
Hereinafter, various example embodiments will be described with reference to the attached drawings.
An image processing device 10 according to an example embodiment may be mounted in an electronic device having an image or light sensing function. For example, the image processing device 10 may be mounted in an electronic device such as a camera, a smartphone, a wearable device, Internet of Things (IoT), a tablet personal computer (PC), a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a drone, an advanced driver-assistance system (ADAS), or the like. Also, the image processing device 10 may be included as a component of a vehicle, furniture, manufacturing equipment, a door, various measuring instruments, or the like.
Referring to
The vision sensor 100 may detect a variation in an intensity of incident light and output an event signal. The vision sensor 100 may be a dynamic vision sensor outputting event signals EVS with respect to pixels, in which a variation in a light intensity is detected, for example, pixels in which an event has occurred. The variation in the intensity of light may be due to a movement of an object, of which an image is captured using the vision sensor 100, or due to a movement of the vision sensor 100 or the image processing device 10. The vision sensor 100 may transmit event signals EVS to the processor 200 periodically or non-periodically. The vision sensor 100 may transmit, to the processor 200, not only the event signals EVS including an address, a polarity, and a timestamp but also a timestamp map TSM or an optical flow map OFM generated based on the event signal EVS.
The vision sensor 100 may selectively transmit the event signals EVS to the processor 200. The vision sensor 100 may transmit, to the processor 200, those event signals EVS generated from pixels PX corresponding to a region of interest (ROI) set in a pixel array 110 from among event signals generated to correspond to the pixel array 110.
According to an example embodiment, the vision sensor 100 may apply crop or event binning to the event signals EVS. Also, the vision sensor 100 may generate a timestamp map TSM or an optical flow map OFM based on an event signal EVS. According to an example embodiment, the vision sensor 100 may selectively transmit event signals EVS, a timestamp map TSM, or an optical flow map OFM to the outside according to a transmission mode.
The processor 200 may process the event signal EVS received from the vision sensor 100, and detect a movement of an object or a movement of an object on an image recognized by the image processing device 10, and may use an algorithm such as a simultaneous localization and mapping (SLAM). The processor 200 may include an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a dedicated microprocessor, a microprocessor, a general purpose processor, or the like. According to an example embodiment, the processor 200 may include an application processor or an image processor.
In addition, the vision sensor 100 and the processor 200 may be each implemented as an integrated circuit (IC). For example, the vision sensor 100 and the processor 200 may be implemented as separate semiconductor chips, or may be implemented as single chips. For example, the vision sensor 100 and the processor 200 may be implemented as system on chips (SoCs).
Referring to
The pixel array 110 may include a plurality of pixels PX arranged in a matrix form. Each of the pixels PX may detect events such as an increase or a decrease in an intensity of received light. For example, each of the pixels PX may be connected to the event detection circuit 120 via column lines extending in a column direction and row lines extending in a row direction. A signal indicating that an event has occurred and polarity information of an event, for example, whether the event is an on-event where a light intensity increases or an off-event where a light intensity decreases, may be output from a pixel PX in which the event has occurred, to the event detection circuit 120.
The event detection circuit 120 may read events from the pixel array 110 and process the events. The event detection circuit 120 may generate an event signal EVS including polarity information of the occurred event, an address of a pixel in which the event has occurred, and a timestamp. The event signal EVS may be generated in various formats, for example, in an address event representation (AER) format including address information, timestamp information, and polarity information of a pixel in which an event has occurred or a raw format including event occurrence information of all pixels.
The event detection circuit 120 may process events that occurred in the pixel array 110, in units of pixels, in units of pixel groups including a plurality of pixels, in units of columns or in units of frames.
The map data module 130 may adjust an amount of the event signals EVS by post-processing such as crop, which refers to setting of an ROI in a pixel array with respect to an event signal EVS or event binning for setting a data amount, and adjust a size of a timestamp map TSM and an optical flow map OFM that are generated. According to an example embodiment, the map data module 130 may generate map data MDT including a timestamp map or an optical flow map and output the map data MDT to the interface circuit 140, or output the event signals EVS received from the event detection circuit 120 to the interface circuit 140.
The interface circuit 140 may receive the event signals EVS and pieces of map data MDT, and transmit vision sensor data VSD to the processor 200 (
Hereinafter, the event signals EVS or pieces of map data MDT being output may be the event signals EVS or the pieces of map data MDT being converted into vision sensor data VSD via the interface circuit 140 and the vision sensor data VSD being transmitted to the processor 200.
Referring to
The voltage generator 121 may generate a voltage provided to the pixel array 110. For example, the voltage generator 121 may generate threshold voltages used to detect an on-event or an off-event from the pixel PX (
The DTAG 122 may receive, from the pixel PX in which an event has occurred, a signal indicating that the event has occurred, and generate a timestamp TS including information about a time when the event of the pixel PX has occurred, and an address ADDR including a column address, a row address, or a group address.
For example, the column AER generator may receive, from the pixel PX in which an event has occurred, a signal indicating that the event has occurred, for example, a column request, and generate a column address C_ADDR of the pixel PX in which the event has occurred.
The row AER generator may receive, from the pixel PX in which an event has occurred, a signal indicating that the event has occurred, for example, a row request, and generate a row address R_ADDR of the pixel PX in which the event has occurred. Instead of the row AER generator generating a row address, a group address G_ADDR may also be generated in units of preset groups.
According to an example embodiment, the pixel array 110 may be scanned in units of columns, and when a request is received from a certain column, for example, a first column, the column AER generator may transmit a response signal to the first column. The pixel PX in which the event has occurred and which has received the response signal may transmit polarity information Pol, for example, a signal indicating the occurrence of an on-event or an off-event, to the row AER generator. Upon receiving the polarity information Pol, the row AER generator may transmit a reset signal to the pixel PX in which the event has occurred. In response to the reset signal, the pixel PX in which the event has occurred may be reset. The row AER generator may control a period at which a reset signal is generated. The row AER generator may generate information about a time when an event has occurred, that is, a timestamp TS.
Operations of the row AER generator and the column AER generator are described above by assuming that the pixel array 110 is scanned in units of columns. However, the operations of the row AER generator and the column AER generator are not limited thereto, and the row AER generator and the column AER generator may read, from the pixel PX in which an event has occurred, whether the event has occurred and polarity information Pol, in various manners. For example, the pixel array 110 may be scanned in units of rows, and operations of the row AER generator and the column AER generator may be exchanged with each other, that is, the column AER generator may receive a polarity information Pol and transmit a reset signal to the pixel array 110. Also, the row AER generator and the column AER generator may individually access the pixel PX in which the event has occurred.
The ESP unit 123 may generate an event signal EVS based on a column address C_ADDR, a row address R_ADDR, a group address G_ADDR, a polarity information Pol, and a timestamp TS received from the DTAG 122. According to an example embodiment, the ESP unit 123 may remove a noise event, and generate an event signal EVS with respect to valid events. For example, when an amount of events that occurred for a certain period of time is less than a threshold, the ESP unit 123 may determine the events as noise, and may not generate an event signal EVS with respect to the noise event.
Referring to
The crop unit 131 may set at least one ROI on a pixel array including a plurality of pixels, according to a user's settings or a preset algorithm. For example, the crop unit 131 may set a region in which multiple events occur, as an ROI, from among a plurality of regions set with respect to the pixel array 110, or set an arbitrary region corresponding to pixels in which multiple events occur during a certain period of time, as an ROI. According to another example embodiment, an arbitrary region that is set by a user arbitrarily or corresponds to pixels PX from which a certain object is sensed may be set as an ROI. However, the ROI is not limited thereto and may be set in various manners.
The binning unit 132 may adjust a data amount used in generating a map according to a user's settings or a preset algorithm. For example, the binning unit 132 may perform a binning operation on an ROI by grouping pixels included in a set ROI, in certain units.
The timestamp map generator 133 may generate a timestamp map TSM based on an event signal EVS adjusted using the crop unit 131 and the binning unit 132. For example, the timestamp map generator 133 may generate a timestamp map TSM only with respect to an ROI that is designated by cropping. Also, the timestamp map generator 133 may generate a timestamp map TSM having a size adjusted by binning.
The map data buffer 135 may store the timestamp map TSM generated using the timestamp map generator 133, and transmit the stored timestamp map TSM to the interface circuit 140 in the vision sensor 100 as map data MDT.
Referring to
An optical flow may be a pattern of apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer and a scene, and may be a distribution of apparent velocities of movement of a brightness pattern in an image. According to example embodiments, an optical flow may be information about a direction and velocity in and at which a pixel in which an event signal is generated is varied.
Referring to
The map data module 130 may receive the event signals EVS from the event detection circuit 120 (S110).
The map data module 130 may perform crop of selecting event signals generated in an ROI of a pixel array, from among event signals, according to a user's settings or a preset algorithm (S120).
The map data module 130 may perform event binning (S130). When the number of events that occurred in a preset N*N size-pixel is greater than a preset threshold value, the map data module 130 may determine that an event has occurred, and store an event signal EVS on an array for storing a timestamp map corresponding to the N*N size-pixel.
The map data module 130 may generate a timestamp map TSM (S140). The map data module 130 may generate a plurality of timestamp maps TSM having different reference frames from each other in an event signal frame period that is periodically generated.
For example, the map data module 130 may periodically generate a first timestamp reference and a first timestamp map based on a preset number of event signal frames counted from the first timestamp reference, and periodically generate a second timestamp reference and a second timestamp map based on a preset number of event signal frames counted from the second timestamp reference. For example, the first timestamp map may be generated by setting a first timestamp reference frame as a first event signal frame and accumulating up to an eleventh event signal frame, and the second timestamp map may be generated by setting a second timestamp reference frame as a fifth event signal frame and accumulating from the fifth event signal frame up to a fifteenth event signal frame. Three or more timestamp maps TSM may also be generated according to the occurrence of an event signal EVS or user's settings.
The map data module 130 may store an offset value of a timestamp based on a preset timestamp reference of a timestamp map TSM, and set a timestamp reference of the timestamp map TSM based on event occurrence information. When transmitting the timestamp map TSM to another device later, the map data module 130 may transmit raw data by adding, a timestamp reference value corresponding to the timestamp map TSM. For example, when an event generation time is r+k, a timestamp reference may be set as r, and an offset value of a timestamp may be stored as k. When transmitting the timestamp map to another device after the timestamp map is generated, a r+k value obtained by adding the timestamp reference r to the offset value k may be transmitted.
The map data module 130 may generate an optical flow map OFM by estimating, from the timestamp map TSM according to a preset algorithm, an optical flow including information about a direction and velocity in and at which a pixel in which an event signal is generated is varied. The map data module 130 may store the optical flow map OFM in a location of the map data buffer 135 in which the timestamp map TSM is stored, instead of the timestamp map TSM. Accordingly, a map data buffer size may be more effectively controlled.
The interface circuit 140 may transmit vision sensor data VSD including the event signal EVS, the timestamp map TSM, or the optical flow map OFM, to an external processor (S150). The interface circuit 140 may selectively transmit each piece of data according to an output mode.
For example, in a first output mode, the interface circuit 140 may output only an event signal included in the vision sensor data VSD to an external processor. In a second output mode, the interface circuit 140 may generate at least one virtual channel and transmit an event signal to an external processor via a first virtual channel, and transmit timestamp map data to an external processor via a second virtual channel. In a third output mode, the interface circuit 140 may generate at least one virtual channel, and transmit an event signal to an external processor via the first virtual channel, and transmit optical flow map data to the external processor via the second virtual channel.
Referring to
The photoelectric conversion device PD may convert incident light, that is, an optical signal, into an electrical signal, for example, an electric current. The photoelectric conversion device PD may include, for example, a photodiode, a phototransistor, a port gate or a pinned photodiode, or the like. The photoelectric conversion device PD may generate an electrical signal having a higher level as an intensity of incident light increases.
The amplifier 111 may convert a received current into a voltage and amplify a voltage level. An output voltage of the amplifier 111 may be provided to the first comparator 112 and the second comparator 113.
The first comparator 112 may compare an output voltage Vout of the amplifier 111 with an on-threshold voltage TH1, and generate an on signal E_ON based on a comparison result. The second comparator 113 may compare an output voltage Vout of the amplifier 111 with an off-threshold voltage TH2, and generate an off signal E_OFF based on a comparison result. The first comparator 112 and the second comparator 113 may generate an on signal E_ON or an off signal E_OFF when a variation amount of light received by the photoelectric conversion device PD is equal to or greater than a certain level of variation.
For example, the on signal E_ON may be at a high level when a light amount received by the photoelectric conversion device PD increases to a certain level or more, and the off signal E_OFF may be at a high level when a light amount received by the photoelectric conversion device PD is reduced to a certain level or less. The on-event holder 114 and the off-event holder 115 may respectively hold the on signal E_ON and the off signal E_OFF and then output the same. When the pixel PX is scanned, the on signal E_ON and the off signal E_OFF may be output. As described above, when light sensitivity is adjusted, levels of the on-threshold voltage TH1 and the off-threshold voltage TH2 may be modified. For example, the light sensitivity may be reduced. Accordingly, the level of the on-threshold voltage TH1 may be increased, and the level of the off-threshold voltage TH2 may be reduced. Accordingly, when a variation in light received by the photoelectric conversion device PD is greater than before the levels of the on-threshold voltage TH1 and the off-threshold voltage TH2 are modified, the first comparator 112 and the second comparator 113 may generate the on signal E_ON or the off signal E_OFF.
Referring to
A timestamp may include information about a time when an event has occurred. For example, a timestamp may include 32 bits, but is not limited thereto.
A column address and a row address may each include a plurality of bits, for example, 8 bits. In this case, a vision sensor including a plurality of pixels arranged in up to eight rows and up to eight columns may be supported. However, this is an example, and the number of bits of a column address and a row address may vary according to the number of pixels.
Polarity information may include information about an on-event and an off-event. For example, the polarity information may include 1 bit including information about whether an on-event has occurred and 1 bit including information about whether an off-event has occurred. For example, a bit indicating an on-event and a bit indicating an off-event may not be both ‘1’ but may be both ‘0.’
The map data MDT includes more information in each frame than the event signal EVS, and thus, a size and generation period of the map data MDT may be adjusted according to a user's settings. For example, the map data MDT may be generated based on 16 bits/pixel and 50 fps. A size (c×d) of the map data MDT may be adjusted by cropping and event binning setting.
Referring to
Binning refers to a data preprocessing technique of dividing raw data into smaller sections (bins) and replacing a value of the sections with a median value or the like. In the present disclosure, according to event binning, an event is determined to have occurred when the number of event signals generated on a pixel array of a preset N×N size is greater than a threshold. Referring to
Referring to
Referring to
When the vision sensor 100 generates a timestamp map TSM, the number of event signal frames or a time period may be set. Referring to
The vision sensor 100 may generate a plurality of timestamp maps TSM, for example, first through third timestamp maps. In this case, periods of frames for generating the first through third timestamp maps may overlap each other. The plurality of timestamp maps TSM may be generated based on identical event signal frames, but may differ in a timestamp reference, which is a reference time point of a timestamp map generation period.
For example, when generating three timestamp maps TSM, a first timestamp map may be generated by generating timestamp map data 1 TSM1 by accumulating from an nth event signal frame to a (n+11)th event signal frame included in a first map accumulate time window (Map 1 accumulate time window), a total of twelve event signal frames, with respect to a first timestamp reference frame TSR1. A second timestamp map may be generated by generating timestamp map data 2 TSM2 by accumulating a total of twelve event signal frames, from an (n+4)th event signal frame included in a second map accumulate time window (Map 2 accumulate time window), with respect to a second timestamp reference frame TSR2. A third timestamp map may be generated by generating timestamp map data 3 TSM3 by accumulating a total of twelve event signal frames, from an (n+8)th event signal frame included in a third map accumulate time window (Map 3 accumulate time window), with respect to a third timestamp reference frame TSR3. In the timestamp maps, a timestamp reference used as a reference differs, but event signal frames used in generating the timestamp maps may overlap each other.
Referring to
Referring to
When a map generation period is set, a timestamp map or an optical flow map may be generated for each of event signal frames of a preset number. For example, referring to
For example, to generate an optical flow map, a 5×5 size digital filter (z) having an appropriate filter coefficient with respect to a timestamp map (x) may be used. Referring to
When the timestamp map TSM is completed, an optical flow with respect to the timestamp map TSM may be calculated. Referring to
Referring to
The vision sensor 100 described with reference to
The image sensor 1200 may generate image data, such as raw image data, based on a received optical signal and provide the image data to the main processor 1300.
The main processor 1300 may control the overall operation of the electronic device 1000, and may detect a movement of an object by processing event data, that is, event signals received from the vision sensor 1100. Also, an image frame may be received from the image sensor 1200 and image processing may be performed based on preset information. Similar to the processor 200 shown in
The working memory 1400 may store data used for an operation of the electronic device 1000. For example, the working memory 1400 may temporarily store packets or frames processed by the main processor 1300. For example, the working memory 1400 may include a volatile memory such as dynamic random access memory (DRAM) and SRAM, and/or a non-volatile memory such as phase-change RAM (PRAM), magneto-resistive RAM (MRAM), resistive RAM (ReRAM), and ferro-electric RAM (FRAM).
The storage 1500 may store data, of which storage is requested from the main processor 1300 or other components. The storage 1500 may include a non-volatile memory such as flash memory, PRAM, MRAM, ReRAM, and FRAM.
The display device 1600 may include a display panel, a display driving circuit, and a display serial interface (DSI). For example, the display panel may be implemented as various devices such as a liquid crystal display (LCD) device, a light-emitting diode (LED) display, an organic LED (OLED) display, and an active matrix OLED (AMOLED) display. The display driving circuit may include a timing controller, a source driver, and the like, needed to drive a display panel. A DSI host embedded in the main processor 1300 may perform serial communication with the display panel through the DSI.
The user interface 1700 may include at least one of input interfaces such as a keyboard, a mouse, a keypad, a button, a touch panel, a touch screen, a touch pad, a touch ball, a gyroscope sensor, a vibration sensor, and an acceleration sensor.
The communicator 1800 may exchange signals with an external device/system through an antenna 1830. A transceiver 1810 and a modulator/demodulator (MODEM) 1820 of the communicator 1800 may process signals exchanged with external devices/systems according to wireless communication protocols such as Long Term Evolution (LTE), Worldwide Interoperability for Microwave Access (WIMAX), Global System for Mobile communication (GSM), Code Division Multiple Access (CDMA), Bluetooth, Near Field Communication (NFC), Wireless Fidelity (Wi-Fi), or Radio Frequency Identification (RFID).
Components of the electronic device 1000, for example, the vision sensor 1100, the image sensor 1200, the main processor 1300, the working memory 1400, the storage 1500, the display device 1600, the user interface 1700, and the communicator 1800, may exchange data according to at least one of various interface protocols such as Universal Serial Bus (USB), Small Computer System Interface (SCSI), MIPI, Inter-Integrated Circuit (I2C), Peripheral Component Interconnect Express (PCIe), Mobile PCIe (M-PCIe), Advanced Technology Attachment (ATA), Parallel ATA (PATA), Serial ATA (SATA), Serial Attached SCSI (SAS), Integrated Drive Electronics (IDE), Enhanced IDE (EIDE), Nonvolatile Memory Express (NVMe), or Universal Flash Storage (UFS).
At least one of the components, elements, modules or units (collectively “units” in this paragraph) represented by a block in the drawings including
While example embodiments have been illustrated and described above, it will be apparent to those skilled in the art that modifications and variations could be made without departing from the scope as defined by the appended claims and their equivalents.
Claims
1. A vision sensor comprising:
- a pixel array comprising a plurality of pixels disposed in a matrix form;
- an event detection circuit configured to detect whether an event has occurred in the plurality of pixels and generate event signals corresponding to pixels from among the plurality of pixels in which an event has occurred;
- a map data processor configured to generate a timestamp map based on the event signals; and
- an interface circuit configured to transmit vision sensor data including at least one of the event signals and the timestamp map to an external processor,
- wherein the timestamp map includes timestamp information indicating polarity information, address information, and an event occurrence time of a pixel included in an event signal corresponding to the pixel.
2. The vision sensor of claim 1, wherein the event signals are generated in an address-event representation (AER) format including address information, timestamp information, and polarity information of the pixels in which an event has occurred or a raw format including event occurrence information of all of the plurality of pixels.
3. The vision sensor of claim 1, wherein the map data processor is further configured to select event signals generated in a region of interest (ROI) of the pixel array, from among the event signals.
4. The vision sensor of claim 1, wherein, based on a number of events that occurred in a preset N*N size-pixel being greater than a preset threshold value, the map data processor is further configured to determine that an event has occurred in the N*N size-pixel, and store an event signal corresponding to the N*N size-pixel.
5. The vision sensor of claim 1, wherein the interface circuit is further configured to:
- in a first output mode, output an event signal included in the vision sensor data to the external processor;
- in a second output mode, generate at least one virtual channel, transmit the event signal to the external processor via a first virtual channel, and transmit the timestamp map data to the external processor via a second virtual channel; and
- in a third output mode, generate at least one virtual channel, transmit the event signal to the external processor via the first virtual channel, and transmit optical flow map data to the external processor via the second virtual channel.
6. The vision sensor of claim 1, wherein the map data processor is further configured to receive event signals periodically generated in each of frames, and generate a plurality of timestamp maps based on different frame periods.
7. The vision sensor of claim 6, wherein the map data processor is further configured to periodically generate a first timestamp reference frame and a first timestamp map based on a preset number of event signal frames from the first timestamp reference frame, and periodically generate a second timestamp reference frame and a second timestamp map based on a preset number of event signal frames from the second timestamp reference frame, and
- wherein the first timestamp map and the second timestamp map have overlapping event signal frame periods for generating the first timestamp map and the second timestamp map.
8. The vision sensor of claim 1, wherein the timestamp map is configured to store an offset value of a timestamp based on a preset timestamp reference.
9. The vision sensor of claim 8, wherein the timestamp map is configured to set a timestamp reference based on event occurrence information.
10. The vision sensor of claim 1, wherein the map data processor is further configured to generate an optical flow map by estimating an optical flow including information about a direction and a velocity in and at which a pixel in which an event signal is generated is varied, from the timestamp map, based on a preset algorithm.
11. The vision sensor of claim 10, wherein the map data processor is further configured to store the optical flow map in a location of a map data buffer, in which the timestamp map is stored, instead of the timestamp map.
12. An image processing device comprising:
- a vision sensor configured to: generate a plurality of event signals corresponding to a plurality of pixels in which an event has occurred based on a movement of an object, from among the plurality of pixels included in a pixel array; generate a timestamp map based on polarity information, address information, and event occurrence time information of a pixel included in the plurality of event signals, and output vision sensor data including the event signals and the timestamp map; and
- a processor configured to detect the movement of the object by processing the vision sensor data output from the vision sensor.
13. The image processing device of claim 12, wherein the vision sensor is further configured to:
- generate an optical flow map by estimating an optical flow including information about a direction and a velocity in and at which a pixel in which an event signal is generated is varied, from the timestamp map based on a preset algorithm; and
- replace the generated optical flow map in a location of a map data buffer in which the timestamp map is stored.
14. The image processing device of claim 12, wherein the vision sensor is further configured to generate at least one virtual channel, transmit the event signals to the processor via a first virtual channel, and transmit the timestamp map data or optical flow map data to the processor via a second virtual channel.
15. The image processing device of claim 12, wherein, based on a number of events that occurred in a preset N*N size-pixel being greater than a preset threshold value, the vision sensor determines that an event has occurred.
16. An operating method of a vision sensor, the operating method comprising:
- detecting whether an event has occurred in a plurality of pixels and generating event signals corresponding to pixels from among the plurality of pixels in which an event has occurred;
- generating vision sensor data including a timestamp map based on the event signals; and
- transmitting the vision sensor data to an external processor,
- wherein the timestamp map includes timestamp information indicating polarity information, address information, and an event occurrence time of a pixel included in an event signal.
17. The operating method of claim 16, wherein the event signals are generated in an address-event representation (AER) format including address information, timestamp information, and polarity information of the pixels in which an event has occurred or a raw format including event occurrence information of all of the plurality of pixels.
18. The operating method of claim 16, further comprising selecting event signals generated in a region of interest (ROI) of a pixel array including the plurality of pixels, from among the event signals.
19. The operating method of claim 16, wherein, based on a number of events that occurred in a preset N*N size-pixel being greater than a preset threshold value, the vision sensor determines that an event has occurred in the preset N*N size-pixel.
20. The operating method of claim 16, further comprising generating an optical flow map by estimating an optical flow including information about a direction and a velocity in and at which a pixel in which an event signal is generated is varied, from the timestamp map based on a preset algorithm.
Type: Application
Filed: Nov 1, 2021
Publication Date: Jun 2, 2022
Applicant: SAMSUNG ELECTRONICS CO., LTD. (Suwon-si)
Inventors: Jongseok SEO (Seoul), Junhyuk Park (Seoul), Hyunku Lee (Suwon-si)
Application Number: 17/515,755