Patents Assigned to Cognixion
-
Patent number: 12008162Abstract: A method and system are disclosed using steady-state motion visual evoked potential stimuli in an augmented reality environment. Requested stimuli data are received from a user application on a smart device. Sensor data and other context data are also received, where other context data includes data that is un-sensed. The requested stimuli data are transformed into modified stimuli based on the sensor data, and the other context data. Modified stimuli and environmental stimuli are presented to the user with a rendering device configured to mix the modified stimuli and the environmental stimuli, thereby resulting in rendered stimuli. Biosignals generated in response to the rendered stimuli are received from the user to a wearable biosignal sensing device. Received biosignals are classified based on the modified stimuli, resulting in a classified selection, which is returned to the user application.Type: GrantFiled: April 5, 2022Date of Patent: June 11, 2024Assignee: COGNIXION CORPORATIONInventors: Sarah Pearce, Aravind Ravi, Jing Lu, Ning Jiang, Andreas Forsland, Chris Ullrich
-
Patent number: 11977682Abstract: There is disclosed devices, systems and methods for nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio. The system comprises sensory devices comprising sensors to detect a user inputting gestures on sensor interfaces, a cloud system comprising a processor, for retrieving the inputted gestures detected by the sensor on the sensory device, comparing the inputted gestures to gestures stored in databases on the cloud system, identifying at least a text, graphics and/or speech command comprising a word that corresponds to the inputted gesture; showing the command to the user; and transmitting the command to another device.Type: GrantFiled: August 1, 2023Date of Patent: May 7, 2024Assignee: COGNIXION CORPORATIONInventor: Andreas Forsland
-
Patent number: 11762467Abstract: There is disclosed devices, systems and methods for nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio. The system comprises sensory devices comprising sensors to detect a user inputting gestures on sensor interfaces, a cloud system comprising a processor, for retrieving the inputted gestures detected by the sensor on the sensory device, comparing the inputted gestures to gestures stored in databases on the cloud system, identifying at least a text, graphics and/or speech command comprising a word that corresponds to the inputted gesture; showing the command to the user; and transmitting the command to another device.Type: GrantFiled: December 21, 2022Date of Patent: September 19, 2023Assignee: COGNIXION CORPORATIONInventor: Andreas Forsland
-
Publication number: 20230274516Abstract: A method and system of a user interface device with dual-sided display which may include a system using a brain computer interface with an Augmented Reality (AR) headset. The user's intent is sent to the system, which processes, analyzes and maps the user's intent. An output corresponding to the user's intent is projected using the user interface device. This output is displayed on the user's side of the display. An image corresponding to the output is displayed on the observer's side of the display.Type: ApplicationFiled: May 9, 2023Publication date: August 31, 2023Applicant: Cognixion CorporationInventors: Joseph Andreas Forsland, Leonard Zerman
-
Patent number: 11561616Abstract: There is disclosed devices, systems and methods for nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio. The system comprises sensory devices comprising sensors to detect a user inputting gestures on sensor interfaces, a cloud system comprising a processor, for retrieving the inputted gestures detected by the sensor on the sensory device, comparing the inputted gestures to gestures stored in databases on the cloud system, identifying at least a text, graphics and/or speech command comprising a word that corresponds to the inputted gesture; showing the command to the user; and transmitting the command to another device.Type: GrantFiled: December 23, 2021Date of Patent: January 24, 2023Assignee: Cognixion CorporationInventor: Andreas Forsland
-
Publication number: 20220326772Abstract: An apparatus, system, and method of a brain computer interface in a headset including an augmented reality display, one or more sensors, a processing module, at least one biofeedback device, and a battery. The interface may include a printed circuit board that has the sensors to read bio-signals, provides biofeedback, and performs the processing, analyzing, and mapping of bio-signals into output. The output provides feedback via stimulation of multiple sensory brain systems of a user, including audio and visual on the augmented reality display, or audio and haptic in terms of vibration patterns that a human user may feel. All together this forms a closed-loop system, by detecting the bio-signal, then providing sensory-feedback, which in turn enhances the bio-signal.Type: ApplicationFiled: June 23, 2022Publication date: October 13, 2022Applicant: Cognixion CorporationInventors: Andreas Forsland, Leonard Zerman
-
Publication number: 20220326771Abstract: A method and system are disclosed using steady-state motion visual evoked potential stimuli in an augmented reality environment. Requested stimuli data are received from a user application on a smart device. Sensor data and other context data are also received, where other context data includes data that is un-sensed. The requested stimuli data are transformed into modified stimuli based on the sensor data, and the other context data. Modified stimuli and environmental stimuli are presented to the user with a rendering device configured to mix the modified stimuli and the environmental stimuli, thereby resulting in rendered stimuli. Biosignals generated in response to the rendered stimuli are received from the user to a wearable biosignal sensing device. Received biosignals are classified based on the modified stimuli, resulting in a classified selection, which is returned to the user application.Type: ApplicationFiled: April 5, 2022Publication date: October 13, 2022Applicant: Cognixion CorporationInventors: Sarah Pearce, Aravind Ravi, Jing Lu, Ning Jiang, Andreas Forsland, Chris Ullrich
-
Patent number: 11402909Abstract: An apparatus, system, and method of a brain computer interface in a headset including an augmented reality display, one or more sensors, a processing module, at least one biofeedback device, and a battery. The interface may include a printed circuit board that has the sensors to read bio-signals, provides biofeedback, and performs the processing, analyzing, and mapping of bio-signals into output. The output provides feedback via stimulation of multiple sensory brain systems of a user, including audio and visual on the augmented reality display, or audio and haptic in terms of vibration patterns that a human user may feel. All together this forms a closed-loop system, by detecting the bio-signal, then providing sensory-feedback, which in turn enhances the bio-signal.Type: GrantFiled: April 5, 2021Date of Patent: August 2, 2022Assignee: CognixionInventors: Andreas Forsland, Leonard Zerman
-
Patent number: 11237635Abstract: There is disclosed devices, systems and methods for nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio. The system comprises sensory devices comprising sensors to detect a user inputting gestures on sensor interfaces, a cloud system comprising a processor, for retrieving the inputted gestures detected by the sensor on the sensory device, comparing the inputted gestures to gestures stored in databases on the cloud system, identifying at least a text, graphics and/or speech command comprising a word that corresponds to the inputted gesture; showing the command to the user; and transmitting the command to another device.Type: GrantFiled: January 4, 2021Date of Patent: February 1, 2022Assignee: COGNIXIONInventor: Andreas Forsland
-
Publication number: 20210223864Abstract: An apparatus, system, and method of a brain computer interface in a headset including an augmented reality display, one or more sensors, a processing module, at least one biofeedback device, and a battery. The interface may include a printed circuit board that has the sensors to read bio-signals, provides biofeedback, and performs the processing, analyzing, and mapping of bio-signals into output. The output provides feedback via stimulation of multiple sensory brain systems of a user, including audio and visual on the augmented reality display, or audio and haptic in terms of vibration patterns that a human user may feel. All together this forms a closed-loop system, by detecting the bio-signal, then providing sensory-feedback, which in turn enhances the bio-signal.Type: ApplicationFiled: April 5, 2021Publication date: July 22, 2021Applicant: CognixionInventors: Andreas Forsland, Leonard Zerman
-
Patent number: 10990175Abstract: A method and system of a brain computer interface in a headset including an augmented reality display, one or more sensors, a processing module, at least one biofeedback device, and a battery. The interface may include a printed circuit board that has the sensors to read bio-signals, provides biofeedback, and performs the processing, analyzing, and mapping of bio-signals into output. The output provides feedback via stimulation of multiple sensory brain systems of a user, including audio and visual on the augmented reality display, or audio and haptic in terms of vibration patterns that a human user may feel. All together this forms a closed-loop system, by detecting the bio-signal, then providing sensory-feedback, which in turn enhances the bio-signal.Type: GrantFiled: January 9, 2019Date of Patent: April 27, 2021Assignee: CognixionInventors: Andreas Forsland, Leonard Zerman
-
Patent number: 10860095Abstract: A method of real-time eye tracking feedback with an eye-movement tracking camera includes receiving a left eye movement transform, a right eye movement transform, and gaze direction information from a user's face and user's eyes. An eye tracking map is constructed including the left eye movement transform and the right eye movement transform. The eye tracking map is displayed with the left eye movement information, the right eye movement information, and the gaze direction information on a device screen. Feedback is provided to the user related the left eye movement transform, the right eye movement transform, and the gaze direction information.Type: GrantFiled: May 2, 2019Date of Patent: December 8, 2020Assignee: CognixionInventors: Leonard Zerman, Andreas Forsland
-
Publication number: 20200234503Abstract: A method and system of a user interface device with dual-sided display which may include a system using a brain computer interface with an Augmented Reality (AR) headset. The user's intent is sent to the system, which processes, analyzes and maps the user's intent. An output corresponding to the user's intent is projected using the user interface device. This output is displayed on the user's side of the display. An image corresponding to the output is displayed on the observer's side of the display.Type: ApplicationFiled: January 22, 2020Publication date: July 23, 2020Applicant: CognixionInventors: Andreas Forsland, Leonard Zerman
-
Patent number: D953328Type: GrantFiled: January 16, 2021Date of Patent: May 31, 2022Assignee: CognixionInventor: Andreas Forsland
-
Patent number: D952630Type: GrantFiled: January 16, 2021Date of Patent: May 24, 2022Assignee: CognixionInventor: Andreas Forsland