Patents by Inventor David Saul Hermina Martinez
David Saul Hermina Martinez has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11127217Abstract: A method, system, computer-readable media, and apparatuses for providing a shared vehicle experience between a user located in a remote location and one or more occupants of a vehicle. A device associated with the user receives an exterior video stream corresponding to an exterior environment of the vehicle, along with occupant information comprising location of the one or more occupants in the vehicle. Based on the occupant information, a first augmented video stream is generated. The first augmented video stream comprises one or more dynamic avatars representing each of the occupants of the vehicle, and shows the exterior environment of the vehicle from a perspective of a virtual passenger of the vehicle. The first augmented video stream is displayed to the user to provide the user with an experience of being in the vehicle.Type: GrantFiled: October 29, 2019Date of Patent: September 21, 2021Inventors: David Saul Hermina Martinez, Eugenia Yi Jen Leu, Delbert Bramlett Boone, II, Mohamed Amr Mohamed Nader Abuelfoutouh, Christopher Steven Nowakowski, Jean-Patrick Christian Marie Favier, Félix Louis Jean Marquette, Julie Marie Hardouin Arouna, Siav-Kuong Kuoch
-
Patent number: 11107281Abstract: A method, system, computer-readable media, and apparatuses for providing a shared vehicle experience. The method includes capturing an exterior video stream that depicts an exterior environment of a vehicle, and receiving remote user information indicating a bodily movement of a remote user. The method further includes generating and displaying an augmented video stream comprising a dynamic avatar representing the remote user as a virtual passenger, the dynamic avatar being updated based on the remote user information while the augmented video stream is being displayed in the vehicle. At least a portion of the exterior video stream and occupant information are sent to a computer device that causes, based on the at least a portion of the exterior video stream and the occupant information, a display device of the remote user to output an additional augmented video stream that provides the remote user with an experience of being in the vehicle.Type: GrantFiled: May 18, 2018Date of Patent: August 31, 2021Inventors: David Saul Hermina Martinez, Eugenia Yi Jen Leu, Delbert Bramlett Boone, II, Mohamed Amr Mohamed Nader Abuelfoutouh, Christopher Steven Nowakowski, Jean-Patrick Christian Marie Favier, Félix Louis Jean Marquette, Julie Marie Hardouin Arouna, Siav-Kuong Kuoch
-
Patent number: 10836313Abstract: Methods, apparatuses, and computer-readable media are disclosed for providing a mixed-reality scene. In response to a pedestrian detection event at a first vehicle, a sequence of mixed-reality images is presented to a driver of a second vehicle. At least one image in the sequence of mixed-reality images results from merging (a) an image captured by a camera aboard the first vehicle and (b) an image captured by a camera aboard the second vehicle. The merging may comprise de-emphasizing an occluded portion of the image captured by the camera aboard the second vehicle and emphasizing an unoccluded portion of the image captured by the camera aboard the first vehicle. The unoccluded portion of the image captured by the camera aboard the first vehicle may provide, in the merged image, visibility of a pedestrian at least partially blocked from a view of the driver of the second vehicle.Type: GrantFiled: November 28, 2018Date of Patent: November 17, 2020Assignee: Valeo Comfort and Driving AssistanceInventors: Christopher Steven Nowakowski, David Saul Hermina Martinez, Delbert Bramlett Boone, II, Eugenia Yi Jen Leu, Mohamed Amr Mohamed Nader Abuelfoutouh, Sonam Negi, Tung Ngoc Truong
-
Publication number: 20200164799Abstract: Methods, apparatuses, and computer-readable media are disclosed for providing a mixed-reality scene. In response to a pedestrian detection event at a first vehicle, a sequence of mixed-reality images is presented to a driver of a second vehicle. At least one image in the sequence of mixed-reality images results from merging (a) an image captured by a camera aboard the first vehicle and (b) an image captured by a camera aboard the second vehicle. The merging may comprise de-emphasizing an occluded portion of the image captured by the camera aboard the second vehicle and emphasizing an unoccluded portion of the image captured by the camera aboard the first vehicle. The unoccluded portion of the image captured by the camera aboard the first vehicle may provide, in the merged image, visibility of a pedestrian at least partially blocked from a view of the driver of the second vehicle.Type: ApplicationFiled: November 28, 2018Publication date: May 28, 2020Inventors: Christopher Steven NOWAKOWSKI, David Saul HERMINA MARTINEZ, Delbert Bramlett BOONE, II, Eugenia Yi Jen LEU, Mohamed Amr Mohamed Nader ABUELFOUTOUH, Sonam NEGI, Tung Ngoc TRUONG
-
Publication number: 20200086789Abstract: Methods, apparatuses, and computer-readable media are disclosed for providing a mixed-reality scene. According to one embodiment, a sequence of mixed-reality images is presented to a driver of a first vehicle, the first vehicle oriented in a substantially opposing direction relative to a second vehicle. At least one image in the sequence of mixed-reality images may result from merging (a) an image captured by a forward-facing camera aboard the first vehicle and (b) an image captured by a rear-facing camera aboard the second vehicle. The merging may comprise de-emphasizing an occluded portion of the image captured by the forward-facing camera aboard the first vehicle, the occluded portion corresponding to occlusion by the second vehicle, and emphasizing an unoccluded portion of the image captured by the rear-facing camera aboard the second vehicle.Type: ApplicationFiled: September 13, 2018Publication date: March 19, 2020Inventors: Christopher Steven NOWAKOWSKI, David Saul HERMINA MARTINEZ, Delbert Bramlett BOONE, II, Eugenia Yi Jen LEU, Mohamed Amr Mohamed Nader ABUELFOUTOUH, Sonam NEGI, Tung Ngoc TRUONG
-
Publication number: 20200066055Abstract: A method, system, computer-readable media, and apparatuses for providing a shared vehicle experience between a user located in a remote location and one or more occupants of a vehicle. A device associated with the user receives an exterior video stream corresponding to an exterior environment of the vehicle, along with occupant information comprising location of the one or more occupants in the vehicle. Based on the occupant information, a first augmented video stream is generated. The first augmented video stream comprises one or more dynamic avatars representing each of the occupants of the vehicle, and shows the exterior environment of the vehicle from a perspective of a virtual passenger of the vehicle. The first augmented video stream is displayed to the user to provide the user with an experience of being in the vehicle.Type: ApplicationFiled: October 29, 2019Publication date: February 27, 2020Inventors: David Saul HERMINA MARTINEZ, Eugenia Yi Jen LEU, Delbert Bramlett BOONE, II, Mohamed Amr Mohamed Nader ABUELFOUTOUH, Christopher Steven NOWAKOWSKI, Jean-Patrick Christian Marie FAVIER, Félix Louis Jean MARQUETTE, Julie Marie HARDOUIN AROUNA, Siav-Kuong KUOCH
-
Publication number: 20190355178Abstract: A method, system, computer-readable media, and apparatuses for providing a shared vehicle experience. The method includes capturing an exterior video stream that depicts an exterior environment of a vehicle, and receiving remote user information indicating a bodily movement of a remote user. The method further includes generating and displaying an augmented video stream comprising a dynamic avatar representing the remote user as a virtual passenger, the dynamic avatar being updated based on the remote user information while the augmented video stream is being displayed in the vehicle. At least a portion of the exterior video stream and occupant information are sent to a computer device that causes, based on the at least a portion of the exterior video stream and the occupant information, a display device of the remote user to output an additional augmented video stream that provides the remote user with an experience of being in the vehicle.Type: ApplicationFiled: May 18, 2018Publication date: November 21, 2019Inventors: David Saul HERMINA MARTINEZ, Eugenia Yi Jen LEU, Delbert Bramlett BOONE, II, Mohamed Amr Mohamed Nader ABUELFOUTOUH, Christopher Steven NOWAKOWSKI, Jean-Patrick Christian Marie FAVIER, Félix Louis Jean MARQUETTE, Julie Marie HARDOUIN AROUNA, Siav-Kuong KUOCH
-
Publication number: 20190114911Abstract: A method for determining a location of a vehicle. The method includes obtaining, by a marker reader of the vehicle, a marker location from a signal emitted by a marker located in a vicinity of the vehicle. The marker location is provided relative to a known reference point. The method further includes obtaining vehicle location information relative to the first marker using at least one sensor disposed in the vehicle, and obtaining an estimate of the vehicle location relative to the known reference point, using the marker location and the vehicle location information.Type: ApplicationFiled: October 12, 2017Publication date: April 18, 2019Applicant: Valeo North America, Inc.Inventors: Shahram Rezaei, David Saul Hermina Martinez
-
Publication number: 20190102082Abstract: A touch-sensitive alphanumeric user interface for a vehicle. The touch-sensitive alphanumeric user interface includes a touchpad, a touchpad interface unit, a display interface unit and a display. The touchpad is integrated in a steering wheel of the vehicle and obtains coordinates of a trajectory performed by a finger of a user of the vehicle. The touchpad interface unit determines that a finger movement, identified from the first coordinates, is an arcuate movement, and based on the determination, issues arcuate movement instructions to the display interface unit. The display interface unit renders a visualization of alphanumeric choices, arranged in a circular pattern, determines a selected alphanumeric choice from the alphanumeric choices, based on the arcuate movement instructions, and renders the selected alphanumeric choice to be highlighted.Type: ApplicationFiled: October 3, 2017Publication date: April 4, 2019Applicant: Valeo North America, Inc.Inventors: Siav-Kuong Kuoch, David Saul Hermina Martinez