Patents by Inventor Josh Watson
Josh Watson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11787690Abstract: A method of forming a micro electro mechanical system (MEMS) assembly comprises providing a substrate having an electrically conductive layer disposed thereon. The method also comprises depositing, on the substrate over the electrically conductive layer, a bonding material having an elastic modulus of less than 500 MPa so as to form a bond layer. The bond layer is completely cured, and a MEMS die is attached to the completely cured bond layer.Type: GrantFiled: April 2, 2021Date of Patent: October 17, 2023Assignee: KNOWLES ELECTRONICS, LLC.Inventors: Sung Bok Lee, John Szczech, Josh Watson
-
Publication number: 20200245053Abstract: A MEMS vibration sensor die can include a substrate having a top portion, a mounting surface, and an aperture extending at least partially through the substrate. The die can include a first electrode coupled to the top portion of the substrate and positioned over the aperture. The die can include a second electrode disposed between the substrate and the first electrode. The second electrode can be spaced apart from the first electrode. The die can include a proof mass that can have a first portion coupled to the first electrode or the second electrode. The proof mass can have a second end opposite the first portion. The second end can be recessed within the aperture relative to the mounting surface of the substrate. The proof mass can be suspended freely within the aperture. The proof mass can move the first electrode or the second electrode from which it is suspended in response to vibration.Type: ApplicationFiled: April 9, 2020Publication date: July 30, 2020Inventors: Michael Pedersen, Venkataraman Chandrasekaran, Josh Watson, Jeremy Johnson
-
Publication number: 20190194013Abstract: Systems and apparatuses for a microelectromechanical system (MEMS) device. The MEMS device includes a housing, a transducer, and a sensor. The housing includes a substrate defining a port and a cover. The substrate and the cover cooperatively form an internal cavity. The port fluidly couples the internal cavity to an external environment. The transducer is disposed within the internal cavity and positioned to receive acoustic energy through the port. The transducer is configured to convert the acoustic energy into an electrical signal. The sensor is disposed within the internal cavity and positioned to receive a gas through the port. The sensor is configured to facilitate detecting at least one of an offensive odor, smoke, a volatile organic compound, carbon monoxide, carbon dioxide, a nitrogen oxide, methane, and ozone.Type: ApplicationFiled: June 9, 2017Publication date: June 27, 2019Applicant: Knowles Electronics, LLCInventors: Venkataraman Chandrasekaran, John Szczech, Josh Watson, Claus Furst
-
Patent number: 9137734Abstract: A computing device's settings, specifically audio or video settings are adjusted based on a status and/or location of an event participant carrying the device. An event coordination service or an application on the computing device may match the participant's status and location to event location information. Upon matching either status or location, the computing device's settings may be modified to prevent audio feedback by any audio equipment at the event location or similar video interference.Type: GrantFiled: March 30, 2011Date of Patent: September 15, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Josh Watson, Nghiep Duong
-
Publication number: 20140380156Abstract: An interactive wall provides tactile feedback based on detected touch and displayed content on the wall surface. Upon detecting a touch on the wall surface, the interactive wall provides real time tactile feedback corresponding to the touch employing one or more actuators. The interactive wall itself may serve as projection surface for the display or a wall-size display be affixed to the wall trans-conducting tactile feedback to the user.Type: ApplicationFiled: September 2, 2014Publication date: December 25, 2014Inventors: William Verthein, Josh Watson, Stephen Lee
-
Patent number: 8830188Abstract: An interactive wall provides tactile feedback based on detected touch and displayed content on the wall surface. Upon detecting a touch on the wall surface, the interactive wall provides real time tactile feedback corresponding to the touch employing one or more actuators. The interactive wall itself may serve as projection surface for the display or a wall-size display be affixed to the wall trans-conducting tactile feedback to the user.Type: GrantFiled: June 21, 2011Date of Patent: September 9, 2014Assignee: Microsoft CorporationInventors: William Verthein, Josh Watson, Stephen Lee
-
Patent number: 8773499Abstract: A dynamically adjustable framed view of occupants in a room is captured through an automatic framing system. The system employs a camera system, including a pan/tilt/zoom (PTZ) camera and one or more depth cameras, to automatically locate occupants in a room and adjust the PTZ camera's pan, tilt, and zoom settings to focus in on the occupants and center them in the main video frame. The depth cameras may distinguish between occupants and inanimate objects and adaptively determine the location of the occupants in the room. The PTZ camera may be calibrated with the depth cameras in order to use the location information determined by the depth cameras to automatically center the occupants in the main video frame for a framed view. Additionally, the system may track position changes in the room and may dynamically adjust and update the framed view when changes occur.Type: GrantFiled: June 24, 2011Date of Patent: July 8, 2014Assignee: Microsoft CorporationInventors: Josh Watson, Simone Leorin, Ross G. Cutler
-
Patent number: 8698874Abstract: Techniques to stitch together multiple video streams are described. In an embodiment, a technique may include receiving a plurality of video streams from a plurality of video sources in a room. The video streams may be analyzed for feature points, such as furniture, light fixtures, window frames and so forth. The video streams may be processed to make the video qualities of the video streams, such as scale, color, brightness and so forth, more consistent with each other. Using the feature points, the processed video streams may be stitched together to generate a unified stream. The unified stream may be output to a display in the room and/or to remote viewers. Other embodiments are described and claimed.Type: GrantFiled: June 10, 2011Date of Patent: April 15, 2014Assignee: Microsoft CorporationInventors: Josh Watson, Simone Leorin
-
Patent number: 8624955Abstract: Techniques are described to provide a fixed video feed display from a remote participant to a conference room, where the display further includes remote participant information. In one embodiment, for example, a method may include receiving a connection from a remote participant, retrieving metadata related to the remote participant, and displaying a video feed from the remote participant along with the metadata in a dedicated position in a conference room. The metadata may provide information about the remote participant, as well as points of interest that may aid in conversation with the remote participant. The remote feed remains in the dedicated position throughout the conference, creating the effect of the remote participant being in the room. Other embodiments are described and claimed.Type: GrantFiled: June 2, 2011Date of Patent: January 7, 2014Assignee: Microsoft CorporationInventors: Josh Watson, Simone Leorin, Paul Kwoka
-
Publication number: 20120327179Abstract: A dynamically adjustable framed view of occupants in a room is captured through an automatic framing system. The system employs a camera system, including a pan/tilt/zoom (PTZ) camera and one or more depth cameras, to automatically locate occupants in a room and adjust the PTZ camera's pan, tilt, and zoom settings to focus in on the occupants and center them in the main video frame. The depth cameras may distinguish between occupants and inanimate objects and adaptively determine the location of the occupants in the room. The PTZ camera may be calibrated with the depth cameras in order to use the location information determined by the depth cameras to automatically center the occupants in the main video frame for a framed view. Additionally, the system may track position changes in the room and may dynamically adjust and update the framed view when changes occur.Type: ApplicationFiled: June 24, 2011Publication date: December 27, 2012Applicant: MICROSOFT CORPORATIONInventors: Josh Watson, Simone Leorin, Ross G. Cutler
-
Publication number: 20120326989Abstract: An interactive wall provides tactile feedback based on detected touch and displayed content on the wall surface. Upon detecting a touch on the wall surface, the interactive wall provides real time tactile feedback corresponding to the touch employing one or more actuators. The interactive wall itself may serve as projection surface for the display or a wall-size display be affixed to the wall trans-conducting tactile feedback to the user.Type: ApplicationFiled: June 21, 2011Publication date: December 27, 2012Applicant: Microsoft CorporationInventors: William Verthein, Josh Watson, Stephen Lee
-
Publication number: 20120314015Abstract: Techniques to stitch together multiple video streams are described. In an embodiment, a technique may include receiving a plurality of video streams from a plurality of video sources in a room. The video streams may be analyzed for feature points, such as furniture, light fixtures, window frames and so forth. The video streams may be processed to make the video qualities of the video streams, such as scale, color, brightness and so forth, more consistent with each other. Using the feature points, the processed video streams may be stitched together to generate a unified stream. The unified stream may be output to a display in the room and/or to remote viewers. Other embodiments are described and claimed.Type: ApplicationFiled: June 10, 2011Publication date: December 13, 2012Applicant: MICROSOFT CORPORATIONInventors: Josh Watson, Simone Leorin
-
Publication number: 20120306992Abstract: Techniques are described to provide a fixed video feed display from a remote participant to a conference room, where the display further includes remote participant information. In one embodiment, for example, a method may include receiving a connection from a remote participant, retrieving metadata related to the remote participant, and displaying a video feed from the remote participant along with the metadata in a dedicated position in a conference room. The metadata may provide information about the remote participant, as well as points of interest that may aid in conversation with the remote participant. The remote feed remains in the dedicated position throughout the conference, creating the effect of the remote participant being in the room. Other embodiments are described and claimed.Type: ApplicationFiled: June 2, 2011Publication date: December 6, 2012Applicant: MICROSOFT CORPORATIONInventors: Josh Watson, Simone Leorin, Paul Kwoka
-
Publication number: 20120293606Abstract: Techniques for automatically selecting a video camera feed based on room events in a video teleconference are described. An embodiment may receive video information from multiple cameras in a conference room. An event of interest may be detected from the video information. Events of interest may be detected, for example, by detecting faces, detecting an eye gaze or head direction, and detecting motion. When an event of interest is detected, the video camera having the optimal view of the event may be selected, and the feed from the selected video camera may be transmitted to remote participants. Other embodiments are described and claimed.Type: ApplicationFiled: May 20, 2011Publication date: November 22, 2012Applicant: MICROSOFT CORPORATIONInventors: Josh Watson, Simone Leorin
-
Publication number: 20120254382Abstract: A computing device's settings, specifically audio or video settings are adjusted based on a status and/or location of an event participant carrying the device. An event coordination service or an application on the computing device may match the participant's status and location to event location information. Upon matching either status or location, the computing device's settings may be modified to prevent audio feedback by any audio equipment at the event location or similar video interference.Type: ApplicationFiled: March 30, 2011Publication date: October 4, 2012Applicant: MICROSOFT CORPORATIONInventors: Josh Watson, Nghiep Duong