Patents by Inventor Joseph Paradiso

Joseph Paradiso has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240051686
    Abstract: Described herein, is a sensing textile for a spacecraft, comprising an aerospace-grade fabric substrate having a surface and one or more sensing fibers coupled to the aerospace-grade fabric substrate, wherein at least a subset of the sensing fibers extends above the surface of the substrate. In some embodiments, the sensing fibers comprises one or more of an impact sensor, a charge sensor, a thermal sensor or radiative surface. Some embodiments, the sensing fibers are configured to form one or more patterned topologies about the surface of the aerospace-grade fabric substrate. In some embodiments, the patterned topologies comprises one or more of a pile, looped pile, waffle, spacer, seersucker, plissé, or an embroidery.
    Type: Application
    Filed: August 10, 2023
    Publication date: February 15, 2024
    Applicant: Massachusetts Institute of Technology
    Inventors: Juliana Mae CHERSTON, Joseph A. PARADISO
  • Publication number: 20230357967
    Abstract: Disclosed herein are systems and techniques for seamless and scalable piezoresistive matrix-based intelligent textile development using digital flat-bed and circular knitting machines. Disclosed embodiments allow for combining and customizing functional conductive and polyester and spandex yarns, thus allowing for designing the aesthetics and architecting and engineering both the electrical and mechanical properties of the pressure sensing textile. In addition, by incorporating a melting fiber, disclosed embodiments allow for shaping and personalizing a three-dimensional piezoresistive fabric structure that can conform to the human body through thermoforming principles.
    Type: Application
    Filed: May 3, 2023
    Publication date: November 9, 2023
    Applicant: Massachusetts Institute of Technology
    Inventors: Joseph A. Paradiso, Irmandy Wicaksono
  • Publication number: 20220242594
    Abstract: A method for autonomously assembling a plurality of tiles is performed in a microgravity environment. Each tile includes a shell having a first geometrical shape and an arrangement of first magnets and a controller that are supported by the shell. The controller controls operation of the arrangement of first magnets to self-assemble the shell with another tile. The first magnets are controlled to mate with a complementary arrangement of second magnets on the other tile when the complementary arrangement of second magnets floats to within a range of magnetic attractive force of the arrangement of first magnets, with or without the aid of propulsion. The controllers in the tiles detect the status of the magnetic bonds to determine whether each pair of tiles is properly bonded or has a magnetic bond error. When an error is detected, the tiles are controlled to disassemble and reassemble to correct the error.
    Type: Application
    Filed: February 2, 2022
    Publication date: August 4, 2022
    Applicant: Massachusetts Institute of Technology
    Inventors: Ariel EKBLAW, Joseph A. PARADISO
  • Patent number: 10706820
    Abstract: A system may produce a multimedia presentation that includes visual stimuli, auditory stimuli, olfactory stimuli, thermal stimuli and air currents that are perceptible to a human user. All or part of the system may be housed in or affixed to a table or desk. Sensors may monitor physiology or activities of the user and provide feedback regarding the user's response to the presentation. A user may input instructions for the system. Based on these instructions, the system may present a multimedia presentation which tends to produce a target physiological state of the user that is specified in the instructions or which tends to maintain a current physiological state of the user. The system may employ a control space to control the presentation. This control space may have axes that correspond to how a user perceives multimedia presentations.
    Type: Grant
    Filed: August 19, 2019
    Date of Patent: July 7, 2020
    Assignee: Massachusetts Institute of Technology
    Inventors: Nan Zhao, Joseph Paradiso, Yoav Reches
  • Patent number: 10708992
    Abstract: Camera images may be analyzed to identify perceptual control axes for a lighting system. These control axes may correspond to how human users perceive lighting scenes. A camera may capture high-dynamic-range images of a new room under various lighting conditions. Dimensionality-reduction may be applied to the images to create an image-based map. In this map, each datapoint may correspond to one of the images. The map may be transformed to align it with how human users perceive lighting scenes in the new room. The transformed map may be employed as a control space for the lighting system. In some cases, the map is created without gathering user ratings of lighting scenes for the new room. In other cases, creating the map involves gathering user ratings for as few as three lighting scenes for the new room.
    Type: Grant
    Filed: September 26, 2019
    Date of Patent: July 7, 2020
    Assignee: Massachusetts Institute of Technology
    Inventors: Nan Zhao, Joseph Paradiso
  • Patent number: 10705706
    Abstract: Closed loop control of a multimedia system may be achieved with a control space that includes at least two control axes. In each control axis, a coordinate on the axis may indicate a degree to which a multimedia scene facilitates or is perceived to facilitate a user state or a degree to which the user state is achieved. For example, the two control axes may be focus and restoration. A user may provide input that specifies a target state. The target state may be different than, or the same as, the user's current state. The system may select a scene that has coordinates, in the control space, that are closest, by at least a threshold, to the target state, and present the scene to the user. Sensors in the system may measure a user state that results from presenting the scene. The system may revise the scene's coordinates accordingly.
    Type: Grant
    Filed: March 6, 2018
    Date of Patent: July 7, 2020
    Assignee: Massachusetts Institute of Technology
    Inventors: Nan Zhao, Asaph Azaria, Joseph Paradiso
  • Publication number: 20200058269
    Abstract: A system may produce a multimedia presentation that includes visual stimuli, auditory stimuli, olfactory stimuli, thermal stimuli and air currents that are perceptible to a human user. All or part of the system may be housed in or affixed to a table or desk. Sensors may monitor physiology or activities of the user and provide feedback regarding the user's response to the presentation. A user may input instructions for the system. Based on these instructions, the system may present a multimedia presentation which tends to produce a target physiological state of the user that is specified in the instructions or which tends to maintain a current physiological state of the user. The system may employ a control space to control the presentation. This control space may have axes that correspond to how a user perceives multimedia presentations.
    Type: Application
    Filed: August 19, 2019
    Publication date: February 20, 2020
    Inventors: Nan Zhao, Joseph Paradiso, Yoav Reches
  • Publication number: 20200029405
    Abstract: Camera images may be analyzed to identify perceptual control axes for a lighting system. These control axes may correspond to how human users perceive lighting scenes. A camera may capture high-dynamic-range images of a new room under various lighting conditions. Dimensionality-reduction may be applied to the images to create an image-based map. In this map, each datapoint may correspond to one of the images. The map may be transformed to align it with how human users perceive lighting scenes in the new room. The transformed map may be employed as a control space for the lighting system. In some cases, the map is created without gathering user ratings of lighting scenes for the new room. In other cases, creating the map involves gathering user ratings for as few as three lighting scenes for the new room.
    Type: Application
    Filed: September 26, 2019
    Publication date: January 23, 2020
    Inventors: Nan Zhao, Joseph Paradiso
  • Patent number: 10477641
    Abstract: Camera images may be analyzed to identify perceptual control axes for a lighting system. These control axes may correspond to how human users perceive lighting scenes. A camera may capture high-dynamic-range images of a new room under various lighting conditions. Dimensionality-reduction may be applied to the images to create an image-based map. In this map, each datapoint may correspond to one of the images. The map may be transformed to align it with how human users perceive lighting scenes in the new room. The transformed map may be employed as a control space for the lighting system. In some cases, the map is created without gathering user ratings of lighting scenes for the new room. In other cases, creating the map involves gathering user ratings for as few as three lighting scenes for the new room.
    Type: Grant
    Filed: September 24, 2018
    Date of Patent: November 12, 2019
    Assignee: Massachusetts Institute of Technology
    Inventors: Nan Zhao, Joseph Paradiso
  • Patent number: 10380311
    Abstract: A system for introspection and annotation of electronic design data includes a tracked probe that interacts with an electronic circuit, a tracking system, schematics, design files, or models for the circuit, electronic design system software, and a user interface. The probe has a sensor that tracks the position of the probe within the circuit. The tracking system receives data from the sensor and translates it into coordinates reflecting the location of the probe within the circuit. The software uses the probe coordinates to locate the position of the probe on the circuit schematic, identify the circuit component at the probe location, and display information about the identified feature. The displayed information preferably includes an annotated version of the circuit schematic. The system may include a measurement or instrumentation device, the probe may include at least one parameter measurement device, and the display may include information derived from the parameter measurements.
    Type: Grant
    Filed: August 9, 2016
    Date of Patent: August 13, 2019
    Assignee: Massachusetts Institute of Technology
    Inventors: Pragun Goyal, Joseph A. Paradiso
  • Patent number: 10335061
    Abstract: A sensor system detects hand-to-mouth behavior. The system includes an electrical bio-impedance spectrometer and an inertial measurement unit. The sensor system may be worn on the forearm. The sensor system recognizes hand-to-mouth behavior in real-time, facilitating monitoring and immediate interventions An electrode positioning strategy optimizes the device's sensitivity and accuracy. Machine learning algorithms are leveraged to infer the hand-to-mouth detection. A prototype of the sensor system achieves 92% detection accuracy for recurrent usage by a single user and 90% accuracy for users that have not been previously encountered.
    Type: Grant
    Filed: October 22, 2016
    Date of Patent: July 2, 2019
    Assignee: Massachusetts Institute of Technology
    Inventors: Asaph Azaria, Brian Mayton, Joseph Paradiso
  • Publication number: 20190098724
    Abstract: Camera images may be analyzed to identify perceptual control axes for a lighting system. These control axes may correspond to how human users perceive lighting scenes. A camera may capture high-dynamic-range images of a new room under various lighting conditions. Dimensionality-reduction may be applied to the images to create an image-based map. In this map, each datapoint may correspond to one of the images. The map may be transformed to align it with how human users perceive lighting scenes in the new room. The transformed map may be employed as a control space for the lighting system. In some cases, the map is created without gathering user ratings of lighting scenes for the new room. In other cases, creating the map involves gathering user ratings for as few as three lighting scenes for the new room.
    Type: Application
    Filed: September 24, 2018
    Publication date: March 28, 2019
    Inventors: Nan Zhao, Joseph Paradiso
  • Publication number: 20180253222
    Abstract: Closed loop control of a multimedia system may be achieved with a control space that includes at least two control axes. In each control axis, a coordinate on the axis may indicate a degree to which a multimedia scene facilitates or is perceived to facilitate a user state or a degree to which the user state is achieved. For example, the two control axes may be focus and restoration. A user may provide input that specifies a target state. The target state may be different than, or the same as, the user's current state. The system may select a scene that has coordinates, in the control space, that are closest, by at least a threshold, to the target state, and present the scene to the user. Sensors in the system may measure a user state that results from presenting the scene. The system may revise the scene's coordinates accordingly.
    Type: Application
    Filed: March 6, 2018
    Publication date: September 6, 2018
    Inventors: Nan Zhao, Asaph Azaria, Joseph Paradiso
  • Patent number: 10005312
    Abstract: In exemplary implementations of this invention, a computer-assisted, handheld machining tool allows even an inexperienced user to carve a complex 3D object, while maintaining artistic freedom to modify the sculpture from an initial CAD design. The tool prevents the user from unintentionally removing material from a volume defined by the CAD design. It does so by slowing or halting spindle rotation as the bit approaches or penetrates the protected volume. The user can override this protective feature. The tool may operate in at least three interaction modes: (i) a static mode in which a static CAD model is used, where the computer assists by preventing the user from damaging the static model; (ii) a dynamic mode where the computer dynamically modifies the CAD model during the sculpting process; and (iii) an autonomous mode where the computer can operate independently of the user, for tasks such as semi-automatic texture rendering.
    Type: Grant
    Filed: April 16, 2016
    Date of Patent: June 26, 2018
    Assignee: Massachusetts Institute of Technology
    Inventors: Amit Zoran, Joseph Paradiso, Roy Shilkrot
  • Patent number: 9674929
    Abstract: In exemplary embodiments of this invention, one or more I/O devices accept input from a human user. The input is indicative of a value for each control variable in a set of separate control variables. A computer analyzes the input and outputs control signals to specify a set of separate setpoints. Dimmers adjust the intensity or color of a set of luminaires according to the setpoints. The number of separate control variables is much less than the number of separate setpoints. Having a human control a small number of control variables, in order to control a much larger number of separate luminaire setpoints of luminaires, has at least two advantages: control is faster and control is more intuitive. In illustrative implementations, the luminaire setpoints that are being controlled are not functions of each other.
    Type: Grant
    Filed: March 10, 2015
    Date of Patent: June 6, 2017
    Assignee: Massachusetts Institute of Technology
    Inventors: Joseph Paradiso, Matthew Aldrich, Nan Zhao
  • Publication number: 20170127979
    Abstract: A sensor system detects hand-to-mouth behavior. The system includes an electrical bio-impedance spectrometer and an inertial measurement unit. The sensor system may be worn on the forearm. The sensor system recognizes hand-to-mouth behavior in real-time, facilitating monitoring and immediate interventions An electrode positioning strategy optimizes the device's sensitivity and accuracy. Machine learning algorithms are leveraged to infer the hand-to-mouth detection. A prototype of the sensor system achieves 92% detection accuracy for recurrent usage by a single user and 90% accuracy for users that have not been previously encountered.
    Type: Application
    Filed: October 22, 2016
    Publication date: May 11, 2017
    Inventors: Asaph Azaria, Brian Mayton, Joseph Paradiso
  • Publication number: 20160350472
    Abstract: A system for introspection and annotation of electronic design data includes a tracked probe that interacts with an electronic circuit, a tracking system, schematics, design files, or models for the circuit, electronic design system software, and a user interface. The probe has a sensor that tracks the position of the probe within the circuit. The tracking system receives data from the sensor and translates it into coordinates reflecting the location of the probe within the circuit. The software uses the probe coordinates to locate the position of the probe on the circuit schematic, identify the circuit component at the probe location, and display information about the identified feature. The displayed information preferably includes an annotated version of the circuit schematic. The system may include a measurement or instrumentation device, the probe may include at least one parameter measurement device, and the display may include information derived from the parameter measurements.
    Type: Application
    Filed: August 9, 2016
    Publication date: December 1, 2016
    Applicant: Massachusetts Institute of Technology
    Inventors: Pragun Goyal, Joseph A. Paradiso
  • Patent number: 9446585
    Abstract: A handheld inkjet printer includes an inkjet print head and a tip. One or more sensors measure the position of points on a curved surface that are physically touched by the tip while the tip is moved relative to the surface. Based on these measurements, a computer generates or modifies a computer model that specifies at least (i) position of the curved surface, and (ii) a target region of the curved surface on which a pattern is to be printed. In addition, the one or more sensors measure position and orientation of nozzles in the print head, while the handset is moved relative to the surface. The computers also calculate, based on the computer model and these additional measurements, which of the nozzles to fire at a different times, such that the pattern is printed on the target region as the handset is moved relative to the surface.
    Type: Grant
    Filed: August 23, 2015
    Date of Patent: September 20, 2016
    Assignee: Massachusetts Institute of Technology
    Inventors: Pragun Goyal, Amit Zoran, Joseph Paradiso
  • Publication number: 20160231734
    Abstract: In exemplary implementations of this invention, a computer-assisted, handheld machining tool allows even an inexperienced user to carve a complex 3D object, while maintaining artistic freedom to modify the sculpture from an initial CAD design. The tool prevents the user from unintentionally removing material from a volume defined by the CAD design. It does so by slowing or halting spindle rotation as the bit approaches or penetrates the protected volume. The user can override this protective feature. The tool may operate in at least three interaction modes: (i) a static mode in which a static CAD model is used, where the computer assists by preventing the user from damaging the static model; (ii) a dynamic mode where the computer dynamically modifies the CAD model during the sculpting process; and (iii) an autonomous mode where the computer can operate independently of the user, for tasks such as semi-automatic texture rendering.
    Type: Application
    Filed: April 16, 2016
    Publication date: August 11, 2016
    Inventors: Amit Zoran, Joseph Paradiso, Roy Shilkrot
  • Patent number: 9411010
    Abstract: A system for introspection and annotation of electronic design data includes a tracked probe that interacts with an electronic circuit, a tracking system, schematics, design files, or models for the circuit, electronic design system software, and a user interface. The probe has a sensor that tracks the position of the probe within the circuit. The tracking system receives data from the sensor and translates it into coordinates reflecting the location of the probe within the circuit. The software uses the probe coordinates to locate the position of the probe on the circuit schematic, identify the circuit component at the probe location, and display information about the identified component. The displayed information preferably includes an annotated version of the circuit schematic. The system may include a measurement or instrumentation device, the probe may include at least one parameter measurement device, and the display may include information derived from the parameter measurements.
    Type: Grant
    Filed: October 3, 2014
    Date of Patent: August 9, 2016
    Assignee: Massachusetts Institute of Technology
    Inventors: Pragun Goyal, Joseph A. Paradiso