Patents by Inventor Karan Ahuja
Karan Ahuja has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240369537Abstract: The invention relates to an in vitro cell culture comprising neuronal cells derived from iPSC, optionally with an NGN2 transgene, astrocytes derived from iPSC with a SOX9 transgene, and oligodendrocytes derived from iPSC with a SOX10 transgene.Type: ApplicationFiled: September 20, 2022Publication date: November 7, 2024Inventors: Karan Ahuja, Rodrigo Furtado Madeiro da Costa, Cathérine Verfaillie
-
Publication number: 20230385449Abstract: Systems and processes for managing access to personal data based on a purpose for storing the personal data are provided. In a method for managing personal data access, personal data for a data subject corresponding to a first data category is received, and an operation is executed in a purpose agent to associate one or more purposes to the personal data, where the one or more purposes are assigned to the first data category and include at least a first purpose. The personal data may be stored in a data storage system, and the stored personal data may be designated as being associated with the one or more purposes. Access to the personal data may be controlled based on the one or more purposes.Type: ApplicationFiled: July 18, 2022Publication date: November 30, 2023Applicant: SAP SEInventors: Carsten Pluder, Diane Schmidt, Volker Lehnert, Martina Knoedler, Thorsten Bruckmeier, Philipp Alexander Zikesch, Bernhard Drittler, Matthias Vogel, Katrin Ludwig, Naved Ahmed, Saritha Palli, Shweta Sureshchandra Gupta, Arun Kumar Gowd, Dev Karan Ahuja, Shwetha H S
-
Patent number: 11763798Abstract: Embodiments are provided to recognize features and activities from an audio signal. In one embodiment, a model is generated from sound effect data, which is augmented and projected into an audio domain to form a training dataset efficiently. Sound effect data is data that has been artificially created or from enhanced sounds or sound processes to provide a more accurate baseline of sound data than traditional training data. The sound effect data is augmented to create multiple variants to broaden the sound effect data. The augmented sound effects are projected into various audio domains, such as indoor, outdoor, urban, based on mixing background sounds consistent with these audio domains. The model is installed on any computing device, such as a laptop, smartphone, or other device. Features and activities from an audio signal are then recognized by the computing device based on the model without the need for in-situ training.Type: GrantFiled: July 15, 2021Date of Patent: September 19, 2023Assignee: CARNEGIE MELLON UNIVERSITYInventors: Gierad Laput, Karan Ahuja, Mayank Goel, Christopher Harrison
-
Patent number: 11604133Abstract: This disclosure provides methods and systems for classifying biological particles, e.g., blood cells, microbes, circulating tumor cells (CTCs). Using impedance flow cytometry, such as multi-frequency impedance cytometry, in conjunction with supervised machine learning, the disclosed methods and systems demonstrated improved accuracy in classifying biological particles.Type: GrantFiled: April 17, 2020Date of Patent: March 14, 2023Assignee: Rutgers, the State University of New JerseyInventors: Mehdi Javanmard, Karan Ahuja, Jianye Sui, Joseph R. Bertino
-
Publication number: 20220005454Abstract: Embodiments are provided to recognize features and activities from an audio signal. In one embodiment, a model is generated from sound effect data, which is augmented and projected into an audio domain to form a training dataset efficiently. Sound effect data is data that has been artificially created or from enhanced sounds or sound processes to provide a more accurate baseline of sound data than traditional training data. The sound effect data is augmented to create multiple variants to broaden the sound effect data. The augmented sound effects are projected into various audio domains, such as indoor, outdoor, urban, based on mixing background sounds consistent with these audio domains. The model is installed on any computing device, such as a laptop, smartphone, or other device. Features and activities from an audio signal are then recognized by the computing device based on the model without the need for in-situ training.Type: ApplicationFiled: July 15, 2021Publication date: January 6, 2022Inventors: Gierad Laput, Karan Ahuja, Mayank Goel, Christopher Harrison
-
Patent number: 11120459Abstract: A system and method utilizing a mass user gaze fixation behavior with respect to products and product racks (using a camera-based inputs and gaze-tracking algorithms) and a financial profitability of each product/product type; generates gaze-based heat maps for indicating blind spots and eye gaze points at different positions of the racks/shelves and indicating product placement positions on the heat map; and advises product placement in the retail store for optimizing financial profitability by performing an iterative refinement of product placements over a period for the product/similar products. The system translates mass eye gaze of retail store customers into financial profitability via product placement optimization, and iteratively runs a method to optimize product placements, and generate gaze-based heat maps and perform blind spot analysis for avoiding product placements at certain locations and placing products at other locations, e.g., by running a decision module on consumer gaze pattern data.Type: GrantFiled: March 1, 2019Date of Patent: September 14, 2021Assignee: International Business Machines CorporationInventors: Roman Vaculin, Utkarsh Dwivedi, Karan Ahuja, Seema Nagar, Kuntal Dey
-
Patent number: 11119572Abstract: Selective display of available objects based on eye-gaze characteristics with respect to currently displayed objects and certain attributes of the currently displayed objects. Similarity-driven selective display compares attributes of a displayed object receiving a fixed gaze to attributes of available objects and displays similar objects in addition to the displayed object receiving the fixed gaze.Type: GrantFiled: November 6, 2019Date of Patent: September 14, 2021Assignee: International Business Machines CorporationInventors: Karan Ahuja, Kuntal Dey, Sougata Mukherjea, Seema Nagar
-
Patent number: 11069334Abstract: Embodiments are provided to recognize features and activities from an audio signal. In one embodiment, a model is generated from sound effect data, which is augmented and projected into an audio domain to form a training dataset efficiently. Sound effect data is data that has been artificially created or from enhanced sounds or sound processes to provide a more accurate baseline of sound data than traditional training data. The sound effect data is augmented to create multiple variants to broaden the sound effect data. The augmented sound effects are projected into various audio domains, such as indoor, outdoor, urban, based on mixing background sounds consistent with these audio domains. The model is installed on any computing device, such as a laptop, smartphone, or other device. Features and activities from an audio signal are then recognized by the computing device based on the model without the need for in-situ training.Type: GrantFiled: August 13, 2019Date of Patent: July 20, 2021Assignee: Carnegie Mellon UniversityInventors: Gierad Laput, Karan Ahuja, Mayank Goel, Christopher Harrison
-
Patent number: 11055891Abstract: Examples of the present disclosure describe systems and methods for providing real-time motion styling in virtual reality (VR), augmented reality (AR), and/or mixed reality (MR) environments. In aspects, input data corresponding to user interaction with a VR, an AR, or an MR environment may be received. The input data may be featurized to generate a feature set. The feature set may be compared to a set of stored motion data comprising motion capture data representing one or more motion styles for executing an action or activity. Based on the comparison, the feature set may be matched to feature data for one or more motions styles in the stored motion data. The one or more motions styles may then be executed by a virtual avatar or a virtual object in the VR/AR/MR environment.Type: GrantFiled: March 10, 2020Date of Patent: July 6, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Eyal Ofek, Mar Gonzalez Franco, Andrew D. Wilson, Karan Ahuja, Christian Holz
-
Publication number: 20210132691Abstract: Selective display of available objects based on eye-gaze characteristics with respect to currently displayed objects and certain attributes of the currently displayed objects. Similarity-driven selective display compares attributes of a displayed object receiving a fixed gaze to attributes of available objects and displays similar objects in addition to the displayed object receiving the fixed gaze.Type: ApplicationFiled: November 6, 2019Publication date: May 6, 2021Inventors: Karan Ahuja, Kuntal Dey, Sougata Mukherjea, Seema Nagar
-
Patent number: 10963914Abstract: An advertisement remarketing method, system, and non-transitory computer readable medium, include a gaze tracking circuit configured to track a gaze of a user relative to a plurality of advertisements displayed on content-serving devices to determine a focus level of the user with respect to each of the plurality of advertisements, a list creating circuit configured to create a plurality of ranked lists ranking the focus level of the user for each of the plurality of advertisements, each of the plurality of ranked lists corresponding to a different content-serving device of the content-serving devices, and a remarketing circuit configured to remarket advertisements to the user on a user device based on the plurality of ranked lists.Type: GrantFiled: June 13, 2016Date of Patent: March 30, 2021Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Karan Ahuja, Kuntal Dey, Seema Nagar, Roman Vaculin
-
Patent number: 10955158Abstract: A venue occupant comfort system, comprises a processor that stores computer executable components stored in memory. A plurality of sensors sense ambient conditions associated with exterior and interior conditions of a venue. A context component infers or determines context of an occupant of the venue. A crowd estimation component infers, based at least in part on mining social networks, size of crowd expected at the venue. A comfort model component implicitly and explicitly trained on occupant comfort related data analyzes information from the plurality of sensors, the crowd estimation component and context component. A comfort controller adjusts environmental conditions of the venue based at least in part on output of the comfort model component. The adjustments to venue environment can optionally be differentiated by zone.Type: GrantFiled: January 3, 2020Date of Patent: March 23, 2021Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Karan Ahuja, Heena Bansal, Kuntal Dey, Seema Nagar, Roman Vaculin
-
Publication number: 20200357182Abstract: A system and method using light sources as spatial anchors is provided. Augmented reality (AR) requires precise and instant overlay of digital information onto everyday objects. Embodiments disclosed herein provide a new method for displaying spatially-anchored data, also referred to as LightAnchors. LightAnchors takes advantage of pervasive point lights—such as light emitting diodes (LEDs) and light bulbs—for both in-view anchoring and data transmission. These lights are blinked at high speed to encode data. An example embodiment includes an application that runs on a mobile operating system without any hardware or software modifications, which has been demonstrated to perform well under various use cases.Type: ApplicationFiled: May 6, 2020Publication date: November 12, 2020Inventors: Karan Ahuja, Sujeath Pareddy, Bo Robert Xiao, Christopher Harrison, Mayank Goel
-
Patent number: 10831270Abstract: A gaze tracking system, method, and computer product for tracking an eye gaze on a screen of a device including a single monocular camera, the system including measuring a rotation of a hinged plane of a display screen with respect to the eye gaze, combining the rotation with a three-dimensional movement of the camera, a position of the camera being constant with respect to the display screen, and estimating a point of gaze localization on the display screen using the single monocular camera as the input, in absence of a sensor, and without performing a display screen calibration.Type: GrantFiled: October 15, 2019Date of Patent: November 10, 2020Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Karan Ahuja, Kuntal Dey, Seema Nagar, Roman Vaculin
-
Publication number: 20200333235Abstract: This disclosure provides methods and systems for classifying biological particles, e.g., blood cells, microbes, circulating tumor cells (CTCs). Using impedance flow cytometry, such as multi-frequency impedance cytometry, in conjunction with supervised machine learning, the disclosed methods and systems demonstrated improved accuracy in classifying biological particles.Type: ApplicationFiled: April 17, 2020Publication date: October 22, 2020Applicant: Rutgers, The State University of New JerseyInventors: Mehdi Javanmard, Karan Ahuja, Jianye Sui, Joseph R. Bertino
-
Patent number: 10776827Abstract: A location-based advertisement method, system, and non-transitory computer readable medium, include a location-based advertisement system, including a gaze tracking circuit configured to track a gaze of a user relative to a plurality of advertisements displayed on content-serving devices to determine a focus level of the user with respect to each of the plurality of advertisements, a location detecting circuit configured to detect when a user is within a predetermined proximity of a location of a business having products related to the plurality of advertisements displayed on the content-serving devices, and an advertisement sending circuit configured to send a notification to the user based on the user being within the predetermined proximity of the location of the business and the plurality of ranked lists ranking the focus level of the user.Type: GrantFiled: June 13, 2016Date of Patent: September 15, 2020Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Karan Ahuja, Kuntal Dey, Seema Nagar, Roman Vaculin
-
Publication number: 20200279283Abstract: A system and method utilizing a mass user gaze fixation behavior with respect to products and product racks (using a camera-based inputs and gaze-tracking algorithms) and a financial profitability of each product/product type; generates gaze-based heat maps for indicating blind spots and eye gaze points at different positions of the racks/shelves and indicating product placement positions on the heat map; and advises product placement in the retail store for optimizing financial profitability by performing an iterative refinement of product placements over a period for the product/similar products. The system translates mass eye gaze of retail store customers into financial profitability via product placement optimization, and iteratively runs a method to optimize product placements, and generate gaze-based heat maps and perform blind spot analysis for avoiding product placements at certain locations and placing products at other locations, e.g., by running a decision module on consumer gaze pattern data.Type: ApplicationFiled: March 1, 2019Publication date: September 3, 2020Inventors: Roman Vaculin, Utkarsh Dwivedi, Karan Ahuja, Seema Nagar, Kuntal Dey
-
Patent number: 10747859Abstract: A humanness validation method, system, and computer program product, include generating a sequence of actions including gaze actions and emotional state actions, instructing an access requestor to perform the actions in the sequence one-by-one, and granting access to the access requestor if a total error rate of each of the performed one-by-one actions in the sequence of actions is less than a predefined threshold value.Type: GrantFiled: January 6, 2017Date of Patent: August 18, 2020Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Karan Ahuja, Kuntal Dey, Utkarsh Dwivedi, Seema Nagar, Roman Vaculin
-
Patent number: 10675939Abstract: Methods, systems, and computer program products for pre-cooling and pre-heating vehicles using crowd estimation techniques are provided herein. A computer-implemented method includes estimating the number of individuals having travelled in a vehicle over sub-routes of a fixed route, learning temporal patterns pertaining to the number of individuals in the vehicle across each of the sub-routes, predicting the number of individuals that will be in the vehicle during the sub-routes based on the patterns, computing an expected amount of time for the vehicle to complete a sub-route and arrive at a location that commences a subsequent sub-route, determining an amount by which the energy required to maintain a temperature range in the vehicle is to be modified prior to the vehicle reaching the location based on said predicting and said computing, and utilizing energy to maintain the temperature range, based on said determining, prior to the vehicle reaching the location.Type: GrantFiled: January 17, 2017Date of Patent: June 9, 2020Assignee: International Business Machines CorporationInventors: Karan Ahuja, Heena Bansal, Kuntal Dey, Seema Nagar
-
Patent number: 10665119Abstract: Techniques are provided for creating presentation notes based upon gaze tracking information associated with observers of a presentation. In one example, a computer-implemented method comprises: obtaining, by a system operatively coupled to a processor, gaze information associated with observers of a presentation; determining, by the system, respective content clarity scores for content elements of the presentation based on a content clarity function; selecting, by the system, respective content from one or more content sources for the content elements based on the respective content clarity scores; and generating, by the system, presentation notes based on the presentation and the selected respective content for the content elements of the presentation.Type: GrantFiled: December 14, 2017Date of Patent: May 26, 2020Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Karan Ahuja, Kuntal Dey, Utkarsh Dwivedi, Seema Nagar, Roman Vaculin