Patents by Inventor Vaishnavi A. Sashikanth

Vaishnavi A. Sashikanth has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11775833
    Abstract: Techniques herein train a multilayer perceptron, sparsify edges of a graph such as the perceptron, and store edges and vertices of the graph. Each edge has weight. A computer sparsifies perceptron edges. The computer performs a forward-backward pass on the perceptron to calculate a sparse Hessian matrix. Based on that Hessian, the computer performs quasi-Newton perceptron optimization. The computer repeats this until convergence. The computer stores edges in an array and vertices in another array. Each edge has weight and input and output indices. Each vertex has input and output indices. The computer inserts each edge into an input linked list based on its weight. Each link of the input linked list has the next input index of an edge. The computer inserts each edge into an output linked list based on its weight. Each link of the output linked list comprises the next output index of an edge.
    Type: Grant
    Filed: October 3, 2019
    Date of Patent: October 3, 2023
    Assignee: Oracle International Corporation
    Inventors: Dmitry Golovashkin, Uladzislau Sharanhovich, Vaishnavi Sashikanth
  • Publication number: 20200034713
    Abstract: Techniques herein train a multilayer perceptron, sparsify edges of a graph such as the perceptron, and store edges and vertices of the graph. Each edge has weight. A computer sparsifies perceptron edges. The computer performs a forward-backward pass on the perceptron to calculate a sparse Hessian matrix. Based on that Hessian, the computer performs quasi-Newton perceptron optimization. The computer repeats this until convergence. The computer stores edges in an array and vertices in another array. Each edge has weight and input and output indices. Each vertex has input and output indices. The computer inserts each edge into an input linked list based on its weight. Each link of the input linked list has the next input index of an edge. The computer inserts each edge into an output linked list based on its weight. Each link of the output linked list comprises the next output index of an edge.
    Type: Application
    Filed: October 3, 2019
    Publication date: January 30, 2020
    Inventors: Dmitry Golovashkin, Uladzislau Sharanhovich, Vaishnavi Sashikanth
  • Patent number: 10467528
    Abstract: Techniques herein train a multilayer perceptron, sparsify edges of a graph such as the perceptron, and store edges and vertices of the graph. Each edge has weight. A computer sparsifies perceptron edges. The computer performs a forward-backward pass on the perceptron to calculate a sparse Hessian matrix. Based on that Hessian, the computer performs quasi-Newton perceptron optimization. The computer repeats this until convergence. The computer stores edges in an array and vertices in another array. Each edge has weight and input and output indices. Each vertex has input and output indices. The computer inserts each edge into an input linked list based on its weight. Each link of the input linked list has the next input index of an edge. The computer inserts each edge into an output linked list based on its weight. Each link of the output linked list comprises the next output index of an edge.
    Type: Grant
    Filed: August 11, 2015
    Date of Patent: November 5, 2019
    Assignee: Oracle International Corporation
    Inventors: Dmitry Golovashkin, Uladzislau Sharanhovich, Vaishnavi Sashikanth
  • Patent number: 9990303
    Abstract: Techniques herein are for sharing data structures. In embodiments, a computer obtains a directed object graph (DOG) containing objects and pointers interconnecting the objects. Each object pointer (OP) resides in a source object and comprises a memory address (MA) of a target object (TO). An original address space (OAS) contains the MA of the TO. The objects are not contiguous within the OAS. The DOG resides in original memory segment(s). The computer obtains an additional memory segment (AMS) beginning at a base address. The computer records the base address within the AMS. For each object in the DOG, the computer copies the object into the AMS at a respective address. For each OP in the DOG having the object as the TO of the MA of the OP, the computer replaces the MA of the OP with the respective address. AMS contents are provided in another address space.
    Type: Grant
    Filed: August 21, 2017
    Date of Patent: June 5, 2018
    Assignee: ORACLE INTERNATIONAL CORPORATION
    Inventors: Uladzislau Sharanhovich, Anand Srinivasan, Dmitry Golovashkin, Vaishnavi Sashikanth
  • Patent number: 9870342
    Abstract: According to one technique, a modeling computer computes a Hessian matrix by determining whether an input matrix contains more than a threshold number of dense columns. If so, the modeling computer computes a sparsified version of the input matrix and uses the sparsified matrix to compute the Hessian. Otherwise, the modeling computer identifies which columns are dense and which columns are sparse. The modeling computer then partitions the input matrix by column density and uses sparse matrix format to store the sparse columns and dense matrix format to store the dense columns. The modeling computer then computes component parts which combine to form the Hessian, wherein component parts that rely on dense columns are computed using dense matrix multiplication and component parts that rely on sparse columns are computed using sparse matrix multiplication.
    Type: Grant
    Filed: June 8, 2017
    Date of Patent: January 16, 2018
    Assignee: Oracle International Corporation
    Inventors: Dmitry Golovashkin, Uladzislau Sharanhovich, Vaishnavi Sashikanth
  • Publication number: 20170344488
    Abstract: Techniques herein are for sharing data structures. In embodiments, a computer obtains a directed object graph (DOG) containing objects and pointers interconnecting the objects. Each object pointer (OP) resides in a source object and comprises a memory address (MA) of a target object (TO). An original address space (OAS) contains the MA of the TO. The objects are not contiguous within the OAS. The DOG resides in original memory segment(s). The computer obtains an additional memory segment (AMS) beginning at a base address. The computer records the base address within the AMS. For each object in the DOG, the computer copies the object into the AMS at a respective address. For each OP in the DOG having the object as the TO of the MA of the OP, the computer replaces the MA of the OP with the respective address. AMS contents are provided in another address space.
    Type: Application
    Filed: August 21, 2017
    Publication date: November 30, 2017
    Inventors: ULADZISLAU SHARANHOVICH, Anand Srinivasan, Dmitry Golovashkin, Vaishnavi Sashikanth
  • Publication number: 20170286365
    Abstract: According to one technique, a modeling computer computes a Hessian matrix by determining whether an input matrix contains more than a threshold number of dense columns. If so, the modeling computer computes a sparsified version of the input matrix and uses the sparsified matrix to compute the Hessian. Otherwise, the modeling computer identifies which columns are dense and which columns are sparse. The modeling computer then partitions the input matrix by column density and uses sparse matrix format to store the sparse columns and dense matrix format to store the dense columns. The modeling computer then computes component parts which combine to form the Hessian, wherein component parts that rely on dense columns are computed using dense matrix multiplication and component parts that rely on sparse columns are computed using sparse matrix multiplication.
    Type: Application
    Filed: June 8, 2017
    Publication date: October 5, 2017
    Inventors: Dmitry Golovashkin, Uladzislau Sharanhovich, Vaishnavi Sashikanth
  • Patent number: 9740626
    Abstract: Techniques herein are for sharing data structures between processes. A method involves obtaining a current memory segment that begins at a current base address within a current address space. The current memory segment comprises a directed object graph and a base pointer. The graph comprises object pointers and objects. For each particular object, determine whether a different memory segment contains an equivalent object that is equivalent to the particular object. If the equivalent object exists, for each object pointer having the particular object as its target object, replace the memory address of the object pointer with a memory address of the equivalent object that does not reside in the current memory segment. Otherwise, for each object pointer having the particular object as its target object, increment the memory address of the object pointer by an amount that is a difference between the current base address and the original base address.
    Type: Grant
    Filed: August 11, 2015
    Date of Patent: August 22, 2017
    Assignee: Oracle International Corporation
    Inventors: Uladzislau Sharanhovich, Anand Srinivasan, Dmitry Golovashkin, Vaishnavi Sashikanth
  • Patent number: 9715481
    Abstract: According to one technique, a modeling computer computes a Hessian matrix by determining whether an input matrix contains more than a threshold number of dense columns. If so, the modeling computer computes a sparsified version of the input matrix and uses the sparsified matrix to compute the Hessian. Otherwise, the modeling computer identifies which columns are dense and which columns are sparse. The modeling computer then partitions the input matrix by column density and uses sparse matrix format to store the sparse columns and dense matrix format to store the dense columns. The modeling computer then computes component parts which combine to form the Hessian, wherein component parts that rely on dense columns are computed using dense matrix multiplication and component parts that rely on sparse columns are computed using sparse matrix multiplication.
    Type: Grant
    Filed: March 9, 2015
    Date of Patent: July 25, 2017
    Assignee: Oracle International Corporation
    Inventors: Dmitry Golovashkin, Uladzislau Sharanhovich, Vaishnavi Sashikanth
  • Patent number: 9594692
    Abstract: Techniques herein are for sharing data structures between processes. A method involves obtaining a current memory segment that begins at a current base address within a current address space. The current memory segment comprises a directed object graph and a base pointer. The graph comprises object pointers and objects. For each particular object, determine whether a different memory segment contains an equivalent object that is equivalent to the particular object. If the equivalent object exists, for each object pointer having the particular object as its target object, replace the memory address of the object pointer with a memory address of the equivalent object that does not reside in the current memory segment. Otherwise, for each object pointer having the particular object as its target object, increment the memory address of the object pointer by an amount that is a difference between the current base address and the original base address.
    Type: Grant
    Filed: August 11, 2015
    Date of Patent: March 14, 2017
    Assignee: Oracle International Corporation
    Inventors: Uladzislau Sharanhovich, Anand Srinivasan, Dmitry Golovashkin, Vaishnavi Sashikanth
  • Publication number: 20170046270
    Abstract: Techniques herein are for sharing data structures between processes. A method involves obtaining a current memory segment that begins at a current base address within a current address space. The current memory segment comprises a directed object graph and a base pointer. The graph comprises object pointers and objects. For each particular object, determine whether a different memory segment contains an equivalent object that is equivalent to the particular object. If the equivalent object exists, for each object pointer having the particular object as its target object, replace the memory address of the object pointer with a memory address of the equivalent object that does not reside in the current memory segment. Otherwise, for each object pointer having the particular object as its target object, increment the memory address of the object pointer by an amount that is a difference between the current base address and the original base address.
    Type: Application
    Filed: August 11, 2015
    Publication date: February 16, 2017
    Inventors: Uladzislau Sharanhovich, Anand Srinivasan, Dmitry Golovashkin, Vaishnavi Sashikanth
  • Publication number: 20170046614
    Abstract: Techniques herein train a multilayer perceptron, sparsify edges of a graph such as the perceptron, and store edges and vertices of the graph. Each edge has weight. A computer sparsifies perceptron edges. The computer performs a forward-backward pass on the perceptron to calculate a sparse Hessian matrix. Based on that Hessian, the computer performs quasi-Newton perceptron optimization. The computer repeats this until convergence. The computer stores edges in an array and vertices in another array. Each edge has weight and input and output indices. Each vertex has input and output indices. The computer inserts each edge into an input linked list based on its weight. Each link of the input linked list has the next input index of an edge. The computer inserts each edge into an output linked list based on its weight. Each link of the output linked list comprises the next output index of an edge.
    Type: Application
    Filed: August 11, 2015
    Publication date: February 16, 2017
    Inventors: Dmitry Golovashkin, Uladzislau Sharanhovich, Vaishnavi Sashikanth
  • Patent number: 9418082
    Abstract: A method, system, and computer program product for interfacing an R language client with a separate database engine environment. The method commences by interpreting an R language code fragment to identify and select R language constructs and transforming the R language constructs into queries or other database language constructs to execute within the database engine environment. The method further implements techniques for transmitting marshalled results (resulting from the execution of the database language constructs) back to the R client environment. In some situations, the marshalled results include an XML schema or DTD or another metadata description of the structure of the results.
    Type: Grant
    Filed: March 29, 2012
    Date of Patent: August 16, 2016
    Assignee: Oracle International Corporation
    Inventors: Denis B. Mukhin, Patrick Aboyoun, Vaishnavi Sashikanth
  • Publication number: 20150378962
    Abstract: According to one technique, a modeling computer computes a Hessian matrix by determining whether an input matrix contains more than a threshold number of dense columns. If so, the modeling computer computes a sparsified version of the input matrix and uses the sparsified matrix to compute the Hessian. Otherwise, the modeling computer identifies which columns are dense and which columns are sparse. The modeling computer then partitions the input matrix by column density and uses sparse matrix format to store the sparse columns and dense matrix format to store the dense columns. The modeling computer then computes component parts which combine to form the Hessian, wherein component parts that rely on dense columns are computed using dense matrix multiplication and component parts that rely on sparse columns are computed using sparse matrix multiplication.
    Type: Application
    Filed: March 9, 2015
    Publication date: December 31, 2015
    Inventors: Dmitry Golovashkin, Uladzislau Sharanhovich, Vaishnavi Sashikanth
  • Patent number: 9047566
    Abstract: According to one aspect of the invention, target data comprising observations is received. A neural network comprising input neurons, output neurons, hidden neurons, skip-layer connections, and non-skip-layer connections is used to analyze the target data based on an overall objective function that comprises a linear regression part, the neural network's unregularized objective function, and a regularization term. An overall optimized first vector value of a first vector and an overall optimized second vector value of a second vector are determined based on the target data and the overall objective function. The first vector comprises skip-layer weights for the skip-layer connections and output neuron biases, whereas the second vector comprises non-skip-layer weights for the non-skip-layer connections.
    Type: Grant
    Filed: March 12, 2013
    Date of Patent: June 2, 2015
    Assignee: ORACLE INTERNATIONAL CORPORATION
    Inventors: Dmitry Golovashkin, Patrick Aboyoun, Vaishnavi Sashikanth
  • Patent number: 9043368
    Abstract: A method, system, and computer program product for interfacing an R language client with a separate database engine environment. The method commences by interpreting an R language code fragment to identify and select R language constructs and transforming the R language constructs into queries or other database language constructs to execute within the database engine environment. The method further implements techniques for transmitting marshalled results (resulting from the execution of the database language constructs) back to the R client environment. In some situations, the marshalled results include an XML schema or DTD or another metadata description of the structure of the results.
    Type: Grant
    Filed: March 29, 2012
    Date of Patent: May 26, 2015
    Assignee: ORACLE INTERNATIONAL CORPORATION
    Inventors: Denis B. Mukhin, Vaishnavi Sashikanth, Mark F. Hornick
  • Publication number: 20140279771
    Abstract: According to one aspect of the invention, target data comprising observations is received. A neural network comprising input neurons, output neurons, hidden neurons, skip-layer connections, and non-skip-layer connections is used to analyze the target data based on an overall objective function that comprises a linear regression part, the neural network's unregularized objective function, and a regularization term. An overall optimized first vector value of a first vector and an overall optimized second vector value of a second vector are determined based on the target data and the overall objective function. The first vector comprises skip-layer weights for the skip-layer connections and output neuron biases, whereas the second vector comprises non-skip-layer weights for the non-skip-layer connections.
    Type: Application
    Filed: March 12, 2013
    Publication date: September 18, 2014
    Applicant: ORACLE INTERNATIONAL CORPORATION
    Inventors: DMITRY GOLOVASHKIN, PATRICK ABOYOUN, VAISHNAVI SASHIKANTH
  • Patent number: 8626572
    Abstract: A method of quota planning. In one embodiment, the method includes determining a top-down goal. The top-down goal indicates an expected amount of sales for a sales territory of a sales territory hierarchy. The method also includes generating a bottom-up recommendation for the sales territory and specifying a quota for the sales territory. The bottom-up recommendation is reconciled with the top-down goal, resulting in a quota that indicates an assigned amount of sales for the sales territory.
    Type: Grant
    Filed: April 20, 2010
    Date of Patent: January 7, 2014
    Assignee: Oracle International Corporation
    Inventors: George H. Colliat, Ajay A. Awatramani, John Kuzmicki, Vaishnavi A. Sashikanth
  • Publication number: 20130262524
    Abstract: A method, system, and computer program product for interfacing an R language client with a separate database engine environment. The method commences by interpreting an R language code fragment to identify and select R language constructs and transforming the R language constructs into queries or other database language constructs to execute within the database engine environment. The method further implements techniques for transmitting marshalled results (resulting from the execution of the database language constructs) back to the R client environment. In some situations, the marshalled results include an XML schema or DTD or another metadata description of the structure of the results.
    Type: Application
    Filed: March 29, 2012
    Publication date: October 3, 2013
    Applicant: Oracle International Corporation
    Inventors: Denis B. MUKHIN, Vaishnavi SASHIKANTH, Mark F. HORNICK
  • Publication number: 20130262520
    Abstract: A method, system, and computer program product for interfacing an R language client with a separate database engine environment. The method commences by interpreting an R language code fragment to identify and select R language constructs and transforming the R language constructs into queries or other database language constructs to execute within the database engine environment. The method further implements techniques for transmitting marshaled results (resulting from the execution of the database language constructs) back to the R client environment. In some situations, the marshaled results include an XML schema or DTD or another metadata description of the structure of the results.
    Type: Application
    Filed: March 29, 2012
    Publication date: October 3, 2013
    Applicant: Oracle International Corporation
    Inventors: Denis B. MUKHIN, Patrick Aboyoun, Vaishnavi Sashikanth