Patents by Inventor Michael J. Beckerle

Michael J. Beckerle has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 7752299
    Abstract: With a continuous source of data relating to transactions, the data may be segmented and processed in a data flow arrangement, optionally in parallel, and the data may be processed without storing the data in an intermediate database. Data from multiple sources may be processed in parallel. The segmentation also may define points at which aggregate outputs may be provided, and where checkpoints may be established.
    Type: Grant
    Filed: October 12, 2007
    Date of Patent: July 6, 2010
    Assignee: International Business Machines Corporation
    Inventors: Lawrence A. Bookman, David Albert Blair, Steven M. Rosenthal, Robert Louis Krawitz, Michael J. Beckerle, Jerry Lee Callen, Allen Razdow, Shyam R. Mudambi
  • Publication number: 20080270403
    Abstract: With a continuous source of data relating to transactions, the data may be segmented and processed in a data flow arrangement, optionally in parallel, and the data may be processed without storing the data in an intermediate database. Data from multiple sources may be processed in parallel. The segmentation also may define points at which aggregate outputs may be provided, and where checkpoints may be established.
    Type: Application
    Filed: October 12, 2007
    Publication date: October 30, 2008
    Inventors: LAWRENCE A. BOOKMAN, David Albert Blair, Steven M. Rosenthal, Robert Louis Krawitz, Michael J. Beckerle, Jerry Lee Callen, Allen Razdow, Shyam R. Mudambi
  • Patent number: 7392320
    Abstract: With a continuous source of data relating to transactions, the data may be segmented and processed in a data flow arrangement, optionally in parallel, and the data may be processed without storing the data in an intermediate database. Data from multiple sources may be processed in parallel. The segmentation also may define points at which aggregate outputs may be provided, and where checkpoints may be established.
    Type: Grant
    Filed: May 14, 2004
    Date of Patent: June 24, 2008
    Assignee: International Business Machines Corporation
    Inventors: Lawrence A. Bookman, David Albert Blair, Steven M. Rosenthal, Robert Louis Krawitz, Michael J. Beckerle, Jerry Lee Callen, Allen M. Razdow, Shyam R. Mudambi
  • Patent number: 6801938
    Abstract: With a continuous source of data relating to transactions, the data may be segmented and processed in a data flow arrangement, optionally in parallel, and the data may be processed without storing the data in an intermediate database. Data from multiple sources may be processed in parallel. The segmentation also may define points at which aggregate outputs may be provided, and where checkpoints may be established.
    Type: Grant
    Filed: June 19, 2000
    Date of Patent: October 5, 2004
    Assignee: Torrent Systems, Inc.
    Inventors: Lawrence A. Bookman, David Albert Blair, Steven M. Rosenthal, Robert Louise Krawitz, Michael J. Beckerle, Jerry Lee Callen, Allen Razdow, Shyam R. Mudambi
  • Patent number: 6415286
    Abstract: A computer system splits a data space to partition data between processors or processes. The data space may be split into sub-regions which need not be orthogonal to the axes defined the data space's parameters, using a decision tree. The decision tree can have neural networks in each of its non-terminal nodes that are trained on, and are used to partition, training data. Each terminal, or leaf, node can have a hidden layer neural network trained on the training data that reaches the terminal node. The training of the non-terminal nodes' neural networks can be performed on one processor and the training of the leaf nodes' neural networks can be run on separate processors. Different target values can be used for the training of the networks of different non-terminal nodes. The non-terminal node networks may be hidden layer neural networks.
    Type: Grant
    Filed: March 29, 1999
    Date of Patent: July 2, 2002
    Assignee: Torrent Systems, Inc.
    Inventors: Anthony Passera, John R. Thorp, Michael J. Beckerle, Edward S. Zyszkowski
  • Publication number: 20020080181
    Abstract: A performance monitor represents execution of a data flow graph by changing performance information along different parts of a representation of that graph. If the graph is executed in parallel, the monitor can show parallel operator instances, associated datalinks, and performance information relevant to each. The individual parallel processes executing the graph send performance messages to the performance monitor, and the performance monitor can instruct such processes to vary the information they send. The monitor can provides 2D or 3D views in which the user can change focus, zoom and viewpoint. In 3D views, parallel instances of the same operator are grouped in a 2D array. The data rate of a datalink can be represented by both the density and velocity of line segments along the line which represent it. The line can be colored as a function of the datalink's source or destination, its data rate, or the integral thereof.
    Type: Application
    Filed: November 20, 2001
    Publication date: June 27, 2002
    Inventors: Allen M. Razdow, Daniel W. Kohn, Michael J. Beckerle, Jeffrey D. Ives
  • Publication number: 20020083424
    Abstract: A computer system splits a data space to partition data between processors or processes. The data space may be split into sub-regions which need not be orthogonal to the axes defined by the data space's parameters, using a decision tree. The decision tree can have neural networks in each of its non-terminal nodes that are trained on, and are used to partition, training data. Each terminal, or leaf, node can have a hidden layer neural network trained on the training data that reaches the terminal node. The training of the non-terminal nodes' neural networks can be performed on one processor and the training of the leaf nodes' neural networks can be run on separate processors. Different target values can be used for the training of the networks of different non-terminal nodes. The non-terminal node networks may be hidden layer neural networks.
    Type: Application
    Filed: November 20, 2001
    Publication date: June 27, 2002
    Inventors: Anthony Passera, John R. Throp, Michael J. Beckerle, Edward S. A. Zyszkowski
  • Patent number: 6330008
    Abstract: A performance monitor represents execution of a data flow graph by changing performance information along different parts of a representation of that graph. If the graph is executed in parallel, the monitor can show parallel operator instances, associated datalinks, and performance information relevant to each. The individual parallel processes executing the graph send performance messages to the performance monitor, and the performance monitor can instruct such processes to vary the information they send. The monitor can provides 2D or 3D views in which the user can change focus, zoom and viewpoint. In 3D views, parallel instances of the same operator are grouped in a 2D array. The data rate of a datalink can be represented by both the density and velocity of line segments along the line which represent it. The line can be colored as a function of the datalink's source or destination, its data rate, or the integral thereof.
    Type: Grant
    Filed: February 24, 1997
    Date of Patent: December 11, 2001
    Assignee: Torrent Systems, Inc.
    Inventors: Allen M. Razdow, Daniel W. Kohn, Michael J. Beckerle, Jeffrey D. Ives
  • Patent number: 6311265
    Abstract: A system provides an environment for parallel programming by providing a plurality of modular parallelizable operators stored in a computer readable memory. Each operator defines operation programming for performing an operation, one or more communication ports, each of which is either an input port for providing the operation programming a data stream of records, or an output port for receiving a data stream of records from the operation programming and an indication for each of the operator's input ports, if any, of a partitioning method to be applied to the data stream supplied to the input port.
    Type: Grant
    Filed: March 25, 1996
    Date of Patent: October 30, 2001
    Assignee: Torrent Systems, Inc.
    Inventors: Michael J. Beckerle, James Richard Burns, Jerry L. Callen, Jeffrey D. Ives, Robert L. Krawitz, Daniel L. Leary, Seven Rosenthal, Edward S. A. Zyzkowski
  • Patent number: 6289474
    Abstract: Checkpointing of operations on data may be provided by partitioning the data into temporal segments. Operations may be performed on the temporal segments and checkpoints may be established by storing a persistent indication of the segment being processed. The entire processing state need not be saved. If a failure occurs, processing can be restarted using the saved indication of the segment to be processed. Such data partitioning and checkpointing may be applied to relational databases, databases with dataflow operation and/or parallelism and other database types with or without parallel operation.
    Type: Grant
    Filed: June 24, 1998
    Date of Patent: September 11, 2001
    Assignee: Torrent Systems, Inc.
    Inventor: Michael J. Beckerle
  • Patent number: 5909681
    Abstract: A computer system splits a data space to partition data between processors or processes. The data space may be split into sub-regions which need not be orthogonal to the axes defined by the data space's parameters, using a decision tree. The decision tree can have neural networks in each of its non-terminal nodes that are trained on, and are used to partition, training data. Each terminal, or leaf, node can have a hidden layer neural network trained on the training data that reaches the terminal node. The training of the non-terminal nodes' neural networks can be performed on one processor and the training of the leaf nodes' neural networks can be run on separate processors. Different target values can be used for the training of the networks of different non-terminal nodes. The non-terminal node networks may be hidden layer neural networks.
    Type: Grant
    Filed: March 25, 1996
    Date of Patent: June 1, 1999
    Assignee: Torrent Systems, Inc.
    Inventors: Anthony Passera, John R. Thorp, Michael J. Beckerle, Edward S. A. Zyszkowski