Patents by Inventor Jeong Ae Han

Jeong Ae Han has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10275508
    Abstract: A method may include receiving a query for data to be provided by a database server, wherein the query includes an indication of a maximum lag. The method may further include determining whether a hint is available to apply to the query, wherein the hint affects an execution of the query. When no hint is available, a baseline database server may be selected to be the database server. When the hint is available, a replication server or a cache server may be selected to be the database server based on the hint and the maximum lag. The query may be processed at the selected database server.
    Type: Grant
    Filed: November 19, 2015
    Date of Patent: April 30, 2019
    Assignee: SAP SE
    Inventors: Christian Bensberg, Norman May, Alexander Boehm, Juchang Lee, Sung Heun Wi, Jeong Ae Han, Ki Hong Kim, Kyu Hwan Kim, Chul Won Lee, Andreas Kemmler, Christoph Glania, Armin Risch, Kai Morich
  • Patent number: 10061808
    Abstract: Embodiments relate to view caching techniques that cache for a limited time, some of the (intermediate) results of a previous query execution, in order to avoid expensive re-computation of query results. Particular embodiments may utilize a cache manager to determine whether information relevant to a subsequent user request can be satisfied by an existing cache instance or view, or whether creation of an additional cache instance is appropriate. At design time, cache defining columns of a view are defined, with user input parameters automatically being cache defining. Cache instances are created for each tuple of literals for the cache defining columns, and for each explicit or implicit group by clause. Certain embodiments may feature enhanced reuse between cache instances, in order to limit memory footprint. Over time a cache instances may be evicted from memory based upon implementation of a policy such as a Least Recently Used (LRU) strategy.
    Type: Grant
    Filed: June 3, 2014
    Date of Patent: August 28, 2018
    Assignee: SAP SE
    Inventors: Ki Hong Kim, Norman May, Alexander Boehm, Sung Heun Wi, Jeong Ae Han, Sang Il Song, Yongsik Yoon
  • Patent number: 9846724
    Abstract: According to an aspect, a database system for integrating calculation models into execution plans includes a first engine configured to parse a query to be applied on a database. The first engine is configured to invoke a second engine during query compilation. The second engine is configured to instantiate a calculation model based on the query, and the second engine is configured to derive a converted calculation model by converting the calculation model into a format compatible with the first engine. The first engine is configured to incorporate the converted calculation model into an execution plan during the query compilation and execute the query on the database according to execution plan.
    Type: Grant
    Filed: November 13, 2014
    Date of Patent: December 19, 2017
    Assignee: SAP SE
    Inventors: Christoph Weyerhaeuser, Tobias Mindnich, Johannes Merx, Yongsik Yoon, Sung Heun Wi, Jeong Ae Han
  • Publication number: 20170147671
    Abstract: A method may include receiving a query for data to be provided by a database server, wherein the query includes an indication of a maximum lag. The method may further include determining whether a hint is available to apply to the query, wherein the hint affects an execution of the query. When no hint is available, a baseline database server may be selected to be the database server. When the hint is available, a replication server or a cache server may be selected to be the database server based on the hint and the maximum lag. The query may be processed at the selected database server.
    Type: Application
    Filed: November 19, 2015
    Publication date: May 25, 2017
    Inventors: Christian Bensberg, Norman May, Alexander Boehm, Juchang Lee, Sung Heun Wi, Jeong Ae Han, Ki Hong Kim, Kyu Hwan Kim, Chul Won Lee, Andreas Kemmler, Christoph Glania, Armin Risch, Kai Morich
  • Patent number: 9619514
    Abstract: A query is received by a database server from a remote application server. The query is associated with a calculation scenario that defines a data flow model that includes one or more nodes that each define one or more operations for execution by a calculation engine on the database server. Thereafter, the database server instantiates a runtime model of the calculation scenario based on the nodes of the instantiated calculation scenario. Subsequently, one or more of the nodes are identified as being convertible into a relational database format. These nodes are then used to form a container node. An execution plan of the runtime model of the calculation scenario including the container node is built that is executed by the database server to result in a data set which is provided by the database server to the application server.
    Type: Grant
    Filed: June 17, 2014
    Date of Patent: April 11, 2017
    Assignee: SAP SE
    Inventors: Tobias Mindnich, Jeong Ae Han, Johannes Merx, Christoph Weyerhaeuser, Yongsik Yoon, Sung Heun Wi
  • Publication number: 20160140175
    Abstract: According to an aspect, a database system for integrating calculation models into execution plans includes a first engine configured to parse a query to be applied on a database. The first engine is configured to invoke a second engine during query compilation. The second engine is configured to instantiate a calculation model based on the query, and the second engine is configured to derive a converted calculation model by converting the calculation model into a format compatible with the first engine. The first engine is configured to incorporate the converted calculation model into an execution plan during the query compilation and execute the query on the database according to execution plan.
    Type: Application
    Filed: November 13, 2014
    Publication date: May 19, 2016
    Inventors: Christoph Weyerhaeuser, Tobias Mindnich, Johannes Merx, Yongsik Yoon, Sung Heun Wi, Jeong Ae Han
  • Publication number: 20150363463
    Abstract: A query is received by a database server from a remote application server. The query is associated with a calculation scenario that defines a data flow model that includes one or more nodes that each define one or more operations for execution by a calculation engine on the database server. Thereafter, the database server instantiates a runtime model of the calculation scenario based on the nodes of the instantiated calculation scenario. Subsequently, one or more of the nodes are identified as being convertible into a relational database format. These nodes are then used to form a container node. An execution plan of the runtime model of the calculation scenario including the container node is built that is executed by the database server to result in a data set which is provided by the database server to the application server.
    Type: Application
    Filed: June 17, 2014
    Publication date: December 17, 2015
    Inventors: Tobias Mindnich, Jeong Ae Han, Johannes Merx, Christoph Weyerhaeuser, Yongsik Yoon, Sung Heun Wi
  • Publication number: 20150347410
    Abstract: Embodiments relate to view caching techniques that cache for a limited time, some of the (intermediate) results of a previous query execution, in order to avoid expensive re-computation of query results. Particular embodiments may utilize a cache manager to determine whether information relevant to a subsequent user request can be satisfied by an existing cache instance or view, or whether creation of an additional cache instance is appropriate. At design time, cache defining columns of a view are defined, with user input parameters automatically being cache defining. Cache instances are created for each tuple of literals for the cache defining columns, and for each explicit or implicit group by clause. Certain embodiments may feature enhanced reuse between cache instances, in order to limit memory footprint. Over time a cache instances may be evicted from memory based upon implementation of a policy such as a Least Recently Used (LRU) strategy.
    Type: Application
    Filed: June 3, 2014
    Publication date: December 3, 2015
    Applicant: SAP AG
    Inventors: Ki Hong Kim, Norman May, Alexander Boehm, Sung Heun Wi, Jeong Ae Han, Sang Il Song, Yongsik Yoon