Patents by Inventor Chenfan Sun

Chenfan Sun has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240095541
    Abstract: Embodiments relate to compiling neural network operations into tasks that may be performed in a streaming manner by a neural processor. In a streaming operation, a tensor is spatially partitioned, and tasks associated two or more layers of the neural network are performed simultaneously in an overlapping manner. To enable efficient memory usage during streaming operation, a subset of the tasks having completion times close in time are assigned to a same portion of memory in the neural processor during a compilation process. After the tasks assigned to the same portion of the memory is finished, the portion of the memory may be flushed to make space for subsequent tasks. Multiple tasks may also be coalesced into a single task to reduce the number of tasks and more efficiently perform the operations at the neural processor.
    Type: Application
    Filed: September 16, 2022
    Publication date: March 21, 2024
    Inventors: Sayyed Karen Khatamifard, Thomas G. Anderl, Alexander J. Kirchhoff, Keith Wyss, Dylan H. Rush, Chenfan Sun, Jeffrey D Marker
  • Publication number: 20230394276
    Abstract: Embodiments relate to streaming convolution operations in a neural processor circuit that includes a neural engine circuit and a neural task manager. The neural task manager obtains multiple task descriptors and multiple subtask descriptors. Each task descriptor identifies a respective set of the convolution operations of a respective layer of a set of layers. Each subtask descriptor identifies a corresponding task descriptor and a subset of the convolution operations on a portion of a layer of the set of layers identified by the corresponding task descriptor. The neural processor circuit configures the neural engine circuit for execution of the subset of the convolution operations using the corresponding task descriptor. The neural engine circuit performs the subset of the convolution operations to generate output data that correspond to input data of another subset of the convolution operations identified by another subtask descriptor from the list of subtask descriptors.
    Type: Application
    Filed: June 6, 2022
    Publication date: December 7, 2023
    Inventors: Sayyed Karen Khatamifard, Chenfan Sun, Alon Yaakov, Husam Khashiboun, Jeffrey D. Marker, Saman Naderiparizi, Ramana V. Rachakonda, Rohit K. Gupta
  • Publication number: 20230368008
    Abstract: Embodiments relate to streaming operations in a neural processor circuit that includes a neural engine circuit and a data processor circuit. The neural engine circuit performs first operations on a first input tensor of a first layer to generate a first output tensor, and second operations on a second input tensor of a second layer at a higher hierarchy than the first layer, the second input tensor corresponding to the first output tensor. The data processor circuit stores a portion of the first input tensor for access by the neural engine circuit to perform a subset of the first operations and generate a portion of the first output tensor. The data processor circuit stores the portion of the first output tensor for access by the neural engine circuit as a portion of the second input tensor to perform a subset of the second operations.
    Type: Application
    Filed: May 16, 2022
    Publication date: November 16, 2023
    Inventors: Sayyed Karen Khatamifard, Alexander J. Kirchhoff, Rohit K. Gupta, Jeffrey D. Marker, Thomas G. Anderl, Saman Naderiparizi, Chenfan Sun, Alon Yaakov, Husam Khashiboun, Ramana V. Rachakonda
  • Patent number: 11657124
    Abstract: In one embodiment, a method includes receiving a user request from a client device associated with a user, accessing an instructional file comprising one or more binary inference engines and one or more encrypted model data corresponding to the one or more binary inference engines, respectively, selecting a binary inference engine from the one or more binary inference engines in the accessed instructional file based on the user request, sending a validation request for a permission to execute the binary inference engine to a licensing server, receiving the permission from the licensing server, decrypting the encrypted model data corresponding to the binary inference engine by a decryption key, executing the binary inference engine based on the user request and the decrypted model data, and sending one or more execution results responsive to the execution of the binary inference engine to the client device.
    Type: Grant
    Filed: December 10, 2018
    Date of Patent: May 23, 2023
    Assignee: Apple Inc.
    Inventors: Peter Zatloukal, Matthew Weaver, Alexander Kirchhoff, Dmitry Belenko, Ali Farhadi, Mohammad Rastegari, Andrew Luke Chronister, Keith Patrick Wyss, Chenfan Sun
  • Publication number: 20200184037
    Abstract: In one embodiment, a method includes receiving a user request from a client device associated with a user, accessing an instructional file comprising one or more binary inference engines and one or more encrypted model data corresponding to the one or more binary inference engines, respectively, selecting a binary inference engine from the one or more binary inference engines in the accessed instructional file based on the user request, sending a validation request for a permission to execute the binary inference engine to a licensing server, receiving the permission from the licensing server, decrypting the encrypted model data corresponding to the binary inference engine by a decryption key, executing the binary inference engine based on the user request and the decrypted model data, and sending one or more execution results responsive to the execution of the binary inference engine to the client device.
    Type: Application
    Filed: December 10, 2018
    Publication date: June 11, 2020
    Inventors: Peter Zatloukal, Matthew Weaver, Alexander Kirchhoff, Dmitry Belenko, Ali Farhadi, Mohammad Rastegari, Andrew Luke Chronister, Keith Patrick Wyss, Chenfan Sun