State Street has filed a patent for an apparatus that integrates neural networks into optimization. The apparatus converts input values into function data structures and evaluates each node in the neural network. The patent aims to improve the efficiency and effectiveness of neural networks. GlobalData’s report on State Street gives a 360-degree view of the company including its patenting strategy. Buy the report here.
According to GlobalData’s company profile on State Street, grid computing was a key innovation area identified from patents. State Street's grant share as of June 2023 was 1%. Grant share is based on the ratio of number of grants to total number of patents.
An apparatus for converting input values into function data structures
A recently filed patent (Publication Number: US20230206037A1) describes an apparatus and system for implementing neural networks using function data structures. The apparatus includes memory and logic circuitry that converts input values for an input layer of the neural network into function data structures representing constants and formulas with variables. These function data structures can be represented using tree-based or graph-based data structures, such as directed acyclic graphs (DAGs). The DAGs consist of various vertex types, including sum operations, arithmetic operations, variables, constant values, activation functions, comparison operations, division operations, and subtraction operations.
The patent also describes a system that includes processing circuitry, memory, a communications interface, and data storage. The processing circuitry, which can be one or more processors residing in servers, executes code stored in memory and data storage. The code, which is distributed between the memory and data storage, enables the functionality described in the patent claims.
Additionally, the patent discusses the use of non-transitory storage mediums containing instructions that, when executed by a processor, perform the operations described in the patent claims. These operations involve evaluating nodes in the neural network, performing mathematical operations, and applying activation functions. The result of these operations can be used as an objective function or constraint in an optimizer or limit analyzer.
Overall, this patent presents an apparatus and system for implementing neural networks using function data structures, enabling efficient computation and optimization of neural network models. The use of directed acyclic graphs and various vertex types allows for flexible representation and evaluation of neural network nodes. The system architecture, including processing circuitry, memory, and data storage, supports the execution of code and distribution of functionality across servers. The patent also highlights the use of non-transitory storage mediums containing instructions for performing the described operations.