Buscar
Mostrando ítems 1-10 de 201
On the determination of epsilon during discriminative GMM training
(2010-12-01)
Discriminative training of Gaussian Mixture Models (GMMs) for speech or speaker recognition purposes is usually based on the gradient descent method, in which the iteration step-size, ε, uses to be defined experimentally. ...
On the determination of epsilon during discriminative GMM training
(2010-12-01)
Discriminative training of Gaussian Mixture Models (GMMs) for speech or speaker recognition purposes is usually based on the gradient descent method, in which the iteration step-size, ε, uses to be defined experimentally. ...
Aplicação do Word2vec e do Gradiente descendente dstocástico em tradução automática
(2016-05-30)
O word2vec é um sistema baseado em redes neurais que processa textos e representa pa- lavras como vetores, utilizando uma representação distribuída. Uma propriedade notável são as relações semânticas encontradas nos modelos ...
A NUMERICAL DESCENT METHOD FOR AN INVERSE PROBLEM OF A SCALAR CONSERVATION LAW MODELLING SEDIMENTATION
(SPRINGER, 2008)
This contribution presents a numerical descent method for the identification of parameters in the flux function of a scalar nonlinear conservation law when the solution at a fixed time is known. This problem occurs in a ...
A NUMERICAL DESCENT METHOD FOR AN INVERSE PROBLEM OF A SCALAR CONSERVATION LAW MODELLING SEDIMENTATION
(SPRINGER, 2006)
This contribution presents a numerical descent method for the identification of parameters in the flux function of a scalar nonlinear conservation law when the solution at a fixed time is known. This problem occurs in a ...
A Robust Predictive Speed Control for SPMSM Systems Using a Sliding Mode Gradient Descent Disturbance Observer
(2023-03-01)
This paper proposes a sliding mode gradient descent disturbance observer-based adaptive reaching law sliding mode predictive speed control (GD-SMPC+ARL) for surface-mounted permanent magnet synchronous motor (SPMSM) systems ...
A continuous-time model of stochastic gradient descent: convergence rates and complexities under Lojasiewicz inequality
(Universidad de Chile, 2021)
In this thesis we study the convergence rates and complexities of a continuous model of the Stochastic Gradient Descent (SGD) under convexity, strong convexity and Łojasiewicz assumptions, the latter being a way to generalize ...
Treinamento de redes neurais com arquitetura multilayer perceptron em FPGA
(Florianópolis, SC, 2019-07-22)
Este trabalho de conclusão de curso apresenta uma implementação em Field Programmable Gate Arrays (FPGA) de um sistema responsável pelo treinamento on chip de redes neurais com arquitetura
Multilayer Perceptron (MLP). O ...