See also:
class RNN: # ... def step(self, x): # update the hidden state self.h = np.tanh(np.dot(self.W_hh, self.h) + np.dot(self.W_xh, x)) # compute the output vector y = np.dot(self.W_hy, self.h) return y
The np.tanh (hyperbolic tangent) function implements a non-linearity
that squashes the activations to the range [-1, 1]. The input, x, is combined
with the xh matrix via the numpy dot product vector operation. It is added
to the dot product of the internal state and the hh matrix, then squashed
to produce a new internal state. Finally, the output is processed through
the hy matrix and returned.
file: /Techref/method/ai/natural_language.htm, 7KB, , updated: 2023/6/29 19:52, local time: 2024/2/27 19:10, owner: JMN-EFP-786, |
©2024 These pages are served without commercial sponsorship. (No popup ads, etc...).Bandwidth abuse increases hosting cost forcing sponsorship or shutdown. This server aggressively defends against automated copying for any reason including offline viewing, duplication, etc... Please respect this requirement and DO NOT RIP THIS SITE. Questions? <A HREF="http://techref.massmind.org/techref/method/ai/natural_language.htm"> Natural Language processing.</A> |
Did you find what you needed? |
Welcome to massmind.org! |
Welcome to techref.massmind.org! |
.