Boltzmann machines for timeseries
This topic contains 0 replies, has 1 voice, and was last updated by arXiv 2 years, 1 month ago.

Boltzmann machines for timeseries
We review Boltzmann machines extended for timeseries. These models often have recurrent structure, and back propagration through time (BPTT) is used to learn their parameters. The perstep computational complexity of BPTT in online learning, however, grows linearly with respect to the length of preceding timeseries (i.e., learning rule is not local in time), which limits the applicability of BPTT in online learning. We then review dynamic Boltzmann machines (DyBMs), whose learning rule is local in time. DyBM’s learning rule relates to spiketiming dependent plasticity (STDP), which has been postulated and experimentally confirmed for biological neural networks.
Boltzmann machines for timeseries
by Takayuki Osogami
https://arxiv.org/pdf/1708.06004v1.pdf This topic was modified 2 years ago by admin.
You must be logged in to reply to this topic.