Does this notation mean vector-concatenation?


When reading papers on neural networks, I occasionally stumble upon the following notation with a semicolon:

$$ \text{tanh}(\mathbf{W_c}[\mathbf{c}_t;\mathbf{h}_t]) $$

Unless otherwise noted, does this by default mean the following:

  • vector $\mathbf{c}_t$ is appended to vector $\mathbf{h}_t$
  • the resulting long vector is dot-producted with a Weight matrix $\mathbf{W}_t$
  • Finally, the resulting vector is component-wise activated by a hyperbolic tangent function

The first bullet point is my main question. Googling for "Vector concatenation notation" doesn't return answers that would resemble the image I've attached above. However, many papers seem to use it.

An example of notation that I googled


Posted 2018-04-13T16:01:27.820

Reputation: 2 248

2Yes. Concatenation is the richest aggregation of mathematical objects. As opposed to, say, averaging. – Emre – 2018-04-13T16:17:38.543

No answers