Given some representation of a number (e.g. as a vector in binary form, so that number
5 is represented as
[1,0,1,0,0,0,0,...]), the neural network should learn to return its square - 25 in this case.
If I could implement $y=x^2$, I could probably implement $y=x^3$ and generally any polynomial of x, and then with Taylor series I could approximate $y=\sin(x)$, which would solve the Fizz Buzz problem - a neural network that can find remainder of the division.
Clearly, just the linear part of NNs won't be able to perform this task, so if we could do the multiplication, it would be happening thanks to activation function.
Can you suggest any ideas or reading on subject?