14

8

In spirit of the famous Tensorflow Fizz Buzz joke and XOr problem I started to think, if it's possible to design a neural network that implements $y = x^2$ function?

Given some representation of a number (e.g. as a vector in binary form, so that number `5`

is represented as `[1,0,1,0,0,0,0,...]`

), the neural network should learn to return its square - 25 in this case.

If I could implement $y=x^2$, I could probably implement $y=x^3$ and generally any polynomial of x, and then with Taylor series I could approximate $y=\sin(x)$, which would solve the Fizz Buzz problem - a neural network that can find remainder of the division.

Clearly, just the linear part of NNs won't be able to perform this task, so if we could do the multiplication, it would be happening thanks to activation function.

Can you suggest any ideas or reading on subject?

2Thank you very much, this is exactly what I was asking for! – Boris Burkov – 2019-03-22T13:23:28.587

4Although true, it a very bad idea to learn that. I fail to see where any generalization power would arise from. NN shine when there's something to generalize. Like CNN for vision that capture patterns, or RNN that can capture trends. – Jeffrey – 2019-03-22T15:21:12.363