Neuroevolution Through Augmenting Topologies or NEAT may be what you are referring to. The original paper by Kenneth O. Stanley is here
NEAT combines a neural network and a genetic algorithm. Instead of using back propagation or gradient descent to "train" your network, NEAT creates a population of very simple neural networks (no connections) and evolves them with fitness evaluation, crossover, and mutation. The genome syntax: every connection gene has a few settings. In node, Out node, Weight of connection, activated, and innovation. In, Out, and Weight values are the same as regular neural networks. Enabled and Disabled genes are well, enabled and disabled. The innovation value is possibly the most defining feature of NEAT, since it allows for crossover of different topologies and historical tracking of each connection.NEAT can mutate or change both its weights and connections, so for example,
Parent2 has 5 of the same connections, represented by innovation / ID numbers 1 through 5. Since they have the same connection nodes, the genetic algorithm will randomly pick either
Parent1 weight or
Parent2 weight. The excess and disjoint genes are inherited from the more fit parent. NEAT will then mutate each genome, shown in the image below.