1

I'm building Neural Evolution of Augmented Topologies and I'm looking for a way to optimize my algorithm. The network represents an irregula set of connections between neurons.

I'm not very familiar with tensorflow, but I suppose that there is a way to use it here.

I need to iterate through the network many times in quite a big interval of time. So, it gets very slow when the net is very big.

The network can be of any structure: a genetic algorithm evolves the network. Every neuron can have different activation functions.

Any suggestions?

Emil Terman
  • 526
  • 4
  • 22
  • Detail copied in from a deleted answer: by irregular I mean a networks that can have any kind of connections: there can be, for example, 3 neurons connected to 1, then, the same neurons can be connected with other 2, and, at the same time, connected to the first of the 3 neurons. Each neuron has a different activation function. I don't think there is a trivial way to connect it with tensorflow. The network is evolved by a genetic algorithm. – David Parks Apr 19 '18 at 17:29
  • 1
    It seems this question is not so much related to performance as it is a question about how to structure a model that is best described as a complex graph in tensorflow where common matrix multiplication doesn't seem to easily apply. – David Parks Apr 19 '18 at 17:31
  • I don't understand your question. Do you want to run the NEAT algorithm in Tensorflow? Tensorflow is used for deeplearning, while NEAT is an evolutionary algorithm. Or do you mean you want to somehow combine both approaches? (Such as use NEAT to evolve architectures that are then used for deeplearning.) – Pablo Aug 16 '18 at 13:17
  • Did you ever find an answer to this question? It would be surprising if you created a novel area of neural network research while trying to implement a decades old algorithm. I fear the guys who wrote that original neuroevolution paper fudged their data somehow because the solution to this problem would be clearly explained if they didn't. My guess is they manually pruned their randomly generated neural networks so they always worked with matrix math. This kind of bullshit happens all the time in esoteric fields because it's too complicated for most people to check their work. – Austin Capobianco Apr 23 '20 at 13:37
  • 1
    @AustinCapobianco unfortunately I haven't. My algorithm ended up being extremely slow, as I was doing everything manually. But thinking back, I imagine you could translate some of these irregularities to separate neural layers, which are compatible with Tensorflow. – Emil Terman Apr 23 '20 at 15:04
  • 1
    I was trying to treat every neuron as "connected" to every other neuron, with most of the connections "having a weight of 0" generating a large sparse matrix, but it doesn't match up with what I get when I manually calculate the answer. If you're interested: https://datascience.stackexchange.com/questions/72068/how-do-i-use-matrix-math-in-irregular-neural-networks-such-as-those-generated-fr – Austin Capobianco Apr 23 '20 at 18:35

0 Answers0