Quantcast
Channel: Active questions tagged javascript - Stack Overflow
Viewing all articles
Browse latest Browse all 140071

Is there a way to stop a neural network from getting worse?

$
0
0

I was seeing if I could implement a new way to training my number doubler neural network in less the time. So I decided on this. There is a generation counter. Everything it goes down, the net trains for 50 epochs. When the counter reaches zero, it is ready for testing. Then I set the generation counter to 10. To my surprise, it trained instantly. So, I immediately set the generation counter to 1000. It also trained instantly. But I was shocked to see that it's outputs were actually worse than when the counter was only 10.

I have tried everything I could think. The only thing I could think of is the everytime it trains another generation it resets.

const trainingGenerations = 10

const epochsPerGeneration = 50

const input = tf.tensor2d([1, 2, 3, 4, 5], [5, 1])

const output = tf.tensor2d([2, 4, 6, 8, 10], [5, 1])

let model = tf.sequential()

model.add(tf.layers.dense({units: 1, inputShape: [1]}))

model.add(tf.layers.dense({units: 64, inputShape: [1]}))

model.add(tf.layers.dense({units: 1, inputShape: [64]}))

model.compile({loss: "meanSquaredError", optimizer: "sgd"})

for (let i = 0; i < trainingGenerations; i++) {

    model.fit(input, output, {epochs: epochsPerGeneration, shuffle: true})

}

aInput = 6

aOutput = model.predict(tf.tensor1d([parseFloat(aInput)])).dataSync()[0]

alert(aOutput)

My question is why is it getting worse and how can I fix it? The expected result was that it would alert the number 12 or something close to it. The actual result is that with 1000 generations it log something like 10 or 9, when with ten generation, it logs 11.5 or 11.9


Viewing all articles
Browse latest Browse all 140071

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>