YOUR PROGRESS
Python Neural Net
From Scratch
TF.js
Build Your Own
🐍 STEP 1: Neural Network in Python
The simplest neural network possible. 3 inputs, 1 output. No hidden layers. Just the basics.
import numpy as np
training_data = np.array([
[0 , 0 , 1 , 0 ],
[1 , 1 , 1 , 1 ],
[1 , 0 , 1 , 1 ],
[0 , 1 , 1 , 0 ]
])
np.random.seed(1 )
weights = 2 * np.random.random((3 , 1 )) - 1
print ("Initial weights:\n" , weights)
for iteration in range(10000 ):
inputs = training_data[:, :3 ]
expected = training_data[:, 3 :]
outputs = 1 / (1 + np.exp(-(np.dot(inputs, weights))))
error = expected - outputs
adjustments = np.dot(inputs.T, error * outputs * (1 - outputs))
weights += adjustments
print ("\nTrained weights:\n" , weights)
print ("\nOutputs after training:" )
print (1 / (1 + np.exp(-(np.dot(training_data[:, :3 ], weights))))
▶ RUN DEMO
📋 COPY
⚡ CHALLENGE 1
Change the training data. What happens if you add more examples? Can you make it learn XOR? (Hint: You'll need a hidden layer for XOR)
🔧 STEP 2: Build The Network From Scratch
What the Neural Garden is actually doing under the hood. Build the same 3-layer network in JavaScript.
function sigmoid (x) {
return 1 / (1 + Math.exp(-x));
}
function sigmoidDerivative (x) {
return x * (1 - x);
}
class NeuralNetwork {
constructor (inputNodes, hiddenNodes, outputNodes) {
this .inputNodes = inputNodes;
this .hiddenNodes = hiddenNodes;
this .outputNodes = outputNodes;
this .weightsIH = this .randomMatrix(hiddenNodes, inputNodes);
this .weightsHO = this .randomMatrix(outputNodes, hiddenNodes);
this .learningRate = 0.1 ;
}
randomMatrix (rows, cols) {
return Array(rows).fill(0 ).map(() =>
Array(cols).fill(0 ).map(() => Math.random() * 2 - 1 )
);
}
predict (inputs) {
this .hidden = this .weightsIH.map(row =>
sigmoid(row.reduce((sum, w, i) => sum + w * inputs[i], 0 ))
);
this .outputs = this .weightsHO.map(row =>
sigmoid(row.reduce((sum, w, i) => sum + w * this .hidden[i], 0 ))
);
return this .outputs;
}
train (inputs, targets) {
this .predict(inputs);
const outputErrors = targets.map((t, i) => t - this .outputs[i]);
for (let i = 0 ; i < this .outputNodes; i++) {
for (let j = 0 ; j < this .hiddenNodes; j++) {
const gradient = outputErrors[i] * sigmoidDerivative(this .outputs[i]);
this .weightsHO[i][j] += gradient * this .hidden[j] * this .learningRate;
}
}
}
}
const nn = new NeuralNetwork(2 , 4 , 1 );
for (let i = 0 ; i < 10000 ; i++) {
nn.train([0 , 0 ], [0 ]);
nn.train([0 , 1 ], [1 ]);
nn.train([1 , 0 ], [1 ]);
nn.train([1 , 1 ], [0 ]);
}
console.log("0 XOR 0 =" , nn.predict([0 , 0 ]));
console.log("0 XOR 1 =" , nn.predict([0 , 1 ]));
⚡ CHALLENGE 2
The code above almost learns XOR but has a bug. The hidden layer gradients aren't being calculated. Can you fix it? (Hint: You need to backpropagate the error from output to hidden)
🎮 INTERACTIVE PLAYGROUND
Test your understanding. Experiment with the network in real-time.
INPUTS
TARGET OUTPUT
0
1
Target: 1
NETWORK OUTPUT
Prediction: 0.512
Confidence: 51.2%
Click "TRAIN ONCE" to learn
TRAIN ONCE
PREDICT
Epochs: 0 | Accuracy: 0%
⚡ FREE PLAY
Try to train this tiny network to be a binary AND gate. Target = 1 only when BOTH inputs are 1. Can you do it in under 50 epochs?
🔥 STEP 3: TensorFlow.js (The Easy Way)
Real frameworks handle the math for you. Here's the same network using TensorFlow.js
import * as tf from 'https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@latest' ;
const model = tf.sequential();
model.add(tf.layers.dense({
units: 4 ,
inputShape: [2 ],
activation: 'sigmoid'
}));
model.add(tf.layers.dense({
units: 1 ,
activation: 'sigmoid'
}));
model.compile({
optimizer: 'adam' ,
loss: 'meanSquaredError'
});
const xs = tf.tensor2d([[0 ,0 ], [0 ,1 ], [1 ,0 ], [1 ,1 ]]);
const ys = tf.tensor2d([[0 ], [1 ], [1 ], [0 ]]);
await model.fit(xs, ys, { epochs: 1000 });
model.predict(tf.tensor2d([[0 ,1 ]])).print();
Ready for the Advanced tier?
SYMBIOSIS TIER →