BLOOM TIER

CODE • BUILD • UNDERSTAND

YOUR PROGRESS

Python Neural Net
From Scratch
TF.js
Build Your Own

🐍 STEP 1: Neural Network in Python

The simplest neural network possible. 3 inputs, 1 output. No hidden layers. Just the basics.

import numpy as np # Training data: 3 inputs, 1 output # Each row = [input1, input2, input3, expected_output] training_data = np.array([ [0, 0, 1, 0], [1, 1, 1, 1], [1, 0, 1, 1], [0, 1, 1, 0] ]) # Initialize random weights (3 inputs → 1 output) np.random.seed(1) weights = 2 * np.random.random((3, 1)) - 1 print("Initial weights:\n", weights) # Training loop for iteration in range(10000): # Forward pass inputs = training_data[:, :3] expected = training_data[:, 3:] outputs = 1 / (1 + np.exp(-(np.dot(inputs, weights)))) # Backward pass (gradient descent) error = expected - outputs adjustments = np.dot(inputs.T, error * outputs * (1 - outputs)) weights += adjustments print("\nTrained weights:\n", weights) print("\nOutputs after training:") print(1 / (1 + np.exp(-(np.dot(training_data[:, :3], weights))))

⚡ CHALLENGE 1

Change the training data. What happens if you add more examples? Can you make it learn XOR? (Hint: You'll need a hidden layer for XOR)

🔧 STEP 2: Build The Network From Scratch

What the Neural Garden is actually doing under the hood. Build the same 3-layer network in JavaScript.

function sigmoid(x) { return 1 / (1 + Math.exp(-x)); } function sigmoidDerivative(x) { return x * (1 - x); } class NeuralNetwork { constructor(inputNodes, hiddenNodes, outputNodes) { this.inputNodes = inputNodes; this.hiddenNodes = hiddenNodes; this.outputNodes = outputNodes; // Initialize weights randomly this.weightsIH = this.randomMatrix(hiddenNodes, inputNodes); this.weightsHO = this.randomMatrix(outputNodes, hiddenNodes); this.learningRate = 0.1; } randomMatrix(rows, cols) { return Array(rows).fill(0).map(() => Array(cols).fill(0).map(() => Math.random() * 2 - 1) ); } // Forward pass predict(inputs) { // Input → Hidden this.hidden = this.weightsIH.map(row => sigmoid(row.reduce((sum, w, i) => sum + w * inputs[i], 0)) ); // Hidden → Output this.outputs = this.weightsHO.map(row => sigmoid(row.reduce((sum, w, i) => sum + w * this.hidden[i], 0)) ); return this.outputs; } // Train with backpropagation train(inputs, targets) { this.predict(inputs); // Forward pass // Calculate errors const outputErrors = targets.map((t, i) => t - this.outputs[i]); // Update weights (simplified) for (let i = 0; i < this.outputNodes; i++) { for (let j = 0; j < this.hiddenNodes; j++) { const gradient = outputErrors[i] * sigmoidDerivative(this.outputs[i]); this.weightsHO[i][j] += gradient * this.hidden[j] * this.learningRate; } } } } // Use it! const nn = new NeuralNetwork(2, 4, 1); for (let i = 0; i < 10000; i++) { nn.train([0, 0], [0]); nn.train([0, 1], [1]); nn.train([1, 0], [1]); nn.train([1, 1], [0]); } console.log("0 XOR 0 =", nn.predict([0, 0])); console.log("0 XOR 1 =", nn.predict([0, 1]));

⚡ CHALLENGE 2

The code above almost learns XOR but has a bug. The hidden layer gradients aren't being calculated. Can you fix it? (Hint: You need to backpropagate the error from output to hidden)

🎮 INTERACTIVE PLAYGROUND

Test your understanding. Experiment with the network in real-time.

INPUTS

Input 1: 0.0
Input 2: 1.0

TARGET OUTPUT

Target: 1

NETWORK OUTPUT

Prediction: 0.512 Confidence: 51.2% Click "TRAIN ONCE" to learn

Epochs: 0 | Accuracy: 0%

⚡ FREE PLAY

Try to train this tiny network to be a binary AND gate. Target = 1 only when BOTH inputs are 1. Can you do it in under 50 epochs?

🔥 STEP 3: TensorFlow.js (The Easy Way)

Real frameworks handle the math for you. Here's the same network using TensorFlow.js

// Load TensorFlow.js from CDN import * as tf from 'https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@latest'; // Create the model const model = tf.sequential(); model.add(tf.layers.dense({ units: 4, // 4 hidden neurons inputShape: [2], // 2 inputs activation: 'sigmoid' })); model.add(tf.layers.dense({ units: 1, // 1 output activation: 'sigmoid' })); // Compile model.compile({ optimizer: 'adam', loss: 'meanSquaredError' }); // Training data const xs = tf.tensor2d([[0,0], [0,1], [1,0], [1,1]]); const ys = tf.tensor2d([[0], [1], [1], [0]]); // Train! await model.fit(xs, ys, { epochs: 1000 }); // Predict model.predict(tf.tensor2d([[0,1]])).print();

Ready for the Advanced tier?