Step 1: Forward Propagation - Input data is passed through the network. Each neuron computes a weighted sum of its inputs, applies an activation function, and passes the result to the next layer.
Step 2: Loss Calculation - The network's prediction is compared to the actual target value, and a loss is calculated to measure the error.
Step 3: Backpropagation - The gradient of the loss with respect to each weight is calculated, starting from the output layer and moving backward.
Step 4: Weight Update - Weights are updated using the calculated gradients and the learning rate to minimize the loss.
Step 5: Iteration - Steps 1-4 are repeated for multiple iterations until the network learns to accurately predict the target values.
Unlike logistic regression, neural networks can model complex, non-linear relationships between inputs and outputs through multiple layers of neurons.