📚 PracticeMediumML CodingCoding Ready

Implement Linear Regression with Gradient Descent

ML Coding: ml-algorithms | 30-40 minutes

regressiongradient-descentoptimizationsupervisednumpy
Updated Dec 20, 2025

Question

Problem

Implement linear regression from scratch using gradient descent. Given training data (X, y), learn weights that minimize mean squared error.

Your implementation should:

  1. Initialize weights randomly
  2. Compute predictions: y_pred = X * w
  3. Compute loss: MSE = mean((y_pred - y)^2)
  4. Update weights using gradient descent: w = w - learning_rate * gradient
  5. Repeat until convergence or max iterations

Include both the training function and prediction function.

Constraints

  • Number of samples n: 10 ≤ n ≤ 10000
  • Number of features d: 1 ≤ d ≤ 100
  • Learning rate: 0.001 ≤ lr ≤ 0.1
  • Max iterations: 1000
  • Convergence threshold: 1e-6 for loss change

Examples

Example 1

Input:

# Simple 1D case: y = 2x + 1
X = [[1], [2], [3], [4], [5]]
y = [3, 5, 7, 9, 11]
learning_rate = 0.01

Output:

# After training:
weights ≈ [2.0]
bias ≈ 1.0
# Predictions: [3, 5, 7, 9, 11]

Explanation: Perfect linear relationship: y = 2x + 1

Example 2

Input:

# 2D case with some noise
X = [[1, 2], [2, 3], [3, 4], [4, 5]]
y = [5, 7, 9, 11]

Output:

# After training:
weights ≈ [1.0, 1.0]
bias ≈ 1.0

Function Signature

class LinearRegression:
    def __init__(self, learning_rate: float = 0.01, max_iters: int = 1000):
        """
        Initialize linear regression model.

        Args:
            learning_rate: Step size for gradient descent
            max_iters: Maximum training iterations
        """
        self.lr = learning_rate
        self.max_iters = max_iters
        self.weights = None
        self.bias = None

    def fit(self, X: np.ndarray, y: np.ndarray) -> None:
        """
        Train the model using gradient descent.

        Args:
            X: Training features, shape (n_samples, n_features)
            y: Training labels, shape (n_samples,)
        """
        pass

    def predict(self, X: np.ndarray) -> np.ndarray:
        """
        Make predictions on new data.

        Args:
            X: Features, shape (n_samples, n_features)

        Returns:
            predictions: shape (n_samples,)
        """
        pass

Estimated Time

30-40 minutes

Tags

regression gradient-descent optimization supervised numpy


Your Solution

python
Auto-saves every 30s

Try solving the problem first before viewing the solution


0:00time spent