Hi,
Here are some points which should help troubleshoot and fix the 'gradientDescent' function:
- Correct Syntax: Ensure the function definition starts with 'function' instead of 'unction'.
- Remove Data Loading Inside the function: The 'gradientDescent' function should not load data from a file. Instead, 'x', 'y', and 'theta' should be passed as arguments. Load the data and initialize the variables outside the function.
- Compute Cost Function: Ensure to have a 'computeCost' function defined to calculate the cost for linear regression.
Given below is a modified version of the code implementing the above changes and also illustrating its usage:
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
% Compute the predictions
predictions = X * theta;
% Compute the error
errors = predictions - y;
% Perform the gradient descent update
theta = theta - (alpha / m) * (X' * errors);
% Save the cost J in every iteration
J_history(iter) = computeCost(X, y, theta);
end
end
% Function to compute the cost for linear regression
function J = computeCost(X, y, theta)
m = length(y); % number of training examples
predictions = X * theta;
sqErrors = (predictions - y).^2;
J = 1 / (2 * m) * sum(sqErrors);
end
% Example Usage
% Load data
data = load('ex1data1.txt');
X = data(:, 1);
y = data(:, 2);
m = length(y);
% Add intercept term to X
X = [ones(m, 1), X];
% Initialize fitting parameters
theta = zeros(2, 1);
% Set gradient descent parameters
alpha = 0.01;
num_iters = 1500;
% Run gradient descent
[theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters);
% Display the result
fprintf('Theta found by gradient descent: ');
fprintf('%f %f \n', theta(1), theta(2));
The following image shows the result achieved upon using the above script on dummy data:

I hope this helps!