Hello,
I have run into an issue with the Ubuntu server I am running the code on, where each time the code starts it increases the CPU usage, but does not allocate any memory, even when the variable size is not neglible. The server appears alive for roughly 10 minutes, bevore crashing. It does not generate any crash log or anything similar. Once it crashed, it doesnt respond to anything, and requires the power plug to be pulled in order to shut it down.
Server Specs:
Ubuntu 24.04.1 LTS
2x 64 Core AMD Epic CPU
~1 TB of meory,
NVIDIA RTX A400
Code:
F_max = f_center + (N) * f_rep + f_CEO;
num_chunks = ceil(N / chunk_size);
if isempty(gcp('nocreate'))
start_idx = (c-1) * chunk_size + 1;
end_idx = min(c * chunk_size, N);
f_k_chunk = f_center + f_CEO + (start_idx - (N/2)) * f_rep : f_rep : f_center + f_CEO + (end_idx - (N/2)) * f_rep;
for j = 1:length(f_k_chunk)
chunk_X = chunk_X + cos(2 * pi * f_k_chunk(j) * t);
window = blackmanharris(L)';
X_windowed = X .* window;
X_windowed = X_windowed / mean(window);
title("Time Domain Signal");
xlabel("Time (seconds)");
P1(2:end-1) = 2 * P1(2:end-1);
plot(f, P1, 'LineWidth', 1);
title("Single-Sided Amplitude Spectrum");
xlabel("Frequency (Hz)");
Any ideas what may be the problem is welcome, I am at the end of my rope.
Thanks
Update: Done some testing, apparently, at 10 processers, it uses 250 GB of memory... While it is working now, the next question is why toe scrip requires so much memory, and what all of the overhead is for.
Thanks