Hi everyone, I want to know how much time my data takes from uploading it to a thingspeak channel until the channel receives the data. Therefore I am sending a unix timestamp as data to my thingspeak channel. When ThingSpeak receives the datapoint, it creates a timestamp itself called "created_at". But this "created_at" timestamp measures only seconds, while the unix timestamp, that is sent as data, measures milliseconds. Because of this, it can happen, that the unix timestamp in the data is bigger than the "created_at" timestamp from thingspeak, which makes no sense. Does anybody know if there is a way that the "created_at" timestamp also measure milliseconds? Or if there is a better way to measure the latency between my system and ThingSpeak? How to get exact "created_at" timestamps in thingspeak? You can get the system time in a MATLAB analysis script (for example using datetime('now'); and also use thingSpeakWrite to write data to the channel. You can use a field to store higher precision than the created_at timestamp. Thanks for your reply! But do I get it right, that when I use MATLAB Analysis I have to use for example thingSpeakRead which also takes some time? The problem with this would be that I want to know the exact timestamp, when my data "arrives" to the thingSpeak Channel. Sure there will be other delays involved, but you could probably back out some estimates by doing a bunch of measurements over time and taking averages. You can aslo use webread() and the GET REST command to get data from ThingSpeak. MQTT has the fastest response time. I tried to to it with MATLAB Analysis, but sometimes my results are very strange. c = 0; culum_delay = 0; lastMatlabValue = 0; limit = 5; cumul_diff = 0; while c < limit c = c + 1; data = thingSpeakRead(readChannelID, 'ReadKey', readAPIKey); fprintf('Data: %8f\n', data); time = datetime("now",'TimeZone', 'Europe/Berlin'); timestamp = posixtime(time); fprintf('time: %8f\n', timestamp) if isnan(data) || data == lastMatlabValue continue end diff = timestamp - data; fprintf('Diff: %8f\n', diff); lastMatlabValue = data; pause(0.5); end This is my MATLAB Analysis script. From my local matlab environment I upload the current timestamp every second on my thingspeak channel. The shown MATLAB Analysis script reads the latest datapoint that was uploaded and creates a timestamp after that. This timestamp (you can see it in the code as 'timestamp') should be later than the timestamp that was loaded from the thingspeak channel. But sometimes it is smaller (aka older timestamp) than the data that was loaded. This makes no sense to me, so do you see anything that I overlooked? timestamp milliseconds latency