I’ve been running a small daily job against the new history/csv endpoint where I request the previous day’s 10-minute averages for 6 outdoor sensors. Today I noticed the CSV files had all doubled in size - turns out they all contain 2 rows for each distinct timestamp.
Would it be possible to include an example of how an ISO 8601 datetime string should look in the request? I’ve had no issue using UNIX timestamps, but I’ve not had success with the ISO format (e.g. 2022-06-27T15:46:23 or 2022-06-29T16:04:59Z or 20220629T154620Z). The API only returns an invalid timestamp errors.
Currently ThingSpeak stores sensor data in four channels (Primary / Secondary A&B). The Sensor data download tool from the PurpleAir map also provides the sensor data in four csv files. Since the new API will provide all sensor data in one row, will the sensor download tool eventually provide all data in a single file? Currently my downstream processing software assumes four files and I prefer for it to work the same regardless of if the data is obtained from the sensor download tool or from my application. Your reply will help me decide how to modify my software.
The new API Get Sensor History endpoint indicated that “10 minute average history maximum time span is three (3) days”. Will this limitation exist in the final API? This is a limit of 432 records per call. For one month of two minute data this would mean 50 requests per sensor. By comparison ThingSpeak limits rows per call to around 8000.
Hi Adrian. The preview api page appears to no longer display properly on any browser so I cannot really review it any more as I have in the past. I was going to check to see if there have been any added fields in the history data. Also, how much warning can we expect to have that the june2022 url will go away? Will there be a switch over period where both urls work or is it an event happening all at once on one day?
FYI… I also lost some time trying to use the ISO date (without time, I just thought start date would get the whole day). So it appears not all ISO formats are valid. Now I’m just using the timestamp (number) representing date and time and that works reliably so I’m no longer using ISO date time format.
In the email you just sent announcing the update it sounds like you’re saying that all sensor data will no longer be available via api.thingspeak.com. Is that true? If so, a little more time and/or a clearer hard stop date would be helpful to ensure that developers using ThingSpeak can migrate without downtime.
Would it be possible to have sensor index/memberID included in the output for requesting sensor history? As it stands, the rows of data that I retrieve have no unique identifier attached to them. If I was to simply combine two outputs from two different sensors, I would not know which sensor the data came from. That has thus far made it very challenging to compile the data into a useful format.
For example, my current R script is able to pull the most recent two weeks of hourly data for all of the sensors we have deployed. I’ve needed to save that data as separate csv outputs with the name of the sensor as the file name. If each row of data was attributed to one sensor index, I could instead bind all outputs into one file and manipulate that as needed. My ideal situation would be to have one single file with the headers sensor_index, pm2.5_cf_1_a , pm2.5_cf_1_b, temperature, humidity, etc, with the sensor data fields as hourly averages.
What I’m doing is making the API calls individually, but running them on a loop with a 2 minute pause between calls, so it’s only a few lines of code. That’s not (yet) resulted in me being rate limited.
I need help regarding downloading the purple air sensors data by the new path. Please, I need access to the data to download.
I need to download some application, to have access, since ThingSpeak is no longer working. Do I need to email firstname.lastname@example.org to get API keys? I’m lost… There’s an orientation of getting the historical data accessible through our new API at https://api.purpleair.com, but I can’t.
Thank you very much!
I was wondering if these time constraints are still in place. It would be great to be able to query averages and real-time over longer time periods instead of making multiple get requests. Is there a reason why the time is so limited?