Preview of API that includes the ability to get sensor history

Here is the preview url for an upcoming API change:

You will need to use that full url ( instead of to test it.

The documentation on there includes the two new history endpoints “Get Sensor History” and “Get Member History”.

We welcome any feedback and if you find something wrong, please let us know!

1 Like

Thanks for the preview of the upcoming API change. Are there plans for adding something like a “GetMembersHistory” endpoint that would allow for retrieval of historic data from multiple members of a group?

1 Like

At the moment, we are working on adding more and more planned features to the API. We see how GetSensorsHistory and GetMembersHistory could prove helpful, and both will be considered in the future.

Any ETA for when other values for the ‘average’ parameter will be available? (esp the 1440 / 1 day value). I note the API docs say:

Coming soon: 360 (6 hour), 1440 (1 day), 10080 (1 week), 44640 (1 month), 525600 (1 year).

Great, can’t wait to see what else you’ll add! Is there a way to retrieve the historic data in JSON format (like we got from Thingspeak) or in some format other than CSV?


We are planning on returning the data in JSON format instead of a string. This is being worked on, and should be one of the sooner updates that we see.

We are unsure of the exact ETA for the averages. However, the team is actively working on this, and they should start releasing them within the next few weeks.

Thank you for the updates!

Are there limits to how much data you can request at a given time? My use case is to eventually get 1 day average data for the past 365 days for the 0.3_um_count field for outside sensors in the US. Any thoughts on what kind of throttling this may run into, and how much time it might take to get that data?

Hi Adrian,

I tried this and found that data download period in a single download run is for only for 2 days. This period needs to get increased to at least between 7 to 14 days.

Thank you,

The max span at the moment depends on the average you are using.

Real-time: 2 days
10 Minute : 3 days
30 minute: 7 days
1 hr: 14 days
6hr: 90 days
1 day: 1 year

This may be revised and expanded though. It is based on the default time span users find most useful in the map.


I am getting the following message/error while downloading the data. Looks like I exceeded the limits.

Can this limit get exceeded while downloading historical data?

Thank you!

File “C:\ProgramData\Anaconda3\lib\site-packages\requests\”, line 655, in send
r = adapter.send(request, **kwargs)

File “C:\ProgramData\Anaconda3\lib\site-packages\requests\”, line 514, in send
raise SSLError(e, request=request)

SSLError: HTTPSConnectionPool(host=‘’, port=443): Max retries exceeded with url: /v1/sensors/29633/history/csv?api_key=&start_timestamp=1648623600&end_timestamp=1648796400&fields=pm2.5_atm_a%2Cpm2.5_atm_b%2Cpm2.5_cf_1_a%2Cpm2.5_cf_1_b%2Chumidity_a%2Chumidity_b%2Ctemperature_a%2Ctemperature_b%2Cpressure_a%2Cpressure_b (Caused by SSLError(SSLCertVerificationError(1, ‘[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1051)’)))

Is it still not working? This does not look like a limit error. Also, know that the start and end timestamps can be a unix timestamp and also an ISO 8601 datetime string. Documentation will be updated soon.

When I restarted the run for same sensor_index and dates then it is working.

The error message says “Max retries exceeded with url: /v1/sensors/29633/history/csv?api_key=***&start_timestamp=1648623600&end_timestamp=1648796400&fields=pm2.5_atm_a%2Cpm2.5_atm_b%2Cpm2.5_cf_1_a%2Cpm2.5_cf_1_b%2Chumidity_a%2Chumidity_b%2Ctemperature_a%2Ctemperature_b%2Cpressure_a%2Cpressure_b”

My Pyhton script is working though. I am using UNIX timestamp. If any user wants it then I can send.

The URL is w/o my keys.

I found the problem like for this sensor_index and timespamp (UNIX), the return is ‘0’ and causing the error and getting the message i posted earlier.***&start_timestamp=1646899200&end_timestamp=1647072000&fields=pm2.5_atm_a%2Cpm2.5_atm_b%2Cpm2.5_cf_1_a%2Cpm2.5_cf_1_b%2Chumidity_a%2Chumidity_b%2Ctemperature_a%2Ctemperature_b%2Cpressure_a%2Cpressure_b

The question is why return is ‘0’?

try excpet worked on this scenario.


Test out the script to find bugs.
Thank you!!

Excited for these updates, thank you all!

This may be a known issue, but it seems like the API preview isn’t backwards-compatible with the “Get Sensors Data” endpoint. I have some working requests on that endpoint, but swapping in the june2022 base URL gives me an “InvalidHistoryFieldValueError”.

	"api_version": "V1.0.10-0.0.23",
	"time_stamp": 1655477951,
	"error": "InvalidHistoryFieldValueError",
	"description": "A provided history fields value (model) was not found."

The error sort of makes sense - I’m requesting the sensor’s model name, but that’s not something that changes over time. The trouble is that I’m only requesting the latest data, not the history. I get a similar error when only requesting a single field like pm2.5.

Hi @r1yk, once the historical endpoints are officially released, the june2022 base URL will be removed, and the standard URL will need to be used. The june2022 URL is temporary to allow access to the preview version.

Thank you @Ethan_Breinholt, fair enough! One more question for now: after the new history endpoint is fully live and returning JSON, would the history/csv endpoint still be available?

Yes, the CSV variant will stay. We will add a JSON one.

I will look into this, it may be a bug where we are validating the fields incorrectly.
EDIT: This is a bug. We will fix it.