Data is ingested into Warp 10 via HTTP POST requests to the update endpoint.

The update endpoint is https://HOST:PORT/api/v0/update, where HOST:PORT is a valid endpoint for the Warp 10 instance and vX is the version of the API you want to use (currently v0).

The Ingress API must be accessed using the POST method.

All requests MUST be authenticated, as described below.

Response status code

HTTP status codes are used to indicate the request action that happened during the process (success, redirection, client error, server error). The following status are common:

200Successful request. All datapoints pushed were stored successfully.
500Error: details in response body. Errors can occur due to syntax or quota issues.

Update POST request

POST /api/v0/update


To be authenticated you need to add an X-Warp10-Token header with a valid WRITE token.

X-Warp10-TokenA valid WRITE token


The body of the POST request contains the Geo Time Series data in GTS input format, one reading by line.

The general format is:


Please refer to the detailed documentation of the input format for a more throughout tour of the GTS input format.

Successful response

Upon receiving a successful response from the endpoint (HTTP 200), the data has been stored to disk (in the case of the Standalone version), in memory (if the Standalone was configured with an in-memory storage backend) or pushed to Kafka (in the case of the Distributed version).


POST /api/v0/update HTTP/1.1
Host: host
X-Warp10-Token: TOKEN
Content-Type: text/plain

1380475081000000// foo{label0=val0,label1=val1} 123
/48.0:-4.5/ bar{label0=val0} 3.14
1380475081123456/45.0:-0.01/10000000 foobar{label1=val1} T

Example using curl

curl -H 'X-Warp10-Token: TOKEN_WRITE' -H 'Transfer-Encoding: chunked' -T METRICS_FILE 'https://HOST:PORT/api/v0/update'

Where TOKEN_WRITE is a valid write token for the platform, METRICS_FILE is text file where every line is a GTS input format and API_ENDPOINT is a valid endpoint for the public Warp 10 API.

The Transfer-Encoding header is needed, otherwise curl will try to materialize the data in memory which will lead to failures when you attempt to push millions or billions of datapoints at once.

Here you have sample METRICS_FILE:

1382441207762000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 79.16
1382441237727000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 75.87
1382441267504000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 74.46
1382441267504000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 73.55
1382441297664000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 72.30
1382441327765000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 70.73
1382441327765000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 69.50
1382441357724000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 68.24
1382441387792000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 66.66
1382441387792000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 65.73

Performance tip: Gather your data together. If you do one ingress call per line, the http overhead will be huge. Ingress size has no limit. You can curl a 200 GB file, the ingestion rate will be very high. Gathering your data per channel will be even faster, because you don't need to repeat the class name and labels on each line. Just start the line with '='.

#no latitude, no longitude, no elevation, same channel for the following lines.
#starting a line with "=" means "keep previous names and labels from previous line".
1521444669000000// tool_speed{plant=paris,machine=47} 104.392492
=1521444669000020// 105.240093
=1521444669000040// 105.336131
=1521444669000060// 106.168065


curl command has no limits... However, if your files are small enough to be handled by javascript, you can use WarpStudio tool to push data to your platform:

WarpStudio - example