I am migrating from InfluxDB into QuestDB and I have exported my data (using influxd inspect) as a large file containing all my ILP points. It looks something like this (just several Gigs of it):
diagnostics,device_version=v1.0,driver=Albert,fleet=East,model=F-150,name=truck_1027 current_load=2658 1451612300000000000
diagnostics,device_version=v1.0,driver=Albert,fleet=East,model=F-150,name=truck_1027 current_load=3436 1451612310000000000
readings,driver=Trish,fleet=West,model=H-2,name=truck_972 velocity=89 1451831680000000000
Please note I exported a whole bucket so the ILP file contains entries for several measurements/tables.
I want to load into QuestDB, but I can see the HTTP endpoint supports loading CSV files only. I know QuestDB supports ingesting ILP, but the official clients don't accept sending an ILP file. It seems with the client libraries I have to compose an object representing my point and then send it over. I could read the file line by line, parse it and then use the Python client to send the points, but I am wondering if there is a better way.
CodePudding user response:
QuestDB does support the ILP protocol. You can just send your ILP points over a socket connection using the TCP port (defaults to 9009).
The official client libraries do exactly that, but they do it in a more convenient way so you don't have to compose the raw message yourself, which could be error prone.
In your case, since you already have valid ILP points, you can just iterate over your file and send via socket. I am showing a basic example using Python:
import socket
import sys
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
def send_utf8(msg):
print(msg)
sock.sendall(msg.encode())
if __name__ == '__main__':
try:
sock.connect(('localhost', 9009))
with open("YOUR_FILE") as infile:
for line in infile:
# print(line)
send_utf8(line)
except socket.error as e:
sys.stderr.write(f'Got error: {e}')
sock.close()