I have a simple task:
Subscribe to messages on Redis channel
Transform message, e.g.
HASH:
'<user_id>|<user_type>|<event_type>|...'
with items:{ 'param_1': 'param_1_value', 'param_2': 'param_2_value', ... }
into tabular form
user_id | event_type | param_1 | param_2 | ... |
---|---|---|---|---|
<user_id> | <event_type> | cleaned(param_1_value) | cleaned(param_2_value) | ... |
- Append to an existing table in Postgres
Additional context:
- The scale of events is rather small
- Refreshments must be done at most every ~15 minutes
- Solution must be deployable on premises
- Using something else as a queue than Redis is not an option
The best solution I came up with is to use Kafka, with Kafka Redis Source Connector (https://github.com/jaredpetersen/kafka-connect-redis) and then Kafka Postgres Sink Connector (https://github.com/ibm-messaging/kafka-connect-jdbc-sink). It seems reasonable, but the task seems like generic Redis to Postgres ETL and I'm wondering if there is really no easier out of the box solution out there.
CodePudding user response:
You could just write a script and execute it via cron. But take a look at the Benthos project as you can easily run it on prem and what you describe can be done entirely via configuration for Redis -> Postgres.