Lambda function code:
import json
import datetime
import redshift_module.pygresql_redshift_common as db_handler
from decouple import config
import pg
host = config('host')
port = config('port')
db_name = config('db_name')
db_user = config('db_user')
db_password = config('db_password')
def get_connection(host, port, dbname, user, password):
rs_conn_string = "host=%s port=%s dbname=%s user=%s password=%s" % (
host, port, dbname, user, password)
rs_conn = pg.connect(dbname=rs_conn_string)
rs_conn.query("set statement_timeout = 1200000")
return rs_conn
def query(con, query):
res = con.query(query)
return res
rs_conn = db_handler.get_connection(host, port, db_name, db_user, db_password)
query_string = "call XYZ_warehouse.ABC_revenue_last_7_days();"
res = query(rs_conn, query_string)
print(res.getresults())
def lambda_handler(event, context):
return {'statusCode': 200, 'body': json.dumps('Hello from Lambda!')}
So the code runs a query function on a redshift database. The output of the lambda contains strings like following:
DEUS revenue is completely matching and revenue is : 5139 and Revenue without shipping is: 4987
What i need to do is capture lines like these in the execution results, maybe store them in a variable or in a log file somewhere so i can then email them. Ive tried running .getresults()
on the query object but it returns an empty string. Maybe because its not returning a record or a table?
Is there a way i can capture the complete output of a lambda function? or send it to a file somewhere?
CodePudding user response:
Create dictionary and save all output you want to send via email.
Now convert it to json and send via SNS or SES
CodePudding user response:
Your question is a bit unclear, do you also want help with retrieving your query results?
If so I suggest you read from here
Is there a way i can capture the complete output of a lambda function? or send it to a file somewhere?
One option is to send the output of your lambda func to an s3 bucket e.g.
import boto3
s3 = boto3.resource(
's3',
region_name='us-east-1',
aws_access_key_id=KEY_ID,
aws_secret_access_key=ACCESS_KEY
)
content="String content to write to a new S3 file"
s3.Object('my-bucket-name', 'newfile.txt').put(Body=content)
you can set up an s3 trigger that will then trigger a second lambda as soon as a new object is added to the bucket.
Your second lambda can then transform to JSON and send via SNS/SES as Vaquar Khan's answer.
CodePudding user response:
If i understood correctly " you want to store output of lambda in separate place/file" to use it as email.
If it is a pure lambda output you can simple use lambda destinations.
Destinations allows you to segregate results based on failure or success. if it is success you can send it to a target ( SNS, SQS, EventBridge or different Lambda), if it is failure you can send it to failure destination