I have the metric response in a JSON file and I want to send the metrics, "Size" and "fileCreatedAt" to Cloudwatch. In the future, I would like to get the metrics from the S3 bucket in AWS, but for now I wanted to write a Sample Code.
This is the JSON File
{
"nextToken": "sample-token",
"files": [
{
"id": "xxxa",
"fileCreatedAt": "2021",
"size": 1234,
"dataSourceId": "xx32"
},
{
"id": "xxxb",
"fileCreatedAt": "2022",
"size": 3560,
"dataSourceId": "xx33"
},
{
"id": "xxxa",
"fileCreatedAt": "2021",
"size": 1234,
"dataSourceId": "xx32"
},
{
"id": "xxxb",
"fileCreatedAt": "2022",
"size": 3560,
"dataSourceId": "xx33"
}
]
}
The boto3 Sample Code
import json
import boto3
import botocore
import logging
from datetime import datetime
from datetime import timedelta
def lambda_handler(event, context):
# TODO implement
s3_client = boto3.client('s3')
cw_client = boto3.client('cloudwatch')
response = s3_client.get_object(Bucket='myawsjsonbucket',Key='sample.json')
metric_response = cw_client.put_metric_data( Namespace='JSON/AWS',
MetricData=[
{
'MetricName': 'filesize',
'Dimensions': [
{
"Name": "size",
"Value": "some unique value"
},
],
'Value': size,
'Timestamp': datetime.now()
},
{
'MetricName': 'fileCreatedAt',
'Dimensions': [
{
"Name": "fileCreatedAt",
"Value": "some unique value"
},
],
'Value': fileCreatedAt,
'Timestamp': datetime.now()
},
]
)
print(response)
I tried to write the code to read the metrics from the metric response in the JSON file and transfer the metrics using boto3. Unfortunately, I got stuck with the boto code and I don't know how to proceed further.
CodePudding user response:
As mentioned in the docs, you can send the metric in a single request as well as in multiple request, whichever you feel efficient based on the input frequency
You can publish either individual data points in the Value field, or arrays of values and the number of times each value occurred during the period by using the Values and Counts fields in the MetricDatum structure. Using the Values and Counts method enables you to publish up to 150 values per metric with one PutMetricData request, and supports retrieving percentile statistics on this data.
Each PutMetricData request is limited to 1 MB in size for HTTP POST requests. You can send a payload compressed by gzip. Each request is also limited to no more than 1000 different metrics.
as per your snippet, you have to make the request like this
client.put_metric_data(
Namespace='JSON/AWS',
MetricData=[
{
'MetricName': 'filesize',
'Dimensions': [
{
"Name": "Size",
"Value": "some unique value"
},
],
'Value': size,
'Timestamp': datetime.datetime.now()
},
{
'MetricName': 'fileCreatedAt',
'Dimensions': [
{
"Name": "fileCreatedAt",
"Value": "some unique value"
},
],
'Value': fileCreatedAt,
'Timestamp': datetime.datetime.now()
},
]
)
The dimension
has 2 properties name
and value
which is used as an unique identifier of the dimension(I have kept some unique value
so that it can be made for some grouping logic, otherwise for each filesize you will have multiple dimension). For passing the value
of the dimension you have to pass it in the Value
property.
An explanation of dimension
is present in the docs
A dimension is a name/value pair that is part of the identity of a metric. Because dimensions are part of the unique identifier for a metric, whenever you add a unique name/value pair to one of your metrics, you are creating a new variation of that metric. For example, many Amazon EC2 metrics publish InstanceId as a dimension name, and the actual instance ID as the value for that dimension.
You can assign up to 30 dimensions to a metric.