Home > Mobile >  Proper way to deploy a node/express app to lamda function
Proper way to deploy a node/express app to lamda function

Time:08-15

I am looking to deploy a standard node / express app to a lamda function and use it as the back end of my code. I have the API Gateway set up, and everything seems to be working fine as I get a "hello world" response when I hit the API end point for the back end.

The issue is that I need to upload new iterations of the back end and I don't know how to push the code from my local repo or github or anywhere else onto the lambda server/function.

This page says that you should zip it and push it, but that isn't real descriptive. It almost looks like it would create a new lambda each time that you use it.

zip function.zip index.js

aws lambda create-function --function-name my-function \
--zip-file fileb://function.zip --handler index.handler --runtime nodejs12.x \
--role arn:aws:iam::123456789012:role/lambda-ex

Then, there's using same to build and deploy the node app. This seems like a lot of work and setup for a simple app like this. I can set up the front end to deploy to an S3 bucket each time I push to master. Can I do something like that with lambda?

CodePudding user response:

Instead of using the aws lambda create-function command, you can use aws lambda update-function-code to update an existing lambda:

aws lambda update-function-code \
--function-name <function-name> \
--zip-file fileb://function.zip

CodePudding user response:

I personally would use the SAM cli. This allows you do build, package and deploy your lambda locally with use of a Cloud Formation template - IMHO a better way to go for your application stack.

You define all of your resources -api gateway, resources, routes, methods, integrations and permissions in the template. Then use sam deploy to not only create your infrastructure, but deploy your application code when it changes.

Here is an example SAM Cloud Formation template that you place in the root of your repo (this assumes a lambda function that needs to access a dynamo db table).

AWSTemplateFormatVersion: 2010-09-09
Transform: AWS::Serverless-2016-10-31
Description: Your Stack Name


Resources:

  DynamoTable:
    Type: AWS::DynamoDB::Table
    Properties:
      TableName:  my-table
      KeySchema:
        AttributeName: id
        KeyType: HASH
      BillingMode: PAY_PER_REQUEST


  Lambda:
    Type: AWS::Serverless::Function
    Properties:
      FunctionName: your-lambda-name
      CodeUri: app/
      Handler: app.handler
      Runtime: nodejs14.x
      Environment:
        Variables:
          DYNAMO_TABLE: !Ref DynamoTable
      Policies:
        - AWSLambdaExecute
        - Version: 2012-10-17
          Statement:
            - Effect: Allow
              Action:
                - dynamodb:GetItem
                - dynamodb:Query
                - dynamodb:Scan
                - dynamodb:PutItem
                - dynamodb:UpdateItem
              Resource:
                - !Sub 'arn:aws:dynamodb:${AWS::Region}:${AWS::AccountId}:table/${DynamoTable}'
      Events:
        ApiEvent:
          Type: Api
          Properties:
            Path: /path
            Method: get

Now deploying your code (or updated stack as you add / change resources) is as simple as:

sam deploy --template-file template.yml --stack-name your-stack-name --resolve-s3 --region your-region 

the sam deploy does a lot in one line:

I include the --template-file parameter - but it is really not necessary in this case because the file is called "template.yml" - which the deploy command will look for.

the "--resolve-s3" option will automatically create an s3 bucket to upload your lambda function code to versus you having to define a bucket (and create it) outside of the template. The alternative would be to specify a bucket "--s3-bucket". However you would have to create that BEFORE attempting to create the stack. This is fine - but you should take this into account when you go to delete your stack. The bucket will not be included in that process and you need to ensure the bucket is deleted in addition to the stack.

I typically add in a Makefile and do my build, prune, test, etc as part of my build and deploy.

i.e.

publish:
   npm run build && \
   npm test && \
   npm prune --production && \
   sam deploy --stack-name my-stack resolve-s3 --region us-east-1

then

make publish
   

More than what you asked for I realize - but perhaps someone else will find utility in this.

  • Related