PKI Hacking for Fun and Profit - Part III

This is the final article in my PKI Hacking for Fun and Profit series. This article follows up on part II and shows how you can use AWS's API Gateway to handle mTLS traffic from your factory devices with AWS Lambda. The result is a service that can scale as you need.

In the PKI Hacking for Fun and Profit - Part II article, I created a simple example of an event logging agent that allows devices to use a customer owned mTLS server.

This blog will create the same API we did with the blog-log-server, but using AWS.

DISCLAIMER - I am not an AWS expert. The settings below are probably not optimal - especially IAM roles.

Create a log group

We need a WatchLog log group:

  $ aws logs create-log-group --log-group-name pki-blog

Configure permissions

The Lambda needs some IAM things in place. This creates a role for the Lambda with permissions to CloudWatch for creating log streams and uploading events.

  $ aws iam create-role \
      --role-name pki-blog-lambda \
      --assume-role-policy-document '{"Version": "2012-10-17","Statement": [{ "Effect": "Allow", "Principal": {"Service": "lambda.amazonaws.com"}, "Action": "sts:AssumeRole"}]}'

  $ aws iam attach-role-policy \
      --role-name pki-blog-lambda \
      --policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole

  $ cat >policy.json <<EOF
  {
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "logs:CreateLogStream",
                "logs:DescribeLogStreams",
                "logs:PutLogEvents"
            ],
            "Resource": "*"
        }
    ]
  }
  EOF

  $ aws iam put-role-policy \
      --role-name pki-blog-lambda \
      --policy-name LogPermssions \
      --policy-document file://policy.json

Create the Lambda

Create the lambda payload, blog-log-server.zip with:

$ cat >lambda_function.py <<EOF
import json
import time
import traceback

import boto3

def lambda_handler(event, context):

    try:
        group = 'pki-blog'
        client = boto3.client('logs')

        subj = event["requestContext"]["authentication"]["clientCert"]["subjectDN"]
        for item in subj.split(","):
            k, v = item.split("=", 1)
            if k == "CN":
                uuid = v
                break
        else:
            return {"statusCode": 400, "body": "Client certificate missing CN="}

        events = []
        items = json.loads(event["body"])
        for item in items:
            events.append({
                "timestamp": item["Time"] * 1000,
                "message": item["Msg"],
            })

        if not events:
            return {"statusCode": 200, "body": ""}

        try:
            logdescribe = client.describe_log_streams(logGroupName=group, logStreamNamePrefix=uuid)
            logStreams = logdescribe['logStreams']
            logStream = logStreams[0]
        except IndexError:
            rv = client.create_log_stream(logGroupName=group, logStreamName=uuid)
            logdescribe = client.describe_log_streams(logGroupName=group, logStreamNamePrefix=uuid)
            logStreams = logdescribe['logStreams']
            logStream = logStreams[0]

        try:
            sequenceToken = logStream.get('uploadSequenceToken', '0')
        except Exception as e:
            sequenceToken = '0'

        resp = client.put_log_events(
            logGroupName=group,
            logStreamName=uuid,
            logEvents=events,
            sequenceToken=sequenceToken,
        )

        sequenceToken = resp.get("nextSequenceToken", '0')
        resp['uuid'] = uuid

        return {
            'statusCode': resp['ResponseMetadata']["HTTPStatusCode"],
            'body': json.dumps(resp),
        }
    except Exception as e:
        data = {
            "error": traceback.format_exc(),
            "event": event,
        }
        return {"statusCode": 500, "body": json.dumps(data, indent=2)}
<<EOF
$ zip blog-log-server.zip lambda_function.py

Now we can create a Lambda:

  # NOTE: export AWS_ID to match your account
  $ aws lambda create-function \
      --function-name blog-log-server \
      --runtime python3.8 \
      --zip-file fileb://blog-log-server.zip \
      --handler lambda_function.lambda_handler \
      --role arn:aws:iam::${AWS_ID}:role/pki-blog-lambda

Create API Gateway

$ aws apigatewayv2 create-api \
    --name pki-cli \
    --protocol-type HTTP \
    --target arn:aws:lambda:us-east-2:${AWS_ID}:function:blog-log-server

# capture the "ApiId" from the previous command ^
$ export API_ID=<TODO>

# NOTE: the lambda ARN will be slightly different for your AWS account:
$ aws apigatewayv2 create-integration \
    --api-id $API_ID \
    --integration-type AWS_PROXY \
    --payload-format-version 2.0 \
    --integration-uri arn:aws:lambda:us-east-2:${AWS_ID}:function:blog-log-server

# Capture the "IntegrationId" from the previous command ^
$ export INTEGRATION_ID=<TODO>
$ aws apigatewayv2 create-route \
    --api-id $API_ID \
    --route-key "POST /" \
    --target integrations/$INTEGRATION_ID

Create custom domain for mTLS

This part is painful and requires you to have some ability to configure DNS. In my case I did the following from API Gateway.

This required me to create an Amazon issued certificate using AWS Certificate Manager. I also had to upload my CA certificates to an S3 bucket. You can get this by copying the "Device Authentication Certificates" from fioctl keys ca show.

These commands got my "andy-corp" domain working:

$ aws apigateway create-domain-name \
    --domain-name andy-corp.foundries.io \
    --mutual-tls-authentication truststoreUri=s3://andy-corp-demo/truststore.pem \
    --security-policy TLS_1_2 \
    --endpoint-configuration types=REGIONAL \
    --certificate-arn arn:aws:acm:us-east-2:${AWS_ID}:certificate/9b8ee4a8-3985-4764-a3e5-aa32cb147d98
$ aws apigatewayv2 create-api-mapping \
    --domain-name andy-corp.foundries.io --api-id $API_ID --stage \$default

Testing it out

$ curl -v -X POST -H "content-type: application/json" \
    --cert ./client.pem --key ./pkey.pem  \
    -d '[{"Time": 1627336731, "Msg": "pki-blog-test"}]' \
    https://andy-corp.foundries.io/

You can then watch log entries start to appear in CloudWatch:

# List log streams (i.e. devices):
$ aws logs describe-log-streams --log-group-name pki-blog

# Get log events:
$ aws logs get-log-events --log-group pki-blog --log-stream <logStreamName from above>

Keep up to date with Foundries.io