AWS PowerShell Module

Amazon Web Services (AWS) is a full-featured cloud platform, serving the needs of application and infrastructure developers, architects, engineers, and even hobbyist users / developers. Amazon offers a PowerShell module to interact with the vast number of services that are provided by AWS. There are a couple different “editions” of the AWS PowerShell module.

Contents

  • Standard Edition – The standard AWS PowerShell module is the one that most users are accustomed to. The Standard Edition of the AWS PowerShell module only runs on the Desktop edition of Microsoft PowerShell.
  • Core Edition – The “core” edition of the AWS PowerShell module was rebuilt to work with the Microsoft .NET Core framework. .NET Core was designed as a cross-platform alternative to the “full” .NET Framework that has been included with the Windows operating system for years. Microsoft introduced a new “edition” of PowerShell, called PowerShell Core, which runs cross-platform, on top of the .NET Core framework.

Concepts

.NET String Formatting

Similar to many other languages like JavaScript in Node.js, or Python, PowerShell supports string formatting, sometimes also called “template strings.” While this feature is incredibly powerful, not to mention extensible by developers, the most basic form of it is often used for variable substitution. As with most things in PowerShell, there are several different ways of invoking string formatting, but the most “PowerShell” way of doing this is using the -f operator.

Let’s start by taking a look at a very simplistic example.

'{0} {1}' -f 'Trevor', 'Sullivan'
# Result: "Trevor Sullivan"

Notice how, on the left side of the -f operator, I’m specifying my template string with placeholders inside the curly braces. On the right-hand side of the -f operator, I’m specifying my input arguments to the template string, starting with argument 0, then 1, and so on.

I could also reuse each placeholder anywhere inside the string. Therefore, the following use is valid.

'{0} {1} {0}' -f 'Trevor', 'Sullivan'
# Result: "Trevor Sullivan Trevor"

In these last couple of examples, I’m using static strings as arguments to the template strings, but I could just as well use variables instead.

$FirstName = 'Trevor'
$LastName = 'Sullivan'
'My first name is {0}, and my last name is {1}.' -f $FirstName, $LastName
# Result: "My first name is Trevor, and my last name is Sullivan."

Over the course of this documentation, you’ll notice that I use .NET String Formatting in a variety of examples to dynamically build strings, such as resource names, ARNs, URLs, and so on. I strongly encourage you to get familiar with this PowerShell and .NET feature, and leverage it wherever you need to build dynamic strings.

PowerShell Splatting

PowerShell supports a really nice feature called Splatting. This technique helps you write cleaner looking code, and helps you log / audit what your code is doing before actually invoking it. Basically, any time that you call a PowerShell command / function, you pass in all of the named parameters as a HashTable object. I’ve created a video on this topic, so please check it out, so that you understand the examples in the rest of this document.

Amazon Resource Name (ARN)

Amazon Resource Names (ARN) are used to uniquely identify any AWS cloud resource. The ARN of a resource includes useful information such as:

  • Cloud partition that the resource belongs to (eg. AWS public cloud, government cloud, China cloud)
  • AWS account number that the resource belongs to
  • Region that the resource resides in
  • Type of resource (eg. S3 Bucket, DynamoDB Table, SQS Queue, etc.)
  • Resource sub-type (if applicable)
  • Resource name

Many of the AWS PowerShell commands either:

  • Return an ARN as a string
  • Return an ARN as a property of an object
  • Accept an ARN as an input parameter to a function / command call

Common AWS PowerShell Parameters

There are a few different parameters on AWS PowerShell commands that are considered “common parameters” across nearly all commands in the AWS PowerShell module, with few exceptions. These parameters include some of the following:

  • ProfileName – the AWS Profile Name in your AWS credentials file that will be used to authenticate you to AWS. You can get a list of configured AWS Profiles by using the Get-AWSCredential -ListProfileDetail command.
  • Region – the AWS Region that you want to issue an operation against. You can list out all AWS Regions using the Get-AWSRegion command.

PowerShell Functions

PowerShell functions are sometimes referred to as “cmdlets” or “commands.” While there are differences between basic functions, cmdlets, advanced functions, and commands, in PowerShell parlance, we’ll use these terms interchangeably when referring to the AWS PowerShell

  • Function – a basic PowerShell function, defined using the function keyword, but without the [CmdletBinding()] attribute. Avoid creating basic functions, as they don’t support pipelining or PowerShell common parameters.
  • Advanced Function – a PowerShell function, defined using the function keyword, and containing the [CmdletBinding()] attribute and param ( ) block.
  • Cmdlet – a PowerShell cmdlet is written in a .NET language such as C# and inherits from the Cmdlet or PSCmdlet class in the PowerShell SDK.
  • Command – in PowerShell, a “command” can have different types. Aliases are considered to be commands, just like functions, cmdlets, and native binaries (applications).

For more information about PowerShell commands, check out the help for Get-Command by running:

Get-Help -Name Get-Command -Parameter CommandType

Installation & Configuration

Installation of AWS PowerShell Module

Installing the AWS PowerShell module is very easy, because it’s hosted in the publicly accessible PowerShell Gallery. The PowerShellGet module is built into PowerShell 5 and later, including PowerShell Core Edition, and provides a simple Install-Module command to install modules directly from the PowerShell Gallery, or even a private NuGet feed.

# Install the AWS PowerShell module for PowerShell Desktop Edition (Windows-only)
Install-Module -Name AWSPowerShell -Scope CurrentUser -Force

# Install the AWS PowerShell module for PowerShell Core Edition (cross-platform)
Install-Module -Name AWSPowerShell.NetCore -Scope CurrentUser -Force

Validating Installation

After installing the AWS PowerShell module, you’ll want to make sure that it was installed successfully. Normally, if you don’t see any errors on your screen after installation, that’s a good indication that it’s present, but generally it’s good to verify its presence on your system. To verify the installation, we use the Get-Module command. When you run Get-Module without any parameters, you get back only a list of modules that have been “imported” into your current PowerShell session. Because we haven’t explicitly imported the AWS PowerShell module, or called any commands inside the module that would invoke module auto-loading, we need to use the -ListAvailable parameter instead.

Get-Module -ListAvailable -Name aws*

You should see some results similar to the following.

PS C:\Users\trevor> Get-Module -ListAvailable -Name aws*

    Directory: C:\Users\trevor\Documents\WindowsPowerShell\Modules

ModuleType Version    Name                                ExportedCommands
---------- -------    ----                                ----------------
Binary     3.3.113.0  AWSPowerShell                       {Add-AASScalableTarget, Add-ACMCertificateTag, Add-ADSConf...
Binary     3.3.48.0   AWSPowerShell                       {Clear-AWSHistory, Set-AWSHistoryConfiguration, Initialize...

Authentication

One of the first operations you’ll need to perform, when setting up the AWS PowerShell module, is to authenticate to AWS. When you log in to AWS as an individual developer, you most likely have what’s called a “root account.” This account owns the AWS billing account, and has full access to all of the cloud resources inside of the account. In order to utilize the AWS PowerShell module, you’ll need to configure an Identity Access Management (IAM) user account. IAM is an AWS identity service that you gain access to automatically, when you sign up for an AWS account. You can create IAM user accounts and generate access keys which can be used by client applications, including the AWS PowerShell module, to invoke operations against your billing account.

Your IAM user credentials are stored in a file directly underneath your user profile root, named .aws/credentials. On most non-Windows systems, this can be resolved as ~/.aws/credentials, or on Windows you can use %USERPROFILE%/.aws/credentials, or from PowerShell it would be $env:USERPROFILE/.aws/credentials.

The credentials file doesn’t have any file extension, so if your operating system automatically appends an extension, go ahead and remove it. The format of the credentials file is as follows:

[default]
aws_access_key_id = ...
aws_secret_access_key = ...
[namedprofile1]
aws_access_key_id = ...
aws_secret_access_key = ...

Once you’ve configured the AWS credentials file, set your desired profile by using the Set-AWSCredential PowerShell command in the AWS PowerShell module.

Set-AWSCredential -ProfileName namedprofile1

In addition to setting up your AWS authentication, you’ll also need to configure the default AWS region that AWS PowerShell commands operate against. This is done using the Set-DefaultAWSRegion command.

Set-DefaultAWSRegion -Region us-east-2

Keep in mind that every AWS PowerShell command has a -Region and -ProfileName parameter, but for the sake of brevity, this article assumes that you have configured a default ProfileName and Region.

For an authoritative reference on configuring your AWS credentials, see the AWS PowerShell documentation.

Discovering Commands

The AWS PowerShell module is a single PowerShell module with thousands of commands. To discover the commands in the AWS PowerShell module, use the typical PowerShell Get-Command command.

Get-Command -Module AWSPowerShell

You can easily list out the number of commands in the AWS PowerShell module, by inspecting the Count property on the object array, returned by Get-Command.

(Get-Command -Module AWSPowerShell).Count

Because there is an overwhelming number of commands available in the AWS PowerShell module, you can filter the list of commands using the Get-AWSCmdletName command. Using this command, you can get the AWS PowerShell commands that pertain to a specific service, such as Lambda, S3, CloudWatch, DynamoDB, SQS, EC2, and others.

# Retrieve all of the Lambda related commands
Get-AWSCmdletName -Service Lambda

# Retrieve all of the SQS related commands
Get-AWSCmdletName -Service SQS

# Retrieve all of the IAM related commands
Get-AWSCmdletName -Service iam

# Retrieve all of the CloudWatch related commands
Get-AWSCmdletName -Service CloudWatch

You can also search for a PowerShell command based on the corresponding AWS web API, using the -ApiOperation parameter of the Get-AWSCmdletName command.

# Retrieve the command for a specific AWS API
Get-AWSCmdletName -ApiOperation CreateStack

Get-AWSCmdletName -ApiOperation PutLogEvents

If you prefer to search for AWS APIs using regular expressions, you can use the -MatchWithRegex parameter. Let’s take a look at a couple examples of how you’d use this parameter.

# Find all of the AWS APIs that start with "List"
Get-AWSCmdletName -MatchWithRegex ^List

# Find all of the AWS APIs that match "queue"
Get-AWSCmdletName -MatchWithRegex queue

# Find all of the AWS APIs that match "vpc"
Get-AWSCmdletName -MatchWithRegex vpc

# Find all of the AWS APIs that end with "zone"
Get-AWSCmdletName -MatchWithRegex zone$

Once you’ve located the command that you want to call, you can use the well-known Get-Help command to learn more detailed information about the command. The -Full parameter for the Get-Help command will return additional details about the command, including example command invocations and detailed help for each of the input parameters.

# Get the help for the New-CFNStack command, which creates a new CloudFormation stack
Get-Help -Name New-CFNStack -Full

Services

Under this section, we’ll explore how to work with various AWS services using the AWS PowerShell module. Each of the major section headers will cover a specific service. Some service examples might require interaction with other services, for example CloudFormation with S3. In these cases, please make sure you review the documentation for both services so that you have a complete understanding of how to utilize them.

Identity & Access Management (IAM)

Create an IAM User

The New-IAMUser command enables you to create an IAM User account. The only required parameter is the -UserName parameter.

New-IAMUser -UserName trevorsullivan

However, if you’d like to categorize your IAM User accounts, you can prepend a path onto the username by adding the -Path parameter. Keep in mind that the value passed into the -Path parameter must start and end with a forward slash, according to the IAM CreateUser API documentation.

New-IAMUser -Path /category1/ -UserName trevor

Remove an IAM User

Deleting an IAM User is easy using the Remove-IAMUser command. The only required parameter is the -UserName parameter, just like New-IAMUser. Because you’re deleting an entity here, by default you’ll be prompted to confirm the operation. You can override the confirmation by adding the -Force parameter.

Remove-IAMUser -UserName trevor -Force

Create an IAM Role

Remove an IAM Role

CloudWatch Events

CloudWatch Metrics

CloudWatch Logs

Create a CloudWatch Log Group

Create a CloudWatch Log Stream

Write a CloudWatch Log Event

CloudFormation

Deploy a CloudFormation Stack

Deploying a CloudFormation Stack is simple with the New-CFNStack command.

There are a couple of different ways to use this command. First of all, you can embed the CloudFormation template itself into the command invocation, as a string, using the -TemplateBody parameter. Second of all, you can upload your CloudFormation template to a S3 bucket and point the New-CFNStack to the S3 URL using the -TemplateURL parameter. The latter option is a bit more work, since you have to create a S3 Bucket and upload your CloudFormation template to the bucket, however, if you choose to specify the CloudFormation Stack in-line you’ll be limited to 51,200 bytes instead of 460,800 bytes with the -TemplateURL parameter. You can learn more about these details in the AWS CloudFormation CreateStack API documentation.

Aside from the template itself, which we’ve already covered, the only parameter you are required to specify is the -StackName parameter. This parameter simply sets the name of your CloudFormation Stack that you’ll see when you view it in the AWS Management Console, or via the other CloudFormation APIs.

Let’s start by taking a look at an example of how to specify a CloudFormation template in-line. To simplify our command invocation, we’ll use a PowerShell “here-string” (aka. multi-line string) to specify our YAML CloudFormation template in the -TemplateBody parameter.

New-CFNStack -StackName trevortest -TemplateBody @'
Resources:
  mybucket:
    Type: AWS::S3::Bucket
'@

We can refactor the same example into using PowerShell Splatting, which looks a bit cleaner, other than the fact that the indentation in my CloudFormation template makes it look weird.

$Stack = @{
    StackName = 'trevortest2'
    TemplateBody = @'
Resources:
  mybucket:
    Type: AWS::S3::Bucket
'@
}
New-CFNStack @Stack

Awesome! Those are a couple different techniques we can use to deploy CloudFormation templates with PowerShell in a basic manner.

Let’s switch gears and take a look at a slightly more complex example of uploading our CloudFormation template to a S3 Bucket and deploying a new stack from that S3 Object URL. Since we have to create a CloudFormation template file on the filesystem, create a S3 Bucket, upload our CloudFormation template file to the Bucket, and then deploy from there, you’ll notice that there are a few extra steps compared to our previous examples. Bear with me though, as the process is quite straightforward. Examine the in-line comments in the code to follow along step-by-step.

# Define variables for our S3 Bucket name and CloudFormation filename
$BucketName = 'cfn-trevor-test'
$CloudFormationFile = 'cloudformation.yml'

# Write out a local file with our YAML CloudFormation template, using a PowerShell "here-string"
Set-Content -Path $CloudFormationFile -Value @'
Resources:
    mybucket:
        Type: AWS::S3::Bucket
'@

# Create an AWS S3 Bucket
New-S3Bucket -BucketName $BucketName

# Upload our YAML CloudFormation template to the S3 Bucket
Write-S3Object -BucketName $BucketName -File $CloudFormationFile

# Deploy the CloudFormation Stack from the S3 Object URL.
# NOTE: We construct the S3 Object URL using the .NET String Formatting technique.
$Stack = @{
    StackName = 'trevortest-froms3'
    TemplateUrl = 'https://{0}.s3.amazonaws.com/{1}' -f $BucketName, $CloudFormationFile
}
New-CFNStack @Stack

See? That wasn’t so bad, was it?

Delete a CloudFormation Stack

If you want to delete a CloudFormation Stack, look no further than the Remove-CFNStack command. All you need to do is specify the -StackName parameter and pass in the name of the CloudFormation stack that you want to delete. Because you’re deleting a CloudFormation stack, which is a potentially damaging operation, you’ll be prompted to confirm that you want to delete the stack. To override this behavior, simply tack on the -Force parameter.

Remove-CFNStack -StackName trevortest -Force

Get CloudFormation Stack Events

When CloudFormation deploys a stack, it generates a series of chronological events. You can retrieve these events at any stage during deployment, once you’ve created the stack, using the Get-CFNStackEvent command. The results from this command aren’t guaranteed to be in chronological order, but it’s easy to sort and/or filter the events using PowerShell. When you run Get-CFNStackEvent, the results written to the output stream are an array of StackEvent objects. Each StackEvent object has a variety of useful properties including:

  • EventID
  • LogicalResourceID – the resource name you specified in your CloudFormation template
  • PhysicalResourceID (ARN)
  • CloudFormation StackID (ARN)
  • CloudFormation StackName
  • Resource Type (eg. AWS::S3::Bucket)
  • Resource Status (eg. CREATE_COMPLETE)
  • Timestamp
Get-CFNStackEvent -StackName trevortest

If you’d rather see the StackEvent objects sorted in chronological order, you can use the built-in Sort-Object PowerShell command and sort the objects based on their Timestamp property.

Get-CFNStackEvent -StackName trevortest | Sort-Object -Property Timestamp

To take this one step further, you can take the sorted objects and display them in a tabular view using the Format-Table command, and limit which properties are displayed.

Get-CFNStackEvent -StackName trevortest |
  Sort-Object -Property Timestamp |
  Format-Table -AutoSize -Property Timestamp, StackName, ResourceType, LogicalResourceId, ResourceStatus, ResourceStatusReason

This output is much more useful than what we started with! Here’s what the formatted results look like, which is much more readable than the default list output.

Timestamp            StackName  ResourceType               LogicalResourceId ResourceStatus     ResourceStatusReason
---------            ---------  ------------               ----------------- --------------     --------------------
6/28/2017 8:21:48 PM trevortest AWS::CloudFormation::Stack trevortest        CREATE_IN_PROGRESS User Initiated
6/28/2017 8:21:52 PM trevortest AWS::S3::Bucket            mybucket          CREATE_IN_PROGRESS
6/28/2017 8:21:54 PM trevortest AWS::S3::Bucket            mybucket          CREATE_IN_PROGRESS Resource creation In...
6/28/2017 8:22:15 PM trevortest AWS::S3::Bucket            mybucket          CREATE_COMPLETE
6/28/2017 8:22:17 PM trevortest AWS::CloudFormation::Stack trevortest        CREATE_COMPLETE

Wait for a CloudFormation Stack Status

DynamoDB

Create a DynamoDB Table

Remove a DynamoDB Table

Write an Object to a DynamoDB Table

Scan a DynamoDB Table

Query a DynamoDB Table

EC2 Container Service (ECS)

Create an Empty ECS Cluster

You can create a new, empty ECS Cluster using the New-ECSCluster command. The ECS Cluster will not contain any compute instances until you deploy a stand-alone EC2 instance or set up an Auto Scaling Group (ASG) from a custom Launch Configuration.

New-ECSCluster -ClusterName trevortest

Create an ECS Task Definition

Key Management Service (KMS)

Create a new KMS Key

Remove a KMS Key

Encrypt a Value with a KMS Key

Decrypt a Value with a KMS Key

Simple Storage Service (S3)

Create an S3 Bucket

It’s easy to create a new AWS S3 Bucket using the AWS PowerShell module. To do this, we’ll simply call the New-S3Bucket command. The -BucketName parameter is the only required parameter, although there are some other useful parameters available.

New-S3Bucket -BucketName s3trevortest

By default, your new AWS S3 Bucket will only be readable and writable by you, but there are a few “canned” access control lists (ACL) for S3 that you can utilize if you want to open up the permissions a bit. Use the -CannedACLName parameter to specify one of these canned ACLs.

The following example creates a new S3 Bucket and enables public read access on it.

New-S3Bucket -BucketName s3trevortest -CannedACLName public-read

Remove an S3 Bucket

You can delete an S3 Bucket using the Remove-S3Bucket command. The -BucketName parameter is the only one that’s required.

Remove-S3Bucket -BucketName mybucket

If your S3 Bucket contains objects, and you’re okay with deleting all of them, then the optional -DeleteBucketContent parameter will remove all of the S3 objects from the Bucket, and then delete the Bucket. If you don’t specify this parameter when you call Remove-S3Bucket, and the S3 Bucket isn’t already empty, then an exception will be thrown.

# Exception if the bucket isn't empty, and -DeleteBucketContent is omitted:
# Remove-S3Bucket : The bucket you tried to delete is not empty

# Force deletion of the objects contained in the S3 Bucket, along with the Bucket itself
Remove-S3Bucket -BucketName mybucket -DeleteBucketContent

Upload Data to an S3 Bucket

It’s really easy to upload content to an S3 Bucket. We’ll use the Write-S3Object command to do this. However, depending on your use case, there are several different approaches.

First, let’s look at how we can upload a file from our local filesystem to an S3 Bucket. Simply specify the -BucketName parameter to specify which S3 Bucket you want to upload to, and the -File parameter to specify the path to the file on your local filesystem that you want to upload to the S3 Bucket.

Write-S3Object -BucketName mybucket -File myfile.txt

If you want to give the file a different name, once it has been uploaded to the S3 Bucket, you can use the -Key parameter to specify its new name.

Write-S3Object -BucketName mybucket -File myfile.txt -Key mys3file.txt

You don’t necessarily have to have a file on the local filesystem in order to create an S3 Object though. Instead of using the -File parameter, you can use the -Content parameter to specify the contents of the new S3 Object directly, as a string, without reading from your local filesystem.

Write-S3Object -BucketName mybucket -File mystring.txt -Content 'hello world'

Taking it even one step further, you can upload an entire directory of files directly to S3 using the -Folder parameter. In this scenario, you’ll also be required to specify the -KeyPrefix parameter. Even though AWS S3 doesn’t support the notion of “folders,” you can still add a prefix to all of your S3 Object keys which gives a similar feel to creating folders on a filesystem.

# Create a folder containing a few test files
mkdir -Path s3test
Set-Location -Path s3test
Set-Content -Path testfile1.txt, testfile2.txt, testfile3.txt -Value 'hello world'

# Upload the entire folder to an AWS S3 Bucket
Write-S3Object -BucketName mybucket -Folder ./ -KeyPrefix s3test

Thought we were done with Write-S3Object? Not quite yet! There’s one other way you can utilize this command, which is to write an object to S3 from any type of .NET Stream. For example, let’s say you have an in-memory stream of data, and wanted to write it directly to S3. You can do this using the -Stream parameter on the Write-S3Object command. Note that the stream you pass in will have its Close() method called after being written to S3, so you won’t be able to replay the stream or change the stream’s pointer.

# Create an in-memory stream of bytes
$MyStream = [System.IO.MemoryStream]::new([Byte[]](150, 155, 200, 201,87))

# Write the MemoryStream to an S3 Object
Write-S3Object -BucketName mybucket -Key memorystream.bytes -Stream $MyStream

As you can see, there are quite a few different ways to upload content to AWS S3 Buckets using the Write-S3Object command! Play around with the command and see how it best suits your needs.

List S3 Objects in S3 Bucket

If you need to get a list of S3 objects inside of one of your S3 Buckets, you can use the Get-S3Object command. Simply specify the -BucketName parameter to specify which S3 Bucket you want to retrieve the objects from.

Get-S3Object -BucketName mybucket

It might be easier to view the results as a table of output instead of the default list format. You can pipe the results into the Format-Table -AutoSize command to achieve this.

Get-S3Object -BucketName mybucket | Format-Table -AutoSize

Copy an S3 Object to a new S3 Object

You can make copies of S3 objects directly in AWS, without having to download an object and then re-upload it again. This saves bandwidth and significant amounts of time, especially for large objects. When copying S3 objects, the source and destination bucket do not have to be the same; you can copy objects across buckets.

The Copy-S3Object command enables you to make S3-to-S3 object copies. At a minimum, you’ll need to specify the following parameters:

  • BucketName – the name of the AWS S3 Bucket that the source object resides in
  • Key – the S3 Object key that you want to make a copy of
  • DestinationKey – the S3 Object key of the newly copied S3 Object
Copy-S3Object -BucketName mybucket -Key s3test/test.txt -DestinationKey s3test2/test.txt

If your desired destination bucket is different than the source bucket, simply add the -DestinationBucket parameter and specify another bucket that you own.

Enable S3 Bucket Versioning

AWS S3 supports versioning S3 objects inside S3 buckets. However, before you can do this, you need to enable versioning on the S3 Bucket where you want to store versioned objects. You can enable versioning on your S3 Bucket by using the Write-S3BucketVersioning command. The -BucketName parameter specifies the S3 Bucket that you want to configure with versioning, and the -VersioningConfig_Status parameter should be set to Enabled.

Write-S3BucketVersioning -BucketName mybucket -VersioningConfig_Status Enabled

Versioning S3 Objects

Once you’ve enabled an AWS S3 Bucket with versioning support, it automatically keeps track of new versions of S3 Objects that you write to it. You can view S3 Object versions by using the Get-S3Version command. The returned object should have a Versions property which contains all of the individual object versions. Be careful, if you have a bucket with lots of objects and object versions, you might get a huge set of results!

(Get-S3Version -BucketName mybucket).Versions

Lambda

Create a Lambda Function

Creating a new AWS Lambda function is easy, with the Publish-LMFunction command. One important dependency for creating a Lambda function is to have an IAM Role that will be assumed to execute your Lambda code. You also have a couple of options to upload your Lambda function code / payload:

  1. As a ZIP file in an S3 Bucket
  2. As a ZIP file directly to AWS Lambda

In the former case, you can simply specify the -ZipFilename parameter on Publish-LMFunction, and your ZIP file will be uploaded directly to Lambda. Let’s take a look at an example of this method, since it requires the fewest dependencies.

$LambdaZip = 'lambda.zip'
$AWSAccountId = @(Get-EC2SecurityGroup)[0].OwnerId

### Create a Python handler
Set-Content -Path index.py -Value @'
def handler():
    print("Handled")
'@

### Create a ZIP file with the Python Lambda function
zip $LambdaZip index.py

### Create / Publish the Lambda function
$LambdaFunction = @{
    FunctionName = 'awspowershelltest'
    Runtime = 'Python3.6'
    ZipFilename = $LambdaZip
    Handler = 'index.handler'
    Role = 'arn:aws:iam::{0}:role/service-role/lambda' -f $AWSAccountId
}
Publish-LMFunction @LambdaFunction

### Clean up the ZIP file and Python handler
Remove-Item -Path lambda.zip, index.py

In the second scenario, your code will need to first be published to an S3 Bucket as a ZIP file. Let’s take a look at an end-to-end example of how to create an S3 Bucket, Lambda ZIP archive, and then publish (create) the Lambda function.

 

Get a list of Lambda Functions

You can retrieve a list of AWS Lambda functions from a specific AWS account & region by using the Get-LMFunctionList command. Each of the returned objects will contain detailed information about the Lambda function, such as:

  • Lambda timeout
  • Runtime (Python, Java, Node.js, etc.)
  • Size (in bytes) of the code package
  • Handler file and function name
  • Version
  • Memory size
  • VPC (if applicable)
  • KMS key ARN
Get-LMFunctionList

Remove a Lambda Function

To delete an AWS Lambda function, use the Remove-LMFunction command. You’ll need to specify the Lambda function’s name, using the -FunctionName parameter.

Remove-LMFunction -FunctionName MyLambdaFunction

Exception Messages

CloudFormation

New-CFNStack throws “Requires capabilities” exception

I came across a scenario where I was deploying an AWS CloudFormation template using the AWS PowerShell module’s New-CFNStack command. Upon running the command, I received an exception:

New-CFNStack : One or more errors occurred. (Requires capabilities : [CAPABILITY_NAMED_IAM])

It turns out that, if you declare certain IAM resources in your CloudFormation template, you must specifically request an IAM “capability” in the API call to create a CloudFormation stack. My CloudFormation template declared a new IAM user with a username property. You can read more about these capabilities in the AWS documentation for the CreateStack API operation. If you’re deploying CloudFormation templates with the New-CFNStack PowerShell command, make sure you append the -Capability parameter, and specify the CAPABILITY_NAMED_IAM and/or CAPABILITY_IAM capabilities as a string array.

$CloudFormation = @{
 TemplateBody = @'
{
  "Resources": {
    "TrevorUser": {
      "Type": "AWS::IAM::User",
      "Properties": {
        "UserName": "TrevorSullivan"
      }
    }
  }
}
'@
 ProfileName = 'AWSProfile'
 Region = 'us-east-1'
 StackName = 'MyCoolCFNStack'
 Capability = @('CAPABILITY_NAMED_IAM')
}
New-CFNStack @CloudFormation

Comments are closed