1

I have just started working on a project that is hosted on an AWS EC2 Windows Instance with an IIS. I want to move this setup to more reliable place, and one of the first things I wanted to do was to move away from snowflake servers that are setup and configured by hand.

So started looking at Terraform from Hashicorp. My thought was that I could define the entire setup including network etc in Terraform and that way make sure it was configured correctly.

I thought I would start with defining a server. A simple Windows Server instance with an IIS installed. But this is where I run into my first problems. I thought I could configure the IIS from Terraform. I guess you can't. So my next thought was to combine Terraform with Powershell Desired State Configuration.

I can setup an IIS server on a box using DSC. But I am stuck invoking DSC from Terraform. I can provision a vanilla server easily. I have tried looking for a good blog post on how to use DSC in combination with Terraform, but I can't find one that explains how to do it.

Can anyone point me towards a good place to read up on this? Or alternatively if the reason I can't find this is that it is just bad practice and I should do it in another way, then please educate me.

Thanks

How can I provision IIS on EC2 Windows with a resource?

Alain O'Dea
  • 21,033
  • 1
  • 58
  • 84
Jay Pete
  • 4,123
  • 4
  • 35
  • 51
  • Yes, you should chain Terraform into a software provisioner, and Terraform should not itself be used for software provisioning. You can either do it directly from Terraform via `provisioner` block, or with a `null` resource for example. The problem you are facing finding examples for DSC is because it is not a commonly used tool. Typically people use Ansible for this, or sometimes Puppet or Chef. There will be many guides and examples for those tools in conjunction with Terraform. Alternatively, bake IIS into a custom AMI via Packer and a software provisioner. – Matthew Schuchard May 17 '20 at 11:11
  • Thanks Matt! I will definitely look at Ansible, Puppet and Chef. One of my previous projects used Chef, but I didn't have anything to do with that tool then. I have heard good things about DSC. But would you say it might be too early to adopt it? – Jay Pete May 17 '20 at 19:08

1 Answers1

0

You can run arbitrary PowerShell scripts on startup as follows:

resource "aws_instance" "windows_2016_server" {
//...
user_data = <<-EOF
<powershell>
$file = $env:SystemRoot + "\Temp\${var.some_variable}" + (Get-Date).ToString("MM-dd-yy-hh-mm")
New-Item $file -ItemType file
</powershell>
EOF
//...
}

You'll need a variable like this defined to use that (I'm providing a more complex example so there's a more useful starting point)

variable "some_variable" {
    type = string
    default = "UserDataTestFile"
}

Instead of creating a timestamp file like the example above, you can invoke DSC to set up IIS as you normally would interactively from PowerShell on a server.

You can read more about user_data on Windows here: https://docs.aws.amazon.com/AWSEC2/latest/WindowsGuide/ec2-windows-user-data.html

user_data will include your PowerShell directly.

You can use a templatefile("${module.path}/user-data.ps1, {some_variable = var.some_variable}) instead of an inline script as above.

Have user-data.ps1 in the same directory as the TF file that references it:

<powershell>
$file = $env:SystemRoot + "\Temp\${some_variable}" + (Get-Date).ToString("MM-dd-yy-hh-mm")
New-Item $file -ItemType file
</powershell>

You still need the <powershell></powershell> tags around your script source code. That's a requirement of how Windows on EC2 expects PowerShell user-data scripts.

And then update your TF file as follows:

resource "aws_instance" "windows_2016_server" {
//...
user_data = templatefile("${module.path}/user-data.ps1, {
  some_variable = var.some_variable
})

//...
}

Note that in the file read by templatefile has variables like some_variable and NOT var.some_variable.

Read more about templatefile here:

https://www.terraform.io/docs/configuration/functions/templatefile.html

Alain O'Dea
  • 21,033
  • 1
  • 58
  • 84
  • 1
    Thanks Alain So I would use a `File Provisioner` to upload my DSC files to the server and then invoke them using the user data. Just a stupid question. It is called user data, so when I first looked at this I thought it would run at login and not boot. But that isn't the case is it? Preferably I would like to never have to log into the instance. – Jay Pete May 18 '20 at 08:49
  • There is no need to upload the script separately. The content provided to **user_data** is automatically associated with your EC2 metadata and loaded by the instance when it starts. I've written a lot of Terraform and I never use file provisioners. I often write user-data which fetches files from S3, the web, or services like GitHub. – Alain O'Dea May 18 '20 at 10:38
  • 1
    I got it to work by creating a user in a powershell script in user data, and using that user to first upload the DSC files and afterwards using remote-exec to execute the DSC configuration. But after I got it all to work I can see that it isn't really a good way of doing it, just as you also say Alain. I like the DSC way of setting up the server. So I think I will continue down that route. But I will set up DSC Pull in the user data instead and try to see if I can make it pull directly from an S3 bucket. – Jay Pete May 18 '20 at 13:04
  • To pull from the S3 bucket the instance profile of your EC2 Windows will need s3:GetObject for the S3 object ARN and possibly kms:Decrypt for the KMS key ARN if you are using a customer master key to encrypt objects in S3. If you run into issues with that, post another question with your Terraform code and mention me in a comment :) – Alain O'Dea May 18 '20 at 13:08
  • 1
    Thanks a lot. I will try that out. – Jay Pete May 18 '20 at 13:10
  • 1
    So I did a little digging and found that you need a DSC Pull server set up and that cannot be a simple S3 bucket. Oh well... Azure has a cheap service called Azure Automation that serves this purpose. This can be connected to my GitHub repo where I can store my DSC configurations under version control. What I do then is to use the user data to register my newly provisioned EC2 instance with Azure Automation... and presto I have it all up and running. There are Terraform modules if you run your VMs in Azure, but we don't. I will clean this up and then provide a step-by-step run through here. – Jay Pete May 19 '20 at 23:06
  • 1
    Thanks for pointing me towards the user data @alain-odea. It has simplified the setup A LOT. – Jay Pete May 19 '20 at 23:07
  • I'm so glad user-data is helpful for yoy. Paying it forward. I had a colleague do the same for me :) – Alain O'Dea May 20 '20 at 23:22