Terraform Aws efs WebServer

Abhishek Chouhan
7 min readFeb 24, 2021

HellO EveryOne !!

I Am here with new Intresting task i.e. Hybrid Multi Cloud Computing Task 2 .

Task DescripTion :-

Perform the task-1 using EFS instead of EBS service on the AWS as,

Task 1 Link :- Git Jenkins Terraform Aws

Create/launch Application using Terraform

1. Create Security group which allow the port 80.

2. Launch EC2 instance.

3. In this Ec2 instance use the existing key or provided key and security group which we have created in step 1.

4. Launch one Volume using the EFS service and attach it in your vpc, then mount that volume into /var/www/html

5. Developer have uploded the code into github repo also the repo has some images.

6. Copy the github repo code into /var/www/html

7. Create S3 bucket, and copy/deploy the images from github repo into the s3 bucket and change the permission to public readable.

8 Create a Cloudfront using s3 bucket(which contains images) and use the Cloudfront URL to update in code in /var/www/html

Pre-requisites -

  • A IAM User On aws so that using its credential we can create a profile t login and can use it in future
  • So here i am showing u how to create a profile , u will get credential once u create a iam user on aws so just give those credential here

Run this command aws

configure --profile 'profile_name'

Now we can use this profile in our HashiCorp Script as Credential to Login and do the furthur things .

Provider

it will login to our aws account to as-south-1 means mumbai region using the credential we created in above steps .

Till THen i Explain how i performed this task let me create the environment so that in between i will be explaining that how it will work .

sO First we need to install some plugins for provider and all so

terraform init

command will do this for u

now plugins are installed properly s we will check the code with

terraform validate

so everything is good so we can proceed furthur to create the environment in one click i.e.

terraform apply --auto-approve
provider "aws" {
region = "ap-south-1"
profile = "Abhi"

This will create KeyPairs so that we can use it in future .

resource “tls_private_key” “Abhikey”{
algorithm = “RSA”
}
resource “aws_key_pair” “Abhikey” {
key_name = “Abhi_key”
public_key = tls_private_key.Abhikey.public_key_openssh
}

— — — — — — — — — -

CreaTing Security Group :-

resource “aws_security_group” “abhitf_sg” {name = “Abhi_sg”description = “port 22 and port 80”
vpc_id = “vpc-c8879aa0”
ingress {
description = “ssh, port 22”
from_port = 22
to_port = 22
protocol = “tcp”
cidr_blocks = [“0.0.0.0/0”]
}
ingress {
description = “http, port 80”
from_port = 80
to_port = 80
protocol = “tcp”
cidr_blocks = [“0.0.0.0/0”]
}
egress {
from_port = 0
to_port = 0
protocol = “-1”
cidr_blocks = [“0.0.0.0/0”]
}
tags = {
Name = “allow_ssh_webserver”
}
}
z

— — — — — — — — — — — -

Creating Instance and Using ssh downlaoding some importatnt software inside it like httpd webserver and git .

resource “aws_instance” “AbhiOs1” {ami = “ami-0732b62d310b80e97”
instance_type = “t2.micro”
key_name = “Abhi_key”
security_groups = [aws_security_group.abhitf_sg.name]
connection {
type = “ssh”
user = “ec2-user”
private_key = tls_private_key.Abhikey.private_key_pem
host = aws_instance.AbhiOs1.public_ip
}
provisioner “remote-exec” {
inline = [
“sudo systemctl restart httpd”,
“sudo systemctl enable httpd”,
“sudo yum install httpd php git -y”
]
}
tags = {
Name = “AbhiOs1”
}
}
some packages installed and instance is launched

— — — — — — — — — — -

Creating and mounting Aws Elastic File System .

resource “aws_efs_file_system” “efs_plus” {depends_on = [aws_security_group.abhitf_sg, aws_instance.AbhiOs1]creation_token = “efs”tags = {Name = “abhiefs”}}resource “aws_efs_mount_target” “mount_efs” {depends_on = [aws_efs_file_system.efs_plus]
file_system_id = aws_efs_file_system.efs_plus.id
subnet_id = aws_instance.AbhiOs1.subnet_id
security_groups=[aws_security_group.abhitf_sg.id]
}
resource “null_resource” “cluster” {
depends_on = [
aws_efs_file_system.efs_plus,
]
connection {
type = “ssh”
user = “ec2-user”
private_key = tls_private_key.Abhikey.private_key_pem
host = aws_instance.AbhiOs1.public_ip
}
provisioner “remote-exec” {
inline = [“sudo echo ${aws_efs_file_system.efs_plus.dns_name}:/var/www/html efs defaults._netdev 0 0>>sudo /etc/fstab”,
“sudo mount ${aws_efs_file_system.efs_plus.dns_name}:/var/www/html/*”,
“sudo rm -rf /var/www/html/*”,
“sudo git clone https://github.com/a1-s2/Hybrid_Multi_Cloud_Task2.git /var/www/html “
]
}
}

— — — — — — — -

Creating Bucket

resource “aws_s3_bucket” “abhibucket” {bucket = “abhibucket13”
acl = “public-read”
force_destroy = true
tags = {
Name = “abhibucket”
}
}

— — — — — — — — —

Putting image in bucket created in above step.

resource “aws_s3_bucket_object” “abhitf_image” {depends_on = [aws_s3_bucket.abhibucket,
]
key = “abhiimage”
bucket = “abhibucket13”content_type = “image/jpg”
source = “C:/Users/abc/Desktop/Abhi.png”
acl = “public-read”
}
resource “aws_cloudfront_origin_access_identity” “abhicf” {
comment = “cloud_front”
}
locals{
s3_origin_id = “aws_s3_bucket.abhibucket.id”
}

— — — — — — — — —

This will Create CloudFront For us

resource “aws_cloudfront_distribution” “cf_abhi” {origin {
domain_name = aws_s3_bucket.abhibucket.bucket_regional_domain_name
origin_id = local.s3_origin_id
s3_origin_config {
origin_access_identity = “${aws_cloudfront_origin_access_identity.abhicf.cloudfront_access_identity_path}”
}
}
enabled = true
is_ipv6_enabled = true
default_root_object = “myimage”
logging_config {
include_cookies = false
bucket = aws_s3_bucket.abhibucket.bucket_domain_name
}default_cache_behavior {
allowed_methods = [“DELETE”, “GET”, “HEAD”, “OPTIONS”, “PATCH”, “POST”, “PUT”]
cached_methods = [“GET”, “HEAD”]
target_origin_id = local.s3_origin_id
forwarded_values {
query_string = false
cookies {
forward = “none”
}
}
viewer_protocol_policy = “allow-all”
min_ttl = 0
default_ttl = 3600
max_ttl = 86400
}
ordered_cache_behavior {
path_pattern = “/content/immutable/*”
allowed_methods = [“GET”, “HEAD”, “OPTIONS”]
cached_methods = [“GET”, “HEAD”, “OPTIONS”]
target_origin_id = local.s3_origin_id
forwarded_values {
query_string = false
headers = [“ORIGIN”]
cookies {forward = “none”
}
}
viewer_protocol_policy = “allow-all”
min_ttl = 0
default_ttl = 3600
max_ttl = 86400
compress = true
}
ordered_cache_behavior {
path_pattern = “/content/*”
allowed_methods = [“GET”, “HEAD”]
cached_methods = [“GET”, “HEAD”]
target_origin_id = local.s3_origin_id
forwarded_values {
query_string = false
headers = [“ORIGIN”]
cookies {
forward = “none”
}
}
viewer_protocol_policy = “allow-all”
min_ttl = 0
default_ttl = 3600
max_ttl = 86400
compress = true
}
price_class= “PriceClass_200”
restrictions {
geo_restriction {
restriction_type = “none”
}
}
viewer_certificate{
cloudfront_default_certificate = true
}
}
resource "null_resource" "null" {
depends_on = [
aws_cloudfront_distribution.cf_abhi,
]
connection {
type = "ssh"
user = "ec2-user"
private_key = tls_private_key.Abhikey.private_key_pem
host = aws_instance.AbhiOs1.public_ip
}
provisioner "remote-exec" {
inline = [
"sudo su << EOF",
"echo \"<img src='http://${aws_cloudfront_distribution.cf_abhi.domain_name}/${aws_s3_bucket_object.abhitf_image.key}' height='500' width='500'>\" >> /var/www/html/index.html","EOF","sudo systemctl restart httpd",
]
}
}
data "aws_iam_policy_document" "abhis3_policy" {
statement {
actions = ["s3:GetObject"]
resources = ["${aws_s3_bucket.abhibucket.arn}/*"]principals {type = "AWS"identifiers = [aws_cloudfront_origin_access_identity.abhicf.iam_arn]}}statement {actions = ["s3:ListBucket",]resources = [aws_s3_bucket.abhibucket.arn]principals {type = "AWS"identifiers = [aws_cloudfront_origin_access_identity.abhicf.iam_arn]}}}resource "aws_s3_bucket_policy" "abhibucket_policy" {bucket = aws_s3_bucket.abhibucket.idpolicy = data.aws_iam_policy_document.abhis3_policy.json}output "AbhiOs_Ip" {value = aws_instance.AbhiOs1.public_ip}output "domain_name" {value = aws_cloudfront_distribution.cf_abhi.domain_name}

And Finally Our Environment is Crated in just click here is the output

As we will type this ip in browser and hit enter this webpage will open in which image is coming from s3 bucket using cloudfront and

And The same EnvironmenT can be Destroyed in one Single command i.e.

terraform destroy --auto-approve

All The Codes , files and all i am uploading on GitHub
GitHub Link :- https://github.com/a1-s2/Hybrid_Multi_Cloud_Task2.git

All THanks tO My MenTor The World Record Holder Mr. Vimal Daga Sir For Giving us Right Knowledge , Right Education . sO That ToDay I Can CreaTe Our Own ProjecTs Like This By Integrating Terraform , Aws , HashiCorp Script and Many More ..

THank yOu EveryOne For Reading !!

For FurThur Queries and suggestion Feel Free tO ConnecT witTH me on Linkedin .

--

--

Abhishek Chouhan

Technology Enthusiast Like to learn new new tools and technology and integrate them, DevOps, Cloud, MLOps, Kubernetes, AWS, Terraform, Expertise in Docker…