Automating User creation in AWS SFTP service ( aws transfer for sftp)

Published on October 21, 2019October 21, 2019 • 1 Likes • 0 Comments

DevOps AWS ☁️(CDA,CSA) | Python | Terraform | Packer | Docker | Jenkins | Ansible | ELK | Rancher | Kubernetes | Bash

If you have ever used AWS SFTP managed service you will know that user creation in this takes a lot of manual effort,

1. you have to select the scope down policy ( or paste it after reading what it is from the docs) .

2. manually select the bucket and create a folder with respect to the user name.

3. also create an ssh-key for the user beforehand to upload this key to the public-key data section in AWS SFTP

doing this repeatedly as and when new user access needs access is a lot of pain for a DevOps Developer like me :)

so I had to automate this, well BOTO3 to the rescue

import boto3
import click
import subprocess
import time

client = boto3.client('transfer')
s3 = boto3.client('s3')
# since we have only one server in ap-southeast-1 , change this accordingly
serverId=client.list_servers().get('Servers')[0].get('ServerId')

bucket_name = "your-bucket-name"
#bucket_notification = s3.BucketNotification(bucket_name)


@click.command()
@click.option('--user','-u',prompt="user name",help="name of the user to create as sftp user")
def create_user(lab):
    #print(did)
    path = f'~/Downloads/pems/{ user }'
    subprocess.call(f'ssh-keygen -t rsa -f { path } -N ""', shell=True)
    #print(f'lab privatekey create {path}')
    time.sleep(3)
    public_key_data=""
    with open(f'{path}/pems/{user}.pub','r') as file:
        public_key_data= file.read()
    scope_down_policy='''{
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "AllowListingOfUserFolder",
                "Action": [
                    "s3:ListBucket"
                ],
                "Effect": "Allow",
                "Resource": [
                    "arn:aws:s3:::${transfer:HomeBucket}"
                ],
                "Condition": {
                    "StringLike": {
                        "s3:prefix": [
                            "${transfer:HomeFolder}/*",
                            "${transfer:HomeFolder}"
                        ]
                    }
                }
            },
            {
                "Sid": "AWSTransferRequirements",
                "Effect": "Allow",
                "Action": [
                    "s3:ListAllMyBuckets",
                    "s3:GetBucketLocation"
                ],
                "Resource": "*"
            },
            {
                "Sid": "HomeDirObjectAccess",
                "Effect": "Allow",
                "Action": [
                    "s3:PutObject",
                    "s3:GetObject",
                    "s3:DeleteObjectVersion",
                    "s3:DeleteObject",
                    "s3:GetObjectVersion"
                ],
                "Resource": "arn:aws:s3:::${transfer:HomeDirectory}/*"
            }
        ]
    }'''
    tags=[
        {
            'Key': 'user',
            'Value': f'{user}'
        },
        ]
    home_dir= f'/your-bucket/{lab}'

    client.create_user(
        HomeDirectory=home_dir,
        Policy=scope_down_policy,
        Role='arn:aws:iam::123456789123:role/sftp_role',
        ServerId=serverId,
        SshPublicKeyBody=f'{public_key_data}',
        Tags=tags,
        UserName=f'{user}'
    )
   
    s3.put_object(Bucket=bucket_name, Key=f'{user}/')
    

if __name__ == '__main__':
    create_user()

this code creates a sftp-user, create a folder in s3 for the user and also scope it down to a policy where the user can only navigate to his/her folder

The path where the key will be stored :   ~/Downloads/pems/{ user }

I am calling shell command, which creates an ssh-key and copies the public key data to sftp user creation endpoint

ssh-keygen -t rsa -f { path } -N ""

hope this small snippet code of some reference to you in the future while automating a User creation in AWS managed SFTP service ( aws transfer for sftp )

you need to create a role beforehand, I would have automated that as well, since I am using the same role everytime it was a one time work for me and after I can simply refer the ARN

HOW TO USE THIS :

prerequisites/assumptions:

  1. you have a AWS sftp service in place with S3 bucket already created for this purpose

  2. created a manual role which gives SFTP-ROLE permission to the service

  3. AWS access-keys in your local pc to pick for calling the rest endpoints

usage

save this file as aws_sftp_user.py, the code is implemented with a click , so you can pass the user name in command line and everything else is a breeze :p

python aws_sftp_user.py --user arjundandagi

the private key to do sftp will be stored in ~/Downloads/pems/user .you can modify this path to store the files to a different path ( make sure the path exist )

Thanks for reading,

cheers.

Published By

DevOps AWS ☁️(CDA,CSA) | Python | Terraform | Packer | Docker | Jenkins | Ansible | ELK | Rancher | Kubernetes | Bash

Follow

Last updated

Was this helpful?