Arlekin M.


122 euro
3 years

My experience


BIG BOTS (PRIVATE LIMITED)October 2020 - January 2021

DevOps Jobs
(Remote external worker)
- Multitenant shared kubernetes GKE and kubernetes EKSclusters to deploy a SaaS django chatbot machine learning application.
- Achieve agreed customer service levels in keeping production environments online at least 95 percent of the time.
- Established a monitoring mechanism that detected performance bottlenecks in the application early
- Dramatically reduced the time from source code update to deployment through a fully automated CI / CD pipeline
- Increase the response capacity to failures in the application code or in the infrastructure thanks to the increase in transparency when handling all configuration from the central git repository.
- Achieve complete automation of the configuration and deployment of all the necessary resources to create a new application to new customers.
- Apply the GitOps operational framework to the infrastructure automation using terraform to deploy AWS and GCP resources. The app can be deployed in EKS or GKE depending on the value of the gitlab variable ``CLOUD''.
- Deployment of a GKE cluster with three pool nodes for staging, production and complementary workloads. The staging nodes are preemptible virtual machines.
- Create public and private buckets per each customer stack in both clouds
- Create k8s yaml configuration definition object files for objects service, ingress, configmap, secrets, horizontal pod autoscaling for the components of the apps.
- Deploy three cloud sql instances for staging, production and complementary databases
- Use terraform to implement and configure gcp cloud functions and cloud scheduler to make backups of the databases in the cloud sql instances because the databases created by the cloud sql instance automatic backup feature can be lost when the instance is deleted.
- Create CI CD pipelines in GitlabCIto deploy the components app in either cloud.
- Create gitlab pipelines to create or delete a customer stack automatically, e.g creating databases, user databases, passwords, kubernetes objects, infrastructure resources.
- Create gitlab pipelines to deploy and configure complementary apps like Keycloak and Kubecost
- Dockerize the django app and related components like Vuejs frontend, Pgbouncer, GCP Cloud Proxy among others.
- Deployment and configuration of AWS CodeGuru profiler.
- Deployment of EKScluster with three EKS node groups or staging, production and complementary workloads.
- Configuration of EKS autoscale cluster component.
- Deployment of AWS ESwith terraform.
- Implementation and configuration of gitlab managed apps like nginx controller, certbot and k8s gitlab-runner with terraform.
- Deploy and configure Rancher. Import EKS and GKE managed k8s clusters.

BUSINESS IT SYSTEMS PTE LTDOctober 2017 - January 2019

(Remote external worker)
- Multi Tenant shared platform to support the CI/CDof customized customer
Odoo - PostgreSQL - MongoDB stacks in AWS cloud

- Implement a Marathon - Mesos cluster (formerly known as DC/OS) in AWS using Ansible Playbooks and Terraform. AWS services used were security groups, IAM, EC2 (public and private cluster nodes),
VPC, ELB, and S3
- Developing of json DC/OS deployment description template files
- Deployment of Marathon-LB (Haproxy + dynamic configuration) as SSL termination and layer 7 edge load balancer where all the vhosts pertaining to each customer were configured. This service was setup in high-availability
- Dockerization of customs Postgres, MongoDB and Odoo apps.
- Deployment of an ELK (Filebeat, Logstash, Elastitcsearch and Kibana)stack for central logging of apps and DC/OS services.
- Implementation of a Nexus server in DC/OS to host a docker private registry and a pip python repository.
- Deployment of Marathon Acme for the automatic generation of free letsencrypt ssl certificates.
- Implementation of self-managed Gitlab server in DC/OS. Automatic schedule backups to an S3 bucket.
- Deployment and configuration of Nagios and nrpe-clients with ansible.
Configuration of checks for disk space, mem, load,swap space, DC/OS services, and app http endpoint health.
- Deploy a self-healing and autoscaling Jenkins cluster on AWS. This deployment was fully automated using a Cloudformation template. It uses an ECS cluster, Auto-Scaling Groups, Cloudwatch Alarms, EFS Storage, Load Balancer, and VPC.
- Create jenkins pipeline jobs to automate any task related to the installation and configuration of complementary apps like ELK stack, Nagios, Gitlab, Nexus and the DC/OS cluster.
- Develop a CI/CDpipeline which automates all the process of building, test and deployment of the customer stacks in the staging and production environments. The pipeline was developed using Jenkins best practices like the scripted pipeline language and shared libraries to obtain DRYpipeline code. The pipeline is autotrigger after a git push taking advantage of a Gitlab and Jenkins integration.
- Develop Jenkins pipelines to restore, backup in schedule by following a backup plan the databases of all customer PostgreSQL,ELK stack, Gitlab, Jenkins, MongoDB, and any other source of data. The status results of the backups are sent by SMS or email using the SNS service in AWS
- Improve security by configuring the AWS System Manager Session
Manager Service in every instance to replace the bastion-based SSH access with the session terminal based on the IAMuser permissions. I also used the System Manager to deploy triggers to replace the function of the Linux SCP command

- Deploy AWS Ops Automator using Cloudformation and set up the automated and scheduled snapshot backup of important EBS volumes.
The AWS Ops Automator is based on Amazon CloudWatch, SNS, Lambda, DynamoDB, SQS queue, and S3.
- Create a Jenkins pipeline to fully automate the creation of a customer stack. The scripted pipeline automates the creation or cloning of a project in Gitlab (odoo addons) using the Gitlab REST API, create staging and production branches, create the Python pip requirements file inside the addons repository, assign members and global deployment keys using the Python gitlab library, creates the customer credentials (PostgreSQL password) in Jenkins credentials store, creates the jenkinsfile with custom properties to deploy odoo, creates the custom Dockerfile, and create a fully configured Jenkins multibranch job with the DSL language.

- Development of an Azure ARM Template to deploy the toolchain app veChain on
Azure Marketplace
- Definition of parameters and variables in the ARM template
- Configure location of the setup bash script i and related docker files in blob storage
- Configure resources of Public IP address, Network Security Groups, Network Interfaces, Load Balancer, and Virtual Machine.
- Write the setup script which installs the blockchain app dependencies, Docker, Docker-Compose, Nginx, Redis, and MySQL. This script also generates and applies the nginx configuration, app related configuration, and the docker-compose file which is executed at the end of the script.


Automate a Laravel application Deployment to Managed DigitalOcean
Kubernetes with CircleCI
Jan / 2020
DevOps expert
Project Details:
Laravel application that we will like to set up an automated deployment to Digital
Ocean Managed Kubernetes. After deployment, the application will be served
NGINX INGRESS. ent/reviews

Technologies used: Linux, CircleCi, Github, Kubernetes, Docker, Laravel, PHP, Nginx

/freelancer.comMarch 2017 - March 2017

Printing remote file over SSH to local printer
I need to print files from server to my local machine using ssh tunnel (putty)
For this job at least, SSH server, CUPS, putty (localmachine) have to be setup
Job is done once I can print on my PDFCreator local printer by typing a "lp -d xxxx
xxxx.txt" instruction on my ssh terminal
Remote server is running under Linux Mandriva 2006
Local machine is running under Windows 7

Technologies used: Linux, Unix, CUPS, SSH tunnel

/freelancer.comMarch 2017 - March 2017

Write python or bash script to organize files organize/details
I need a script to move files from one directory to the right subdirectory under another directory.
There are files in directory Alpha. Under directory Beta, there are many subdirectories with names.
I need a script, either python or bash, to evaluate each file from directory Alpha and find the subdirectory in Beta that contains a match of one of the words in the name of the file. When it finds a match, move the file to that directory. If there is no match, leave the file in Alpha.
Technologies used: Linux, Python

Ministry of Planning and Development of VenezuelaApril 2009 - December 2009


- Structured cabling deployment
- Deployment of staging and production LAMP
(Linux+Apache+MySQL+PHP) servers for Developers work
- Implementation of IP telephony system with MITEL smart telephones
- Segmentation of a single broadcast domain network of more than 1000 devices using Cisco equipment and VLAN technology.
- Deployment of DNS, Web Server, DHCP, backup procedures,domain controller with SAMBA, email delivery using the Linux Debian OS
- Management of Windows SQL Server and Windows Server.

Unidad Clinica Quirurgica NoresteMay 2006 - November 2006

- Manage two servers under a Linux Debian environment that provided file and printer sharing services, domain controllers,DNS and DHCP services.
- Deployment of Sybase Adaptive Server Anywhere 9 Database


Create bash script on Ubuntu to receive and post-process email

My stack

Virtualization, UNIX, Ubuntu, Terraform, SonarQube, Samba, Redis, PostgreSQL, Node.js, Nginx, Nagios, MySQL, MongoDB, Machine Learning, Logstash, Linux, Laravel, Kubernetes, Kibana, JSON, Jenkins, HTTPS, HAProxy, Groovy, GitLab CI, GitLab, GitHub, Git, Elastic Stack, DSL, Docker, DevOps, Cisco Switches/Routers, CircleCI, Blockchain, Bitbucket, Bash scripting, Ansible, Amazon Web Services, Active Directory