elishatheodore@resume:~
$ ./resume.sh --init
drwxr-xr-x
elisha Theodore
/
[ACTIVE]
-rw-r--r--
title.txt
"Cloud & DevOps Engineer | Building Scalable, Cost-Efficient Infrastructure"
-rw-r--r--
location.conf
"Cape Town, South Africa | Open to Remote"
drwxr-xr-x
contact/
[2 items]
-rw-r--r--
email.txt
"elisha@elisha.app"
drwxr-xr-x
links/
[3 links]
-rw-r--r--
portfolio.link
"www.elisha.app/projects"
-rw-r--r--
github.link
"github.com/elishatheodore"
-rw-r--r--
linkedin.link
"linkedin.com/in/elishatheodore"
$ cat profile.json | grep "skills" | head -20
Elisha Theodore

Professional Summary

Cloud, DevOps & Data Engineering leader architecting and implementing enterprise-grade Azure & AWS solutions with Kubernetes, Docker, CI/CD, Terraform, and advanced data pipelines. Proven track record of delivering 99.9% uptime, reducing infrastructure costs by 40%, and automating 80%+ of manual processes. Specialized in cloud-native architectures, microservices, serverless computing, and DevSecOps practices across distributed systems and high-availability environments.

I help businesses:

Reduce cloud costs

Optimize infrastructure spending while maintaining performance and reliability

Deploy faster with CI/CD

Automate deployment pipelines and accelerate time-to-market

Build scalable infrastructure

Design production-ready systems that grow with your business

What I Can Do For You

Kubernetes Orchestration

Container orchestration and microservices management

Cloud Architecture (Azure)

Design and implement scalable cloud solutions

DevOps Automation (CI/CD)

Build automated deployment pipelines

Infrastructure as Code (Terraform)

Manage infrastructure with code

Cost Optimization

Reduce cloud spending significantly

System Scalability

Design systems that scale effortlessly

Technical Skills

Comprehensive expertise across cloud platforms, DevOps automation, and data engineering technologies

Cloud Platforms

Azure AWS Azure Functions Azure Logic Apps Azure Storage Azure SQL

Container Orchestration

Docker Kubernetes AKS Helm Container Registry

Infrastructure as Code

Terraform Bicep ARM Templates Azure Policy

CI/CD & DevOps

GitHub Actions Azure DevOps Jenkins Azure Pipelines GitLab CI

Data Engineering

Python SQL ETL/ELT Microsoft Fabric Power BI Azure Synapse Data Factory

Development & Programming

TypeScript JavaScript FastAPI React Node.js REST APIs

Security & Monitoring

Azure Security Center Sentinel Application Insights Monitor Log Analytics

Featured Projects

Enterprise solutions showcasing cloud architecture, DevOps automation, and data engineering

Kubernetes Platform

Production System

LIVE
Docker AKS CI/CD

Enterprise container orchestration with automated deployments, scaling, and comprehensive monitoring delivering 99.9% uptime.

deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
  name: microservice-api
spec:
  replicas: 3

AI DevOps Agent

Intelligent System

ACTIVE
AI Python Terraform

Intelligent automation agent that orchestrates DevOps workflows, manages infrastructure deployments with machine learning insights.

agent.py
class DevOpsAutomationAgent:
    def __init__(self):
        self.credential = DefaultAzureCredential()
        self.app = FastAPI()
    
    async def deploy_infrastructure(self, config):
        optimized_config = await self.ml_model.predict(config)
        return {"status": "success"}

Azure Infrastructure

Cloud Platform

LIVE
Azure Terraform Bicep

Comprehensive Azure cloud infrastructure with automated provisioning, monitoring, and cost optimization achieving 40% cost reduction.

main.bicep
resource storageAccount 'Microsoft.Storage/storageAccounts@2022-09-01' = {
  name: 'st${uniqueString(resourceGroup().id)}'
  location: resourceGroup().location
  sku: {
    name: 'Standard_LRS'
  }
  kind: 'StorageV2'
  properties: {
    minimumTlsVersion: 'TLS1_2'
    allowBlobPublicAccess: false
  }
}

Data Pipeline

Analytics System

ACTIVE
Spark Databricks SQL

Real-time data processing pipeline with ETL workflows, analytics, and visualization processing 10M+ records daily.

pipeline.py
from pyspark.sql import SparkSession
from pyspark.sql.functions import *

def process_data():
    spark = SparkSession.builder.appName("DataPipeline").getOrCreate()
    
    # Read source data
    df = spark.read.parquet("/data/source/")
    
    # Transform data
    transformed = df.filter(col("status") == "active")
    
    # Write to destination
    transformed.write.parquet("/data/processed/")
    
    return "Processing completed"

Professional Experience

Enterprise-level cloud architecture, DevOps leadership, and data engineering expertise

Consultant – Cloud, Data & Infrastructure Systems

2022 – Present
Current Role
  • Led the design and deployment of Azure-based cloud infrastructure (VMs, VNets, subnets, NSGs, secure connectivity), overseeing cross-functional teams to support $4M+ in business-critical systems while improving application uptime by 35%. Implemented infrastructure monitoring and observability to enhance system visibility and reduce incident response times.
  • Directed the architecture and optimization of cloud-native compute and Kubernetes-based containerized workloads, guiding engineering teams to build scalable, resilient environments that handled 50% more client traffic and reduced operational costs by $250,000 annually. Integrated observability (metrics, logging, and tracing) to ensure high availability and proactive performance management.
  • Managed end-to-end data engineering initiatives, leading the design and implementation of ETL pipelines that reduced reporting cycle times by 60% and enabled data-driven decisions impacting $2M in revenue. Introduced data pipeline observability to improve data quality, reliability, and monitoring of critical workflows.
  • Spearheaded data platform modernization, coordinating teams to build and orchestrate workflows using Microsoft Fabric, Azure-native services, and Kubernetes pipelines, while integrating CI/CD pipelines and DevOps practices to automate 80% of manual processes and save over 500 hours annually.

IT Project Analyst / Project Manager

Heritage Bank Plc | 2016 – 2017
  • Directed the deployment of IT systems and infrastructure, coordinating engineering and operations teams to enable seamless integration with business processes, reducing system downtime by 30% and strengthening operational continuity.
  • Led legacy data recovery project, converting 130,000+ customer records from manual paper to electronic databases; implemented and operated an electronic document management system (EDMS), recovering lost information and enabling debt recovery worth over $600,000.
  • Oversaw technical implementation and environment configuration, collaborating with engineering teams to accelerate delivery timelines by 25% while improving reliability across 10+ mission-critical applications.

Operations Manager

CHISAAP Global Link Ltd | 2012 – 2013
  • Implemented performance tracking systems, improving operational efficiency across multiple departments by 30% and enabling faster, data-driven decision-making.
  • Coordinated workflow improvements, streamlining processes and reducing reporting cycle times by 40%, supporting more timely management actions and operational cost savings of $200,000 annually.

QAQC Analyst

NNPC Limited | 2009
  • Conducted quality assurance analysis on engineering and operational data, ensuring accuracy and compliance in reporting processes.

Education

Executive MBA

Specialized in strategic leadership, technology-driven innovation, and digital transformation to scale organizations and drive efficient, data-informed business performance.

Bachelor of Engineering

Trained in designing and optimizing complex, large-scale systems, with strong emphasis on efficiency, reliability, and data-driven decision-making across interconnected processes.

Technical Metrics & KPIs

99.9%

System Uptime

-40%

Cost Reduction

80%+

Automation Rate

50%

Performance Boost

Cloud Infrastructure

Azure Resources Managed 500+
Kubernetes Clusters 15+
Docker Containers 2000+

Data Engineering

ETL Pipelines 50+
Data Processed Daily 10TB+
API Integrations 100+

DevOps Excellence

CI/CD Pipelines 75+
Deployments/Day 200+
IaC Modules 300+

Interactive Terminal

elishatheodore@cloud-engineer:~$
bash
$ whoami
elisha_theodore - Senior Cloud, DevOps & Data Engineer
$ cat skills.txt
🔹 Cloud Platforms: Azure, AWS
🔹 Container Orchestration: Kubernetes, Docker
🔹 Infrastructure as Code: Terraform, Bicep, ARM
🔹 CI/CD: GitHub Actions, Azure DevOps
🔹 Data Engineering: Python, SQL, ETL/ELT
🔹 Development: TypeScript, FastAPI, React
$ systemctl status cloud-infrastructure
● cloud-infrastructure.service - Enterprise Cloud Infrastructure
Loaded: loaded (/etc/systemd/system/cloud-infrastructure.service; enabled; vendor preset: enabled)
Active: active (running) since Mon 2026-01-01 00:00:00 UTC; 365 days ago
Main PID: 1337 (cloud-engineer)
CPU: 99.9% | Memory: 256GB | Network: 10Gbps
$ kubectl get nodes
NAME STATUS ROLES AGE VERSION
aks-master-node-001 Ready master 365d v1.28.3
aks-worker-node-001 Ready worker 365d v1.28.3
aks-worker-node-002 Ready worker 365d v1.28.3
$ terraform apply -auto-approve
module.aks-cluster.azurerm_kubernetes_cluster.main: Creating...
module.vnet.azurerm_virtual_network.main: Creation complete after 2s [id=/subscriptions/...]
Apply complete! Resources: 42 added, 0 changed, 0 destroyed.
$ docker ps --format "table {{.Names}}\t{{.Status}}\t{{.Ports}}"
NAMES STATUS PORTS
api-gateway Up 365 days 0.0.0.0:80->80/tcp
data-processor Up 365 days 0.0.0.0:8080->8080/tcp
monitoring-stack Up 365 days 0.0.0.0:3000->3000/tcp
$ _

Get In Touch

I'm always interested in hearing about new opportunities and exciting projects.