Hello, I'm

Pareshkumar Mahyavanshi

Software Developer

Specializing in Python, DevOps with AWS Cloud Platform, Data Engineering, and Web Development

Paresh Mahyavanshi

About Me

I'm a Python Developer with extensive experience in DevOps, cloud infrastructure, and end-to-end automation. I specialize in designing scalable, secure, and production-grade systems that streamline complex business workflows using Python and cloud-native technologies.

I have deep expertise in AWS services such as EC2, S3, RDS, IAM, Lambda, Secrets Manager, and VPC networking. I build and deploy infrastructure using Terraform and CloudFormation, with configuration management handled by Ansible.

My DevOps toolkit includes containerization with Docker, orchestration with Kubernetes, and CI/CD pipeline automation using GitLab CI/CD, GitHub Actions, Jenkins, and Apache Airflow. I also integrate Databricks for data pipeline and analytics workflows.

I develop robust backend solutions using Django and Flask, incorporating role-based access, secure authentication, and modular architecture. My database experience spans PostgreSQL and MongoDB, and I’ve worked extensively on API development, data processing automation, and cross-functional collaboration in cloud environments.

6+

Years Experience

20+

Projects

3

Degrees

My Skills

Languages

  • Python (Backend, DevOps, Django/Flask)
  • Shell/Bash (System & Automation Scripts)
  • YAML/HCL (Terraform, K8s, Config Files)
  • SQL (PostgreSQL, ORM, Manual Queries)
  • HTML/CSS/Jinja2 (Web & Templating)
  • R (Data Analytics & Visualization)

Cloud & DevOps

  • AWS (EC2, S3, RDS, Lambda, IAM, EFS, VPC)
  • Docker, Kubernetes, Helm
  • GitLab CI/CD, GitHub Actions, Jenkins
  • Terraform, AWS CDK, Ansible
  • Secrets Manager, CloudWatch, Route 53

Data Engineering

  • Apache Airflow (DAG Scheduling)
  • ETL Pipeline Design & Automation
  • PostgreSQL & MongoDB
  • Data Validation & Schema Design
  • Secure Data Transfer (SFTP)
  • Databricks Analysis & Processing

Web Development

  • Django (ORM, Auth, Forms, Views)
  • Flask (Blueprints, REST Routes)
  • HTML5/CSS3/JavaScript
  • Jinja2 Templates
  • Responsive Design
  • Pagination & Dynamic UI
Technologies I work with:
Technology Ecosystem

Complete tech stack I work with across different domains

AWS EC2, S3, RDS Lambda, IAM, EFS Secrets Manager, SSM VPC, Route 53, CloudWatch
Containers Docker & Docker Compose Kubernetes (kubeadm) kubectl, Kind, Minikube Helm
CI/CD GitLab CI/CD GitHub Actions Jenkins Apache Airflow
IaC & Config Terraform (HCL) AWS CDK (Python) Ansible dotenv
Security Tailscale VPN AWS Secrets Manager Security Groups/NACLs SSM Session Manager
Data Airflow Databricks PostgreSQL MongoDB
Web Django (Auth, ORM) Flask (Blueprints) HTML5/CSS3/JavaScript Jinja2 Templates
Web UX Pagination Prefilled Forms Conditional Fields Autocomplete
Other Jira & Confluence Markdown & Documentation Git & GitHub Logging/Monitoring
Databases PostgreSQL (RDS) MongoDB (NoSQL) SQL Query Language Document DB Modeling
Data Validation YAML Schema Definitions Custom Python Validators Nested Object Validation Auto-increment ID Logic
Data Pipelines Apache Airflow (DAGs) Jenkins (SFTP/Files) Databricks Processing Containerized Workflows
Data Security Secrets Masking .env Configuration AWS Secrets Manager Secure Data Transfer

Work Experience

Software Developer / DevOps Engineer

Pythonwise Inc. (End Client: Prealize Health Inc.) Apr 2021 – Oct 2024

Designed and deployed secure, automated data pipelines to process sensitive healthcare data in compliance with HIPAA and HITRUST standards.

Migrated legacy systems (Shell, SQL, R) to a modern, cloud-native architecture using Python, Apache Airflow, and AWS.

Led end-to-end automation of data ingestion, transformation, analytics integration, and result delivery pipelines.

Enhanced CI/CD workflows using GitLab, Jenkins, Apache Airflow, and Databricks to support rapid and reliable deployments.

Developed a secure SFTP automation system for encrypted client file transfers and result distribution.

Managed production and staging environments, ensuring high availability and operational efficiency across all pipelines.

Collaborated with cross-functional teams to deliver scalable, reliable, and performant systems.

Implemented infrastructure as code using Terraform, and containerized services with Docker and Kubernetes.

Built dynamic configuration systems using YAML, CSV, and Jinja2 templates to support reusable and flexible workflow automation.

Achieved a 90% reduction in turnaround time by transitioning manual workflows to fully automated pipelines capable of handling 10x data volume growth.

Python Apache Airflow Databricks GitLab Jenkins AWS Terraform Docker Kubernetes DevOps Automation SFTP

Junior Engineer

Ceramic Tech Inc., Fremont, California, USA Mar 2019 – Oct 2020

Developed a durable internal web application to streamline workflow tracking and improve visibility across production stages.

Built custom job tracking systems using Django, Python, HTML, CSS, and SQLite to enhance automation and reduce manual effort.

Designed and implemented a Manufacturing Operating System (MOS) to manage the entire lifecycle — from raw material input to final product quality checks.

Collaborated with multiple teams to analyze CNC and manual machine performance, ensuring compliance with production standards.

Conducted quality inspections using pneumatic gauges, hand tools, CMMs, and vision systems to ensure products met engineering requirements.

Improved efficiency, traceability, and team collaboration by introducing process documentation and machine operation tracking systems.

Django Python HTML CSS SQLite Manufacturing MOS Quality Assurance

Education

MS in Learning Technologies and Media Systems

Harrisburg University of Science and Technology

2018 - 2020

MS in Electrical Engineering

Northwestern Polytechnic University

2015 - 2017

BE in Mechanical Engineering

Veer Narmad South Gujarat University (VNSGU)

2008 - 2012

Featured Projects

Here are some of my recent projects. Check out my GitHub for more!

Data Processing Pipeline

Automated Data Processing Pipeline

End-to-end data processing system built with Apache Airflow for orchestrating complex ETL workflows. Features include schema validation, secure SFTP transfers, AWS S3 storage, event-driven Lambda processing, and data normalization with MongoDB and PostgreSQL databases.

Python Apache Airflow MongoDB/PostgreSQL AWS Services
DevOps Infrastructure

DevOps Infrastructure Automation

Infrastructure as Code implementation using Terraform (HCL) and Ansible playbooks to provision and manage AWS resources. Features include multi-environment deployments, security group management, automated CI/CD pipeline setup with GitLab CI/CD, and code quality checks via SonarQube.

Terraform GitLab CI/CD AWS Services Ansible
Django Web Application

Django Manufacturing System

A complete web-based Manufacturing Operating System built with Django and Python. Features include role-based user authentication, workflow tracking, conditional form fields, prefilled update forms, and responsive design for shop floor tablets and mobile devices.

Django Python PostgreSQL HTML/CSS/JS
Kubernetes Microservices

Kubernetes Microservices Platform

Containerized microservices architecture deployed on a Kubernetes cluster (kubeadm). Features include auto-scaling deployments, service discovery, ingress configuration, Persistent Volume Management

Kubernetes Docker Prometheus Microservices

Get In Touch

Feel free to reach out for collaborations or just a friendly chat

Location

Toronto, Ontario, Canada