Career Profile

AI Scientist with 8+ years of experience designing robust ML systems and intelligent infrastructure at scale. I specialize in LLM safety, content moderation, user behavior modeling, and infra automation. I architected the Chagu protocol, built predictive ETL pipelines, and deployed customer-facing personalization systems serving millions. I thrive at the intersection of AI, infra, and trust—building end-to-end pipelines that are safe, scalable, and impactful.

Education

MS(eq), Data Science

Y-Data

MBA, Banking and Technology

Bologna University

Computer and Information Sciences and Support Services

Stavropol State Agricultural University

Experience

Sr. DevOps & MLOps Engineer

2024 – Present
Checkpoint
  • Designed, implemented, and maintained scalable infrastructure using Infrastructure as Code (Terraform, Ansible), and container orchestration systems like Kubernetes (AKS, EKS, GKE).
  • Developed and optimized continuous integration and delivery pipelines for seamless code deployments, reducing manual intervention and increasing efficiency across environments.
  • Managed cloud infrastructures on AWS, Azure, and GCP, focusing on high availability, security, and cost-efficiency in production environments.
  • ML workflows using platforms like SageMaker, Kubeflow, and MLflow. Automated model training, validation, and deployment processes, ensuring efficient scaling of machine learning models in production.
  • Worked closely with data science and development teams to ensure smooth integration of ML models into production environments, ensuring performance, reliability, and scalability.
  • Implemented monitoring solutions (Prometheus, Grafana, Datadog) for both infrastructure and ML models to ensure proactive issue detection and resolution, including anomaly detection for model behavior.
  • Enforced security best practices across infrastructure, including the setup of automated vulnerability scans, access management, and data encryption.

Sr. DevOps & MLOps Engineer

2022 – 2024
Ezbob
  • Built ML algorithm infrastructure to execute over big data.
  • Optimized and parallelized ML algorithms to meet latency constraints on a parallel compute grid.
  • Created integrated ML solutions for DevOps.

Principal DevOps Software Engineer

2021 - 2022
Pagaya
  • Integrated ML algorithms into the decision-making process.
  • Created ETLs with Apache Airflow and AWS Glue.
  • Managed capacity planning, high availability engineering, and performance tuning.
  • Created alerts, implemented monitoring tools, and improved visibility to prioritize and track operational metrics in team management.
  • Developed and optimized build, release, and deployment toolchain.

DevOps, DataOps, Automation - Big Data

2020 - 2021
Pyramid Analytics
  • Updated and deployed machine learning models to production.
  • Worked on data acquisition and developed solutions capable of handling large datasets.
  • Developed data visualization tools for future exploration and model verification.

DevOps, DataOps Engineer

2019 – 2020
Infibond
  • Developed and maintained infrastructure for deploying ML models to production.
  • Created tools for data visualization and model verification.
  • Implemented data validation, security, access, and backup solutions.

Automation and Infrastructure Engineer

2016 – 2018
Ping Identity
  • Designed and tested automated processes.
  • Conducted integration and system tests using Java, Python, Ruby, Appium, and Selenium.

Certifications

AWS Certified Solutions Architect

AWS

Machine Learning Certificate

AlgoExpert

Big Data Engineer

NAYA Technologies

Projects

Here are some of the projects I've worked on, showcasing my skills in Machine Learning, DevOps and Software Engineering.

Chagu - Secure Data Transformation and Transfer Protocol - A cutting-edge project focusing on secure data transformation and transfer using blockchain technology, integrating principles from neuroscience and AI.
Price Predictor - Machine Learning for Price Estimation - A machine learning model developed to predict prices based on historical data, leveraging advanced algorithms and data preprocessing techniques.
Infrastructure as Code with Terraform - Automated infrastructure setup using Terraform, including VPC, EC2, and RDS instances.
Platform tool - Implemented a secure file transfer protocol using blockchain technology to ensure data integrity.
And more you can find here - Explore more of my projects on GitHub.

Publications

Skills & Proficiency

Python, Pandas & NumPy

Scikit-learn, XGBoost & LightGBM

Deep Learning (Keras, TensorFlow, PyTorch)

Machine Learning Ops (SageMaker, MLflow, Kubeflow)

Data Wrangling & Feature Engineering

SQL, BigQuery & Data Lakes

Data Visualization (Plotly, Seaborn, Dash)

Cloud Platforms (AWS, GCP, Azure)

Security-Aware AI (Prompt Injection Defense, Blockchain Auditing)