ST

Fullstack · AI Engineering

Saad Tachrimant

Portfolio

Projects & case studies

A selection of production systems and applied ML work. Each project is written as a lightweight case study: context, role, technical depth, and what “done” looks like in production (tests, monitoring, handoff).

Software systems Microservices · Integrations

Quanoni Platform

Context: Platform for on-demand and scheduled consultations with routing, notifications, and payments. Role: Built and integrated core backend services and production-ready workflows (auth, messaging, payments, async processing).

Technical depth: OAuth2 + JWT authentication, event-driven flows with Kafka, external API integrations (WhatsApp + payments), and operable services with predictable failure handling (retries, idempotency where needed).

Outcome: Production workflows designed for reliability, observability, and maintainable handoff (tests + operational considerations).

Spring Boot Spring AI OAuth2 JWT PostgreSQL Angular Kafka WhatsApp API Payment API
AI product engineering LLM integration · Auth

Zynerator

Context: Tooling that accelerates application generation and feature delivery through AI-assisted workflows. Role: Integrated LLM capabilities and implemented core platform features around authentication and payments.

Technical depth: Spring AI + LLM API integration, secure JWT-based sessions, MySQL persistence, Angular-based UI, and payment provider integration to support monetized usage.

Outcome: AI features integrated in a way that is operationally safe for production (clear interfaces, controlled flows, and maintainable code paths).

Spring Boot Spring AI JWT MySQL Angular Payment API LLM API integration
Software systems Scheduling · Platform

EngFlexy Platform

Context: Adaptive learning platform with dynamic scheduling, instructor ranking, and daily usage workflows. Role: Implemented backend services and integrations supporting scheduling and operational consistency.

Technical depth: Service registration with Eureka, integrations (payments + calendar), Angular frontend, and WordPress touchpoints where needed.

Outcome: Stable scheduling and integration flows designed for day-to-day operations and maintainability.

Spring Boot Angular WordPress Eureka Payment API Calendar API
AI / ML · Research Edge ML · Distributed

P2P Thermal Forecasting

Context: End-to-end ML system running under real deployment constraints (edge devices, sensor integration, resource limits). Role: Built the training/inference pipeline and distributed communication layer, including data ingestion from sensors.

Technical depth: gRPC-based coordination, time-series storage in InfluxDB, PyTorch models (LSTM/GRU), and robust ingestion from sensor APIs with attention to reliability and reproducibility.

Outcome: Demonstrates full ML engineering loop: data → training → evaluation → inference, with distributed/edge constraints in mind.

Python gRPC PyTorch InfluxDB Sensors API integration LSTM GRU
Data · AI / ML DQ · Benchmarking

THERMODSET

Context: Dataset and experimentation pipeline designed for trustworthy ML work on thermal/building data. Role: Implemented the preprocessing, quality checks, visualization, and modeling baselines for repeatable experimentation.

Technical depth: Data visualization and automated quality checks, anomaly detection/removal, and model baselines using XGBoost and PyTorch (LSTM).

Outcome: Cleaner datasets and reproducible experiments that make model performance comparisons more reliable.

Python PyTorch XGBoost LSTM Data visualization Quality checks Anomaly removal