Open to AI + Data Engineering roles
Building production-ready systems as aData Engineer
I design end-to-end data products where interaction data becomes actionable insight in near real time.
About Me
From raw data to clear, trusted decisions.
I build reliable data systems that combine analytics engineering, platform thinking, and GenAI workflows to turn fragmented data into measurable business outcomes.
Skills
PythonSQLDatabricksAirflowAWSAzureSparkSpark SQLPySparkDelta LakeKafkaClickHousedbtPower BIFastAPIRAGData QualityETL Pipelines
Focus
Data products, platform observability, and analytics engineering with production-first reliability.
Explore
Projects, experience, certifications, contact, telemetry, and a local RAG assistant.
Pipeline
Streaming architecture in product
How interactions move from the website into the analytics pipeline and back into the telemetry dashboard.
1
CaptureWeb App
User interaction layer
Captures user actions and event context.
2
ValidateEvent Ingest
API validation layer
Validates payloads and standardizes event shape.
3
StreamStream Buffer
Kafka / Redpanda
Decouples producers and analytics consumers.
4
QueryAnalytics Store
ClickHouse warehouse
Stores events for low-latency analytical queries.
5
InsightTelemetry View
Operational dashboard
Presents trends, metrics, and recent activity.
Ordering
Session-level ordering is preserved for reliable behavior traces.
Idempotency
Unique event IDs protect metrics from retry duplicates.
Freshness
Near real-time ingestion keeps dashboard signals actionable.
