
Building Data You Can Trust.
Hello, I'm Thalibar Rifqi. I'm a data engineer who builds scalable data environments that teams can trust. Leveraging Airflow, ClickHouse, and dbt, I design systems where quality, context, and history matter as much as the numbers themselves.
About me
While my formal academic background is in Computer Engineering, my expertise in building scalable data environments was forged entirely through hands-on industry experience and continuous self-learning. I discovered my passion for making data practical, and quickly specialized in the Modern Data Stack. Rather than simply moving data from point A to B, I build automated systems that transform fragmented operational records into reliable semantic layers. I thrive on collaborating with cross-functional teams, replacing manual reporting bottlenecks with highly performant models that drive daily business decisions.
My core engineering tools include Apache Airflow, dbt, ClickHouse, Postgres, and Google Cloud Platform, complemented by a strong focus on serving accessible insights through Metabase and Power BI.
I’m currently open to full-time roles and high-impact freelance collaborations. Let’s connect and build a data foundation your team can trust!
My Projects
Enterprise Dimensional Modeling
Architected a centralized data warehouse utilizing Kimball’s dimensional modeling methodology. Designed and deployed robust star schemas structuring complex manufacturing data into intuitive Fact and Dimension tables.
- ClickHouse
- dbt

Enterprise Inventory Orchestration
Architected an automated pipeline using Apache Airflow and dbt. Transformed raw warehouse data into a performant ClickHouse semantic layer, empowering operations with Metabase dashboards.
- ClickHouse
- dbt
- Apache Airflow
- Metabase

Historical Data Tracking via dbt Snapshots
Engineered a robust Slowly Changing Dimension (SCD Type 2) architecture using dbt snapshots. Overcame complex ClickHouse configuration constraints to accurately capture historical data mutations.
- ClickHouse
- dbt

Modular Data Transformations via dbt Macros
Engineered custom dbt macros to enforce the DRY (Don't Repeat Yourself) principle across the data warehouse. Remove repetitive SQL transformations into centralized components.
- ClickHouse
- dbt

My skills
- Airflow
- dbt
- Clickhouse
- Dimensional Modeling
- SQL
- Python
- Postgres
- MySQL
- Google Cloud Platform
- Docker
- Linux
- Metabase
- PowerBI
- Git
My experience
Data Engineer — Gresik, East Java
Architected and deployed a centralized Enterprise Data Warehouse to unify fragmented ERP systems. I engineered 20+ automated pipelines using Apache Airflow and implemented Kimball Dimensional Modeling via dbt to create a performant semantic layer for executive decision-making.
2023 - presentData Quality Operator — Remote
Maintained a 90%+ data quality threshold for a global music database. I specialized in auditing and validating high-volume datasets, collaborating with international teams to ensure the integrity and reliability of platform analytics.
2021 - 2022Software Engineer Intern — Pasuruan, East Java
Digitized legacy paper-based reporting workflows into a web-based monitoring application. Developed a centralized platform using MySQL and JavaScript to streamline internal problem-reporting and operational monitoring.
2020Contact me
Please contact me directly at thalibarrifqi@gmail.com or through this form.