- Build and maintain robust ETL pipelines to handle large datasets.
- Develop data models that support analytical needs.
- Collaborate with data scientists and analysts to understand requirements.
- Optimise query performance and storage efficiency.
- Ensure compliance with data governance and security standards.
- Stay updated with emerging technologies and implement best practices.
- 8+ years of experience.
- Strong expertise in SQL and Python.
- Proficiency in tools like Apache Spark, Kafka, and DBT.
- Familiarity with cloud platforms like AWS or Azure.
- Understanding of distributed systems and parallel processing.
- Excellent problem-solving skills and ability to lead technical initiatives.
- Experience with Snowflake or similar data warehouses.
- Knowledge of data lake architectures.