Design, build, and maintain batch/streaming data pipelines using Airflow (orchestration), dbt (transformations), and AWS/S3 for storage.Develop data models and ELT jobs for Snowflake and/or ClickHouse; write efficient SQL and Python.Implement data quality checks, tests, and documentation; contribute to data governance and security practices.Monitor, troubleshoot, and optimize pipeline performance and cost (CloudWatch/warehouse observability).Participate in sprint ceremonies and manage your work in JIRA; contribute to backlog refinement.Follow and improve team standards for version control, code reviews, CI/CD, and code quality using GitHub.Collaborate with analysts, data scientists, and business partners to understand requirements and deliver usable datasets.
...
Design, build, and maintain batch/streaming data pipelines using Airflow (orchestration), dbt (transformations), and AWS/S3 for storage.Develop data models and ELT jobs for Snowflake and/or ClickHouse; write efficient SQL and Python.Implement data quality checks, tests, and documentation; contribute to data governance and security practices.Monitor, troubleshoot, and optimize pipeline performance and cost (CloudWatch/warehouse observability).Participate in sprint ceremonies and manage your work in JIRA; contribute to backlog refinement.Follow and improve team standards for version control, code reviews, CI/CD, and code quality using GitHub.Collaborate with analysts, data scientists, and business partners to understand requirements and deliver usable datasets.