Role - Data Engineer
Skill- Snowflake, Apache Airflow, DBT, Spark/Pyspark SQL and Data engineering concepts
Exp - 4-6 years
JD - "Design, build, test and operationalize scalable data pipelines and cloud-native data platforms leveraging Snowflake, Apache Airflow, dbt, and Spark/PySpark
...
-Build scalable data processing frameworks using Spark / PySpark for large-volume structured datasets
-Design, implement, and optimize cloud data warehouse solutions on Snowflake
-Develop modular, testable transformations using dbt, implementing reusable models, snapshots, and data tests. Perform robust testing across multiple layers of the data process pipeline
-Enable CI/CD for data pipelines integrating Git and deployment workflows"
Duration - 3 months
experience
9show more
Role - Data Engineer
Skill- Snowflake, Apache Airflow, DBT, Spark/Pyspark SQL and Data engineering concepts
Exp - 4-6 years
JD - "Design, build, test and operationalize scalable data pipelines and cloud-native data platforms leveraging Snowflake, Apache Airflow, dbt, and Spark/PySpark
-Build scalable data processing frameworks using Spark / PySpark for large-volume structured datasets
-Design, implement, and optimize cloud data warehouse solutions on Snowflake
-Develop modular, testable transformations using dbt, implementing reusable models, snapshots, and data tests. Perform robust testing across multiple layers of the data process pipeline
-Enable CI/CD for data pipelines integrating Git and deployment workflows"
Duration - 3 months
experience
9