KEY RESPONSIBILITIES...
● Build and operationalize cloud-based platform components
● Develop production-quality data ingestion pipelines with automated quality checks using GCP and
Astronomer
● Assess existing systems architecture and recommend technical improvements
● Develop Python-based automation to support product development and data analytics
...
● Resolving technical problems as they arise
● Research and propose emerging technologies and platform enhancements
● Evaluate the business impact of technical decisions
● Participate in a collaborative, peer review based environment fostering new ideas via cross team
guilds / specialty groups
● Maintain comprehensive documentation around our processes / decision making
YOUR QUALIFICATIONS...
● Bachelor's Degree in Computer science or related field
● 3+ years of related information technology experience
● 2+ years building complex ETL pipelines with dependency management (e.g., File Watchers, APIs, Airflow)
● 2+ years of experience with Big Data technologies (Python, Data Lake, BigQuery SQL)
● Industry certifications such as GCP Certified Data Engineer or Solutions Architect (preferred)
● Demonstrated experience with the Scrum Agile methodology
● Strong ability to learn new technologies in a short time
● Must possess well-developed verbal and written communication
experience
6show more
KEY RESPONSIBILITIES...
● Build and operationalize cloud-based platform components
● Develop production-quality data ingestion pipelines with automated quality checks using GCP and
Astronomer
● Assess existing systems architecture and recommend technical improvements
● Develop Python-based automation to support product development and data analytics
● Resolving technical problems as they arise
● Research and propose emerging technologies and platform enhancements
● Evaluate the business impact of technical decisions
● Participate in a collaborative, peer review based environment fostering new ideas via cross team
guilds / specialty groups
● Maintain comprehensive documentation around our processes / decision making
YOUR QUALIFICATIONS...
● Bachelor's Degree in Computer science or related field
● 3+ years of related information technology experience
● 2+ years building complex ETL pipelines with dependency management (e.g., File Watchers, APIs, Airflow)
● 2+ years of experience with Big Data technologies (Python, Data Lake, BigQuery SQL)
● Industry certifications such as GCP Certified Data Engineer or Solutions Architect (preferred)
...
● Demonstrated experience with the Scrum Agile methodology
● Strong ability to learn new technologies in a short time
● Must possess well-developed verbal and written communication
experience
6show more