Data
Data Engineer
About Woodside Energy
We are a global energy company, providing reliable and affordable energy to help people lead better lives. Join our team at Woodside Global Solutions in Bengaluru where talent, digital expertise, and operational excellence converge to solve complex energy challenges, accelerate change, and reimagine business capabilities to support Woodside's global operations and our role in the energy transition.
Founded in 1954, Woodside established the liquefied natural gas (LNG) industry in Australia 40 years ago and supplies customers around the globe. 70 years on, Woodside continues to be driven by a spirit of innovation and determination.
At Woodside, we know great results come from our people feeling valued, getting the support they need to reach their full potential and working in a psychologically and physically safe work environment. We believe in nurturing talent and providing opportunities for continuous learning and career advancement.
Refer to our corporate website for more information about our different locations and projects: What We Do
About Woodside Global Solutions
Woodside Global Solutions in Bengaluru is being built as a hub of excellence, to drive innovation, digital transformation, and global collaboration.
Working as one Global team, the Woodside Digital team is a trusted partner driving transformation within the organisation. We are bold in our ambitions and resolute in our actions. Through cutting-edge AI, robust cyber security, and advanced data solutions we drive innovation and influence every part of our business.
We are looking for talented professionals who are passionate about technology and eager to make a global impact, helping to shape the future of Woodside together.
About the role
The purpose of the Data Engineer position is to leverage their technical expertise and domain knowledge to design, build, and maintain efficient and robust data solutions. The Data Engineer shall be responsible for the building, testing, and deployment of Data Pipelines on the EDP. The engineer shall ensure that the pipelines are developed and deployed with a secure-by-design approach, delivering robust, thoroughly tested, and maintainable solutions.
Duties & Responsibilities:
Build
Pipeline Development
Develop piplines using the standard patterns for data pipelines and workflows utilizing Streamsets, Kestra, dbt, Git
Design and Implement data storage and processing solutions employing Snowflake
Utilize AWS services for cloud-based platform tooling infrastructure including but not limited to: Lambda,ECS,MSK,RDS,EC2, Secrets Manager, ALB, Cloud Watch, Event Bridge
Utilize Terraform for AWS and Azure deployments
Leverage and integrate APIs for data access and manipulation
Write Python scripts for data common processing and automation tasks
Leverage Platform API’s and Web Applications to enforce Platform Security
Development experience with Go, SQL, C#, .net, JavaScript, shell scripts & container platforms like Docker
The engineer shall have experience integrating with timeseries source systems: Honeywell Plant Historian Database, OSI Pi
The engineer shall have experience in Authentication mechanisms including but not limited to (OAuth 2.0, OIDC, Microsoft Entra, Key Pair Authentication, Certificate based authentication, SAML based SSO)
Test
Quality Assurance
Create and execute comprehensive test plans to ensure the pipelines functionality and performance
Develop unit tests, integration tests, and end-to-end tests for data pipelines and workflows
Ensure data accuracy and consistency through rigorous testing processes
Leverage automated testing processes to enhance efficiency
Governance
Compliance & Risk
Due to the Crown Jewel nature of the enterprise data platform, Data Engineers may have access to PII, Confidential and Most Confidential data
This role requires strict adherence to access process and procedures to maintain Data Privacy and Security
Identify and report any potential breaches of the Data Information and Systems Processes
Operate
Platform Maintenance
Monitor and manage the platform to ensure optimal performance and uptime
Conduct regular maintenance tasks such as updates, patches, and backups
Resolve any issues or incidents related to the platform in a timely manner
Continuously improve platform operations through automation and optimization
Strong experience with Windows & Unix like operating systems
Security
Secure by Design
Implement security best practices throughout the pipeline development and deployment process
Conduct regular security reviews and vulnerability assessments
Ensure data encryption, access control, and other security measures are enforced
Use credential management platforms like Thycotic Secret Server, AWS Secrets Manager
Support
Technical Guidance
Assist in troubleshooting and resolving intricate technical issues
Deliverables
Robust and scalable data pipelines with well-documented code and processes
Comprehensive test plans and automation scripts ensuring platform reliability
Regular security assessments and compliance reports
Technical support and guidance documentation for delivery data engineers
Deliver secure, robust, and maintainable data pipelines
Ensure high-quality and thoroughly tested data solutions
Maintain compliance with security standards and best practices
Maintain compliance with the Data Lifecycle Management Process
Maintain compliance with the Data Privacy standards and best practices
Skills & Experience:
Bachelor’s or Master’s degree in Computer Science, Data Science, Information Technology, or a related field with a focus on data engineering or data analytics
Technical Expertise: Strong proficiency in programming languages such as Python, SQL, Java, or Scala for data processing and analysis
Data Engineering Skills: Experience with data modeling, ETL processes, data warehousing, data integration, and data pipeline development
Database Knowledge: Proficiency in relational databases (e.g., SQL, PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra)
Cloud Platform Experience: Working knowledge of cloud platforms such as Snowflake, AWS, Azure, or Google Cloud Platform for data storage and processing
Data Visualization: Experience with data visualization tools (e.g., Power BI) to create meaningful insights from data
Data Quality Assurance: Understanding of data quality principles, data governance, and data validation processes
Analytical Skills: Ability to analyze complex data sets, identify trends, patterns, and insights to drive data-driven decision-making
Problem-Solving Abilities: Proficiency in troubleshooting data-related issues, identifying root causes, and implementing solutions
Project Management Knowledge: Familiarity with project management methodologies to contribute effectively to project planning and execution
Communication Skills: Strong verbal and written communication skills to collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders
Continuous Learning: Willingness to stay updated with the latest data technologies, tools, and industry trends to enhance data engineering skills
Experience: Prior experience in data engineering, data analytics, or related roles with a track record of successful data project delivery
Technical Leadership: Ability to make informed, strategic decisions that align technology with business objectives, while balancing short-term and long-term trade-offs
Customer Focus: Deep understanding of customer needs and how to translate them into effective technical solutions that drive business value
Collaboration: Encourages collaboration across teams and stakeholders, breaking down silos and ensuring alignment
Problem-Solving: Strong analytical and problem-solving skills, capable of addressing complex technical challenges and delivering innovative solutions
Innovation: Ability to lead innovation in technology while maintaining an eye on product-market fit and user experience
Agility: Adaptability in a fast-moving environment, with a mindset focused on delivering high-impact solutions quickly and iteratively
If you think you can do this job but don’t meet all the criteria, that’s OK! Please apply. At Woodside, we value people with diverse experiences and backgrounds, as they provide unique perspectives that help us innovate.
Recognition & Reward:
What you can expect from us:
Commitment to your ongoing development, including on the job opportunities and formal programs
Inclusive parental leave entitlements for both parents
Values led culture
Flexible work options
Generous annual leave, sick leave and casual leave
Cultural and religious leave with flexible public holiday opportunities
A competitive remuneration package featuring performance based incentives with uncapped Employer Provident Fund
Woodside is committed to fostering an inclusive and diverse workforce culture, which is supported by our Values. Inclusion centres on all employees creating a climate of trust and belonging, where people feel comfortable to bring their whole self to work. We also offer supportive pathways for all employees to grow and develop leadership skills.
If you are ready to take your career to the next level and be part of a global organisation that values innovation and excellence, apply now through our careers portal or connect with our recruitment team.