mid - senior data engineers in melbourne cbd

job type
apply now

job details

melbourne cbd, victoria
job category
job type
working hours
reference number
03 8319 1271
apply now

job description

Our clients is a highly recognised player in the financial space and are going through an Enterprise-wise transformation.

As part of this push they are also actively on the hunt for a highly experienced Enterprise Software Data Engineer to join their fold.

Key Focus of the role includes:

Ability to demonstrate a solid understanding of programming languages such as Python, Java, Scala and shell scripting. Experience with infrastructure tools such as Terraform and Ansible, and CI/CD pipelines are highly regarded. Also required is hands-on experience with AWS web services, particularly with a focus on configuring a secure environment.

Responsibilities will involve:

  • Ability to write and optimize complex queries using both SQL and NoSQL paradigms.
  • Able to work with different data types e.g. streaming, real-time, file based, RDMS, unstructured data, etc.
  • Promote AWS and cloud best practices to maximise compute performance while minimising infrastructure costs.
  • Design and develop ‘infrastructure as code’.
  • Keep up to date with AWS and other cloud providers.
  • Work on software and data engineering for bespoke projects.

Key skills and experience:

  • Experience working with large datasets in a restrictive environment (considerations around privacy, limits around user access, encryption, etc.).
  • Demonstrable experience of Hadoop technologies e.g. Spark, PySpark, Kafka, Hive, Flume, Hue, Sqoop, etc.
  • Experience working with, and configuring AWS services, preferably in a production environment.
  • Experience with DevOps and Agile – with demonstrated ability to drive continuous improvement.
  • Ability to mentor less experienced colleagues.
  • Excellent communication and interpersonal skills, both oral and written.
  • Ability to assess and implement new technologies and processes.
  • An open mindset and proven ability to innovate and influence.

Preference will be given to candidates with the following:

  • Administration of Hadoop ecosystem.
  • Data ingestion technologies and capturing meta-data and data lineage.
  • Experience with ‘infrastructure as code’, e.g. Terraform, Ansible.
  • Experience with shell scripting.
  • Experience with configuring secure environments with AD groups, SSO, SAML, etc.
  • Experience of Reporting and Analytics, and/or experience working with analysts and data scientists.
  • Experience using productivity and collaboration tools such as JIRA and confluence in a software delivery environment
  • Postgrad qualifications and self-learning courses and certifications (Coursera, Udacity, AWS, etc.) highly regarded.

Key skills required for the role

  • > 6 yrs experience in Build and deployment of High Volume Transactional systems
  • Strong experience in AWS
  • Strong experience in continuous integration environments
  • Well experienced in CI tools such as Jenkins/Bamboo
  • Experienced or strong knowledge in Chef, Puppet, Ansible
  • Strong experience with automated/scripted Infrastructure
  • Education Institutional background experience highly advantageous
  • Experienced working in building containerisation technologies like Docker and or similar techs like (Kubernetes or OpenShift) ideal

This is an exciting opportunity to join a team of highly talented Peers who are highly professional and passionate about what they do.

Previous experience working in High Pressure Delivery oriented environment is a Must with the ability work in technical builds and dealing with vendors.

If you think you have the right skills and experience, then please submit a cover-letter justifying your suitability for the role with regards to the above-mentioned points (e.g. 7yrs Agile Scrum, 2yrs BPMN) together with your resume (both in MS Word format) and click on the ‘apply now’ tab.

Ref: 90M0364533

At Randstad, we are passionate about providing equal employment opportunities and embracing diversity to the benefit of all. We actively encourage applications from any background.


Data Engineer, Python, Hadoop, Java, AWS, CI / CD


Bachelor of Science

educational requirements

Bachelor Degree