Back to roles

Dev Ops Engineer Big Data Cloud AWS

Discipline:
Data and Analytics
Business unit:
Enterprise Services
Location:
Melbourne
Employment type:
Full-time Permanent

We are looking for a DevOps engineer to work with an established data and analytics and cloud practice  across a number of projects and who will be responsible for building, improving and maintaining the cloud and big data  infrastructure, enabling their services to be deployed, run and operated from development to production. You will have the chance to work on a variety of cloud and big data projects with highly robust and scalable infrastructure, and some of the most cutting-edge systems and tools, including AWS, Terraform, and Docker.

Essential Skills and Experience

  • Expert skills in Amazon Web Services (EC2, S3, EMR, ELB, Elastic Beanstalk, VPC, Kinesis, Route 53, security groups)
  • Solid background in DevOps engineering (load testing, continuous integration, change management, application monitoring, production support)
  • Strong experience in orchestration tools (Jenkins, Ansible, Chef)
  • Experience with scripting languages (Python, Scala, Go)
  • Expert skills in UNIX
  • Good understanding of networks and information security
  • Ability to manage and work in fast-paced, dynamic work environment
  • Any Data Warehousing, Big Data or Data Analytics experience is a huge bonus, but not essential.

 

Our Consultants are experts in big data, data engineering and connected cloud systems, all of whom possess excellent high level consulting skills combined with an ability to understand a client's business problem, the ability to design solutions, and guide the implementation of these solutions

If you are interested in gaining experience working on leading edge Cloud and Analytics projects or would like to know more about the work we do at Infoready, please apply online, including your resume, and we will be in contact to discuss your application.

Back to roles