DevOps Engineer – Data

We expect candidates to have in-depth experience in a subset of the following skills and technologies and be interested in filling any gaps in knowledge as needed on the job. More importantly, we seek people who are highly logical, with a balance of respect for best practices and using their own critical thinking, adaptive to new situations, capable of working independently and delivering projects end-to-end. Other essential attributes include fluent communication in English, strong collaboration with teammates and stakeholders alike, eagerness to be a part of a high-performing team and a desire to take their career to the next level with us.

Apply
Responsibilities
  • Build modern, automated, cloud native, continuously delivered distributed data infrastructure.

  • Collaborate closely with our data, development and product teams in our fast-paced delivery environment.

  • Actively engage in improvement and maintenance of our data storage infrastructure.

  • Being a part of on-call rotation team to provide monitoring and reliability of the data infrastructure.

Requirements
  • Knowledge of technology best practices for building a modern automated DevOps platform.

  • Good understanding of technologies and experience in building a highly scalable and fault tolerant system.

  • Self-starter, capable of working with minimal supervision and able to deliver projects from scratch.

  • DevOps toolchain experience: GitHub, CircleCI, Artifactory, SonarQube et al.

  • Configuration and continuous deployment tools experience – for example Spinnaker/Ansible/Chef.

  • Strong scripting skills – Python and Bash.

  • Cloud computing deployment and management experience – AWS with HashiCorp Terraform ideally or other public clouds such as GCP or Azure.

  • Container management and container orchestration experience – Docker, Kubernetes.

  • Monitoring tools Elastic Stack, Prometheus, Grafana, Datadog.

  • Good practical knowledge of management of relational databases, PostgreSQL preferred.

  • Strong knowledge of Linux/UNIX and TCP/IP networking.

  • Strong understanding and experience with Agile/Lean methodologies - SCRUM, KANBAN etc.

  • Practical knowledge of Git flow, Trunk and GitHub flow branching strategies.

  • Strong verbal and written English communication skills.


Nice to have:

  • Experience with messaging systems, preferably Kafka, alternatively Solace/RabbitMQ/AWS Kinesis.

  • Exposure with other modern cloud data technology – Spark/EMR, DynamoDB/Casandra, S3/blob stores, Redis/ElastiCache.