Data Engineer

Join us in building a virtual bank from scratch using cloud native technologies. We're a fast-growing team solving exciting problems and delivering high quality products in small, interdisciplinary teams. We’re applying the learnings from Google, Twitter and Netflix to build the next generation of banking.

  • Collaborate closely with our development and product teams in our fast-paced delivery environment
  • Design, build and maintaining modern, automated, cloud native, analytics infrastructure
  • Build and manage data warehouses, databases, data pipelines
  • Understand and translate business needs into data models supporting long-term solutions. Work with the development team to implement data strategies, build data flows and develop conceptual, logical and physical data models that ensure high data quality and reduced redundancy
  • Container Management and container orchestration experience, like Docker, Kubernetes
  • Monitoring tools Elastic Stack, Prometheus, Grafana
  • Breadth of knowledge – operating systems, networking, distributed computing, cloud computing,
  • Familiar with Big Data Technologies, e.g. AWS RedShift, Panoply, ETL Tools, e.g. StitchData and Segment, Machine Learning technologies and environments
  • Knowledge of technology best practices for building a modern data lake, data warehouses and data pipelines
  • Good understand of technologies and experience in building a highly scalable and fault tolerant cloud data platform
  • Self-starter, capable of working without direction and able to deliver projects from scratch
  • Good practical experience and knowledge in building and maintaining Data Warehousing/Big Data Tools - Hadoop and MapReduce, Apache Spark and Spark SQL, HIVE
  • In-Depth Database Knowledge of RDBMS (PostgreSQL and MySQL) and NoSQL (HBase)
  • Strong experience in building and maintaining cloud Big Data and ETL tool, Google Big Table, Big Query and Air Flow (Google Composer)
  • Strong knowledge and experience with Apache Beam in implementing batch and streaming data processing jobs, strong Development background in Python or Java
  • Strong knowledge in messaging systems like Kafka, RabbitMQ and Google Pub/Sub
  • Experience with Agile/Lean projects SCRUM, KANBAN etc.
  • Practical knowledge with Git flow, Trunk and GitHub flow branching strategies
  • Strong English communication skills