Data Engineer with GCP

The CHI Software team is not standing still. We love our job and give it one hundred percent of us! Every new project is a challenge that we face successfully. The only thing that can stop us is… Wait, it’s nothing! The number of projects is growing, and with them, our team too. And now we need a Data Engineer with GCP experience.

We are looking for an experienced Data Engineer to leverage their skillset in a fast-paced financial trading environment. You will be responsible for designing, developing, and maintaining data pipelines and infrastructure across GCP , and Oracle-based systems, ensuring high-performance data flows that support trading, risk, and compliance functions. This role requires a deep understanding of cloud architecture, large-scale data processing, and a strong grasp of agile delivery practices.

 

Requirements

About the client

An international consulting firm that helps companies of all sizes have a better impact on the world. Company capabilities focus on supporting the private and public sectors with their people, processes, and digital technology challenges.

What You Will Do

● Design, build, and maintain scalable data pipelines primarily within Google Cloud Platform
● Develop integration and transformation workflows between cloud data services and on-prem Oracle databases
● Work closely with trading, risk, and analytics teams to understand data requirements and deliver real-time and batch data solutions
● Optimise and monitor performance of data systems to support latency-sensitive trading applications
● Collaborate with cross-functional teams using Agile/Scrum methodologies to deliver business-critical data projects
● Ensure robust data governance, lineage, and compliance (including MiFID II, FCA, and other regulatory standards)
● Automate data workflows using Terraform, CI/CD pipelines, and containerisation tools (Docker/Kubernetes)
What you will bring

● Strong experience with Google Cloud Platform (GCP) e.g. BigQuery, DataPlex, DBT, Pub/Sub, Dataflow, S3, Glue, Redshift
● Expertise in Oracle SQL, PL/SQL, and working with complex stored procedures and large datasets
● Proficiency in programming languages such as Python, Java, or Scala
● Experience with streaming and messaging systems (e.g. Jenkins, GitLab), and container orchestration (Kubernetes)
● Experience working with Infrastructure as Code, preferably Terraform, to manage cloud data infrastructure.
● Deep understanding of data modelling, data warehousing, and ETL/ELT design patterns
● Familiarity with Agile development practices (Scrum, Kanban, Jira)
● Exposure to financial markets, trading systems, or related high-performance environments is a strong plus

Nice to have

● GCP certification (e.g., Professional Data Engineer or similar) will be considered a strong advantage.
● Experience with Amazon Web Services (AWS)
● Knowledge of regulatory reporting, market data, or trade surveillance systems
● Experience with Apache Airflow, DBT, or similar orchestration tools
● Understanding of data security practices and compliance frameworks

Our perks

  • calendar
    Covered vacation period: 20 business days and 5 days off
  • English
    Free English classes
  • clock
    Flexible working schedule
  • smile
    Truly friendly and supporting atmosphere
  • home
    Working remotely or in one of our offices
  • user
    Medical insurance for employees from Ukraine
  • legal
    Legal support

Your dream job awaits you

Apply now!

    Successfully applied!