Middle/Senior Data Engineer with AWS

The CHI Software team is not standing still. We love our job and give it one hundred percent of us! Every new project is a challenge that we face successfully. The only thing that can stop us is… Wait, it’s nothing! The number of projects is growing, and with them, our team too. And now we need a Middle/Senior Data Engineer with AWS.

About the Project

You will join a team working on cutting-edge data solutions for enterprise-scale analytics. The project involves developing and optimizing data pipelines, cloud-based data lakehouses, and modern data platforms. It focuses on high-performance, scalable infrastructure using AWS and Snowflake to support robust data processing and analytics capabilities.

Requirements:

  • 4–5+ years of commercial experience as a Data Engineer;

  • Solid experience with AWS cloud infrastructure, including: S3, EC2, RDS, Redshift, Glue, Lambda, CloudWatch;

  • Hands-on development of ETL/ELT pipelines;

  • Strong proficiency in Python and SQL;

  • Experience with Snowflake (Streams, Tasks, Performance tuning);

  • Hands-on experience with Apache Spark / PySpark;

  • Workflow orchestration with Airflow or similar tools;

  • Understanding of CI/CD practices (GitHub Actions, GitLab CI, AWS CodePipeline);

  • Solid knowledge of DWH and Data Lake architectures;

Will be a plus:

  • Knowledge of dbt;

  • Experience with Presto, ClickHouse, Kafka, Kinesis, or Flink;

  • Familiarity with MLflow or other ML pipeline frameworks;

  • Experience with Docker, Kubernetes, Terraform, AWS CDK;

  • Skills in BI/analytics tools such as Superset, Tableau;

  • Understanding of stream data processing;

  • Relevant certifications:

    • AWS Certified Data Analytics – Specialty

    • Snowflake SnowPro Core

    • Databricks Certified Data Engineer Associate

Responsibilities:

  • Design and build scalable ETL/ELT pipelines;

  • Optimize data processing workflows and automate key processes;

  • Develop and maintain data models for analytics and reporting;

  • Integrate new data sources and ensure data quality;

  • Monitor and troubleshoot data pipeline performance;

  • Collaborate with data analysts, engineers, and business stakeholders;

  • Contribute to architecture decisions and infrastructure improvements;

  • Support continuous improvement through CI/CD and DevOps best practices.

Our perks

  • calendar
    Covered vacation period: 20 business days and 5 days off
  • English
    Free English classes
  • clock
    Flexible working schedule
  • smile
    Truly friendly and supporting atmosphere
  • home
    Working remotely or in one of our offices
  • user
    Medical insurance for employees from Ukraine
  • money
    Compensation of psychological counseling
  • legal
    Legal support
  • relocation
    Relocation assistance

Your dream job awaits you
Apply now!

    Successfully applied!