Middle/Senior Data Engineer - CHI Software - Closed
The CHI Software team is not standing still. We love our job and give it one hundred percent of us! Every new project is a challenge that we face successfully. The only thing that can stop us is… Wait, it’s nothing! The number of projects is growing, and with them, our team too. And now we need a Middle/Senior Data Engineer.
The ideal candidate will be deeply involved in discovering, analyzing, and assembling large and complex data sets. They will design and build our big data infrastructure, ensure data security, and optimize the performance of our big data platforms, primarily within the AWS or Azure cloud environment.
Requirements:
- Bachelor’s degree in Computer Science or a related field.
- 3-4 years of experience in data engineering and building data platforms.
- Demonstrated success in deploying cloud and SaaS model products, with expertise in building optimized processing pipelines for streaming analytics applications.
- Proficiency in databases and query optimization (PostgresSQL, ElasticSearch, MongoDB, Redis, Druid), including experience with NoSQL and graph databases.
- Experience in horizontally scaling databases.
- Expertise in Kafka and Airflow, with a strong understanding of runtime profiling tools.
- Experience in big data processing systems similar to Apache Spark, Flink, Beam or similar.
- Skills in build automation, continuous integration, and deployment (CI/CD) tools (Webpack, Buddy, Jenkins, Docker).
- Expert-level Python coding skills.
- Cloud Platform: AWS or Azure.
Will be a plus:
- Technical background in AI and ML.
- Experience in designing and implementing interactive query-driven main-machine intelligence systems.
- Experience in working with distributed teams.
Responsibilities:
- Design and implement real-time distributed data processing systems analyzing public data and detecting emergent threats
- Development and optimization of ETL processes for various data formats from social media, news, and web sources.
- Design and implement robust database systems and develop tools for query and analytic processing, focusing on real-time streaming applications.
- Spearhead build automation, continuous integration, deployment, and performance optimization efforts, upholding our strict security requirements.
- Design test suites and implement inline instrumentation to ensure data correctness.