New offer - be the first one to apply!

December 17, 2025

Senior Data Engineer

Senior • Hybrid

$150 - $175/hr

Warsaw, Poland

Information about the project:

Location: Hybrid (3 x week in Gdansk/Gdynia or Warsaw office)

Industry: Banking

Rate: 175 pln/h netto +VAT


Introduction & Summary


We are seeking an experienced Senior Data Engineer to support a highly available application that delivers data to multiple downstream systems. The ideal candidate will possess a strong development background along with expertise in Oracle DB, SQL/stored procedures, ETL processes, modern integration patterns, and DevOps practices.


Main Responsibilities


  • Design, build, and maintain scalable ELT pipelines.

  • Develop and optimize PLSQL queries and database objects for enhanced performance.

  • Write Python-based automation scripts for data processing.

  • Implement CI/CD pipelines for data infrastructure and workflow automation.

  • Manage database change with effective change management practices.

  • Collaborate with one or more major cloud vendors.

  • Integrate and manage modern data stack efficiently.

  • Monitor, debug, and optimize data workflows and infrastructure performance.

  • Ensure data quality, governance, and compliance with industry best practices.


Key Requirements


  • Proficient in designing OLAP systems using techniques like STAR schema or denormalization.

  • Expertise in Python and PLSQL development.

  • Experience in ELT/ETL development using GUI-based tools (e.g., IBM Datastage, ADF, Stitch, Matillion).

  • Solid knowledge of database management with Oracle, MSSQL, and Snowflake.

  • Understanding of DevOps and CI/CD practices using tools such as Git, Docker, Kubernetes, Terraform, Jenkins, Bamboo, and Azure DevOps.

  • Experience with test automation tools including Robot and Selenium, including familiarity with various test scenarios.

  • Familiarity with version control technologies such as Git, GitHub, and GitLab.

  • Experience in workflow orchestration tools (e.g., Airflow, Prefect).

  • Knowledge of streaming and batch processing with Kafka.


Nice to Have


  • Experience in data governance frameworks.

  • Knowledge of advanced data modeling techniques.

  • Experience with other cloud platforms.