New offer - be the first one to apply!

December 17, 2025

Senior Data Engineer

Senior • Hybrid

$170 - $180/hr

Warsaw, Poland

Who are we?

Lumicode Sp. z o.o. is part of the Pentacomp Group — a producer of IT solutions and a provider of professional IT services for large enterprises and the public sector.

As Pentacomp, we create IT solutions that combine innovation with decades of experience — and we have quite a lot of it. We have been present on the market for nearly 30 years and can proudly point to numerous successfully delivered projects.



What we offer:

  • B2B contract

  • Rate up to 180 PLN/h net (B2B)

  • Hybrid work model from Gdańsk/Gdynia/Warsaw/Łódź — 2 days per week in the office

  • Access to a sports card and private medical care

  • Long-term project



Preferred locations in Poland: Gdańsk, Gdynia, or Warsaw

Work model: Hybrid — 2 days per week in the office.


We are seeking an IT Developer to support a highly available application responsible for delivering data to multiple downstream systems. The ideal candidate will possess a solid software engineering background, combined with in-depth expertise in Oracle databases, SQL/stored procedures, ETL/ELT processes, modern integration patterns, and DevOps methodologies.


Must-have skills and experience:


  • Strong knowledge of OLAP systems, including STAR schema design and denormalization techniques.

  • Proficiency in Python and PL/SQL.

  • Hands-on experience with ETL/ELT tools (GUI-based), such as IBM Datastage, ADF, Stitch, Matillion.

  • Experience in database management with Oracle, MSSQL, and Snowflake.

  • Understanding of DevOps and CI/CD practices: Git, Docker, Kubernetes, Terraform, and tools like Jenkins, Bamboo, or Azure DevOps.

  • Test automation experience (e.g., Robot Framework, Selenium) and familiarity with various testing scenarios.

  • Solid understanding of version control using Git (GitHub, GitLab).

  • Experience with workflow orchestration tools (Airflow, Prefect).

  • Knowledge of both streaming and batch processing, including technologies such as Kafka.


Key responsibilities:


  • Design, build, and maintain scalable ELT pipelines.

  • Develop and optimize PL/SQL queries and database components to ensure efficient data storage in Oracle.

  • Create Python-based data engineering scripts for data processing.

  • Implement CI/CD pipelines to support data infrastructure and workflow automation.

  • Manage and execute database change processes.

  • Work with one or more major cloud providers.

  • Integrate and maintain components of a modern data stack.

  • Monitor, troubleshoot, and improve the performance of data workflows and related infrastructure.

  • Ensure adherence to data quality standards, governance practices, and compliance requirements.