Summary
Overview
Work History
Education
Skills
Timeline
Generic
Dinko Jantoš

Dinko Jantoš

Senior Data Engineer
Suchy Dwór

Summary

Dynamic Senior Data Engineer with expertise in ETL development and cloud architecture, honed at GlobalLogic Poland. Proven track record in optimizing data workflows and enhancing database performance, driving impactful AI solutions. Strong problem-solving skills complemented by effective cross-functional collaboration, ensuring seamless data integration and migration across diverse platforms.

Overview

9
9
years of professional experience
5
5
years of post-secondary education
3
3
Languages

Work History

Senior Data Engineer

GlobalLogic Poland
Gdansk, Pomerania
01.2024 - Current
  • Nov 2024 - Apr 2025 Senior Data Engineer - As a Senior Data Engineer at GlobalLogic, I worked on a Generative AI Platform designed to provide chatbot-based insights into ongoing company projects. My role involved optimizing database structures (PostgreSQL, Neo4j), building ETL pipelines using Python, SQL, Azure, and GCP, and managing data workflows to support AI-driven solutions.
    Technologies: SQL, Python, Azure, GCP, GitHub, Docker.

    Key responsibilities:
    Developed ETL pipelines to transform and integrate data across Azure and GCP environments.
    Optimized and maintained database schemas (PostgreSQL, Neo4j) designed by the architecture team.
    Automated data processing workflows using Python.
    Managed version control and collaboration using GitHub, Jira, and Confluence.
    Created and maintained technical documentation in Confluence for streamlined knowledge sharing.
    This role was part of a client project under GlobalLogic, where I contributed to building an AI-powered chatbot capable of answering project-related queries. My work ensured efficient data structuring and processing to support the chatbot’s performance and accuracy.As a Senior Data Engineer at GlobalLogic, I worked on a Generative AI Platform designed to provide chatbot-based insights into ongoing company projects. My role involved optimizing database structures (PostgreSQL, Neo4j), building ETL pipelines using Python, SQL, Azure, and GCP, and managing data workflows to support AI-driven solutions. Technologies: SQL, Python, Azure, GCP, GitHub, Docker. Key responsibilities: Developed ETL pipelines to transform and integrate data across Azure and GCP environments. Optimized and maintained database schemas (PostgreSQL, Neo4j) designed by the architecture team. Automated data processing workflows using Python. Managed version control and collaboration using GitHub, Jira, and Confluence. Created and maintained technical documentation in Confluence for streamlined knowledge sharing. This role was part of a client project under GlobalLogic, where I contributed to building an AI-powered chatbot capable of answering project-related queries. My work ensured efficient data structuring and processing to support the chatbot’s performance and accuracy.
  • Jan 2024 - Oct 2024 Senior Software Engineer - Contractor at Itoworld (Contracted through GlobalLogic Poland).

    As a Senior Software Engineer focused on Data Warehouse Development, I specialize in building, optimizing, and maintaining data solutions that drive frontend applications and support business analytics. My responsibilities include developing robust data infrastructure, fixing and improving pipelines, managing data migrations, and implementing custom features using PostgreSQL and Amazon Redshift.

    Key Responsibilities:
    Database Development & Optimization: Build and maintain scalable, high-performance databases in PostgreSQL and Amazon Redshift, including creating stored procedures and optimizing SQL queries to meet frontend data requirements.
    ETL & Data Pipelines: Design, manage, and troubleshoot ETL workflows and DAGs, ensuring seamless data ingestion, transformation, and loading. Quickly address and resolve pipeline issues to maintain data integrity across systems.
    Data Migration & Transformation: Plan and execute complex data migrations with minimal downtime, adapting data structures and formats as needed to ensure compatibility and system stability.
    Feature & Reporting Enablement: Collaborate with cross-functional teams to support new feature rollouts, create new tables and structures for reporting and analysis, and deliver ad-hoc data solutions to meet specific requirements.
    Database Maintenance & Testing: Perform routine maintenance, troubleshoot performance issues, and conduct thorough testing to ensure reliable and secure data environments.
    My role is centered on enabling seamless data flows and supporting data-driven decision-making through innovative and reliable data warehouse solutions.
    Technologies: SQL, Python, AWS, Airflow, GitLab, Alembic, Docker.

AWS Data Engineer

Hapag-Lloyd AG
Gdansk, Pomerania
12.2021 - 12.2023
  • Developed and executed a comprehensive migration strategy from on-premise DB2/Informatica to a cloud-based architecture utilizing Snowflake, Informatica Intelligent Cloud Services (IICS), and AWS.
  • Key Responsibilities:
  • - Adapted and optimized existing IICS workflows for seamless data integration and ETL processes during the migration from on-premise systems, ensuring minimal disruption and enhanced performance in the cloud environment.
  • - Troubleshot and resolved issues, ensuring smooth data flow and system performance.
  • - Implemented a wide range of functionalities within Snowflake to enhance data operations and accessibility.
  • - Created Airflow DAGs for automated workflow scheduling and management.
  • - Utilized a variety of AWS services for serverless computing, event-driven architecture, data storage, effectively leveraging the AWS ecosystem for diverse business needs.
  • - Performed testing of workflows to ensure data accuracy and compliance with business requirements.
  • - Contributed to the continuous integration and deployment pipeline using GitLab and Terraform.
  • Technologies: AWS, Snowflake, Airflow, Informatica IICS, SQL, Python, Qlik Replicate, GitLab, Terraform.

Business Intelligence Engineer

Amazon
Gdansk, Pomerania
11.2019 - 11.2021
  • Established strong working relationships with clients through exceptional communication skills, fostering trust and collaboration.
  • Achieved successful project outcomes by maintaining accurate documentation and meeting strict deadlines.
  • Developed positive working relationships with stakeholders to effectively coordinate work activities.
  • Presented technical findings to stakeholders, ensuring clear understanding of project status and goals. Responsible for building and implementing automation solutions. Delivering business process improvements with usage of Data Analysis, NLP, API integrations, Data Modeling and Machine Learning services.
  • Technologies: Python, MySQL, AWS, Machine Learning tools (Image Rekognition, Textract).

Automation & Emerging Technology Analyst

Refinitiv
Gdansk, Pomerania
09.2019 - 11.2019
  • Resolved complex technical issues through rigorous troubleshooting and root-cause analysis, minimizing downtime and disruptions to business operations. Streamlined processes by automating repetitive tasks, saving time and resources. Developed call-to-action plans to improve IT service effectiveness. Provided technical and functional recommendations based on project requirements. Designing complex automation solutions combining data and technology to address stakeholders requirements. Delivering business process improvements, ad-hoc reports and dashboards using a variety of data.
  • Technologies: Python, SQLite, Cyberquery language, Neo4j graph database, Jira.

Junior Data Engineer

AirHelp
Gdansk, Pomerania
12.2016 - 09.2019
  • Member of data team for leader in Legal-Tech. Being a contact point for data analysts, extracting data for all company departments, transferring knowledge about company procedures across IT team, managing ELT process, scripting data quality/sanity check, preparing dashboards, reports, Ad-hoc analysis and supporting product improvement projects, integrating internal data sources with external APIs, optimizing data warehouse and bi tools performance. Reengineered legacy systems with modern frameworks resulted in reduced technical debt without compromising operational stability or performance metrics.
  • Technologies: PostgreSQL, BI tool (Periscope), Datagrip, GitHub, Jira, BigQuery, Redshift, Python for API integration (Zendesk), MS Office.

Apprentice Data Engineer

AirHelp
Gdansk, Pomerania
06.2016 - 11.2016
  • Built strong relationships with team members and supervisors, fostering a positive work environment. Enhanced practical skills by assisting experienced professionals in various tasks. Observed experienced professionals in action, gaining valuable insights into best practices within the field.

Education

Postgraduate Studies - Data Engineering - Data Science

Gdansk University of Technology
Gdansk, Poland
10.2019 - 06.2020

Master's Degree - Slavic Philology

University of Gdansk
Gdansk, Poland
10.2014 - 06.2016

Licentiate Degree - Slavic Philology

University of Gdansk
Gdansk, Poland
10.2011 - 06.2014

Skills

ETL development

Data warehousing

Cloud architecture

Data migration

Python scripting

Data pipeline design

Advanced SQL

NoSQL databases

Data quality assurance

Business intelligence

Relational databases

Timeline

Senior Data Engineer

GlobalLogic Poland
01.2024 - Current

AWS Data Engineer

Hapag-Lloyd AG
12.2021 - 12.2023

Business Intelligence Engineer

Amazon
11.2019 - 11.2021

Postgraduate Studies - Data Engineering - Data Science

Gdansk University of Technology
10.2019 - 06.2020

Automation & Emerging Technology Analyst

Refinitiv
09.2019 - 11.2019

Junior Data Engineer

AirHelp
12.2016 - 09.2019

Apprentice Data Engineer

AirHelp
06.2016 - 11.2016

Master's Degree - Slavic Philology

University of Gdansk
10.2014 - 06.2016

Licentiate Degree - Slavic Philology

University of Gdansk
10.2011 - 06.2014
Dinko JantošSenior Data Engineer