Administative work / Office work
Agriculture
Architecture and Civil Engineering
Customer Service
Engineering / RnD
Finance and Accounting
General Management / Executive Office/Project Management
Health Care and Pharmaceutical
Hospitality
Human Resources
IT and Telecommunication
Legal
Logistics, Warehousing and Transport
Manufacturing
Marketing, Media and Design
Other Area(s)
Retail and Wholesale
Sales and Business Development
Supply Chain and Procurement
Technical work and Maintainance work
Flexible
Hybrid
On-site
Remote
1-3 years of professional experience
3-5 years of professional experience
5-10 years of professional experience
>10 years of professional experience
Any or no experience required
Student/fresh graduate
Budapest
Bács-Kiskun
Békés
Csongrád-Csanád
Fejér
Győr-Moson-Sopron
Hajdú-Bihar
Jász-Nagykun-Szolnok
Komárom-Esztergom
Other
Pest
Szabolcs-Szatmár-Bereg
Tolna
Vas
Veszprém
Zala
Join a global airline where your career can truly take off. Here, you’ll be part of an international team, shaping the future of travel while enjoying growth opportunities, training, and a dynamic work environment. Be part of a company that values people as much as destinations.
We are looking for a data engineer who can build data pipelines, industrialize machine learning and operations research models and replace legacy data warehousing systems with state-of-the-art data lake solutions. As a Data Engineer, you will apply your skills to change the way our partner company's works, driving innovation and transforming us into a truly data-driven organization. We value both technical expertise (analysis, design & architecture, engineering, quality assurance and devops) as well as the interpersonal parts of the job (coaching, consulting, coordination, and support).
You will be part of the SkyAI team, one of the first data teams. A young and ambitious team in the business platform flight. The team consists of data scientists, data engineers and BI developers who work together according to the scaled agile way of working.
• Proven experience in one of the cloud platforms-Google Cloud Platform (GCP)/Azure/ AWS. (must have)
• Demonstrated expertise in writing, optimizing, and troubleshooting complex SQL queries for large-scale data processing and analytics. (must have)
• Strong proficiency in Python (Python is a must), with experience developing automation scripts for data engineering workflows. (must have)
• Hands-on expertise in building data pipelines using Dataform or DBT.
• Hands-on experience with BigQuery and other GCP data services (this is a plus).
• Solid understanding of data modeling concepts, including Data Vault and dimensional modeling techniques.
• Experience implementing and maintaining data quality frameworks and data testing.
• Proficiency in writing and maintaining unit tests for data pipelines and transformations.
• Experience with CI/CD processes and deployment automation.
• Working knowledge of Terraform and Infrastructure as Code (IaC) for deploying and managing cloud data infrastructure. (must have)
• Experience with data governance, security, and privacy best practices (e.g., policy tags, taxonomy IDs).
• Proficiency with Git for version control and collaborative development.
• Excellent communication and stakeholder management skills, with the ability to collaborate effectively with product owners and cross-functional teams.
• Experience working in Agile environments, with proficiency in using Jira for project and task management.
• Ability to write clear technical documentation and communicate technical concepts to both technical and non-technical stakeholders.
It is considered a plus (not a requirement) if you have:
- Knowledge of - and hands-on experience with - the Hadoop ecosystem
- Knowledge of Entreprise Architect or at least Archimate or similar modeling technique is a plus
- Experience in building production applications with Apache Spark
- Affinity with Machine Learning and/or Operations Research concepts