Desired profile:
• Bachelor or Master of Science in Computer Science, Information Systems or similar technical fields
• 5+ years of experience as a data engineer, data architect, or similar roles
• Fluent in English (German / Czech / Slovak is a plus)
• Advanced database systems knowledge (SQL, NoSQL, database optimization techniques)
• Strong knowledge of data modeling techniques and methodologies (dimensional, relational, etc.)
• Enhanced practical experience with ETL/ELT processes and development of data pipelines
• Extensive experience with modern data warehouses (BigQuery, Snowflake, Redshift)
• Advanced Python knowledge (OOP, design patterns, testing, good coding practices and popular data processing libraries)
• Experience with using and/or building APIs and integrating data from various sources.
• Knowledge of orchestration tools such as Dagster or Airflow
• Knowledge of cloud data solutions (preferably GCP, but AWS or Azure are also fine)
• Strong analytical skills, ability to understand complex data structures and produce efficient data solutions. Excellent communication skills and ability to explain complex technical concepts to non-technical stakeholders
Beneficial:
• Experience with agile work (Scrum, Kanban)
• Working knowledge of Gitlab CI/CD
• Working knowledge of containerization technologies (Docker, Kubernetes) and related deployment tools (helm, jenkins)
• Experience with data visualization tools (e.g. Power BI, MicroStrategy)
• Experience developing and operating machine learning applications