
Professional Platform Engineer – Azure Data Engineering
- Remote, Hybrid
- Wien, Wien, Austria
- Wien
As a Professional Platform Engineer – Azure Data Engineering you independently design and implement scalable data pipelines on Azure or in MS Fabric
Job description
We digitize decisions with data—would you like to get involved?
We shape our customers' data-driven future with scalable, secure, and automated Azure platforms. As a Professional Data Platform Engineer, you will be responsible for complex data pipelines: from zero-ETL replication of operational data (mirroring) to Delta Lake-based lakehouses. Our vision: self-service data access for all departments, supported by DataOps and governance.
Tasks & Responsibilities:
Planning, development, and maintenance of robust batch and streaming pipelines with Azure Data Factory, MS Fabric and Databricks
Integration of new Azure features such as mirroring (zero ETL), Delta Live Tables, and Unity Catalog for centralized governance
Automation of deployments and testing using CI/CD (Azure DevOps or GitHub Actions) and infrastructure as code (Bicep/Terraform)
Ensuring data quality, lineage, and security—using Microsoft Purview and role-based access control
Collaborating with data scientists, product owners, and customers to translate requirements into scalable data solutions
Evaluating new services such as Lakeflow or Microsoft Fabric for productive use
What we offer you
Flexible working: Trust-based working hours, hybrid working, and remote working possible (residence in Austria)
State-of-the-art technology stack: Work with Fabric, Delta Lake, Databricks, Mirroring—ideal for tech-savvy juniors
Targeted further training: Working hours for training, certifications, and mentoring
Innovation space: Opportunity to test new tools and frameworks and develop proof-of-concepts
Diversity & inclusion: We welcome all applicants and promote an inclusive environment; your ideas are important to us
Real influence: You can help shape technology decisions and contribute your ideas directly to product roadmaps
If you want to take on responsibility, enjoy working with the latest Azure technology, and value an open, learning-oriented team culture, we look forward to receiving your application!
Job requirements
2–5 years of experience in data engineering, preferably with Azure Data Services (Data Factory, Databricks, MS Fabric)
Very good SQL and Python skills; experience with Delta Lake, streaming (Event Hubs, Kafka), and data modeling
Knowledge of CI/CD and DevOps processes as well as cloud security and governance
Solution-oriented thinking, strong communication skills, and willingness to learn new things
Very good German and good English skills
or
All done!
Your application has been successfully submitted!
