Databricks Platform Engineer
GIT Consult Zobrazit všechny práce
- Praha
- Trvalý pracovní poměr
- Plný úvazek
- Manage workspace-level settings including clusters, libraries, compute policies, and access controls.
- Define and maintain catalogs, schemas, and tables across workspaces using Unity Catalog.
- Ensure the data platform is secure, scalable, and highly available.
- Contribute to the implementation of the Data Mesh paradigm across domains.
- Act as a technical advisor to data scientists, analysts, and business users.
- Drive onboarding, training, and enablement programs to scale platform adoption.
- Maintain technical documentation and contribute to Agile ceremonies and sprint planning.Requirements- Bachelor's degree in Computer Science, Data Engineering, or a related field.
- 5+ years of experience in cloud big data platforms, preferably Azure or GCP.
- Hands-on experience with Databricks, Delta Lake, and Spark; familiarity with Kafka, Flink, or similar frameworks is a plus.
- Experience building data pipelines using tools like Apache Airflow, Data Factory, or Apache Beam.
- Proficiency in Terraform, Python and CI/CD tools such as GitHub Actions.
- Strong understanding of data management, monitoring, security, and privacy.
- Excellent communication and collaboration skills across cross-functional teams.
- Fluent in English; additional language skills are a plus.Our offer- Bonuses
- Work mostly from home
- Flexible start/end of working hours
- Contributions to the pension / life insurance
- Contribution to sport / culture / leisure
- Education allowance
- Individual budget for personal growth
- Educational courses, trainings
- Transport allowance
- Meal tickets / catering allowance
- Cafeteria
- Refreshments on workplace
- Corporate events
- Holidays 5 weeks
- Sick days
- Notebook