Databricks with SAP BO:
Key Responsibilities
Design, build, and maintain scalable ETL ELT pipelines using Databricks PySpark Delta Lake SQL Warehouse
Transform and curate data into bronze silver and gold layers following medallion architecture best practices
Publish and expose gold layer datasets through Databricks SQL Warehouse for consumption by SAP BO
Collaborate with BO developers to ensure semantic layer alignment
Conduct data validation and reconciliation between Databricks outputs and BO report datasets
Optimize data models queries and partitions for performance cost and scalability
Required Skills and Experience
5 years of experience with Azure Databricks PySpark Delta Lake SQL Warehouse
Proficiency in SQL and data modelling star snowflake schemas
Familiarity with SAP BusinessObjects universe and report structures able to validate and support BO data consumption
Experience working in banking or financial data environments preferred
MNCJobsGulf.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.