1. Design, develop, and maintain scalable data pipelines and ETL processes to ingest, transform, and load data from various sources into our data warehouse or data lake.
2. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and implement solutions that support business insights and decision-making.
3. Optimize data pipelines and processes for performance, reliability, and scalability.
4. Design and implement data models, schemas, and metadata to support data governance and analytics requirements.
5. Monitor and troubleshoot data processing jobs, performance issues, and data quality problems.
6. Ensure data security, privacy, and compliance with regulatory requirements.
7. Stay updated on emerging technologies and best practices in data engineering
and contribute to the continuous improvement of our data infrastructure and processes.
8. Document technical specifications, data flows, and system architecture.
9. Experience with data security in the data management domain specially at the level of data lakes.
10. Experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra).
11. Experience with cloud platforms (AWS, Azure, or GCP) and familiarity with DevOps practices.
Required Skills:
- Fluent in Arabic
- 8+ years of experience in data engineering
- Strong understanding of data pipelines, ETL processes, and data modeling
- Proficiency in SQL, Python, and tools like Apache Airflow, Spark or powerBI
- Experience with cloud platforms (AWS, GCP, or Azure)
- Familiarity with data warehousing and data lake concepts
- Knowledge of data quality, governance, and security best practices
- Ability to work with both structured and unstructured data
- Strong problem-solving and communication skills
- Detailed experience must be attached to the CV
Qualifications:
Bachelor's degree in computer science, Engineering, Information Systems, or a related field.
Arabic speaker
5+ years of experience in data engineering or a related field.
Proficiency in programming languages such as .Net Core, Java, Scala, or equivalent.
Strong understanding of data modeling, database design, and SQL.
Experience with big data technologies such as Hadoop, Spark, Kafka, or equivalent.
Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud.
??? ???????: ???? ????