Etl Developer

Dubai, United Arab Emirates

Job Description

CAFU has the vision to create a better world through connected mobility, serving both consumers and businesses. Every day, we create convenience in motion for our customers in communities across the region. We started as an on-demand fuel delivery app and have now extended our services to the car wash, tire change, and battery change...but this is only the beginning. We have big [global] plans to be a super app with products & services which are agnostic of fuel and mobility. Cafu is a technology business passionate about sustainability and positively impacting the communities we operate in. We’ve already won awards for best AI platform and voted LinkedIn’s Number 1 Top Startups 2021! We encourage community involvement by using in-house technologies to work for the environment. CAFU signed up to the SDG Ambition Accelerator program by setting ambitious sustainability goals and developing a technology strategy to improve measurement and performance towards achieving those goals, aligning to the United Nations Sustainable Development Goal 13 committed to Climate Action. If you want to make a real impact on our communities and are forever curious about how things work and why and want to join a company that moves with purpose and collaborates, then we want to hear from you! You will be an individual contributor to the Data Engineering team and the primary responsibility will be Data Pipeline development which includes writing new and revamping existing pipelines. Collaborate with other members of the data team and other teams to gather requirements and understand source systems. You will be working in a cloud-based modern data stack and should be able to adopt best practices for different aspects of data like security, consistency, governance, etc.
Responsibilities
  • Write new ETL/ELT pipelines with necessary data quality checks and handle edge cases
  • Revamp existing data pipelines to accommodate new technologies or business changes
  • Data modeling for specific use-case in Data Lake and Warehouse
  • Evaluate and adopt the best storage strategy (file format, table format, compression, partition, etc ) for Data Lake for each use-case
  • Debug end-to-end pipeline for any data quality or consistency issues.
Requirements Essential Skills:
  • SQL – Proficient in SQL-based ELT and familiar with analytics/reporting queries
  • Python – Proficient in scripting and familiar with OOP
  • Airflow – Basic architecture and development in Airflow 2.
  • AWS – Proficient in Data Engineering – S3, EC2, Lambda, Athena, Glue, and familiar with basics like IAM, CloudWatch, VPC, etc.
  • Familiar with common tools – Git, Docker, Jira
Desirable Skills:
  • Familiar with Lake-House architecture with transaction support like Delta Lake or Iceberg
  • Experience with opensource distributed query engines like Hive, Presto
  • ML model deployment and monitoring
  • Familiar with basics of any reporting/visualization or BI tool like PowerBI, Tableau, etc.
Benefits
  • A collaborative environment where diversity is celebrated
  • A flexible workforce. Our people are based in Dubai, but you can work from anywhere in the world.
  • A highly competitive market salary with housing and transport allowance.
  • A fast-moving supportive company where everybody takes ownership over their work.
  • An opportunity to share in the success of the business with stock options for qualifying employees.
  • The opportunity to work on a product with growing global appeal
  • Work with some of the most talented people in the industry from well-known digital brands.

Beware of fraud agents! do not pay money to get a job

MNCJobsGulf.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD1402538
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Dubai, United Arab Emirates
  • Education
    Not mentioned