[2874] Senior Cloud Data Developer

Mission start date: 17/10/2022
Mission duration: 660 workdays, with the option to stop the contract each 3 months, with one months' notice
Daily rate: Negotiable

Mission / function description:
Develop a multi-source dataflow with XML’s and DB’s as distinct source (Orchestrated with Airflow)
The flow will consist of loading XMLs with deep hierarchy and large volume.
Then enrich it with Data from several DB’s and compute significant Flags (some simple and other complex which will use large computer resource)

Working hours:
5 days from Monday until Friday (7h36 per day).

• English: active knowledge (understanding/speaking/writing technical documentation)
• French / Dutch: can be a plus

Development Skills:
Technical knowledge and practical experience of required technologies listed below to be able to start development quickly.

You are a Databricks developer with a strong focus on Azure cloud technology.

You have experience with:
• datawarehousing, lakehouses. Knowledge of deltalake is a plus.
• using different data technologies in an Azure cloud environment like Synapse, Azure SQL MI, databricks and azure devops.
You have strong technical skills in the following domains:
• Databricks: data engineering, performance tuning, delta lake
• Languages: Python (in combination with spark), R
• Azure SQL: SQL MI, Azure SQL, Synapse
• Azure Devops
• Apache airflow: data pipelines

Contract
Belgium
Negotiable
GPC002874
Luke Tebb
luke@gpc.work