We are looking for data engineers that will build ETLTs to ingest the data into the data warehouse, as well as end-user facing reporting applications.
Our main goal is to ensure the data availability for internal and external processes and to deliver exceptional analyses to ensure excellent business decision-making.
The data engineer will be sole custodian for formalizing and executing roadmap for capturing, structuring and analyzing structured and unstructured big data across the Supply Chain functions.
A day in the life of...
Helping teams pen down end-to-end workflows and identifying opportunities to digitize
designing and documenting ETLT system
Setting in place a Staging Area, Data Warehouse and Data Marts that provide a common data repository
Defining methods of moving the data from various sources into the data warehouse
Setting up data transformation, aggregation and calculation rules within data warehouse
Developing models for predictive and advance analytics
Developing Self-Service Dashboards to enable key stakeholders to answer many of their data-based questions themselves - without sending ad hoc requests
Writing SQL queries with business logic to retrieve data from our Data Lake / Warehouse, monitoring the KPIs and taking necessary actions
What will make you successful
University degree in a quantitative discipline (e.g., but not limited to Mathematics, Computer Science, Data Science, Statistics, etc.
or equivalent work experience
1+ year of experience in Data modeling, SQL, ETLT , Data Warehousing and Datalakes
Excel in the design, creation, and management of very large datasets
Experience in Azure Synapse Warehouse
Experience with writing SQL scripts alongside expert knowledge in an enterprise class RDBMS
Experience in building reports and dashboards in a BI tool (powerBI, Tableau)
Experience in statistical modelling & time series forecasting (experience with R, SAS, Python, Minitab)