Purpose of Role:
Inchcape is one of the world’s largest independent automotive distributors and retailers. We manage the end-to-end distribution, logistics and customer experience for the best car brands in the world.
We are recruiting a talented and driven Azure Engineer. You, as the successful Azure Engineer, will have the ability to collaborate with a range of stakeholders across the business in problem solving and project delivery, as well as demonstrate verbal and written communication skills to effectively consult, negotiate and engage with internal and external stakeholders of all levels.
Reporting to the Lead Data Engineer, this role provides technical support for the BI & analytics environment. This includes assisting with the design, development and maintenance of the Data Lake, Delta Lake, the Data Warehouses, reporting datasets and all data and analytical requirements of the Inchcape Group.
The primary role will be to work with business users and analysts to design and develop effective technical solutions for our organisation’s data and information needs. Along with this the role is also responsible for the monitoring and maintenance of the analytics environment and will be required to support other specialists within the team.
Job Role and Responsibilities:
- Design and build robust scalable data pipelines and data migration using the Azure Stack
- Responsible for creating reusable and scalable data pipelines
- Produce high-quality code to put solutions into production
- Execute high-performance data processing for structured and unstructured data, and data harmonization
- Schedule, orchestrate, and validate pipelines
- Design exception handling and log monitoring for debugging
- Work closely with the Architects to develop data pipelines on Azure Data Platforms
- Do data wrangling, manipulation, cleansing and blending from multiple data sources and exposure to various data types and storage paradigms, including structured and unstructured data.
- Prepare raw data from multiple sources and apply modelling techniques consistent with existing methodologies.
- Provision and configure effective Azure environments including resource creation and maintenance, platform security, and database management services.
- Conduct issue triage and investigation across the BI / Analytics stack.
- Develop ETL bug fixes using Azure Synapse Analytics, Azure Databricks, Azure Data Factory, Azure Data Lake v2 and Azure Purview.
- Update and maintain the Quality Assurance framework to ensure data quality, integrity, and lineage across the existing BI solutions.
- Undertake unit testing of all fix developments.
- Utilize Azure DevOps for CI/CD across environment and platform deployments.
- Manage Production deployments across the Azure and Power BI platforms.
Skills Required:
- Demonstrated extensive experience working with Azure data solutions and supporting environments involving Azure Synapse Analytics, Azure Databricks, Azure Data Factory, Azure Data Lake v2 and Power BI.
- Demonstrated experience in using LogicApps and Data Factory.
- Demonstrated experience in using SQL and Python in Databricks.
- Experience with JIRA project management software is an advantage.
- Experience with Microsoft PowerBI software is an advantage.
Leadership and Communication:
- Ability to work with technical teams to translate requirements into solutions.
- Experience in supporting internal stakeholders, with a proven track record of customer excellence.
- A highly organized nature with high attention to detail.
- Foster open communication and actively listen.
- Practice information sharing and collaboration.
- Present ideas & concepts logically with accurate documentation.
- Plan and prioritizes own work as well as guides others to ensure all goals are met.
Experience Required:
- Hands on experience in developing data pipelines using Azure Data Factory, Azure Logic Apps, Azure Functions, Azure Synapse, Azure Data Lake.
- Design and develop the logic for end to end data pipeline including performance optimization.
- ETL processing using Azure Databricks and Delta Lake including optimization. o Experience working with structured and unstructured data.
To apply for this role, select “Apply Here” and you will be taken to SkillsNow Platform.