Purpose of Role
Inchcape is one of the world’s largest independent automotive distributors and retailers. We manage the end-to-end distribution, logistics and customer experience for the best car brands in the world.
We are recruiting a talented and driven DB Admin & Azure Engineer. You, as the successful DB Admin & Azure Engineer, will have the ability to collaborate with a range of stakeholders across the business in problem solving and project delivery, as well as demonstrate verbal and written communication skills to effectively consult, negotiate and engage with internal and external stakeholders of all levels.
Reporting to the Lead Data Engineer, this role provides technical support for the BI & analytics environment. This includes maintaining and administering the SQL DBs, as well as in assisting with the design, development and maintenance of the Data Lake, Delta Lake, the Data Warehouses, reporting datasets and all data and analytical requirements of the Inchcape Group.
The primary role will be to maintain and administer the database infrastructure used by the platform. The resource will also work with business users and analysts to design and develop effective technical solutions for our organisation’s data and information needs. Along with this, the role is also responsible for the monitoring and maintenance of the analytics environment and will be required to support other specialists within the team.
Job Role and Responsibilities:
- Work on database administration including performance management, security, installations, upgrades, patches, backup, recovery, space management, performance tuning and capacity planning
- Establish and maintain Microsoft SQL database architecture best practices, including procedures, guides, templates, and relevant documentation
- Provides the operational support and development of the Microsoft SQL Server infrastructure production and non-production environments, which includes database servers, database services, and implementation of SQL based tools and infrastructure.
- Ensure that all databases’ environments are deployed, maintained and managed via proactive monitoring of performance and capacity planning of RDBMS
- Ensures availability of servers 24*7 and plan the maintenance activities with stakeholders
- Maintain backup and recovery process, including disaster recovery performance and tuning of database and application
- Managing database security, roles, and profiles
- Help resolve any performance tuning issues
- Assist the teams in deployment activities
- Ensure data requirements have been identified, design is fully normalized, and database complies with industry best practices and strict security requirements
- Participating in design sessions with application engineers and will work closely with business analysts, who develop the business and software specifications for project objectives
- Design and build robust scalable data pipelines and data migration using the Azure Stack
- Responsible for creating reusable and scalable data pipelines
- Produce high-quality code to put solutions into production
- Execute high-performance data processing for structured and unstructured data, and data harmonization
- Schedule, orchestrate, and validate pipelines
- Design exception handling and log monitoring for debugging
- Work closely with the Architects to develop data pipelines on Azure Data Platforms
- Do data wrangling, manipulation, cleansing and blending from multiple data sources and exposure to various data types and storage paradigms, including structured and unstructured data.
- Prepare raw data from multiple sources and apply modelling techniques consistent with existing methodologies.
- Provision and configure effective Azure environments including resource creation and maintenance, platform security, and database management services.
- Conduct issue triage and investigation across the BI / Analytics stack.
- Develop ETL bug fixes using Azure Synapse Analytics, Azure Databricks, Azure Data Factory, Azure Data Lake v2 and Azure Purview.
- Update and maintain the Quality Assurance framework to ensure data quality, integrity, and lineage across the existing BI solutions.
- Undertake unit testing of all fix developments.
- Utilize Azure DevOps for CI/CD across environment and platform deployments.
Skills and Experience Required:
- Hands on experience in developing data pipelines using Azure Data Factory, Azure Logic Apps, Azure Functions, Azure Synapse, Azure Data Lake
- Design and develop the logic for end to end data pipeline including performance optimization
- ETL processing using Azure Databricks and Delta Lake including optimization
- Experience working with structured and unstructured data
- Degree from a good university.
- Agile delivery qualifications (e.g. SAFE; Six Sigma) are an advantage.
- Project Management qualifications (e.g. APM / PRINCE2 / CAPM) is an advantage.
Communication and interpersonal skills
- Ability to work with technical teams to translate requirements into solutions
- Experience in supporting internal stakeholders, with a proven track record of customer excellence
- A highly organized nature with high attention to detail
- Foster open communication and listen actively
- Practice information sharing and collaboration
- Present ideas & concepts logically with accurate documentation
- Plan and prioritizes own work as well as guides others to ensure all goals are met
- Demonstrated extensive experience working with Azure data solutions and supporting environments involving Azure Synapse Analytics, Azure Databricks, Azure Data Factory, Azure Data Lake v2 and Power BI.Experience with JIRA project management software is an advantage.
- Experience with Microsoft PowerBI software is an advantage.
To apply for this role, select “Apply Here” and you will be taken to SkillsNow Platform.