Data Solution Architect
Our client, a leading global supplier for IT services, requires experienced Data Solution Architects to be based in their client’s office in Coventry, UK.
This is a hybrid role – you can work remotely in the UK and attend the Coventry office 3 days per week.
This is a 3 month temporary contract, to start asap.
Day rate: Competitive Market rate.
The client is seeking for a highly experienced Cloud Solution Architect to join their growing team. The ideal candidate should have extensive experience in cloud data architecture, infrastructure, and real-time data integration especially on Azure, Databricks, Synapse and Fabric with proficiencies in streaming platforms, infrastructure automation and Data Governance/ MDM tools (e.g., Microsoft Purview, Profisee, Informatica, Databricks Unity Catalog etc.). You must have proven track record in managing complex, large-scale data environments (1000TB+) with prior experience in the utilities or telemetry/ SCADA systems domain is a strong plus.
You will lead initiatives that support real-time analytics, AI/ML, secure data management, and enterprise-wide integration strategies especially within complex environments such as IoT and telemetry systems. You will be responsible for designing, implementing, and optimising large-scale data and cloud solutions across Azure, Databricks, and Microsoft Fabric.
Key Responsibilities:
- Architect and implement scalable, secure, and high-performance data platforms using Azure, Databricks, and Microsoft Fabric for real-time and batch processing
- Lead integration across structured and unstructured data sources such as SCADA, SAP, APIs, telemetry, and time-series data using modern ETL/ ELT patterns
- Establish robust data governance, security, and compliance frameworks including data masking, encryption, lineage, access controls, and regulatory adherence (e.g., GDPR, ISO 27001)
- Design and optimise cloud-native storage and compute architectures using Delta Lake, ADLS Gen2, Synapse, Azure SQL, Cosmos DB, and NoSQL for petabyte-scale workloads
- Implement event-driven and real-time streaming architectures using Kafka, Azure Event Hubs, IoT Hub, and Lambda architecture patterns
- Drive DevOps and IaC practices through automated CI/CD pipelines using tools such as Terraform, Bicep, Azure DevOps, and GitHub Actions
- Collaborate with cross-functional stakeholders including business leads, engineers, security, and vendors to align technology with strategic business outcomes
- Implement monitoring, observability, and disaster recovery strategies ensuring high availability, system resilience, and proactive issue resolution
- Lead AI/ML and analytics integrations with Databricks, Power BI, and MDM platforms to enable advanced reporting and insights
- Mentor and enable internal teams through technical training, knowledge-sharing sessions, and architectural best practices to promote a data-driven culture
Key Requirements:
- Azure Data Factory, Synapse Analytics, Azure Databricks
- ADLS, Azure Blob, Azure SQL DB, Cosmos DB, Delta Lake, Oracle DB
- Azure Event Hub, Azure Kafka, Azure IoT Hub, ADX
- Fabric including OneLake
- CI/CD using Azure DevOps, GitHub, ARM
- Experience with unstructured data
- Data Model, Data Mapping, ETL Mapping
- Data Governance (Purview, Databrick Unity Catlog)
- Data Profiling, Data Quality, Security
- MDM (Profisee, Informatica)
- HADR, AI/ML
- Compute, network strategies (Private Endpoint, Vnet, ExpressRoute)
- Security frameworks, IAM, RBAC, firewall rules, Zero Trust architecture
- Threat modelling, risk assessments
- Monitoring, Logging
- Cost and performance management
Desirable:
- Data Governance (Collibra)
- Utilities experience
- Experience with SCADA, eSCADA, telemetry, SAP PM, GIS
- Terraform
- Bicep
- Python, Kusto
Due to the volume of applications received, unfortunately we cannot respond to everyone.
If you do not hear back from us within 7 days of sending your application, please assume that you have not been successful on this occasion.
Please do keep an eye on our website https://projectrecruit.com/jobs/ for future roles.