Rochdale, Greater Manchester, North West, United Kingdom Hybrid / WFH Options
Footasylum Ltd
knowledge sharing sessions and self-development. About You Experience with finance/financial systems and concepts Azure Databricks Azure Data Factory Excellent SQL skills Good Python/Spark/pyspark skills Experience of Kimball Methodology and star schemas (dimensional model). Experience of working with enterprise data warehouse solutions. Experience of working with structured and unstructured data Experience of More ❯
Reading, Berkshire, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
tools to build robust data pipelines and applications that process complex datasets from multiple operational systems.Key Responsibilities: Build and maintain AWS-based ETL/ELT pipelines using S3, Glue (PySpark/Python), Lambda, Athena, Redshift, and Step Functions Develop backend applications to automate and support compliance reporting Process and validate complex data formats including nested JSON, XML, and CSV More ❯
include; Multiple Databricks projects delivered Excellent consulting and client facing experience 7 - 10 years+ experience of Consulting in Data Engineering, Data Platform and Analytics Deep experience with Apache Spark, PySpark CI/CD for Production deployments Working knowledge of MLOps Strong experience with Optmisations for performance and scalability These roles will be paid at circa £600 - £700 per day More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
IR35 Start Date: ASAP Key Skills Required Azure Data Factory Azure Functions SQL Python Desirables- Experience Copilot or Copilot Studio Experience designing, developing, and deploying AI solutions Familiarity with PySpark, PyTorch, or other ML frameworks Exposure to M365, D365, and low-code/no-code Azure AI tools If interetsed please send a copy of your most recent CV More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Involved Solutions
Key Responsibilities - Azure Data Engineer: Design, build and maintain scalable and secure data pipelines on the Azure platform. Develop and deploy data ingestion processes using Azure Data Factory, Databricks (PySpark), and Azure Synapse Analytics. Optimise ETL/ELT processes to improve performance, reliability and efficiency. Integrate multiple data sources including Azure Data Lake (Gen2), SQL-based systems and APIs. … GDPR and ISO standards). Required Skills & Experience - Azure Data Engineer: Proven commercial experience as a Data Engineer delivering enterprise-scale solutions in Azure Azure Data Factory Azure Databricks (PySpark) Azure Synapse Analytics Azure Data Lake Storage (Gen2) SQL & Python Understanding of CI/CD in a data environment, ideally with tools like Azure DevOps. Experience working within consultancy More ❯