Contract PySpark Jobs in the South East

4 of 4 Contract PySpark Jobs in the South East

Data Engineer

London, South East, England, United Kingdom
Huxley
Collaborate with cross-functional teams to translate business needs into technical solutions. Core Skills Cloud & Platforms : Azure, AWS, SAP Data Engineering : ELT, Data Modeling, Integration, Processing Tech Stack : Databricks (PySpark, Unity Catalog, DLT, Streaming), ADF, SQL, Python, Qlik DevOps : GitHub Actions, Azure DevOps, CI/CD pipelines Please click here to find out more about our Key Information Documents. More ❯
Employment Type: Contractor
Rate: £850 - £900 per day
Posted:

AWS Data Engineer

Reading, Berkshire, England, United Kingdom
Hybrid / WFH Options
Opus Recruitment Solutions Ltd
tools to build robust data pipelines and applications that process complex datasets from multiple operational systems.Key Responsibilities: Build and maintain AWS-based ETL/ELT pipelines using S3, Glue (PySpark/Python), Lambda, Athena, Redshift, and Step Functions Develop backend applications to automate and support compliance reporting Process and validate complex data formats including nested JSON, XML, and CSV More ❯
Employment Type: Contractor
Rate: £500 - £550 per day
Posted:

Azure Data Engineer- 12 month Contract

London, South East, England, United Kingdom
Hybrid / WFH Options
Opus Recruitment Solutions Ltd
IR35 Start Date: ASAP Key Skills Required Azure Data Factory Azure Functions SQL Python Desirables- Experience Copilot or Copilot Studio Experience designing, developing, and deploying AI solutions Familiarity with PySpark, PyTorch, or other ML frameworks Exposure to M365, D365, and low-code/no-code Azure AI tools If interetsed please send a copy of your most recent CV More ❯
Employment Type: Contractor
Rate: £300 - £350 per day
Posted:

Databricks Engineer

London, South East, England, United Kingdom
Harnham - Data & Analytics Recruitment
will be on enabling analytics and machine learning at scale, using best-in-class tools across the Azure stack. Your responsibilities will include: Designing and deploying ETL pipelines using PySpark and Delta Lake on Databricks. Supporting the deployment and operationalisation of ML models with MLflow and Databricks Workflows. Building out reusable data products and feature stores for data science … and cloud-native tools. Collaborating with data scientists, analysts, and product teams to improve data usability and performance. KEY SKILLS AND REQUIREMENTS Advanced experience with Databricks , Delta Lake , and PySpark . Strong background in data engineering and distributed processing . Hands-on knowledge of Azure Data Lake , Data Factory , or similar orchestration tools. Experience with ML model deployment , preferably More ❯
Employment Type: Contractor
Rate: £450 - £550 per day
Posted:
PySpark
the South East
10th Percentile
£409
25th Percentile
£413
Median
£505
75th Percentile
£575
90th Percentile
£590