Collaborate with cross-functional teams to translate business needs into technical solutions. Core Skills Cloud & Platforms : Azure, AWS, SAP Data Engineering : ELT, Data Modeling, Integration, Processing Tech Stack : Databricks (PySpark, Unity Catalog, DLT, Streaming), ADF, SQL, Python, Qlik DevOps : GitHub Actions, Azure DevOps, CI/CD pipelines Please click here to find out more about our Key Information Documents. More ❯
Collaborate with cross-functional teams to translate business needs into technical solutions. Core Skills Cloud & Platforms : Azure, AWS, SAP Data Engineering : ELT, Data Modeling, Integration, Processing Tech Stack : Databricks (PySpark, Unity Catalog, DLT, Streaming), ADF, SQL, Python, Qlik DevOps : GitHub Actions, Azure DevOps, CI/CD pipelines Please click here to find out more about our Key Information Documents. More ❯
the ability to write ad-hoc and complex queries to perform data analysis. Experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. You will be able to develop solutions in a hybrid data environment (on-Prem and Cloud). Hands on experience with developing data pipelines for structured, semi-structured More ❯
and strategic change projects. You'll work across multiple workstreams, delivering high-impact data solutions that drive efficiency and compliance for Markets and its clients.Key Responsibilities Build and optimize PySpark and SQL queries to analyze, reconcile, and interrogate large datasets. Recommend improvements to reporting processes, data quality, and query performance. Contribute to the architecture and design of Hadoop environments. More ❯
Belfast, County Antrim, Northern Ireland, United Kingdom
Hays
strategic change projects. You'll work across multiple workstreams, delivering high-impact data solutions that drive efficiency and compliance for Markets and its clients. Key Responsibilities Build and optimize PySpark and SQL queries to analyze, reconcile, and interrogate large datasets. Recommend improvements to reporting processes, data quality, and query performance. Contribute to the architecture and design of Hadoop environments. More ❯
extensive Data Development experience in a commercial or Agile environment. To be successful in this role its essential that you will: Have experience of SQL, Python, AWS, Git and PySpark Desirable experience needed will be: SISS or SAS experience Quality Assurance and Test Automation experience Experience of Database technologies Experience in Financial Services organisation About us Were one of More ❯
include; Multiple Databricks projects delivered Excellent consulting and client facing experience 7 - 10 years+ experience of Consulting in Data Engineering, Data Platform and Analytics Deep experience with Apache Spark, PySpark CI/CD for Production deployments Working knowledge of MLOps Strong experience with Optmisations for performance and scalability These roles will be paid at circa £600 - £700 per day More ❯
Belfast, County Antrim, Northern Ireland, United Kingdom
McGregor Boyall
Execution & Transformation - Data Acquisition Team at a leading Investment Bank. You'll work across regulatory and transformation initiatives that span multiple trading desks, functions, and stakeholders. You'll build PySpark and SQL queries to interrogate, reconcile and analyse data, contribute to Hadoop data architecture discussions, and help improve reporting processes and data quality. You'll be hands-on across More ❯
a separate team to release all development through Azure DevOps pipelines, maintaining a strong understanding of Git and code release best practises. Technology Requirements: - Proficient in Python 3 and Pyspark 3/4. - Experience with Python Behave for Behaviour Driven Development and testing. - Familiarity with Python Coverage for code coverage analysis. - Strong knowledge of Databricks, specifically with delta parquet More ❯
I am currently on the lookout for a Contract AWS Data Engineer with a scale-up who have number of greenfield projects coming up. Tech Stack: AWS Databricks Lakehouse PySpark SQL ClickHouse/MySQL/DynamoDB If you are interested in this position, please click apply with an updated copy of you CV and I will call you to More ❯
Office) I am currently on the lookout for a Contract AWS Data Engineer with a scale-up who have number of greenfield projects coming up.Tech Stack: AWS Databricks Lakehouse PySpark SQL ClickHouse/MySQL/DynamoDB If you are interested in this position, please click apply with an updated copy of you CV and I will call you to More ❯
understanding of data integration, data quality, and data governance. Extensive experience in working with big data technology tools and platforms such as Microsoft Azure Data Factory, Databricks, Unity Catalog, PySpark, Power BI, Synapse, SQL Server, Cosmos Db, Python. Understanding and application of cloud architectures and microservices in big data solutions. Understanding of commodities industry. Rate/Duration More ❯
WE NEED THE PYTHON/DATA ENGINEER TO HAVE.... Current DV Security Clearance (Standard or Enhanced) Experience with big data tools such as Hadoop, Cloudera or Elasticsearch Python/PySpark experience Experience With Palantir Foundry is nice to have Experience working in an Agile Scrum environment with tools such as Confluence/Jira Experience in design, development, test and More ❯
WE NEED THE PYTHON/DATA ENGINEER TO HAVE. Current DV Security Clearance (Standard or Enhanced) Experience with big data tools such as Hadoop, Cloudera or Elasticsearch Python/PySpark experience Experience With Palantir Foundry is nice to have Experience working in an Agile Scrum environment with tools such as Confluence/Jira Experience in design, development, test and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
IR35 Start Date: ASAP Key Skills Required Azure Data Factory Azure Functions SQL Python Desirables- Experience Copilot or Copilot Studio Experience designing, developing, and deploying AI solutions Familiarity with PySpark, PyTorch, or other ML frameworks Exposure to M365, D365, and low-code/no-code Azure AI tools If interetsed please send a copy of your most recent CV More ❯
UK role (WFH) - 6 months initial contract - Top rates - Major consultancy urgently requires a Microsoft Fabric specialist with in-depth experience of MS Fabric (tech stack is Microsoft Fabric, PySpark/RSpark and Github) for an initial 6 months contract (WFH) who is passionate about building new capabilities from the ground up and want to help explore the full More ❯
East London, London, England, United Kingdom Hybrid / WFH Options
Fusion People Ltd
UK role (WFH) - 6 months initial contract - Top rates - Major consultancy urgently requires a Microsoft Fabric specialist with in-depth experience of MS Fabric (tech stack is Microsoft Fabric, PySpark/RSpark and Github) for an initial 6 months contract (WFH) who is passionate about building new capabilities from the ground up and want to help explore the full More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Proven experience as a Programme or Delivery Manager on data-centric programmes Solid understanding of data ingestion processes and Snowflake data warehousing Familiarity with AWS Glue, S3, DBT, SnapLogic, PySpark (not hands-on, but able to converse technically) Strong governance and delivery background in a data/tech environment Excellent communication and stakeholder management skills (must be assertive) Pharma More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Involved Solutions
Key Responsibilities - Azure Data Engineer: Design, build and maintain scalable and secure data pipelines on the Azure platform. Develop and deploy data ingestion processes using Azure Data Factory, Databricks (PySpark), and Azure Synapse Analytics. Optimise ETL/ELT processes to improve performance, reliability and efficiency. Integrate multiple data sources including Azure Data Lake (Gen2), SQL-based systems and APIs. … GDPR and ISO standards). Required Skills & Experience - Azure Data Engineer: Proven commercial experience as a Data Engineer delivering enterprise-scale solutions in Azure Azure Data Factory Azure Databricks (PySpark) Azure Synapse Analytics Azure Data Lake Storage (Gen2) SQL & Python Understanding of CI/CD in a data environment, ideally with tools like Azure DevOps. Experience working within consultancy More ❯
and managing project changes and interventions to achieve project outputs. Documenting all aspects of the project for future reference and audits. Technical Responsibilities: Developing SQL scripts (store procedures) and PySpark notebooks. Creating and managing ingestion, ETL & ELT processes. Designing and configuring Synapse pipelines. Data modelling in various storage systems. Analysing existing data designs and suggesting improvements for performance, stability … Experience in Project Management within the Defence & Security sector. Strong technical skills in API, Java, Python, Web Development, SQL, and Azure. Proficiency in developing and managing SQL scripts and PySpark notebooks. Understanding of ETL & ELT processes and Synapse pipeline design and configuration. Experience in data modelling and improving existing data designs. Knowledge of real-time data processing. Capable of More ❯
and managing project changes and interventions to achieve project outputs. Documenting all aspects of the project for future reference and audits. Technical Responsibilities: Developing SQL scripts (store procedures) and PySpark notebooks. Creating and managing ingestion, ETL & ELT processes. Designing and configuring Synapse pipelines. Data modelling in various storage systems. Analysing existing data designs and suggesting improvements for performance, stability … Experience in Project Management within the Defence & Security sector. Strong technical skills in API, Java, Python, Web Development, SQL, and Azure. Proficiency in developing and managing SQL scripts and PySpark notebooks. Understanding of ETL & ELT processes and Synapse pipeline design and configuration. Experience in data modelling and improving existing data designs. Knowledge of real-time data processing. Capable of More ❯
time zone). Technical Business Analyst & Subject Matter Expert who will join the Data Solutions team. Must have a solid understanding of data engineering conecpets such as ETL pipeline, PySpark transformations & data lake. - ETL pipeline - Data lake - PySpark transformations - Public Financial Markets - Bachelors Degree in IT or Finance - Advanced SQL skill - Solid understanding of data modelling, including dimensional More ❯
TECHNICAL PROGRAMME MANAGER - DATA INGESTION (PHARMA/SNOWFLAKE) UP TO £560 PER DAY HYBRID (1/2 DAYS PER WEEK IN SPAIN & GERMANY) 6 MONTHS THE COMPANY: A global data and analytics consultancy are delivering a large-scale data ingestion More ❯