Leeds, England, United Kingdom Hybrid / WFH Options
Mastek
OBIEE, Workato and PL/SQL. Design and build data solutions on Azure, leveraging Databricks, Data Factory, and other Azure services. Utilize Python and PySpark for data transformation, analysis, and real-time streaming. Collaborate with cross-functional teams to gather requirements, design solutions, and deliver insights. Implement and maintain … Technologies: Databricks, Data Factory: Expertise in data engineering and orchestration. DevOps, Storage Explorer, Data Studio: Competence in deployment, storage management, and development tools. Python, PySpark: Advanced coding skills, including real-time data streaming through Autoloader. Development Tools: VS Code, Jira, Confluence, Bitbucket. Service Management: Experience with ServiceNow. API Integration more »
Lead Data Engineer: We need some strong Lead data engineer profiles… they need good experience with Python, SQL, ADF and preferably Azure Databricks experience Job description: Building new data pipelines and optimizing data flows using the Azure cloud stack. Building more »
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
Newcastle upon Tyne, Tyne & Wear Hybrid / WFH Options
DWP
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
Leeds, West Yorkshire, Yorkshire and the Humber Hybrid / WFH Options
DWP
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
Data Lead – Palantir experience + Python, PySpark, SQL, Big Data 12 Months initial contract Hybrid (Maidenhead) – 2 days onsite per week Inside IR35 My client a top Global company are currently looking to recruit a Data Lead with Palantir experience to join their team on a contract basis. Please … in maintaining data warehouse systems and working on large scale data transformation, and knowledge of at least one MDM. Development skills – experience using Python, PySpark, SQL, Big Data experience (Palantir) and DataOps. Certified Data Management Professional and TOGAF. Interpersonal skills and the ability to communicate complex technology solutions to more »
Job Description POSITION OVERVIEW This is exciting opportunity to join a leading games publisher as a Data Developer. Working within the Data Services team, you will be collaborating with team members at our company and across our studios. You will more »
Senior Data Engineer (Azure, Kubernetes, Databricks) Tech Industry - Oxfordshire (1 day per month in office) Up to £82,000 + benefits OVERVIEW We are partnered with a rapidly expanding digital platform in the property and legal tech sector, backed by more »
City of London, London, United Kingdom Hybrid / WFH Options
Nigel Frank International
Lead Data Engineer - London - Azure - Hybrid - £95k Great opportunity for an experienced data engineering lead to join a leading company within the legal sector who are putting data at the forefront of every step they take in their industry! If more »
a Data Engineer, with a focus on AWS Proficiency in AWS services like Redshift, S3, Glue, and Lambda Strong skills programming in Python or PySpark Nice to have: AWS Certifications Interviews are already underway with limited slots remaining, don't miss out on your opportunity to secure this amazing … Get in touch ASAP by contacting me at (url removed) or on (phone number removed)! Data Engineer, Senior Data Engineer, Developer, AWS, Apache, Python, PySparkmore »
Machine Learning Engineer (Data Engineering Background) Paying up to £80,000 + 10% bonus Remote first policy - Office in Central London if preferred 2 stage interview process One of La Fosse's best clients who are an industry leader within more »
to make data-driven decisions. Microsoft Fabric is a cutting-edge, SaaS-based platform that unifies various data capabilities, including lakehouses for data storage, PySpark notebooks and SQL for data engineering, Power BI for real-time analytics and reporting, and AI integration to provide advanced insights. With a collaborative … performance. Skills, Knowledge and Expertise Degree in a numerate subject (e.g., computer science, engineering, mathematics). Microsoft BI or relevant certifications are advantageous. SQL & PySpark expertise is critical. Problem-solving & Innovation: Ability to work within an evolving environment and propose effective solutions. Strong communication: Able to convey technical concepts more »
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
83zero Limited
to make data-driven decisions. Microsoft Fabric is a cutting-edge, SaaS-based platform that unifies various data capabilities, including lakehouses for data storage, PySpark notebooks and SQL for data engineering, Power BI for real-time analytics and reporting, and AI integration to provide advanced insights. With a collaborative … performance. Skills, Knowledge and Expertise Degree in a numerate subject (e.g., computer science, engineering, mathematics). Microsoft BI or relevant certifications are advantageous. SQL & PySpark expertise is critical. Problem-solving & Innovation: Ability to work within an evolving environment and propose effective solutions. Strong communication: Able to convey technical concepts more »