Richmond, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
data assets. Learn from and collaborate with senior engineers. Capabilities 2+ years relevant work experience. Foundational experience with Python and SQL. Eagerness to work with tools like dbt, Spark, DataBricks, and Kedro. Understanding of version control, testing, and documentation. Familiarity with Agile, Jira, and Confluence. Proficiency working with AWS cloud services is required. Mid-Level Data Engineer Responsibilities Build and More ❯
Wick, Scotland, United Kingdom Hybrid / WFH Options
ZipRecruiter
data assets. Learn from and collaborate with senior engineers. Capabilities 2+ years relevant work experience. Foundational experience with Python and SQL. Eagerness to work with tools like dbt, Spark, DataBricks, and Kedro. Understanding of version control, testing, and documentation. Familiarity with Agile, Jira, and Confluence. Proficiency working with AWS cloud services is required. Mid-Level Data Engineer Responsibilities Build and More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
data assets. Learn from and collaborate with senior engineers. Capabilities 2+ years relevant work experience. Foundational experience with Python and SQL. Eagerness to work with tools like dbt, Spark, DataBricks, and Kedro. Understanding of version control, testing, and documentation. Familiarity with Agile, Jira, and Confluence. Proficiency working with AWS cloud services is required. Mid-Level Data Engineer Responsibilities Build and More ❯
Coulsdon, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
data assets. Learn from and collaborate with senior engineers. Capabilities 2+ years relevant work experience. Foundational experience with Python and SQL. Eagerness to work with tools like dbt, Spark, DataBricks, and Kedro. Understanding of version control, testing, and documentation. Familiarity with Agile, Jira, and Confluence. Proficiency working with AWS cloud services is required. Mid-Level Data Engineer Responsibilities Build and More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
data assets. Learn from and collaborate with senior engineers. Capabilities 2+ years relevant work experience. Foundational experience with Python and SQL. Eagerness to work with tools like dbt, Spark, DataBricks, and Kedro. Understanding of version control, testing, and documentation. Familiarity with Agile, Jira, and Confluence. Proficiency working with AWS cloud services is required. Mid-Level Data Engineer Responsibilities Build and More ❯
Ruislip, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
data assets. Learn from and collaborate with senior engineers. Capabilities 2+ years relevant work experience. Foundational experience with Python and SQL. Eagerness to work with tools like dbt, Spark, DataBricks, and Kedro. Understanding of version control, testing, and documentation. Familiarity with Agile, Jira, and Confluence. Proficiency working with AWS cloud services is required. Mid-Level Data Engineer Responsibilities Build and More ❯
Carshalton, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
data assets. Learn from and collaborate with senior engineers. Capabilities 2+ years relevant work experience. Foundational experience with Python and SQL. Eagerness to work with tools like dbt, Spark, DataBricks, and Kedro. Understanding of version control, testing, and documentation. Familiarity with Agile, Jira, and Confluence. Proficiency working with AWS cloud services is required. Mid-Level Data Engineer Responsibilities Build and More ❯
Twickenham, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
data assets. Learn from and collaborate with senior engineers. Capabilities 2+ years relevant work experience. Foundational experience with Python and SQL. Eagerness to work with tools like dbt, Spark, DataBricks, and Kedro. Understanding of version control, testing, and documentation. Familiarity with Agile, Jira, and Confluence. Proficiency working with AWS cloud services is required. Mid-Level Data Engineer Responsibilities Build and More ❯
Harrow on the Hill, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
data assets. Learn from and collaborate with senior engineers. Capabilities 2+ years relevant work experience. Foundational experience with Python and SQL. Eagerness to work with tools like dbt, Spark, DataBricks, and Kedro. Understanding of version control, testing, and documentation. Familiarity with Agile, Jira, and Confluence. Proficiency working with AWS cloud services is required. Mid-Level Data Engineer Responsibilities Build and More ❯
programs, functions, object classes, WebDynpro). Strong SQL skills with experience in query performance tuning. Knowledge of MS Power BI, Azure Data Lake, and Data Factory is an advantage. Databricks experience is a strong plus, completed by proficiency in Python. Additional Information The position is located in Saltaire, West Yorkshire. Our most important asset is our People Vantiva’s success More ❯
strong technical expertise and a passion for solving complex business problems. You'll bring: Strong experience with SQL, SQL Server DB, Python, and PySpark Proficiency in Azure Data Factory, Databricks is a must, and Cloudsmith Background in data warehousing and data engineering Solid project management capabilities Outstanding communication skills, translating technical concepts into clear business value A collaborative, solution-oriented More ❯
with DevOps culture and agile software delivery methods Nice to have: AWS and/or FME Certifications Familiarity with ETL tools such as AWS Glue, Azure Data Factory, or Databricks Why Join? This role offers the best of both worlds: the autonomy of a principal technical lead and the purpose of delivering public-good technology . You’ll be joining More ❯
desirable Expertise in Gen AI, including Large Language Models (LLMs) and RAG-based solution approaches Strong understanding of AI/ML tools and platforms such as Azure, AWS, GCP, Databricks, MLFlow and Streamlit Proficiency in Python and experience with frameworks like Airflow, Plotly Dash or similar tools Deep understanding of CPGR challenges, including supply chain dynamics, consumer behavior analytics and More ❯
desirable Expertise in Gen AI, including Large Language Models (LLMs) and RAG-based solution approaches Strong understanding of AI/ML tools and platforms such as Azure, AWS, GCP, Databricks, MLFlow and Streamlit Proficiency in Python and experience with frameworks like Airflow, Plotly Dash or similar tools Deep understanding of CPGR challenges, including supply chain dynamics, consumer behavior analytics and More ❯
or Undergraduate degree in Computer Science, Statistics, Math, or Engineering • Experience in financial services, operations fields • Experience in gathering and documenting requirements with full testing traceability • Experience in Snowflake, Databricks, Legend Studio platforms • Data governance and modelling experience ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we More ❯
validation , with at least 3+ years in a leadership role . Proven expertise in testing data platforms, data pipelines, and cloud-based big data solutions . Strong experience with Databricks, Azure Data Factory, Synapse, and cloud infrastructure testing . Deep understanding of data governance, data security, and compliance best practices . Expertise in test automation frameworks (e.g., Great Expectations, DBT More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
or internal clients within large organisations, through e.g. the RFI/RFP process, as preferred bidder, documented bids and face to face presentations. Experience of data science platforms (e.g. Databricks, Dataiku, AzureML, SageMaker) and machine learning frameworks (e.g. Keras, Tensorflow, PyTorch, scikit-learn) Cloud platforms – demonstrable experience of building and deploying solutions to Cloud (e.g. AWS, Azure, Google Cloud) including More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
The Citation Group
technologies like Snowflake, BigQuery, Redshift Experience in open source technologies like Spark, Kafka, Beam Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in an agile environment Here’s a taste of the perks we roll out for our extraordinary team members: 25 Days of Holiday, plus 8 Bank Holidays: We More ❯
London, England, United Kingdom Hybrid / WFH Options
Smart Communications group
to learn and stay updated with the latest technologies in the field of data engineering Strong organizational and time management skills Advantageous skills/experience: Hands-on experience with Databricks, including development, implementation, and optimization in large-scale environments. Experience with AWS. Experience with SQL server SSIS packages. Experience with reporting tools AWS QuickSight, PowerBI. SMART Values S peak Openly More ❯
London, England, United Kingdom Hybrid / WFH Options
Howden Group Holdings
engineering, software testing, or test automation . Familiarity with testing data pipelines, ETL/ELT workflows, and big data solutions . Experience with cloud-based data platforms , preferably Azure Databricks, Azure Data Factory, or Synapse Analytics . Basic experience with test automation frameworks (e.g., PyTest, Great Expectations, DBT tests, or similar). Proficiency in SQL and scripting languages (e.g., Python More ❯
and AI-102 - DESIRABLE Years of Experience 5+ years working with production data workloads in Azure - ESSENTIAL Other Requirements Proficiency in Azure data services (Azure Data Factory, Azure Databricks, Azure Synapse Analytics, etc.) - ESSENTIAL Experience with AI development using Azure Machine Learning - ESSENTIAL Strong programming skills in languages such as Python, SQL, or C# - ESSENTIAL CORE COMEPTENCIES & SKILLS: Expertise in More ❯
working with third-party providers. Knowledge of Lloyds Insurance and Reinsurance, Finance GAAP, Actuarial Reserves, Regulatory Reports, Databases (Data Warehouse, T-SQL, SSIS, SQL Server 2019), ETL processes with Databricks, Programming languages (e.g., Python), AWS, Agile methodologies, Jira, and MS Office products. Educational background with a BSc degree or higher in a relevant field. What is a Must Have? Bachelor More ❯
data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching. Good knowledge of Databricks, Snowflake, Azure/AWS/Oracle cloud, R, Python. Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being More ❯
data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching. Good knowledge of Databricks, Snowflake, Azure/AWS/Oracle cloud, R, Python. Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being More ❯
languages such as Python or R, with extensive experience with LLMs, ML algorithms, and models. Experience with cloud services like Azure ML Studio, Azure Functions, Azure Pipelines, MLflow, Azure Databricks, etc., is a plus. Experience working in Azure/Microsoft environments is considered a real plus. Proven understanding of data science methods for analyzing and making sense of research data More ❯