City Of London, England, United Kingdom Hybrid / WFH Options
CipherTek Recruitment
Expertise in Azure DevOps and GitHub Actions. Familiarity with Databricks CLI and Databricks Job Bundle. Strong programming skills in Python and SQL; familiarity with Scala is a plus. Solid understanding of AI/ML algorithms, model training, evaluation (including hyperparameter tuning), deployment, monitoring, and governance. Experience in handling large datasets More ❯
London, England, United Kingdom Hybrid / WFH Options
DEPOP
and real-time streaming architectures. Deep understanding of data warehousing concepts, ETL/ELT processes, and analytics engineering. Strong programming skills, particularly in Python, Scala or Java, and a solid understanding of cloud platforms (AWS, GCP, or Azure). Experience working with privacy-sensitive datasets and implementing robust data observability More ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Expertise in Azure DevOps and GitHub Actions. Familiarity with Databricks CLI and Databricks Job Bundle. Strong programming skills in Python and SQL; familiarity with Scala is a plus. Solid understanding of AI/ML algorithms, model training, evaluation (including hyperparameter tuning), deployment, monitoring, and governance. Experience in handling large datasets More ❯
role. Strong background in financial services, particularly in market risk, counterparty credit risk, or risk analytics Proficiency in modern programming languages (e.g., Java, Python, Scala) and frameworks. Experience with cloud platforms (AWS, Azure, or GCP). Deep understanding of software development lifecycle (SDLC), agile methodologies, and DevOps practices. Preferred Skills More ❯
experience as a Data Engineer/Software Engineer/ML Engineer/AI Apps on an open-source tech stack with Python, Java/Scala, Spark, cloud platforms, MPP database technologies. Location GB-GB-London Work Locations: GB London 20 Gracechurch Street London EC3V 0BG Job Field Project & Change Management More ❯
optimal solutions - Strong SQL skills with experience across relational and non-relational databases - Proficiency in a modern programming language such as Python, Java or Scala - Hands-on experience using DBT for pipeline development and transformation - Familiarity with cloud platforms such as AWS, Azure or GCP - Knowledge of orchestration tooling (e.g. More ❯
problem-solving, and communication skills Comfort working with remote teams and distributed delivery models Additional skills that are a plus: Programming languages such as Scala, Rust, Go, Angular, React, Kotlin Database management with PostgreSQL Experience with ElasticSearch, observability tools like Grafana and Prometheus What this role can offer Opportunity to More ❯
databases. Solid experience in building scalable and reliable data pipelines for data transformation and feature extraction. Solid proficiency in scientific programming languages like Python, Scala, Java, or SQL. Solid experience with cloud platforms (AWS, GCP, Azure) and cloud services for data storage, processing, and analytics. Nice-to-Have Requirements: Familiarity More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
KPMG United Kingdom
a client delivery team, working to the highest technical standards. What will you need to do it? Experience in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Experience with the design, build and maintenance More ❯
with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A growth mindset and eagerness to work in a fast More ❯
dynamo db, Cassandra) Knowledge of node based architecture, graph databases and languages – Neptune, Neo4j, Gremlin, Cypher Experience 5+ years of experience with Databricks, Spark, Scala,PySpark, Python 5+ years of experience in SQL and database technologies like Snowflake or equivalent. 3+ year of experience with data and ETL programming (Ab More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
PA Consulting
ll have: Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Perform tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Work closely with other engineering teams to integrate data More ❯
with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with More ❯
with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A growth mindset and eagerness to work in a fast More ❯
databases. Solid experience in building scalable and reliable data pipelines for data transformation and feature extraction. Solid proficiency in scientific programming languages like Python, Scala, Java, or SQL. Solid experience with cloud platforms (AWS, GCP, Azure) and cloud services for data storage, processing, and analytics. Nice-to-Have Requirements Familiarity More ❯
London, England, United Kingdom Hybrid / WFH Options
Datatech Analytics
experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. Understanding More ❯
minimum 2 years in Ops or DevOps experience in trading environments, with expertise in large-scale distributed systems; recent work in Java, Go, or Scala is preferred. Hands-on experience with container orchestration (Kubernetes, Docker, etc.) and cloud infrastructure, especially AWS; familiarity with Infrastructure-as-Code tools like Terraform or More ❯
Experience of using agile delivery tools such as JIRA, Pivotal, Collab, Confluence Experience of engineering based on the likes of SQL, SSIS, Python, Java, Scala, XML/FpML and Power BI Data architecture, data lineage including an understanding of AI Testing/quality engineering; experience of test automation will be More ❯
Services, Microsoft Azure, Databricks, Snowflake. Architectural and/or feature knowledge of one or more of the following Programming Languages/Packages: Python, Java, Scala, Spark, SQL, NoSQL databases. Experience working within Agile delivery methodologies. Proven ability to be successful in a matrixed organisation, and to enlist support and commitment More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Capgemini
experience) AWS Certified Data Analytics – Specialty (preferred) Experience with multi-cloud environments (Azure, GCP) is a plus Strong programming skills in Python, SQL, or Scala for data engineering use cases Good to have: Currently working in a major Consulting firm, and/or in industry, but having a Consulting background More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Capgemini
experience) AWS Certified Data Analytics – Specialty (preferred) Experience with multi-cloud environments (Azure, GCP) is a plus Strong programming skills in Python, SQL, or Scala for data engineering use cases Good to have: Currently working in a major Consulting firm, and/or in industry, but having a Consulting background More ❯
London, England, United Kingdom Hybrid / WFH Options
Capgemini
experience) AWS Certified Data Analytics – Specialty (preferred) Experience with multi-cloud environments (Azure, GCP) is a plus Strong programming skills in Python, SQL, or Scala for data engineering use cases Good to have: Currently working in a major Consulting firm, and/or in industry, but having a Consulting background More ❯
as a Data Engineer or in a similar role focused on data pipeline development ️ Strong programming skills in languages such as Python, Java, or Scala ️ Experience with SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, MySQL, Cassandra) ️ Familiarity with cloud data platforms such as AWS, Google Cloud, or Azure ️ Experience with More ❯
in data engineering or related roles Bachelor’s in CS, Engineering, Mathematics, Finance, etc. Proficiency in Python, SQL , and one or more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools More ❯
in data engineering or related roles Bachelor’s in CS, Engineering, Mathematics, Finance, etc. Proficiency in Python, SQL , and one or more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools More ❯