frameworks and practices. Understanding of machine learning workflows and how to support them with robust data pipelines. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks More ❯
deployment in secure and scalable environments to include AI/ML frameworks such as TensorFlow, PyTorch, or scikit-learn. Proven expertise in programming languages such as Python, Java, or Scala, with demonstrated experience in software engineering practices (e.g., version control, CI/CD pipelines, containerization). Experience building and optimizing data pipelines, ETL processes, and real-time streaming solutions using More ❯
on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
CV TECHNICAL LTD
on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience More ❯
into usable formats and support project team to scale, monitor, and operate data platforms TS/SCI clearance Bachelor's degree Nice If You Have: Experience with Python, SQL, Scala, or Java Experience with UNIX or Linux, including basic commands and Shell scripting Experience with a public cloud, including AWS, Micro sof t Azure, or Google Cloud Experience with distributed More ❯
business to deliver value-driven solutions What were looking for: London/Lloyd's Market experience is essential Strong programming skills in Python and SQL; knowledge of Java or Scala is a plus Solid experience with relational databases and data modelling (Data Vault, Dimensional) Proficiency with ETL tools and cloud platforms (AWS, Azure or GCP) Experience working in Agile and More ❯
and support project team to scale, monitor, and operate data platforms TS/SCI clearance Bachelor's degree Nice If You Have: Experience in application development utilizing SQL or Scala Experience with a public cloud, including AWS, Micro sof t Azure, or Google Cloud Experience with distributed data or computing tools such as Spark, Databricks, Hadoop, Hive, AWS EMR, or More ❯
of data into usable formats and support project team to scale, monitor, and operate data platforms Secret clearance Bachelor's degree Nice If You Have: Experience with Python, SQL, Scala, or Java Experience with UNIX or Linux, including basic commands and Shell scripting Experience with a public cloud, including AWS, Micro sof t Azure, or Google Cloud Experience with distributed More ❯
computer science, Information Systems , Engineering , or a related field. Hands-on experience in data architecture , data engineering , or a similar role. Deep expertise in Databricks , including Spark (PySpark/Scala) , Delta Lake , and orchestration within Databricks workflows. Strong understanding of cloud infrastructure and data services on at least one major cloud platform (Azure preferred, but AWS or GCP also accepted More ❯
Stevenage, Hertfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
your team through problem-solving with strong technical leadership. What you'll need Proven track record of leading engineering teams on data-intensive projects. Strong programming skills in Java, Scala, or Python. Proficiency in SQL (including extensions for analytical workloads). Deep knowledge of distributed data stores, frameworks, and ETL/ELT platforms (e.g. Azure Databricks, Informatica). Experience applying More ❯
Solr is a strong advantage. Bachelor's or master's degree in computer science, Information Systems , Engineering , or a related field. Deep expertise in Databricks , including Spark (PySpark/Scala) , Delta Lake , and orchestration within Databricks workflows. Strong understanding of cloud infrastructure and data services on at least one major cloud platform (Azure preferred, but AWS or GCP also accepted More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a strong willingness to learn Experience using version control tools like Git and knowledge of CI/CD pipelines Familiarity with software testing methodologies and More ❯
Systems, Engineering, or equivalent experience in a scientific/technical discipline. Strong experience with Python and its data ecosystem (e.g., NumPy, pandas, scikit-learn). Proficiency in Java or Scala for developing scalable backend systems and data pipelines. Solid understanding of SQL and relational databases (e.g., MySQL, PostgreSQL, Hive). Familiarity with the Apache Hadoop ecosystem (HDFS, MapReduce, YARN). More ❯
monitor, and operate data platforms Secret clearance Bachelor's degree in a Computer Science, Analytics, or Mathematics field Nice If You Have: Experience in application development utilizing SQL or Scala Experience with a public cloud, including AWS, Micro sof t Azure, or Google Cloud Experience with distributed data or computing tools such as Spark, Databricks, Hadoop, Hive, AWS EMR, or More ❯
on experience with cloud platforms like AWS or Azure, including relevant services like S3, EMR, Glue, Data Factory, etc. Proficiency in SQL and one or more programming languages (Python, Scala, or Java) for data manipulation and transformation. Knowledge of data security and privacy best practices, including data access controls, encryption, and data masking techniques. Strong problem-solving and analytical skills More ❯
resolve flow issues, optimize performance, and implement errorhandling strategies. • Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: • Proficiency in Java and SQL. • Experience with C# and Scala is a plus. • Experience with ETL tools and big data platforms. • Knowledge of data modeling, replication, and query optimization. • Hands-on experience with SQL and NoSQL databases is desirable. • Familiarity More ❯
resolve flow issues, optimize performance, and implement errorhandling strategies. • Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: • Proficiency in Java and SQL. • Experience with C# and Scala is a plus. • Experience with ETL tools and big data platforms. • Knowledge of data modeling, replication, and query optimization. • Hands-on experience with SQL and NoSQL databases is desirable. • Familiarity More ❯
Reston, Virginia, United States Hybrid / WFH Options
ICF
for IaC. Emergency management domain knowledge a plus Advanced proficiency in data engineering and analytics using Python, Expert-level SQL skills for data manipulation and analysis and experience with Scala, preferred but not required (Python expertise can substitute) Proven experience breaking down complex ideas into manageable components Demonstrable experience developing rapid POCs and Prototypes History of staying current with evolving More ❯
South West London, London, United Kingdom Hybrid / WFH Options
Client Server
a week with flexibility to work from home once a week. About you: You have experience as a Data Engineer, working on scalable systems You have Python, Java or Scala coding skills You have experience with Kafka for data streaming including Kafka Streams and KTables You have strong SQL skills (e.g. PostgreSQL, MySQL) You have strong AWS knowledge, ideally including More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
a week with flexibility to work from home once a week. About you: You have experience as a Data Engineer, working on scalable systems You have Python, Java or Scala coding skills You have experience with Kafka for data streaming including Kafka Streams and KTables You have strong SQL skills (e.g. PostgreSQL, MySQL) You have strong AWS knowledge, ideally including More ❯
large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A growth mindset and eagerness to work in a fast-paced, mission-driven environment Good More ❯
such as Hadoop, Spark, and Kafka, with a strong emphasis on Java development. Proficiency in data modeling, ETL processes, and data warehousing concepts. Experience with data processing languages like Scala, Python, or SQL. Familiarity with containerization technologies (Docker) and orchestration tools (Kubernetes). Strong knowledge of software development principles, including object-oriented design, design patterns, and clean code practices. Excellent More ❯
and external software solutions. Evaluate and recommend third-party vendors and partners. Support development of standards, methods, and procedures for product release readiness. Qualifications Proficiency in Java, Python, Angular, Scala, or Go. Strong expertise Kubernetes, and Airflow. Experience with relational and NoSQL databases, plus cloud-native data services. Proven ability to work across the full software development lifecycle. Familiarity with More ❯
remote services: EC2, EMR, RDS, Redshift Experience with stream-processing systems: Kafka, Storm, Spark-Streaming, etc. Experience with object-oriented/object function scripting languages: Python, Julia, Java, C++, Scala, etc. Experience with data encryption/security features applied to data-in-transit Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working More ❯
wait. You Have: Experience in programming clean, secure, and efficient code using established principles for scripting, data analysis, automation, and data warehousing Experience programming in JavaScript, Java, Python, SQL, Scala, or Bash/Shell scripting Experience building scalable ETL/ELT workflows for reporting and analytics Experience with CI/CD practices to automate builds, testing, and deployments Experience writing More ❯