write clean, scalable, robust code using python or similar programming languages. Background in software engineering a plus. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks More ❯
technical skills, knowledge and professional qualifications provided: Databases – has a deep understanding of relational databases Use DevOps methods and work in an Agile environment Programming - expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows Has expertise in SQL AI and Machine Learning knowledge as it applies to end More ❯
Strong understanding of DevOps methodologies and principles. Solid understanding of data warehousing data modeling and data integration principles. Proficiency in at least one scripting/programming language (e.g. Python Scala Java). Experience with SQL and NoSQL databases. Familiarity with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication interpersonal and presentation skills. Desired More ❯
GeoServer). Technical Skills: Expertise in big data frameworks and technologies (e.g., Hadoop, Spark, Kafka, Flink) for processing large datasets. Proficiency in programming languages such as Python, Java, or Scala, with a focus on big data frameworks and APIs. Experience with cloud services and technologies (AWS, Azure, GCP) for big data processing and platform deployment. Strong knowledge of data warehousing More ❯
Relational Databases and Data Warehousing concepts. Experience of Enterprise ETL tools such as Informatica, Talend, Datastage or Alteryx. Project experience using the any of the following technologies: Hadoop, Spark, Scala, Oracle, Pega, Salesforce. Cross and multi-platform experience. Team building and leading. You must be: Willing to work on client sites, potentially for extended periods. Willing to travel for work More ❯
degree or higher in an applicable field such as Computer Science, Statistics, Maths or similar Science or Engineering discipline Strong Python and other programming skills (Java and/or Scala desirable) Strong SQL background Some exposure to big data technologies (Hadoop, spark, presto, etc.) NICE TO HAVES OR EXCITED TO LEARN: Some experience designing, building and maintaining SQL databases (and More ❯
are looking for data engineers who have a variety of different skills which include some of the below. Strong proficiency in at least one programming language (Python, Java, or Scala) Extensive experience with cloud platforms (AWS, GCP, or Azure) Experience with: Data warehousing and lake architectures ETL/ELT pipeline development SQL and NoSQL databases Distributed computing frameworks (Spark, Kinesis More ❯
are looking for data engineers who have a variety of different skills which include some of the below. Strong proficiency in at least one programming language (Python, Java, or Scala) Extensive experience with cloud platforms (AWS, GCP, or Azure) Experience with: Data warehousing and lake architectures ETL/ELT pipeline development SQL and NoSQL databases Distributed computing frameworks (Spark, Kinesis More ❯
technical skills and knowledge Proficiency in Programming Languages : Strong proficiency in Python is essential, along with experience in Bash/Shell scripting. Familiarity with additional languages such as Java, Scala, R, or Go is a plus. Understanding of Machine Learning Fundamentals: A solid understanding of machine learning concepts, including algorithms, data pre-processing, model evaluation, and training. Familiarity with ML More ❯
Mandatory Proficient in either GCP (Google) or AWS cloud Hands on experience in designing and building data pipelines using Hadoop and Spark technologies. Proficient in programming languages such as Scala, Java, or Python. Experienced in designing, building, and maintaining scalable data pipelines and applications. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of Infrastructure as Code tools. More ❯
to their growth and development Apply agile methodologies (Scrum, pair programming, etc.) to deliver value iteratively Essential Skills & Experience Extensive hands-on experience with programming languages such as Python, Scala, Spark, and SQL Strong background in building and maintaining data pipelines and infrastructure In-depth knowledge of cloud platforms and native cloud services (e.g., AWS, Azure, or GCP) Familiarity with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a strong willingness to learn Experience using version control tools like Git and knowledge of CI/CD pipelines Familiarity with software testing methodologies and More ❯
flow issues, optimize performance, and implement error handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity More ❯
Farnborough, Hampshire, South East, United Kingdom
Peregrine
flow issues, optimize performance, and implement error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity More ❯
Mandatory Proficient in either GCP (Google) or AWS cloud Hands on experience in designing and building data pipelines using Hadoop and Spark technologies. Proficient in programming languages such as Scala, Java, or Python. Experienced in designing, building, and maintaining scalable data pipelines and applications. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of Infrastructure as Code tools. More ❯
Mandatory Proficient in either GCP (Google) or AWS cloud Hands on experience in designing and building data pipelines using Hadoop and Spark technologies. Proficient in programming languages such as Scala, Java, or Python. Experienced in designing, building, and maintaining scalable data pipelines and applications. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of Infrastructure as Code tools. More ❯
Mandatory Proficient in either GCP (Google) or AWS cloud Hands on experience in designing and building data pipelines using Hadoop and Spark technologies. Proficient in programming languages such as Scala, Java, or Python. Experienced in designing, building, and maintaining scalable data pipelines and applications. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of Infrastructure as Code tools. More ❯
data platform that powers smarter decisions, better insights, and streamlined operations.Key skills and responsibilities * Proven experience in data engineering and data platform development * Strong programming skills in Python, Java, Scala, or similar * Advanced SQL and deep knowledge of relational databases * Hands-on experience with ETL tools and building robust data pipelines * Familiarity with data science, AI/ML integration, and More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Anson McCade
review standards and agile delivery. Contributing to technical strategy and mentoring other engineers. What You’ll Need: Strong experience in data engineering with expertise in languages such as Python, Scala, or Spark. Proficiency in designing and building data pipelines, working with both structured and unstructured data. Experience with cloud platforms (AWS, Azure, or GCP), using native services for data workloads. More ❯
serving layers Experience with data lakehouse architecture, schema design, and GDPR-compliant solutions Working knowledge of DevOps tools and CI/CD processes Bonus Points For Development experience in Scala or Java Familiarity with Cloudera, Hadoop, HIVE, and Spark ecosystem Understanding of data privacy regulations, including GDPR, and experience working with sensitive data Ability to learn and adapt new technologies More ❯
driven APIs, and designing database schemas and queries to meet business requirements. A passion and proven background in picking up and adopting new technologies on the fly. Exposure to Scala, or functional programming generally. Exposure with highly concurrent, asynchronous backend technologies, such as Ktor, http4k, http4s, Play, RxJava, etc. Exposure with DynamoDB or similar NoSQL databases, such as Cassandra, HBase More ❯
knowledge sharing across the team. What We're Looking For Strong hands-on experience across AWS Glue, Lambda, Step Functions, RDS, Redshift, and Boto3. Proficient in one of Python, Scala or Java, with strong experience in Big Data technologies such as: Spark, Hadoop etc. Practical knowledge of building Real Time event streaming pipelines (eg, Kafka, Spark Streaming, Kinesis). Proven More ❯
large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A growth mindset and eagerness to work in a fast-paced, mission-driven environment Good More ❯
of emerging technology and trends. Provides out of hours support for applications to ensure the shop stays open and fully functional. Essential knowledge and skills Proficient in Python or Scala Familiarity in Java Experience in a Marketing technical stack and 3rd party tools Broad experience of working within AWS; including infrastructure (VPC, EC2, Security groups, S3 etc) to AWS data More ❯
such as Hadoop, Spark, and Kafka, with a strong emphasis on Java development. Proficiency in data modeling, ETL processes, and data warehousing concepts. Experience with data processing languages like Scala, Python, or SQL. Familiarity with containerization technologies (Docker) and orchestration tools (Kubernetes). Strong knowledge of software development principles, including object-oriented design, design patterns, and clean code practices. Excellent More ❯