GeoServer). Technical Skills: Expertise in big data frameworks and technologies (e.g., Hadoop, Spark, Kafka, Flink) for processing large datasets. Proficiency in programming languages such as Python, Java, or Scala, with a focus on big data frameworks and APIs. Experience with cloud services and technologies (AWS, Azure, GCP) for big data processing and platform deployment. Strong knowledge of data warehousing More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
security, compliance, and data governance standards (e.g. GDPR , RBAC) Mentor junior engineers and guide technical decisions on client engagements ✅ Ideal Experience Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake More ❯
security, compliance, and data governance standards (e.g. GDPR , RBAC) Mentor junior engineers and guide technical decisions on client engagements ✅ Ideal Experience Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake More ❯
Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL and at least one programming language such as Python , Scala , or Java . Demonstrated experience owning and delivering complex systems from architecture through implementation. Excellent communication skills with the ability to explain technical concepts to both technical and non-technical More ❯
complex real-world problems Drive innovation in fraud analytics, data engineering, and AI applications What You Bring Proven experience in Java or a similar OOP stack (e.g., C#, Kotlin, Scala) Strong grasp of REST APIs, microservices, and containerisation (Docker/Kubernetes) Agile mindset with strong software engineering fundamentals Passion for innovation, learning, and leading by example Familiarity with DevOps tools More ❯
to their growth and development Apply agile methodologies (Scrum, pair programming, etc.) to deliver value iteratively Essential Skills & Experience Extensive hands-on experience with programming languages such as Python, Scala, Spark, and SQL Strong background in building and maintaining data pipelines and infrastructure In-depth knowledge of cloud platforms and native cloud services (e.g., AWS, Azure, or GCP) Familiarity with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a strong willingness to learn Experience using version control tools like Git and knowledge of CI/CD pipelines Familiarity with software testing methodologies and More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Anson McCade
review standards and agile delivery. Contributing to technical strategy and mentoring other engineers. What You’ll Need: Strong experience in data engineering with expertise in languages such as Python, Scala, or Spark. Proficiency in designing and building data pipelines, working with both structured and unstructured data. Experience with cloud platforms (AWS, Azure, or GCP), using native services for data workloads. More ❯
estimation, and solution assurance activities. Required Experience Strong experience leading data engineering teams in a consulting or delivery environment . Hands-on expertise in modern data engineering languages - Python, Scala, or Java . Deep knowledge of SQL and distributed data processing frameworks. Experience with cloud-based platforms such as Azure, AWS, or GCP , particularly Databricks , Informatica , or similar. Understanding of More ❯
define requirements and deliver innovative solutions Support internal capability development by sharing your expertise and experience What We're Looking For Strong hands-on experience with Python, Java, or Scala Proficiency in cloud environments (AWS, Azure, or GCP) and big data tech (Spark, Hadoop, Airflow) Solid understanding of SQL, ETL/ELT approaches, and data modelling techniques Experience building CI More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Eden Smith Group
as Actuarial and Finance, to help with their data presentation requirements and help deliver data for visualisation solutions, such as Power BI Key skills required: Expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows. Strong knowledge of Power BI Strong SQL experience. Familiarity with technical data structures, data More ❯
as Actuarial and Finance, to help with their data presentation requirements and help deliver data for visualisation solutions, such as Power BI Key skills required: Expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows. Strong knowledge of Power BI Strong SQL experience. Familiarity with technical data structures, data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Investment Banking - London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Snowflake, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles including a Data Engineer position to More ❯
solving and logical thinking skills. Ability to work independently in a remote environment. Preferred Qualifications: Exposure to tools like Apache Airflow, Spark, or Kafka. Basic experience with Python or Scala for scripting and automation. Knowledge of cloud platforms like AWS, Azure, or GCP. Previous academic or personal projects related to data pipeline development or big data handling. Internship Details: Duration More ❯
Bethesda, Maryland, United States Hybrid / WFH Options
Gridiron IT Solutions
Statistics, Mathematics, or related field 7+ years of experience in data science or related field, plus additional 3+ years' experience in a complimentary function Strong programming skills in Python, SCALA, and/or UNIX shell scripting Expertise in machine learning techniques and statistical analysis Proficiency in SQL and NoSQL databases Experience with big data platforms such as Hadoop, Spark, and More ❯
be helpful in this role. While SQL is a language everybody on the team knows very well, Snowflake is fast evolving to allow native integrations with Python, Java and Scala & with Snowpark , if you have experience with Spark you will definitely be able to put these skills to good use. DataOps or DevOps for data teams is getting ever more More ❯
education) in a STEM discipline. Proven experience in software engineering and development, and a strong understanding of computer systems and how they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong More ❯
governance. Strong expertise in balancing trade-offs within complex distributed systems, focusing on data quality, performance, reliability, availability, and security. Proficient in software engineering with modern languages (e.g. Python, Scala, Java), applying best practices to create maintainable, scalable, and robust code. A continuous learner, up-to-date with the latest technology trends, with the ability to assess new technologies pragmatically More ❯
Azure DevOps). Excellent communication and stakeholder management skills. Bonus Points for: Previous experience in a consultancy environment . Hands-on coding experience in additional languages like Python, Ruby, Scala, PHP, or C++ . Knowledge of performance testing tools like JMeter, Gatling, K6 or Neoload. What's in It for You? At Ten10, we believe in recognizing and rewarding great More ❯
Azure DevOps). Excellent communication and stakeholder management skills. Bonus Points for: Previous experience in a consultancy environment . Hands-on coding experience in additional languages like Python, Ruby, Scala, PHP, or C++ . Knowledge of performance testing tools like JMeter, Gatling, K6 or Neoload . What's in It for You? At Ten10, we believe in recognizing and rewarding More ❯
Purview or equivalent for data governance and lineage tracking Experience with data integration, MDM, governance, and data quality tools . Hands-on experience with Apache Spark, Python, SQL, and Scala for data processing. Strong understanding of Azure networking, security, and IAM , including Azure Private Link, VNETs, Managed Identities, and RBAC . Deep knowledge of enterprise-scale data architecture patterns , including More ❯
champion a culture of innovation and challenge. We have over 300 tech experts across our teams all using the latest tools and technologies including Docker, Kubernetes, AWS, Kafka, Java, Scala, Python, iOS, Android .Net Core, Swift, Kotlin, Node.js and MongoDB. Theres something for everyone. Were a place of opportunity. Youll have the tools and autonomy to drive your own career More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
in designing and implementing data warehousing solutions using Snowflake and AWS. The ideal candidate must have: Strong experience as an AWS Data Engineer Software Engineering background Coding lanuguages Java, Scala, C# or Python Lead the design and implementation of Snowflake [and Redshift] based data warehousing solutions within an AWS environment Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
IO Associates
tools and services (preferably Azure ) Required Skills & Experience: Proven experience as a Data Engineer on cloud-based projects Strong hands-on skills with Databricks , Apache Spark , and Python or Scala Proficient in SQL and working with large-scale data environments Experience with Delta Lake , Azure Data Lake , or similar technologies Familiarity with version control, CI/CD, and infrastructure-as More ❯
Columbia, South Carolina, United States Hybrid / WFH Options
Systemtec Inc
Sagemaker, Unified Studio, R Studio/Posit Workbench, R Shiny/Posit Connect, Posit Package Manager, AWS Data Firehose, Kafka, Hive, Hue, Oozie, Sqoop, Git/Git Actions, IntelliJ, Scala Responsibilities of the Data Engineer (AWS): Act as an internal consultant, advocate, mentor, and change agent providing expertise and technical guidance on complex projects. Work closely with customers, business analysts More ❯