write clean, scalable, robust code using python or similar programming languages. Background in software engineering a plus. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks More ❯
and Java Expertise in building modern data pipelines and ETL (extract, transform, load) processes using tools such as Apache Kafka and Apache Nifi Proficient in programming languages like Java, Scala, or Python Experience or expertise using, managing, and/or testing API Gateway tools and Rest APIs Experience in traditional database and data warehouse products such as Oracle, MySQL, etc. More ❯
the payments industry. Experience using Azure Databricks. Experience using containerised services such as Docker/Kubernetes. Experience using IaC tools such as Terraform/Bicep. Experience using programming languages; Scala, Powershell, YAML. Comprehensive, payments industry training by in-house and industry experts. Excellent performance-based earning opportunity, including OKR-driven bonuses. Future opportunity for equity, rewarded to high performers. Personal More ❯
Strong understanding of DevOps methodologies and principles. Solid understanding of data warehousing data modeling and data integration principles. Proficiency in at least one scripting/programming language (e.g. Python Scala Java). Experience with SQL and NoSQL databases. Familiarity with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication interpersonal and presentation skills. Desired More ❯
services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least one programming language (e.g., Python, Scala, or Java) Demonstrated experience owning complex technical systems end-to-end, from design through production Excellent communication skills with the ability to explain technical concepts to both technical and non More ❯
Databricks on Azure or AWS. Databricks Components : Proficient in Delta Lake, Unity Catalog, MLflow, and other core Databricks tools. Programming & Query Languages : Strong skills in SQL and Apache Spark (Scala or Python). Relational Databases : Experience with on-premises and cloud-based SQL databases. Data Engineering Techniques : Skilled in Data Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data More ❯
Relational Databases and Data Warehousing concepts. Experience of Enterprise ETL tools such as Informatica, Talend, Datastage or Alteryx. Project experience using the any of the following technologies: Hadoop, Spark, Scala, Oracle, Pega, Salesforce. Cross and multi-platform experience. Team building and leading. You must be: Willing to work on client sites, potentially for extended periods. Willing to travel for work More ❯
degree or higher in an applicable field such as Computer Science, Statistics, Maths or similar Science or Engineering discipline Strong Python and other programming skills (Java and/or Scala desirable) Strong SQL background Some exposure to big data technologies (Hadoop, spark, presto, etc.) NICE TO HAVES OR EXCITED TO LEARN: Some experience designing, building and maintaining SQL databases (and More ❯
5+ years of hands-on experience with big data technology, systems and tools such as AWS, Hadoop, Hive, and Snowflake Expertise with common Software Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages Experience with workflow orchestration tools such as Airflow Detailed problem-solving approach, coupled with a strong sense of ownership More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
security, compliance, and data governance standards (e.g. GDPR , RBAC) Mentor junior engineers and guide technical decisions on client engagements ✅ Ideal Experience Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake More ❯
security, compliance, and data governance standards (e.g. GDPR , RBAC) Mentor junior engineers and guide technical decisions on client engagements ✅ Ideal Experience Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake More ❯
Social network you want to login/join with: Job Description We are seeking an experienced and visionary Head of Data Engineering to lead our dynamic Data Engineering Team. The successful candidate will oversee a team of approximately 40 Data More ❯
Associates degree or higher in engineering, computer science, or related field and 5+ years of experience as a DevOps/Cloud/Software engineer -OR- 8+ years of experience as a DevOps/Cloud/Software engineer Proficiency in programming More ❯
Senior Data Engineer Hybrid 3 days per week in the office Base salary up to £70,000 My client a financial services institution are seeking a dynamic Senior Data Engineer to help shape the future of data engineering in an More ❯
South East London, London, United Kingdom Hybrid / WFH Options
TEN10 SOLUTIONS LIMITED
for data pipelines, data quality, and data transformation logic. Use tools like Azure Deequ , Spark , and Databricks to ensure data accuracy and completeness. Write robust, scalable test scripts in Scala , Python , and Java . Integrate testing into CI/CD pipelines and support infrastructure automation. Collaborate with data engineers, architects, and analysts to align QA efforts with data strategies. What … and a passion for quality in data. If you meet most of these, wed love to hear from you: Must-Have Skills: Eligible for SC clearance. Development background in Scala , Python , or Java Solid understanding of data engineering practices and data validation techniques. Experience using test automation frameworks for data pipelines and ETL workflows Strong communication and stakeholder management skills. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Investment Banking - London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Snowflake, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles including a Data Engineer position to More ❯
focus on cloud-based data pipelines and architectures Strong expertise in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure Data Factory Strong understanding of data architecture principles, data modelling, and data governance Experience with More ❯
GitHub proficiency Strong organizational, analytical, problem-solving, and communication skills Comfort working with remote teams and distributed delivery models Additional skills that are a plus: Programming languages such as Scala, Rust, Go, Angular, React, Kotlin Database management with PostgreSQL Experience with ElasticSearch, observability tools like Grafana and Prometheus What this role can offer Opportunity to deepen understanding of AI and More ❯
you: 5+ years of experience in software development, with a strong focus on backend technologies and building distributed services. Proficiency in one or more programming languages including Java, Python, Scala or Golang. Experience with columnar, analytical cloud data warehouses (e.g., BigQuery, Snowflake, Redshift) and data processing frameworks like Apache Spark is essential. Experience with cloud platforms like AWS, Azure, or More ❯
you: 5+ years of experience in software development, with a strong focus on backend technologies and building distributed services. Proficiency in one or more programming languages including Java, Python, Scala or Golang. Experience with columnar, analytical cloud data warehouses (e.g., BigQuery, Snowflake, Redshift) and data processing frameworks like Apache Spark is essential. Experience with cloud platforms like AWS, Azure, or More ❯
3+ years of experience in Framework development and building integration Layers to solve complex business use cases. Technical Skills Strong coding skills in one or more programming languages - Python, Scala, Spark or Java Experience in working with petabyte scale data sets and developing integration layer solutions in Databricks, Snowflake or similar large platforms. Experience with cloud-based data warehousing, transformation More ❯
strategy. Data Science or AI/ML solution architecture. Proficiency in development/languages related to data processing, data manipulation, sampling, and reporting (Python/SQL/Java/Scala). Experience working with imbalanced datasets and applying appropriate techniques. Experience with time series data, including preprocessing, feature engineering, and forecasting. Experience with outlier detection and anomaly detection. Experience working More ❯
nature. You know how to branch, commit, and collaborate cleanly. Bonus Skills (nice to have): Apache Hadoop, Spark/Docker, Kubernetes/Grafana, Prometheus, Graylog/Jenkins/Java, Scala/Shell scripting Team ️ Our Tech Stack We build with the tools we love (and we love good tools): TypeScript, Node.js, React, Python, SQL, Scala, Java, Docker, Kubernetes, AWS, GCP More ❯
nature. You know how to branch, commit, and collaborate cleanly. Bonus Skills (nice to have): Apache Hadoop, Spark/Docker, Kubernetes/Grafana, Prometheus, Graylog/Jenkins/Java, Scala/Shell scripting Team ️ Our Tech Stack We build with the tools we love (and we love good tools): TypeScript, Node.js, React, Python, SQL, Scala, Java, Docker, Kubernetes, AWS, GCP More ❯
nature. You know how to branch, commit, and collaborate cleanly. Bonus Skills (nice to have): Apache Hadoop, Spark/Docker, Kubernetes/Grafana, Prometheus, Graylog/Jenkins/Java, Scala/Shell scripting Team ️ Our Tech Stack We build with the tools we love (and we love good tools): TypeScript, Node.js, React, Python, SQL, Scala, Java, Docker, Kubernetes, AWS, GCP More ❯