new sales activities We are looking for experience in the following skills: Relevant work experience in data science, machine learning, and business analytics Practical experience in coding languageseg.Python, R, Scala, etc.(Python preferred) Strong proficiency in database technologieseg.SQL, ETL, No-SQL, DW, and Big Data technologieseg.pySpark, Hive, etc. Experienced working with structured and also unstructured dataeg.Text, PDFs, jpgs, call recordings More ❯
of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with any ETL tool like, Informatica More ❯
of Hadoop ecosystem and associated technologies, (For e.g. Apache Spark, MLlib, GraphX, iPython, sci-kit,Pandas etc.) Working knowledge in writing and optimizing efficient SQL queries with Python, Hive, Scala handling Large Data Sets in Big-Data Environments. Experience managing large-scale datasets using Big Data and distributed processing platforms such as Hadoop, Hive and Spark Experience with Visualization Tools More ❯
ensure cost efficiency and high performance. Design and manage data lakes, data warehouses, and associated infrastructure to ensure data accuracy, integrity, and availability. Write high-quality, efficient code in Scala, Python, and Spark, following best practices for maintainability and scalability. Develop and implement robust monitoring, alerting, and error-handling mechanisms to ensure pipeline reliability. Partner with business stakeholders, product managers … Provide on-call support as part of a shared team rota to ensure platform availability. ABOUT YOU Proven expertise working with Databricks, including Unity Catalog. Strong programming skills in Scala, Python, Spark & SQL/MYSQL. Solid experience with version control systems, particularly Git. Strong background in designing and optimizing complex data pipelines and infrastructure. Experience leading and mentoring technical teams More ❯
efficiency. Mentor and support junior developers, sharing knowledge and encouraging technical growth across the team. What We're Looking For: Proficiency in a JVM language such as Kotlin, Java, Scala, or C#, alongside experience with Python. Strong background in building solutions within SOA and/or microservices environments. Hands-on experience in implementing CI/CD pipelines for production services. More ❯
Full stack Sr Software Engineer - Scala/Typescript - 6 months - Hybrid (2 days pw) - Central LDN A Full Stack Senior Software Engineer is required to join a highly skilled engineering team with a renowned media client. You will be working with functional Scala and while this is a backend leaning role, you will also have experience working with Typescript (React … preferable) and running with AWS. Scala (Essential) Typescript (Essential) React (Preferred) AWS This is a 2 stage interview process: 1 hour paired programming & 1 hour face to face. This is for a July start - please send on your updated CV and let's discuss More ❯
South East London, London, United Kingdom Hybrid / WFH Options
TEN10 SOLUTIONS LIMITED
. Proficiency in performance testing tools like JMeter, Gatling, K6, Neoload, or Webload . Strong coding skills in at least one language such as Java, TypeScript JavaScript, Python, C#, , Scala, or PHP. Experience designing and building automation frameworks . Familiarity with Agile development environments (SCRUM, Kanban, TDD, BDD). Implementing pipelines using common tooling such as Jenkins, ADO, GitHub actions More ❯
West London, London, United Kingdom Hybrid / WFH Options
Client Server
experience with Agile processes and TDD You have a thorough understanding of Computer Science fundamentals such as OOP, Design Patterns, Data Structures, Algorithms Other tech in the stack includes Scala, React, Spring, Oracle, Redis, Kubernetes, Docker and Linux so previous exposure to any of these would be beneficial You're collaborative with good communication skills What's in it for More ❯
GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Understanding of data warehousing and data lake concepts and best practices. Experience with version control systems (e.g., Git). 5+ years of advanced expertise … GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL More ❯
GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Understanding of data warehousing and data lake concepts and best practices. Experience with version control systems (e.g., Git). - Proficiency in SQL, Python, and More ❯
GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL More ❯
GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL More ❯
Proven track record in Data Engineering and supporting the business to gain true insight from data. Experience in data integration and modelling including ELT pipelines. Strong SQL, Python/Scala/R; experience with ELT pipelines and Power BI Solid knowledge of data Lakehouse architecture and Azure services Insurance or MGA data experience preferred Strong communication, stakeholder engagement, and problem More ❯
experience working in enterprise data warehouse and analytics technologies Hands-on experience building and training machine learning models. Experience writing software in one or more languages such as Python, Scala, R, or similar with strong competencies in data structures, algorithms, and software design. Experience working with recommendation engines, data pipelines, or distributed machine learning. Experience working with deep learning frameworks More ❯
one stack”. You’ll be expected to work across a broad tech landscape: Big Data & Distributed Systems: HDFS, Hadoop, Spark, Kafka Cloud: Azure or AWS Programming: Python, Java, Scala, PySpark – you’ll need two or more, Python preferred Data Engineering Tools: Azure Data Factory, Databricks, Delta Lake, Azure Data Lake SQL & Warehousing: Strong experience with advanced SQL and database More ❯
practice. We are looking for experience in the following skills: Relevant work experience in data science, machine learning, and business analytics Practical experience in coding languages eg. Python, R, Scala, etc.; (Python preferred) Proficiency in database technologies eg. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. pySpark, Hive, etc. Experienced working with structured and also unstructured data eg. More ❯
practice. We are looking for experience in the following skills: Relevant work experience in data science, machine learning, and business analytics Practical experience in coding languages eg. Python, R, Scala, etc.; (Python preferred) Proficiency in database technologies eg. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. pySpark, Hive, etc. Experienced working with structured and also unstructured data eg. More ❯
data solutions 💡 What You’ll Bring Strong hands-on experience building streaming data platforms Deep understanding of tools like Kafka , Flink , Spark Streaming , etc. Proficiency in Python , Java , or Scala Cloud experience with AWS , GCP , or Azure Familiarity with orchestration tools like Airflow , Kubernetes Collaborative, solutions-focused mindset and a willingness to lead from the front 🌟 What’s on Offer More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Atarus
data solutions 💡 What You’ll Bring Strong hands-on experience building streaming data platforms Deep understanding of tools like Kafka , Flink , Spark Streaming , etc. Proficiency in Python , Java , or Scala Cloud experience with AWS , GCP , or Azure Familiarity with orchestration tools like Airflow , Kubernetes Collaborative, solutions-focused mindset and a willingness to lead from the front 🌟 What’s on Offer More ❯
enterprise-wide integrations. What we’re looking for: At least 3 years of experience in data engineering, ideally in a large-scale or regulated environment Strong Python, Java or Scala skills, with solid SQL knowledge Hands-on experience with AWS and tools such as Glue, Spark, Redshift or Hadoop Proven experience with relational and NoSQL databases such as SQL Server More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Atarus
real-time data processing tools (e.g. Kafka, Flink, Spark Streaming) Solid hands-on knowledge of cloud platforms (AWS, GCP or Azure) Strong proficiency in languages like Python, Java or Scala Familiarity with orchestration tools such as Airflow or Kubernetes Strong stakeholder management and communication skills Passion for mentoring and developing engineering talent 🎁 What’s On Offer Competitive salary + quarterly More ❯
real-time data processing tools (e.g. Kafka, Flink, Spark Streaming) Solid hands-on knowledge of cloud platforms (AWS, GCP or Azure) Strong proficiency in languages like Python, Java or Scala Familiarity with orchestration tools such as Airflow or Kubernetes Strong stakeholder management and communication skills Passion for mentoring and developing engineering talent 🎁 What’s On Offer Competitive salary + quarterly More ❯
Senior Scala Developer Hybrid, West London: 4 days on-site (pick your day) Salary: Up to £90,000 + benefits Fast process Scala developer with a love of sports? Looking to take ownership in a fast-moving scale-up? Passion for data, real-time feedback and saying "I created that" - thing visible to a global audience? This is an opportunity … and more. The opportunity would have you delivering products and statistical models for some of the top sporting teams and associations across the globe. About you: Strong background in Scala development as a senior developer - working in fast paced environments Deep technical knowledge and ownership - a willingness to guide and give feedback and direction to your peers and stakeholders Passionate More ❯
Senior Scala Developer Hybrid, West London: 4 days on-site (pick your day) Salary: Up to £90,000 + benefits Fast process Scala developer with a love of sports? Looking to take ownership in a fast-moving scale-up? Passion for data, real-time feedback and saying "I created that" - thing visible to a global audience? This is an opportunity … and more. The opportunity would have you delivering products and statistical models for some of the top sporting teams and associations across the globe. About you: Strong background in Scala development as a senior developer - working in fast paced environments Deep technical knowledge and ownership - a willingness to guide and give feedback and direction to your peers and stakeholders Passionate More ❯
GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Understanding of data warehousing and data lake concepts and best practices. Experience with version control systems (e.g., Git). 5+ years of advanced expertise … GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL More ❯