bradford, yorkshire and the humber, united kingdom Hybrid / WFH Options
Anson McCade
review standards and agile delivery. Contributing to technical strategy and mentoring other engineers. What You’ll Need: Strong experience in data engineering with expertise in languages such as Python, Scala, or Spark. Proficiency in designing and building data pipelines, working with both structured and unstructured data. Experience with cloud platforms (AWS, Azure, or GCP), using native services for data workloads. More ❯
driven APIs, and designing database schemas and queries to meet business requirements. A passion and proven background in picking up and adopting new technologies on the fly. Exposure to Scala, or functional programming generally. Exposure with highly concurrent, asynchronous backend technologies, such as Ktor, http4k, http4s, Play, RxJava, etc. Exposure with DynamoDB or similar NoSQL databases, such as Cassandra, HBase More ❯
knowledge sharing across the team. What We're Looking For Strong hands-on experience across AWS Glue, Lambda, Step Functions, RDS, Redshift, and Boto3. Proficient in one of Python, Scala or Java, with strong experience in Big Data technologies such as: Spark, Hadoop etc. Practical knowledge of building Real Time event streaming pipelines (eg, Kafka, Spark Streaming, Kinesis). Proven More ❯
large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A growth mindset and eagerness to work in a fast-paced, mission-driven environment Good More ❯
of emerging technology and trends. Provides out of hours support for applications to ensure the shop stays open and fully functional. Essential knowledge and skills Proficient in Python or Scala Familiarity in Java Experience in a Marketing technical stack and 3rd party tools Broad experience of working within AWS; including infrastructure (VPC, EC2, Security groups, S3 etc) to AWS data More ❯
practices, cloud data platforms, and the full data lifecycle (ingestion, processing, modelling, analytics, and ML/AI deployment). Strong programming skills in languages such as Python, Java, or Scala, and experience with data pipeline frameworks (e.g., Spark, Airflow, dbt) and cloud infrastructure (e.g., AWS, Azure, GCP). A track record of delivering innovative, data-driven products from concept to More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Arm Limited
re solving for reliability, compliance, performance, and speed - at once. You'll be key to making it work. Required Skills: Knowledge of one or more programming languages (Java/Scala, TypeScript, Python). Validated experience operating distributed systems at scale in production. Cloud AWS (primary), Kubernetes (future), Docker (current), Terraform. Excellent debugging skills across network, systems, and data stack. Observability More ❯
platforms, preferably with GCP expertise. Deep knowledge of cloud architecture, data engineering, pipelines, and big data technologies (e.g., BigQuery, Dataflow, Pub/Sub). Proficiency in Python, Java, or Scala; familiarity with microservices, Docker, Kubernetes, CI/CD tools (e.g., Jenkins, GitLab CI), and cloud monitoring. Proven experience in digital transformation and Agile environments. Preferred understanding of banking risk management More ❯
platforms, preferably with GCP expertise. Deep knowledge of cloud architecture, data engineering, pipelines, and big data technologies (e.g., BigQuery, Dataflow, Pub/Sub). Proficiency in Python, Java, or Scala; familiarity with microservices, Docker, Kubernetes, CI/CD tools (e.g., Jenkins, GitLab CI), and cloud monitoring. Proven experience in digital transformation and Agile environments. Preferred understanding of banking risk management More ❯
in motivated teams, collaborating effectively and taking pride in your work. Strong problem-solving skills, viewing technology as a means to solve challenges. Proficiency in a programming language (e.g., Scala, Python, Java, C#) with understanding of domain modelling and application development. Knowledge of data management platforms (SQL, NoSQL, Spark/Databricks). Experience with modern engineering tools (Git, CI/ More ❯
3+ years of experience in Framework development and building integration Layers to solve complex business use cases. Technical Skills Strong coding skills in one or more programming languages - Python, Scala, Spark or Java Experience in working with petabyte scale data sets and developing integration layer solutions in Databricks, Snowflake or similar large platforms. Experience with cloud-based data warehousing, transformation More ❯
platforms, preferably with GCP expertise. Deep knowledge of cloud architecture, data engineering, pipelines, and big data technologies (e.g., BigQuery, Dataflow, Pub/Sub). Proficiency in Python, Java, or Scala; familiarity with microservices, Docker, Kubernetes, CI/CD tools (e.g., Jenkins, GitLab CI), and cloud monitoring. Proven experience in digital transformation and Agile environments. Preferred understanding of banking risk management More ❯
tested, and fault-tolerant data engineering solutions. Support and mentor junior engineers, contributing to knowledge sharing across the team. What We're Looking For Proficient in one of Python, Scala or Java, with strong experience in Big Data technologies such as: Spark, Hadoop etc. Practical knowledge of building real-time event streaming pipelines (e.g., Kafka, Spark Streaming, Kinesis). Proficiency More ❯
Out in Science, Technology, Engineering, and Mathematics
years of professional software development experience Excellent problem-solving skills and a proactive approach to addressing challenges. Strong programming skills in languages such as Go, SQL, Python, Java or Scala Experience in the effective application of AI tooling in your development process A growth mindset and eagerness to work in a fast-paced, mission-driven environment Excellent communication skills, with More ❯
understanding of various data engineering technologies including Apache Spark, Databricks and Hadoop Strong understanding of agile ways of working Up-to-date understanding of various programming languages including Python, Scala, R and SQL Up-to-date understanding of various databases and cloud-based datastores including SQL and NoSQL Up-to-date understanding of cloud platforms including AWS and/or More ❯
expectations, navigate complex requirements, and deliver tailored solutions across diverse industries. 5+ years' experience working with Databricks, including Spark and Delta Lake Strong skills in Python and/or Scala for data engineering tasks Comfortable working with cloud platforms like Azure, AWS, and/or Google Cloud A problem-solving mindset and the ability to simplify complex data challenges Excellent More ❯
Position Level: Junior/Mid Position Type: Full-time (Onsite) About Intelmatix: Intelmatix is a deep tech Artificial intelligence (AI) company founded in July 2021 by a group of MIT scientists with the vision of transforming enterprises to become cognitive. More ❯
ll need: Demonstrated experience as a Senior Data Engineer in complex enterprise environments Deep understanding of technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large More ❯
ll need: Demonstrated experience as a Senior Data Engineer in complex enterprise environments Deep understanding of technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large More ❯
ll need: Demonstrated experience as a Senior Data Engineer in complex enterprise environments Deep understanding of technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large More ❯
ll need: Demonstrated experience as a Senior Data Engineer in complex enterprise environments Deep understanding of technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large More ❯
london (city of london), south east england, united kingdom
Sahaj Software
ll need: Demonstrated experience as a Senior Data Engineer in complex enterprise environments Deep understanding of technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large More ❯
thrive. What you'll need Validated expertise in delivering and supporting high-availability, high-performance systems. Coding/scripting experience developed in a commercial/industry setting (Python, Java, Scala or Go and SQL) Strong experience working with Kafka technologies Working experience with operational data stores, data warehouse, big-data technologies, and data lakes Experience working with relational and non More ❯
Bethesda, Maryland, United States Hybrid / WFH Options
Gridiron IT Solutions
Statistics, Mathematics, or related field 7+ years of experience in data science or related field, plus additional 3+ years' experience in a complimentary function Strong programming skills in Python, SCALA, and/or UNIX shell scripting Expertise in machine learning techniques and statistical analysis Proficiency in SQL and NoSQL databases Experience with big data platforms such as Hadoop, Spark, and More ❯
data engineering experience - Experience with SQL - Experience with data modeling, warehousing and building ETL pipelines - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS - Experience mentoring team members on best practices PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience operating large data warehouses Amazon is an More ❯