we're looking for great people, not just those who simply check off all the boxes. What you'll do: Work with technologies like Apache Lucene, Apache Flink, Apache Beam, and Kubernetes to build core components of Yelp's search infrastructure. Design, build, and maintain scalable real … and complexity analysis. Comprehensive understanding of systems and application design, including operational and reliability trade-offs. Experience with distributed data processing frameworks such as Apache Flink or Apache Beam. Familiarity with search technologies like Apache Lucene or Elasticsearch is a plus. Experience working with containerized environments and More ❯
technologies like Docker and Kubernetes. Ideally, some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark/Ignite or other caching and analytics technologies. A working knowledge of FX markets and financial instruments would be beneficial. What we'll More ❯
diagram of proposed tables to enable discussion. Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users. Worked on Apache Airflow before to create DAGs. Ability to work within Agile, considering minimum viable products, story pointing, and sprints. More information: Enjoy fantastic perks like More ❯
lakehouse platforms like Databricks, Redshift, and Snowflake. Experience using Git for version control. Ability to automate tasks using Python or workflow orchestration tools like Apache Airflow. Skill in integrating data from various sources, including APIs, databases, and third-party systems. Ensuring data quality through monitoring and validation throughout pipelines. More ❯
experience working with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one programming language (e.g. Python, Scala, Java, R). Experience deploying and maintaining cloud infrastructure More ❯
team of developers globally. The platform is a Greenfield build using standard modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premise and in AWS utilising technologies such as EKS, S3, FSX. The main purpose More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Peaple Talent
and familiar with setting up CI/CD workflows using platforms like Azure DevOps or similar tools. Hands-on experience with orchestration tools like Apache Airflow for managing complex data workflows. Practical familiarity with low-code or no-code platforms such as Talend and SnapLogic for streamlined pipeline development. More ❯
Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. Data Technology Skills: A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes More ❯
Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (Python, Data Modelling, ETL/ELT, Apache Airflow, DBT, AWS) Enterprise-scale tech firm Up to £70,000 plus benefits - FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalabale data structures and data pipelines within a More ❯
Python. Experience in data modelling and design patterns; in-depth knowledge of relational databases (PostgreSQL) and familiarity with data lakehouse formats (storage formats, e.g. Apache Parquet, Delta tables). Experience with Spark, Databricks, data lakes/lakehouses. Experience working with external data suppliers (defining requirements for suppliers, defining Service More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Low Carbon Contracts Company
design principles for usability, maintainability and extensibility Experience working with Git in a version-controlled environment Good knowledge of parallel computing techniques (Python multiprocessing, Apache Spark), and performance profiling and optimisation Good understanding of data structures and algorithms An enthusiastic problem-solving mindset with a desire to solve technical More ❯
West Midlands, England, United Kingdom Hybrid / WFH Options
Aubay UK
migration projects, particularly large-scale migrations to distributed database platforms. Hands-on experience with big data processing technologies, including Spark (PySpark and SparkScala) and Apache Airflow. Expertise in distributed databases and computing environments. Familiarity with Enterprise Architecture methodologies, ideally TOGAF. Strong leadership experience, including managing technology teams and delivering More ❯
Python. Experience in data modelling and design patterns; in-depth knowledge of relational databases (PostgreSQL) and familiarity with data lakehouse formats (storage formats, e.g. Apache Parquet, Delta tables). Experience with Spark, Databricks, data lakes/lakehouses. Experience working with external data suppliers (defining requirements for suppliers, defining Service More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Duel
Snowflake. You understand event-driven architectures and real-time data processing. You have experience implementing and maintaining scalable data pipelines using tools like dbt, Apache Airflow, or similar. You have no issue working with either structured or semi-structured data. You are comfortable working with data engineering and scripting More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up More ❯
or all of the services below would put you at the top of our list Google Cloud Storage Google Data Transfer Service Google Dataflow (Apache Beam) Google PubSub Google CloudRun BigQuery or any RDBMS Python Debezium/Kafka dbt (Data Build tool) Interview process Interviewing is a two way More ❯
Central London, London, United Kingdom Hybrid / WFH Options
167 Solutions Ltd
. Develop and manage data warehouse and lakehouse solutions for analytics, reporting, and machine learning. Implement ETL/ELT processes using tools such as Apache Airflow, AWS Glue, and Amazon Athena . Work with cloud-native technologies to support scalable, serverless architectures. Collaborate with data science teams to streamline More ❯
services experience is desired but not essential. API development (FastAPI, Flask) Tech stack : Azure, Python, Databricks, Azure DevOps, ChatGPT, Groq, Cursor AI, JavaScript, SQL, Apache Spark, Kafka, Airflow, Azure ML, Docker, Kubernetes and many more. Role Overview: We are looking for someone who is as comfortable developing AI/ More ❯
london, south east england, united kingdom Hybrid / WFH Options
Aventis Solutions
services experience is desired but not essential. API development (FastAPI, Flask) Tech stack : Azure, Python, Databricks, Azure DevOps, ChatGPT, Groq, Cursor AI, JavaScript, SQL, Apache Spark, Kafka, Airflow, Azure ML, Docker, Kubernetes and many more. Role Overview: We are looking for someone who is as comfortable developing AI/ More ❯
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
Noir
in a data-focused role, with a strong passion for working with data and delivering value to stakeholders. Strong proficiency in SQL, Python, and Apache Spark , with hands-on experience using these technologies in a production environment. Experience with Databricks and Microsoft Azure is highly desirable. Financial Services experience More ❯
pipelines. Exceptional troubleshooting and debugging skills. Good to have: Experience with designing and implementing integration solutions for event/data streaming. Experience working with Apache Superset. Start-up experience. Exposure to Payments domain - ideally sanctions screening. What You Get in Return: Impactful Work: Be part of a growing startup More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal English More ❯