technologies like Docker and Kubernetes. Ideally, some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark/Ignite or other caching and analytics technologies. A working knowledge of FX markets and financial instruments would be beneficial. What we'll More ❯
experience working with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one programming language (e.g. Python, Scala, Java, R). Experience deploying and maintaining cloud infrastructure More ❯
team of developers globally. The platform is a Greenfield build using standard modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premise and in AWS utilising technologies such as EKS, S3, FSX. The main purpose More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don More ❯
Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. Data Technology Skills: A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up More ❯
Central London, London, United Kingdom Hybrid / WFH Options
167 Solutions Ltd
. Develop and manage data warehouse and lakehouse solutions for analytics, reporting, and machine learning. Implement ETL/ELT processes using tools such as Apache Airflow, AWS Glue, and Amazon Athena . Work with cloud-native technologies to support scalable, serverless architectures. Collaborate with data science teams to streamline More ❯
pipelines. Exceptional troubleshooting and debugging skills. Good to have: Experience with designing and implementing integration solutions for event/data streaming. Experience working with Apache Superset. Start-up experience. Exposure to Payments domain - ideally sanctions screening. What You Get in Return: Impactful Work: Be part of a growing startup More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal English More ❯
and scaling data systems. Highly desired experience with Azure, particularly Lakehouse and Eventhouse architectures. Experience with relevant infrastructure and tools including NATS, Power BI, Apache Spark/Databricks, and PySpark. Hands-on experience with data warehousing methodologies and optimization libraries (e.g., OR-Tools). Experience with log analysis, forensic More ❯
Python. Experience in data modelling and design patterns; in-depth knowledge of relational databases (PostgreSQL) and familiarity with data lakehouse formats (storage formats, e.g. Apache Parquet, Delta tables). Experience with Spark, Databricks, data lakes/lakehouses. Experience working with external data suppliers (defining requirements for suppliers, defining Service More ❯
Python. Experience in data modelling and design patterns; in-depth knowledge of relational databases (PostgreSQL) and familiarity with data lakehouse formats (storage formats, e.g. Apache Parquet, Delta tables). Experience with Spark, Databricks, data lakes/lakehouses. Experience working with external data suppliers (defining requirements for suppliers, defining Service More ❯
or all of the services below would put you at the top of our list Google Cloud Storage Google Data Transfer Service Google Dataflow (Apache Beam) Google PubSub Google CloudRun BigQuery or any RDBMS Python Debezium/Kafka dbt (Data Build tool) Interview process Interviewing is a two way More ❯
London, England, United Kingdom Hybrid / WFH Options
WA Consultants
brokers such as AWS SQS. *Own and evolve containerised deployment pipelines using Docker and CI/CD principles. *Develop and manage data pipelines with Apache Airflow, with data transformation using Python and Pandas. *Guide and mentor a team of engineers, setting high standards for clean code, testing, and technical More ❯
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Noir
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Cathcart Technology
Data Scientist with Machine Learning experience ** Strong understanding and experience with ML models and ML observability tools ** Strong Python and SQL experience ** Spark/Apache Airflow ** ML frame work experience (PyTorch/TensorFlow/Scikit-Learn) ** Experience with cloud platforms (preferably AWS) ** Experience with containerisation technologies Useful information: Their More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English More ❯
processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the More ❯
A passion for building and participating in highly effective teams and development processes. Strong debugging, testing/validation, and analytics/SQL(AO), skills Apache, Experience working with Agile methodologies (Scrum) and cross-functional teams. Desirable: Experience in CDN, datacenter, hyperscaler, media, news, and/or entertainment industry. Knowledge More ❯
ll Bring: Strong experience in AWS cloud platform architecture and solving complex business issues. Proficiency in Java programming and Linux, with desirable knowledge of Apache NiFi, Node.js, JSON/XML, Jenkins, Maven, BitBucket, or JIRA. Hands-on experience with scripting (Shell, Bash, Python) and a solid understanding of Linux More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
ll Bring: Strong experience in AWS cloud platform architecture and solving complex business issues. Proficiency in Java programming and Linux, with desirable knowledge of Apache NiFi, Node.js, JSON/XML, Jenkins, Maven, BitBucket, or JIRA. Hands-on experience with scripting (Shell, Bash, Python) and a solid understanding of Linux More ❯
the user experience. Key skills: Senior Data Scientist experience Commercial experience in Generative AI and recommender systems Strong Python and SQL experience Spark/Apache Airflow LLM experience MLOps experience AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill More ❯
the user experience. Key skills: Senior Data Scientist experience Commercial experience in Generative AI and recommender systems Strong Python and SQL experience Spark/Apache Airflow LLM experience MLOps experience AWS Additional information: This role offers a strong salary of up to 95,000 (Depending on experience/skill More ❯