Central London, London, United Kingdom Hybrid / WFH Options
167 Solutions Ltd
. Develop and manage data warehouse and lakehouse solutions for analytics, reporting, and machine learning. Implement ETL/ELT processes using tools such as ApacheAirflow, AWS Glue, and Amazon Athena . Work with cloud-native technologies to support scalable, serverless architectures. Collaborate with data science teams to More ❯
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯
data flow, storage, and processing • Good logical thinking and attention to detail ⸻ 🌟 Nice-to-Have (But Not Required) : • Experience with data pipeline tools like ApacheAirflow, DBT, or Kafka • Knowledge of cloud data services (AWS S3/Glue/Redshift, GCP BigQuery, Azure Data Factory) • Exposure to Spark More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal More ❯
and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, ApacheAirflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management : Lead post-incident reviews, perform root cause analysis for data More ❯
implement elegant solutions for them. Are a data enthusiast who wants to be surrounded by brilliant teammates and huge challenges. Bonus Points: Experience with ApacheAirflow, including designing, managing, and troubleshooting DAGs and data pipelines. Experience with CI/CD pipelines and tools like Jenkins, including automating the More ❯
with NoSQL databases (e.g., MongoDB) and relational databases (e.g., PostgreSQL, MySQL). 5+ years in Python and SQL work. Knowledge of ETL tools (e.g., ApacheAirflow) and cloud platforms (e.g., AWS, Azure, GCP). Understand data modelling concepts and best practices. Experience with healthcare data standards (e.g., HL7 More ❯
SQL and experienced with various database technologies. Knowledge of Python or Java, with the ability to leverage either in building scalable solutions. Experience with Airflow & other big data technologies is useful. Familiarity with DevOps practices and tools, including CI/CD pipelines. Previous experience with reporting tools is helpful More ❯
london, south east england, united kingdom Hybrid / WFH Options
Saragossa
SQL and experienced with various database technologies. Knowledge of Python or Java, with the ability to leverage either in building scalable solutions. Experience with Airflow & other big data technologies is useful. Familiarity with DevOps practices and tools, including CI/CD pipelines. Previous experience with reporting tools is helpful More ❯
with a knack for translating complex tech into clear business insights. Proven leadership and mentoring experience. Nice to have: Hands-on experience with Snowflake, Airflow, AWS Glue, Spark, and S3. Familiarity with open-source data libraries (e.g., Pandas, DBT). Experience with modern data stacks and AWS cloud services. More ❯
with Snowflake, building full solutions, ideally from scratch with security, and user access DBT/general data modelling with dault vault experience being desirable Airflow and Python experience Proficient with AWS- Lambda, S3, SNS, CDK- DevOPs Need to be able to build, deploy and use Terraform Benefits Bonus opportunity More ❯
Services: Glue, Lambda, IAM, Service Catalogue, Cloud Formation, Lake Formation, SNS, SQS, Event Bridge Language & Scripting: Python and Spark ETL: DBT Good to Have: Airflow, Snowflake, Big Data (Hadoop), and Teradata Responsibilities: Serve as the primary point of contact for all AWS related data initiatives and projects. Responsible for More ❯
Southampton, Hampshire, South East, United Kingdom Hybrid / WFH Options
Spectrum It Recruitment Limited
functional teams Tech You'll Work With ML & Data Science Python (primary language) TensorFlow, PyTorch, or Keras NumPy, pandas Data pipelines (Azure Data Factory, Airflow, etc.) Applied ML: NLP, CV, transformers, GANs, time series, etc. Engineering & Cloud Azure (or similar cloud platforms like AWS, GCP) Microservices and event-driven More ❯
portsmouth, hampshire, south east england, united kingdom Hybrid / WFH Options
Spectrum IT Recruitment
functional teams Tech You'll Work With ML & Data Science Python (primary language) TensorFlow, PyTorch, or Keras NumPy, pandas Data pipelines (Azure Data Factory, Airflow, etc.) Applied ML: NLP, CV, transformers, GANs, time series, etc. Engineering & Cloud Azure (or similar cloud platforms like AWS, GCP) Microservices and event-driven More ❯
in dbt or similar tools. Good understanding of cloud platforms such as GCP, AWS or Azure. Experience configuring orchestration of SQL and Python via Airflow or similar tools. Experience working with data pipelines, defining problems, crafting and launching solutions, and practicing continuous improvement. Experience with process improvement frameworks and More ❯
Software Engineering company. You're not fazed by the prospect of working autonomously. It's a bonus if you have experience using workflows like Airflow or Temporal, especially in a distributed system environment. Interview Process: Recruiter Screen & Intro to Monolith (30 mins) Take Home Assignment (120 mins) Interview More ❯
processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the More ❯
Tech SQL & Python are your native languages, with a dash of Scala when needed. DBT, data modeling , and analytics are your go-to tools; Airflow is your daily companion. BigQuery/GCP hold no secrets for you, and AWS is a trusted friend. You know when to build real More ❯
work cross-functionally in an Agile environment Exposure to data product management principles (SLAs, contracts, ownership models) Familiarity with orchestration tools and observability platforms (Airflow, dbt, Monte Carlo, etc.) Exposure to real-time/streaming pipelines Understanding of information security best practices Familiarity with BI tools (QuickSight, Power BI More ❯
it would be a plus: Experience working with external engagements, technical architecture forums etc. Experience in workflow orchestration with tools such as Argo Workflow, Airflow, and scientific workflow tools such as Nextflow, Snakemake, VisTrails, or Cromwell Experience with specialized data architecture (e.g. optimizing physical layout for access patterns, including More ❯
Engineer, you will need experience of: A track record leading high performing engineering teams. Advanced demonstrable commercial experience across Azure Data Factory, Databricks, AutoLoader, Apache Airflow. Actual commercial experience of real-time & Batch (Synchronous and Asynchronous) integrations into a Salesforce environment is required. Advanced Database skills in SQL, Stored … Infrastructure as Code). Certification in Azure data integration is highly desired. Strong understanding of data flow and message services such as Event Hub, Apache Kafka. Please note: This role requires candidates to be onsite 3 days a week therefore a reasonable commutable domicile is expected. Where Tunbridge Wells … pension, and an extensive employee benefits programme. Data Integration Engineer, Integration Engineer, Salesforce, .Net, Azure, Azure Data Factory, RDBMS, SQL, Stored Procedures, Triggers, SalesForce, ApacheAirflow, Databricks, Autoloader, CI, CD, IAC, Infrastructure as Code, DevOps, Agile, Event Hub, Apache Kafka. We are Disability Confident and neurodiverse aware. More ❯
Altrincham, Cheshire, North West, United Kingdom Hybrid / WFH Options
Chroma Recruitment Ltd
Strong proficiency in SQL and experience with database management systems (e.g., MySQL, PostgreSQL, MongoDB). Experience with data pipeline and workflow management tools (e.g., ApacheAirflow, Luigi). Experience working with Google Cloud Platform tools Proficiency in programming languages such as Python, Java, or Scala. Desirable skills: Experience More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Cathcart Technology
Data Scientist with Machine Learning experience ** Strong understanding and experience with ML models and ML observability tools ** Strong Python and SQL experience ** Spark/ApacheAirflow ** ML frame work experience (PyTorch/TensorFlow/Scikit-Learn) ** Experience with cloud platforms (preferably AWS) ** Experience with containerisation technologies Useful information More ❯
field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're More ❯
field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There’s no place quite like BFS and we’re More ❯