Cloud Composer, Cloud Run, Cloud Monitoring & Logging, Dataplex, Beam, Tentacles and Pub/Sub; Fluent Python, SQL skills with real life project experience; Experience on orchestration tools such as Airflow and DBT; Experience with one of major analytical DWHs is plus: BigQuery, Redshift, Snowflake, Databricks, Synapse; Work experience with following technologies are noteworthy to mention and might be seen More ❯
and well-tested solutions to automate data ingestion, transformation, and orchestration across systems. Own data operations infrastructure: Manage and optimise key data infrastructure components within AWS, including Amazon Redshift, ApacheAirflow for workflow orchestration and other analytical tools. You will be responsible for ensuring the performance, reliability, and scalability of these systems to meet the growing demands of … pipelines , data warehouses , and leveraging AWS data services . Strong proficiency in DataOps methodologies and tools, including experience with CI/CD pipelines, containerized applications , and workflow orchestration using ApacheAirflow . Familiar with ETL frameworks, and bonus experience with Big Data processing (Spark, Hive, Trino), and data streaming. Proven track record - You've made a demonstrable impact More ❯
to-end, scalable data and AI solutions using the Databricks Lakehouse (Delta Lake, Unity Catalog, MLflow). Design and lead the development of modular, high-performance data pipelines using Apache Spark and PySpark. Champion the adoption of Lakehouse architecture (bronze/silver/gold layers) to ensure scalable, governed data platforms. Collaborate with stakeholders, analysts, and data scientists to … datasets. Promote CI/CD, DevOps, and data reliability engineering (DRE) best practices across Databricks environments. Integrate with cloud-native services and orchestrate workflows using tools such as dbt, Airflow, and Databricks Workflows. Drive performance tuning, cost optimisation, and monitoring across data workloads. Mentor engineering teams and support architectural decisions as a recognised Databricks expert. Essential Skills & Experience: Demonstrable … expertise with Databricks and Apache Spark in production environments. Proficiency in PySpark, SQL, and working within one or more cloud platforms (Azure, AWS, or GCP). In-depth understanding of Lakehouse concepts, medallion architecture, and modern data warehousing. Experience with version control, testing frameworks, and automated deployment pipelines (e.g., GitHub Actions, Azure DevOps). Sound knowledge of data governance More ❯
manage systems for ingesting data from various sources, such as IoT devices, logs, and operational systems. Implement and maintain data pipelines for batch and streaming data, using tools like Airflow or NiFi. Monitor, tune, and scale data storage and processing environments for optimal efficiency. Apply industry best practices and compliance policy of customer to safeguard sensitive O&M data … knowledge for database querying. Proficiency with big data tools (Hadoop, Spark) and familiarity with big data file formats (Parquet, Avro). Skilled in data pipeline and workflow management tools (ApacheAirflow, NiFi). Strong background in programming (Python, Scala, Java) for data pipeline and algorithm development. Skilled in data visualization (Tableau, Power BI) and BI reporting. Experience with More ❯
be considered in lieu of degree. Proven experience as an ETL Engineer, Data Engineer, or in a similar data engineering role. Strong knowledge of ETL tools and frameworks (e.g., Apache NiFi, Talend, Informatica, Microsoft SSIS, ApacheAirflow, etc.). Proficiency in SQL and experience working with relational databases (e.g., MySQL, PostgreSQL, Oracle). Experience with cloud-based … of data integration, data transformation, and data quality practices. Familiarity with version control systems (e.g., Git).Preferred Skills & Qualifications: Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, AWS Kinesis, etc.). Familiarity with data governance and compliance requirements (e.g., GDPR, HIPAA). Knowledge of data visualization tools (e.g., Tableau, Power BI) and their integration with More ❯
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
In Technology Group
warehousing. Proficiency in Python or another programming language used for data engineering. Experience with cloud platforms (e.g., Azure, AWS, or GCP) is highly desirable. Familiarity with tools such as ApacheAirflow, Spark, or similar is a plus. What’s On Offer: Competitive salary between £45,000 – £55,000 , depending on experience. Flexible hybrid working – 3 days on-site More ❯
for performance, reliability, and scalability. Collaborate with engineers and analysts to understand data requirements and deliver high-quality data solutions. Implement and manage ETL processes using advanced tools like ApacheAirflow, Spark, or similar. Ensure data quality and consistency through rigorous testing, validation, and governance practices. Deploy, monitor, and maintain data infrastructure in cloud environments (AWS, GCP, Azure … Security+) Experience or expertise using, managing, and/or testing API Gateway tools and Rest APIs Experience or expertise configuring an LDAP client to connect to IPA Experience with Apache Hadoop and ETL Who we are: Reinventing Geospatial, Inc. (RGi) is a fast-paced small business that has the environment and culture of a start-up, with the stability More ❯
future-proofing of the data pipelines. ETL and Automation Excellence: Lead the development of specialized ETL workflows, ensuring they are fully automated and optimized for performance using tools like ApacheAirflow, Snowflake, and other cloud-based technologies. Drive improvements across all stages of the ETL cycle, including data extraction, transformation, and loading. Infrastructure & Pipeline Enhancement: Spearhead the upgrading More ❯
between systems Experience with Google Cloud Platform (GCP) is highly preferred.(Experience with other cloud platforms like AWS, Azure can be considered.) Familiarity with data pipeline scheduling tools like ApacheAirflow Ability to design, build, and maintain data pipelines for efficient data flow and processing Understanding of data warehousing best practices and experience in organising and cleaning up More ❯
of distributed systems, databases,(PostgreSQL, Databricks, Clickhouse, ElasticSearch), and performance tuning. Familiarity with modern web frameworks and front-end technologies (React, Vue, Angular, etc.) Experience with data processing frameworks (Apache Spark, Kafka, Airflow, Dagster or similar) Experience with cloud platforms (AWS, GCP, Azure) and containerization technologies (Docker/Kubernetes) Experience with testing frameworks Strong analytical skills and a More ❯
Senior Data Engineer - Up to £90K - Central London (1 day in the office) Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Are you passionate about building scalable data solutions that drive real business impact? I am looking for a Senior Data … Excellent experience of DBT, SQL and Python Good customer-facing/pitching experience and being a self-sufficient person A proactive mindset with excellent problem-solving skills Experience of airflow and medallion is desirable A degree in computer science or a related degree is beneficial Benefits: Company bonus scheme (Based in annual profit made by the company) Pension … with data? Apply now and become part of a business where data drives every decision. Please send your CV to peter.hutchins @ circlerecruitment.com Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Circle Recruitment is acting as an Employment Agency in relation to More ❯
in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
SQL, craft new features. Modelling sprint: run hyper-parameter sweeps or explore heuristic/greedy and MIP/SAT approaches. Deployment: ship a model as a container, update an Airflow (or Azure Data Factory) job. Review: inspect dashboards, compare control vs. treatment, plan next experiment. Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow) SQL (Redshift, Snowflake or … similar) AWS SageMaker → Azure ML migration, with Docker, Git, Terraform, Airflow/ADF Optional extras: Spark, Databricks, Kubernetes. What you'll bring 3-5+ years building optimisation or recommendation systems at scale. Strong grasp of mathematical optimisation (e.g., linear/integer programming, meta-heuristics) as well as ML. Hands-on cloud ML experience (AWS or Azure). Proven … Terraform. SQL mastery for heavy-duty data wrangling and feature engineering. Experimentation chops - offline metrics, online A/B test design, uplift analysis. Production mindset: containerise models, deploy via Airflow/ADF, monitor drift, automate retraining. Soft skills: clear comms, concise docs, and a collaborative approach with DS, Eng & Product. Bonus extras: Spark/Databricks, Kubernetes, big-data panel More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and analysing A/B tests Strong storytelling and stakeholder-management skills Full UK work authorization Desirable Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery) and ETL tools (dbt, Airflow) Experience with loyalty-programme analytics or CRM platforms Knowledge of machine-learning frameworks (scikit-learn, TensorFlow) for customer scoring Technical Toolbox Data & modeling: SQL, Python/R, pandas, scikit … learn Dashboarding: Tableau or Power BI ETL & warehousing: dbt, Airflow, Snowflake/Redshift/BigQuery Experimentation: A/B testing platforms (Optimizely, VWO More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
and analysing A/B tests Strong storytelling and stakeholder-management skills Full UK work authorization Desirable Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery) and ETL tools (dbt, Airflow) Experience with loyalty-programme analytics or CRM platforms Knowledge of machine-learning frameworks (scikit-learn, TensorFlow) for customer scoring Technical Toolbox Data & modeling: SQL, Python/R, pandas, scikit … learn Dashboarding: Tableau or Power BI ETL & warehousing: dbt, Airflow, Snowflake/Redshift/BigQuery Experimentation: A/B testing platforms (Optimizely, VWO) Desired Skills and Experience 8+ years in retail/FMCG customer insights and analytics Built customer segmentation, CLV, and propensity models in Python/R Designed and analysed A/B and multivariate tests for pricing More ❯
business constraints into mathematical formulations. Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow) SQL (Redshift, Snowflake or similar) AWS SageMaker → Azure ML migration, with Docker, Git, Terraform, Airflow/ADF Optional extras: Spark, Databricks, Kubernetes. What you'll bring 3-5+ years building optimisation or recommendation systems at scale. Strong grasp of mathematical optimisation (e.g., linear … Terraform. SQL mastery for heavy-duty data wrangling and feature engineering. Experimentation chops - offline metrics, online A/B test design, uplift analysis. Production mindset: containerise models, deploy via Airflow/ADF, monitor drift, automate retraining. Soft skills: clear comms, concise docs, and a collaborative approach with DS, Eng & Product. Bonus extras: Spark/Databricks, Kubernetes, big-data panel More ❯
SQL, craft new features. Modelling sprint: run hyper-parameter sweeps or explore heuristic/greedy and MIP/SAT approaches. Deployment: ship a model as a container, update an Airflow (or Azure Data Factory) job. Review: inspect dashboards, compare control vs. treatment, plan next experiment. Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow)SQL (Redshift, Snowflake or … similar)AWS SageMaker Azure ML migration, with Docker, Git, Terraform, Airflow/ADFOptional extras: Spark, Databricks, Kubernetes. What you'll bring 3-5+ years building optimisation or recommendation systems at scale. Strong grasp of mathematical optimisation (e.g., linear/integer programming, meta-heuristics) as well as ML. Hands-on cloud ML experience (AWS or Azure). Proven track More ❯
tools (e.g., Cursor, Gemini, Claude) into development workflows. Curate and manage datasets across structured and unstructured formats and diverse domains. Contribute to metadata enrichment, lineage, and discoverability using DBT, Airflow, and internal tooling. Skills & Experience We value both traditional and non-traditional career paths. You'll ideally bring: Technical Skills 3-5 years of experience in data or analytics … Python and SQL, with strong debugging and performance tuning skills. Experience building pipelines with AWS services such as Glue, S3, Athena, Redshift, and Lambda. Familiarity with orchestration tools (e.g., Airflow, Step Functions) and DevOps practices (e.g., CI/CD, Infrastructure as Code). Interest in Generative AI and a willingness to grow your skills in LLM integration and prompt More ❯
Herndon, Virginia, United States Hybrid / WFH Options
Maxar Technologies Holdings Inc
and learning new technologies quickly. Preferred Qualifications: Experience with software development. Experience with geospatial data. Experience building data-streaming processes. Experience using PostGIS. Experience with any of the following: Apache-Hive, Trino, Presto, Starburst, OpenMetadata, Apache-SuperSet, Terraform, dbt, Tableau, Fivetran, Airflow. Experience implementing resilient, scalable, and supportable systems in AWS. Experience using a wide variety of open More ❯
and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts, and data scientists Be part of a supportive, innovative More ❯
and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts, and data scientists Be part of a supportive, innovative More ❯
and data visualisation. BSc in Computer Science or a related discipline Good working knowledge of SQL Comfortable using Git for version control Exposure to workflow orchestration tools (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (Azure SQL, Snowflake) or dbt Basic familiarity with Docker and BI tools (Power BI, Tableau) Interest in shipping, financial markets, or commodities Package More ❯