Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
In Technology Group
warehousing. Proficiency in Python or another programming language used for data engineering. Experience with cloud platforms (e.g., Azure, AWS, or GCP) is highly desirable. Familiarity with tools such as ApacheAirflow, Spark, or similar is a plus. What’s On Offer: Competitive salary between £45,000 – £55,000 , depending on experience. Flexible hybrid working – 3 days on-site More ❯
for performance, reliability, and scalability. Collaborate with engineers and analysts to understand data requirements and deliver high-quality data solutions. Implement and manage ETL processes using advanced tools like ApacheAirflow, Spark, or similar. Ensure data quality and consistency through rigorous testing, validation, and governance practices. Deploy, monitor, and maintain data infrastructure in cloud environments (AWS, GCP, Azure … Security+) Experience or expertise using, managing, and/or testing API Gateway tools and Rest APIs Experience or expertise configuring an LDAP client to connect to IPA Experience with Apache Hadoop and ETL Who we are: Reinventing Geospatial, Inc. (RGi) is a fast-paced small business that has the environment and culture of a start-up, with the stability More ❯
future-proofing of the data pipelines. ETL and Automation Excellence: Lead the development of specialized ETL workflows, ensuring they are fully automated and optimized for performance using tools like ApacheAirflow, Snowflake, and other cloud-based technologies. Drive improvements across all stages of the ETL cycle, including data extraction, transformation, and loading. Infrastructure & Pipeline Enhancement: Spearhead the upgrading More ❯
between systems Experience with Google Cloud Platform (GCP) is highly preferred.(Experience with other cloud platforms like AWS, Azure can be considered.) Familiarity with data pipeline scheduling tools like ApacheAirflow Ability to design, build, and maintain data pipelines for efficient data flow and processing Understanding of data warehousing best practices and experience in organising and cleaning up More ❯
of distributed systems, databases,(PostgreSQL, Databricks, Clickhouse, ElasticSearch), and performance tuning. Familiarity with modern web frameworks and front-end technologies (React, Vue, Angular, etc.) Experience with data processing frameworks (Apache Spark, Kafka, Airflow, Dagster or similar) Experience with cloud platforms (AWS, GCP, Azure) and containerization technologies (Docker/Kubernetes) Experience with testing frameworks Strong analytical skills and a More ❯
Senior Data Engineer - Up to £90K - Central London (1 day in the office) Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Are you passionate about building scalable data solutions that drive real business impact? I am looking for a Senior Data … Excellent experience of DBT, SQL and Python Good customer-facing/pitching experience and being a self-sufficient person A proactive mindset with excellent problem-solving skills Experience of airflow and medallion is desirable A degree in computer science or a related degree is beneficial Benefits: Company bonus scheme (Based in annual profit made by the company) Pension … with data? Apply now and become part of a business where data drives every decision. Please send your CV to peter.hutchins @ circlerecruitment.com Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Circle Recruitment is acting as an Employment Agency in relation to More ❯
in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
SQL, craft new features. Modelling sprint: run hyper-parameter sweeps or explore heuristic/greedy and MIP/SAT approaches. Deployment: ship a model as a container, update an Airflow (or Azure Data Factory) job. Review: inspect dashboards, compare control vs. treatment, plan next experiment. Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow) SQL (Redshift, Snowflake or … similar) AWS SageMaker → Azure ML migration, with Docker, Git, Terraform, Airflow/ADF Optional extras: Spark, Databricks, Kubernetes. What you'll bring 3-5+ years building optimisation or recommendation systems at scale. Strong grasp of mathematical optimisation (e.g., linear/integer programming, meta-heuristics) as well as ML. Hands-on cloud ML experience (AWS or Azure). Proven … Terraform. SQL mastery for heavy-duty data wrangling and feature engineering. Experimentation chops - offline metrics, online A/B test design, uplift analysis. Production mindset: containerise models, deploy via Airflow/ADF, monitor drift, automate retraining. Soft skills: clear comms, concise docs, and a collaborative approach with DS, Eng & Product. Bonus extras: Spark/Databricks, Kubernetes, big-data panel More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
and analysing A/B tests Strong storytelling and stakeholder-management skills Full UK work authorization Desirable Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery) and ETL tools (dbt, Airflow) Experience with loyalty-programme analytics or CRM platforms Knowledge of machine-learning frameworks (scikit-learn, TensorFlow) for customer scoring Technical Toolbox Data & modeling: SQL, Python/R, pandas, scikit … learn Dashboarding: Tableau or Power BI ETL & warehousing: dbt, Airflow, Snowflake/Redshift/BigQuery Experimentation: A/B testing platforms (Optimizely, VWO) Desired Skills and Experience 8+ years in retail/FMCG customer insights and analytics Built customer segmentation, CLV, and propensity models in Python/R Designed and analysed A/B and multivariate tests for pricing More ❯
business constraints into mathematical formulations. Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow) SQL (Redshift, Snowflake or similar) AWS SageMaker → Azure ML migration, with Docker, Git, Terraform, Airflow/ADF Optional extras: Spark, Databricks, Kubernetes. What you'll bring 3-5+ years building optimisation or recommendation systems at scale. Strong grasp of mathematical optimisation (e.g., linear … Terraform. SQL mastery for heavy-duty data wrangling and feature engineering. Experimentation chops - offline metrics, online A/B test design, uplift analysis. Production mindset: containerise models, deploy via Airflow/ADF, monitor drift, automate retraining. Soft skills: clear comms, concise docs, and a collaborative approach with DS, Eng & Product. Bonus extras: Spark/Databricks, Kubernetes, big-data panel More ❯
tools (e.g., Cursor, Gemini, Claude) into development workflows. Curate and manage datasets across structured and unstructured formats and diverse domains. Contribute to metadata enrichment, lineage, and discoverability using DBT, Airflow, and internal tooling. Skills & Experience We value both traditional and non-traditional career paths. You'll ideally bring: Technical Skills 3-5 years of experience in data or analytics … Python and SQL, with strong debugging and performance tuning skills. Experience building pipelines with AWS services such as Glue, S3, Athena, Redshift, and Lambda. Familiarity with orchestration tools (e.g., Airflow, Step Functions) and DevOps practices (e.g., CI/CD, Infrastructure as Code). Interest in Generative AI and a willingness to grow your skills in LLM integration and prompt More ❯
Herndon, Virginia, United States Hybrid / WFH Options
Maxar Technologies Holdings Inc
and learning new technologies quickly. Preferred Qualifications: Experience with software development. Experience with geospatial data. Experience building data-streaming processes. Experience using PostGIS. Experience with any of the following: Apache-Hive, Trino, Presto, Starburst, OpenMetadata, Apache-SuperSet, Terraform, dbt, Tableau, Fivetran, Airflow. Experience implementing resilient, scalable, and supportable systems in AWS. Experience using a wide variety of open More ❯
and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts, and data scientists Be part of a supportive, innovative More ❯
and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts, and data scientists Be part of a supportive, innovative More ❯
and data visualisation. BSc in Computer Science or a related discipline Good working knowledge of SQL Comfortable using Git for version control Exposure to workflow orchestration tools (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (Azure SQL, Snowflake) or dbt Basic familiarity with Docker and BI tools (Power BI, Tableau) Interest in shipping, financial markets, or commodities Package More ❯
Computer Science or a related discipline Solid Python programming skills Good working knowledge of SQL Comfortable using Git for version control Desirables: Exposure to workflow orchestration tools (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (Azure SQL, Snowflake) or dbt Basic familiarity with Docker and BI tools (Power BI, Tableau) Interest in shipping, financial markets, or commodities Package More ❯
analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong More ❯
code and building robust processes for scale. Nice-to-have Experience shipping at pace on AWS S3 environments. Experience working with Dagster orchestrator or similar data orchestration systems (e.g. Airflow, Databricks)". Software Engineering background. Benefits Competitive salary & equity options Unlimited holiday Benefits package Career development opportunities as the company scales Ownership of ambitious, mission-driven work with real More ❯
bash. Ability to document code, architectures, and experiments. Preferred Qualifications Experience with databases and data warehousing (Hive, Iceberg). Data transformation skills (SQL, DBT). Experience with orchestration platforms (Airflow, Argo). Knowledge of data catalogs, metadata management, vector databases, relational/object databases. Experience with Kubernetes. Understanding of computational geometry (meshes, boundary representations). Ability to analyze data More ❯
applications to the Cloud (AWS) We'd love to hear from you if you Have strong experience with Python & SQL Have experience developing data pipelines using dbt, Spark and Airflow Have experience Data modelling (building optimised and efficient data marts and warehouses in the cloud) Work with Infrastructure as code (Terraform) and containerising applications (Docker) Work with AWS, S3 More ❯
concepts to diverse audiences and collaborate effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of our amazing offices More ❯
concepts to diverse audiences and collaborate effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of our amazing offices More ❯