engineering role and graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field ● Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions ● Experience in data modelling and tools such as dbt, ETL processes, and data warehousing ● Experience with at least one of the programming languages More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Involved Solutions
strategic business decisions. Responsibilities for the AWS Data Engineer: Design, build and maintain scalable data pipelines and architectures within the AWS ecosystem Leverage services such as AWS Glue, Lambda, Redshift, EMR and S3 to support data ingestion, transformation and storage Work closely with data analysts, architects and business stakeholders to translate requirements into robust technical solutions Implement and optimise More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Arc IT Recruitment
based architecture. You will need to have skills and experience in the following: Cloud Data Architecture: Strong knowledge of modern, cloud-based data architecture and tooling (e.g., S3, Glue, Redshift, Athena, Lake Formation, Iceberg/Delta). AWS Platform Build: Demonstrable experience designing and building modern data platforms in AWS. ETL/Orchestration Expertise: Expertise in ETL/ELT More ❯
with analysts and architects to ensure compliance with government security standards. Troubleshoot and resolve issues in complex cloud environments. Essential Skills Strong experience with AWS services (Glue, Lambda, S3, Redshift, IAM). Proficiency in Python and SQL for data engineering tasks. Knowledge of data modelling, ETL frameworks, and best practices. Familiarity with security and compliance in government or regulated More ❯
and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) would be beneficial. Data Platform Management: Comfortable operating in hybrid environments (cloud and on-prem). Experience integrating diverse data sources and systems. Understanding of secure data transfer More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Involved Solutions
persuasively Desirable Skills for the Senior Data Engineer: Experience with event sourcing, dbt, or related data transformation tools Familiarity with PostgreSQL and cloud-native data services (Azure Event Hub, Redshift, Kinesis, S3, Blob Storage, OneLake, or Microsoft Fabric) Understanding of machine learning model enablement and operationalisation within data architectures Experience working within Agile delivery environments If you are an More ❯
. Hands-on experience with data orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift). Experience with streaming technologies (Kafka, Kinesis, or similar). Strong knowledge of data modelling, governance, and architecture best practices . Excellent leadership, communication, and stakeholder management skills. NICE More ❯
the organisation. What you'll be doing: Design, develop, and maintain scalable data architectures and ETL pipelines Build and manage data models and data warehouse solutions (Airflow, dbt, and Redshift) Write clean, efficient Python and SQL code for data processing and transformation Integrate data from internal and third-party APIs and services Optimise data pipelines for performance, scalability, and More ❯
teams as needed. Skills required to contribute: Years of overall Data and Analytics experience with 2. Minimum 10+ years in AWS data platform including AWS S3, AWS Glue, AWS Redshift, AWS Athena, AWS Sagemaker, AWS Quicksight and AWS MLOPS 3. Snowflake DWH architecture, Snowflake Data Sharing, Snowpipe, Polaris catalog and data governance (meta data/business catalogs). 4. More ❯
may be a great fit if you have experience with any of the following... Workflow orchestration tooling (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (e.g. BigQuery, Snowflake, Redshift) Data transformation tools (e.g. dbt) and data quality frameworks (e.g. Great Expectations) Backend Python frameworks (e.g. Django, FastAPI, Flask) for API development Modern data processing libraries (e.g. Polars, DuckDB More ❯
leveraging data to improve the OTM users and agents experience. This role would be suited to an individual with a background in Python, SQL, AWS, Looker, PyTorch, Airflow and Redshift/Postgres. The candidate must be able to exhibit proficiency in working with large volumes of data, automating data loads, and the ability to understand end user requirements with … goal of building a solution to provide high quality data processing and analysis as well as building APIs. RESPONSIBILITIES D evelop data pipelines using Python and Airflow Data warehousing (Redshift/Spectrum) Develop data solutions to address business needs Maintain existing reports and improve their performance QUALIFICATIONS Bachelors degree in Computer Science, Data Science, Mathematics, Engineering or equivalent experience … query performance. Strong skills in Python Experience with AWS or GCP Demonstrated experience and responsibility with data, processes, and building ETL pipelines. Experience with cloud data warehouses such as AmazonRedshift, and Google BigQuery. Building visualizations using tools such as Looker studio or equivalent Experience in designing ETL/ELT solutions, preferably using tools like Airflow or AWS More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
INTEC SELECT LIMITED
Engineer, particularly in data modelling. Strong SQL and hands-on dbt experience. Ability to convert business requirements into logical, scalable data models. Knowledge of cloud data platforms (e.g., Snowflake, Redshift, BigQuery). Strong communication and documentation skills. Structured, detail-oriented mindset. Desirable: Experience with semantic modelling tools (e.g., dbt SL, LookML). Familiarity with workflow orchestration and BI tooling. More ❯
such as: Hadoop, Kafka, Apache Spark, Apache Flink, object, relational and NoSQL data stores. Hands-on experience with big data application development and cloud data warehousing (e.g. Hadoop, Spark, Redshift, Snowflake, GCP BigQuery) Expertise in building data architectures that support batch and streaming paradigms Experience with standards such as JSON, XML, YAML, Avro, Parquet Strong communication skills Open to More ❯
South East London, London, United Kingdom Hybrid/Remote Options
Stepstone UK
what to do, run some whiteboard design sessions (realising you've used a permanent marker, you quickly leave the room) Qualifications Data Warehousing, Data Modelling, Database Design, ETL AWS Redshift, AWS Glue, S3, SQL Server, Power BI Cloud/DevOps AWS, Docker, Terraform Bitbucket, Bamboo, CI/CD deploy pipeline Agile, pair programming, code reviews Additional Information Were a More ❯
models, tests, documentation, version control. Understanding of data warehousing concepts (star schemas, snowflake, slowly changing dimensions, partitioning, clustering). Experience working in a modern data stack (e.g. BigQuery, Snowflake, Redshift, Databricks, etc.) Comfortable working downstream (with BI/analytics users) and upstream (pipelines, ingestion) contexts. Familiarity with BI tools (we use Thoughtspot and Power BI). Proficient in Python. More ❯
london, south east england, united kingdom Hybrid/Remote Options
Zefr
experience (AWS or GCP) with services for compute, data, and AI/ML. Proven skills in data engineering (ETL pipelines, APIs, SQL/NoSQL, data warehouses like BigQuery/Redshift). Experience with AI agents, LLMs and orchestration frameworks (LangChain, AutoGen, LlamaIndex, etc.). Demonstrated ability to ship production-ready internal tools or SaaS integrations. Strong problem-solver who More ❯
Watford, Hertfordshire, South East, United Kingdom
CV Screen Ltd
the team, promoting best practices and knowledge sharing. What Experience is Required Proven experience as a Data Engineer , ideally 4+ years in a similar role. Strong proficiency with SQL , AmazonRedshift , and preferably MySQL and QuickSight . Experience with AWS Glue , Python scripting , and building ETL/ELT data processes. Salary & Benefits £85,000 per annum plus excellent More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
in commercial analytics, ideally with marketing or customer analytics exposure. Strong SQL skills with experience in dbt or similar modelling frameworks. Experience with modern cloud data warehouses (BigQuery, Snowflake, Redshift, etc.). Skilled in dashboarding tools (Looker, Power BI, etc.). Comfortable working with Marketing/User Acquisition teams. Understanding of advertising platform constraints (Meta, Google, mobile tracking). More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Holland & Barrett International Limited
Key Requirements About You Proficient in SQL, with experience querying and transforming large datasets across marketing, sales, and customer data sources. Comfortable working with modern data stacks, ideally including Redshift, BigQuery, Matillion, Metabase, and Retool. Strong grasp of performance marketing metrics such as ROAS, CAC, CTR, CPM, and attribution models. Experienced in building dashboards and visualisations that enable data More ❯
South East London, London, United Kingdom Hybrid/Remote Options
Stepstone UK
IQ, Adobe Analytics, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pipping and modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development teams to collect business requirements and translate them into More ❯
london, south east england, united kingdom Hybrid/Remote Options
Stepstone UK
IQ, Adobe Analytics, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pipping and modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development teams to collect business requirements and translate them into More ❯