Stack: Python and Scala Starburst and Athena Kafka and Kinesis DataHub ML Flow and Airflow Docker and Terraform Kafka, Spark, Kafka Streams and KSQL DBT AWS, S3, Iceberg, Parquet, Glue and EMR for our Data Lake Elasticsearch and DynamoDB More information: Enjoy fantastic perks like private healthcare & dental insurance, a More ❯
working autonomously. It would be a real bonus, but not a requirement if: You've worked in a start-up environment. You've got DBT experience. You've familiarity with MLOps principles and practices and their application in a production setting. Interview Process: You'll have a 20-minute conversation More ❯
Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (Python, Data Modelling, ETL/ELT, Apache Airflow, DBT, AWS) Enterprise-scale tech firm Up to £70,000 plus benefits - FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalabale data structures and data pipelines within a truly More ❯
Snowflake/Databricks/Redshift/BigQuery, including performance tuning and optimisation. Understanding of best practices for designing scalable and efficient data models, leveraging dbt for transformations. Familiarity with CircleCI, Terraform or similar tools for deployment and infrastructure as code. In this role, you will be responsible for: Shipping and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
as well as 30 Engineers in the businesses data arm. Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual More ❯
the Data space. This role will also allow the successful individual to cross-train into modern Data Engineering tools and Technologies such as Airflow, dbt, Snowflake, as well further develop their skills Python, SQL and Market Data platforms. The firm work on a hybrid working schedule (three days per week More ❯
the Data space. This role will also allow the successful individual to cross-train into modern Data Engineering tools and Technologies such as Airflow, dbt, Snowflake, as well further develop their skills Python, SQL and Market Data platforms. The firm work on a hybrid working schedule (three days per week More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Winston Fox
the Data space. This role will also allow the successful individual to cross-train into modern Data Engineering tools and Technologies such as Airflow, dbt, Snowflake, as well further develop their skills Python, SQL and Market Data platforms. The firm work on a hybrid working schedule (three days per week More ❯
Salisbury, England, United Kingdom Hybrid / WFH Options
Ascentia Partners
practices. Nice-to-Have Skills Exposure to AWS Redshift, Glue, or Snowflake. Familiarity with BigQuery and Google Analytics APIs. Proficiency in Python, PySpark, or dbt for data transformations. Background in insurance, especially in pricing analytics or actuarial data. Click Apply More ❯
a leader within the business over time. The current tech stack is varied, and it’s currently made up of TypeScript, Python, PostgreSQL, Redis, DBT on AWS. You’ll be encouraged to take ownership of products and you’ll be given this autonomy from the co-founders. The products handle More ❯
a leader within the business over time. The current tech stack is varied, and it’s currently made up of TypeScript, Python, PostgreSQL, Redis, DBT on AWS. You’ll be encouraged to take ownership of products and you’ll be given this autonomy from the co-founders. The products handle More ❯
a leader within the business over time. The current tech stack is varied, and it’s currently made up of TypeScript, Python, PostgreSQL, Redis, DBT on AWS. You’ll be encouraged to take ownership of products and you’ll be given this autonomy from the co-founders. The products handle More ❯
Team The Data team is a cross-functional team of experienced and passionate data enthusiasts. We use and own modern data tools (Fivetran, Snowflake, dbt, Looker) and cover a diverse range of data problems and stakeholders. What we're offering you: Flexible hours and summer hours Competitive holiday benefits More ❯
Python, or Java-comfortable reviewing code, guiding design decisions, and supporting technical excellence. Big Data Expertise: In-depth experience with tools like Kafka, Flink, dbt, Airflow, and Airbyte, and a solid understanding of building and operating modern data ecosystems. Cloud Experience: Hands-on experience with AWS, GCP, or both-comfortable More ❯
and coaching others to succeed. Have a strong background in building and managing data infrastructure at scale, with expertise in Python, SQL, BigQuery, AWS, dbt and Airflow. Have a strong background in data modelling and building scalable data pipelines. Are naturally curious and enthusiastic about experimenting with new tech to More ❯
Desired Skills & Experience: Core Technical Skills Expert in Python, SQL, and modern data science toolkits (e.g. scikit-learn, XGBoost, statsmodels). Solid grasp of dbt for data transformation. Experience with modern cloud data stacks - Snowflake, BigQuery, Redshift, etc. Comfortable working in agile environments with tools like Git, Jupyter, Airflow. Domain More ❯
features they implement. What we need from you: At least 3 years of relevant data engineering experience Strong Python and SQL skills Experience with dbt Experience with AWS Experience working with a columnar database such as Redshift Strong Experience with ETL/ELT and the management of data pipelines Familiarity More ❯
Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s) and data pipelines (SQL, dbt, Airbyte) Love automation, process improvement and finding ways to help others work efficiently Are comfortable working autonomously and taking responsibility for the delivery of large More ❯
timelines and quality standards are met. Required Skills & Experience: 5+ years' experience as a Data Analyst Strong skills in Python, SQL and tools like dbt, Snowflake, AWS S3 and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at More ❯
timelines and quality standards are met. Required Skills & Experience: 5+ years' experience as a Data Analyst Strong skills in Python, SQL and tools like dbt, Snowflake, AWS S3 and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at More ❯
timelines and quality standards are met. Required Skills & Experience: 5+ years' experience as a Data Analyst Strong skills in Python, SQL and tools like dbt, Snowflake, AWS S3 and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at More ❯
Role Job Description: AWS Services: Glue, Lambda, IAM, Service Catalogue, Cloud Formation, Lake Formation, SNS, SQS, Event Bridge Language & Scripting: Python and Spark ETL: DBT Good to Have: Airflow, Snowflake, Big Data (Hadoop), and Teradata Responsibilities: Serve as the primary point of contact for all AWS related data initiatives and More ❯
Tools). Experience with one or more of the following is a plus: Kubernetes, Prometheus, Argo workflows, GitHub Actions, Elasticsearch/Opensearch, PostgreSQL, BigQuery, DBTdata pipelines, Fastly, Storybook, Contentful, Deno, Bun. Benefits We want to give you a great work environment; contribute back to both your personal and professional More ❯