Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, Apache Airflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure as code and other DevOps practices. more »
Python/Javascript/C# Familiarity with statistical/machine learning/AI concepts and techniques Understanding of data pipeline/orchestration tools e.g. dbt, dataform Appreciation of GCP’s serverless technologies e.g. Cloud Run/Workflows Understanding of Google’s marketing stack, Google Analytics, Google Tag Manager, Google Ads more »
LookML Experience working with numerous AWS services A detailed understanding of CI/CD practices & tooling A research or mathematical background Experience working with dbt & dbt cloud Experience working with Data Orchestration tooling (e.g. Dagster, Prefect, etc) Experience working with Data Ingestion tooling (e.g. FiveTran, Keboola, etc) Experience working with more »
Manchester, Greater Manchester, United Kingdom Hybrid / WFH Options
AutoTrader UK
and scalable web hosting and data platforms. Our platform is a layer on top of core Open Source technologies such as Kubernetes, Istio, Airflow, dbt, running in Public Cloud. It is the glue that allows our teams to deploy into production environments 100s of times per day with the least more »
Manchester, Greater Manchester, United Kingdom Hybrid / WFH Options
AutoTrader UK
are at the cutting edge of applying data technologies to solve problems and you can expect to work with a range of technologies including dbt, Kotlin/Java, Python, Apache Spark and Kafka.Join us as a Principal Software Engineer and, as well as shaping and creating the foundations for insight more »
hands-on software engineering using a broad range of technologies including the following: Java or Python Microservices Data pipelines and database programming such as DBT, SQL, BigQuery, Cloud Composer etc CI/CD/DevOps tooling experience e.g., GIT, Jenkins etc What you'll get to learn (any previous experience more »
hands-on software engineering using a broad range of technologies including the following: Java or Python Microservices Data pipelines and database programming such as DBT, SQL, BigQuery, Cloud Composer etc CI/CD/DevOps tooling experience e.g., GIT, Jenkins etc What you'll get to learn (any previous experience more »
M50, Trafford Park, Trafford, Greater Manchester, United Kingdom
Hoist Finance
SQL Server, Oracle, MySQL, PostgreSQL) Knowledge of BI Stack design and implementation Any knowledge in some of the following areas is an advantage:- Snowflake, DBT, Azure Technologies including Azure Data Factory, Azure Data Lake, Azure DevOps, Powershell, GIT, Python. Excellent communication and interpersonal skills As a Data Engineer you will more »
Employment Type: Permanent
Salary: £40000 - £45000/annum + Car Allowance + Bonus
greenfield re-architecture from the ground up in Microsoft Azure. The Tech you'll be playing with: Azure Data Factory Azure Databricks PySpark SQL DBT What you need to bring: 1-3 year's experience in building Data Pipelines SQL experience in data warehousing Python experience would be desirable Experience more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry standards like CRISP more »
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Deloitte
SnowPark, Snowpipe, Tasks & Streams, Dynamic Tables. Demonstrable experience with the integration of technologies across the wider Snowflake Partner ecosystem such as Informatica, Matillion, Fivetran, dbt, Monte Carlo, Collibra, Alation and Tableau.Experience with Data Acquisition, Integration & Transformation solutions leveraging Batch, Micro-batch, CDC and Event-driven.Broad knowledge across Cloud architecture, DevOps more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
code.Implement TMS (Tealium IQ, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pippingand modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Legal & General
positions are to focus on the retirements side of the Retail division and will build out new data pipelines utilising tools such as Synapse, DBT, Azure Devops and Snowflake. This role will see you responsible for designing, building, and implementing a variety of data solutions using modern ETL techniques and more »
Queens Road, Teddington, Middlesex, England Hybrid / WFH Options
LGC LIMITED
opportunity for to experience the ins and outs of working in a Data team and exposure to market leading technologies such as Tableau, Snowflake, dbt, and SAP Business Objects. Your duties and responsibilities in this role will consist of: Supporting in handling user access requests and approval process for Business more »
and offline model delivery. Identifying business hypotheses worth pursuing. Hypothesis testing whilst including real-world constraints. Data structures, databases, and ETL processes. AWS, Snowflake, DBT, Jupyter notebooks, Spark, Mongo, and Postgress or similar. This is a pragmatic and humble organisation who are looking for like minded people to help them more »
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Neogen Recruitment
products. Oversee the evaluation of third-party data sets to enhance our offerings. Tools: Proficiency in scripting and querying languages. Experience with AWS, Snowflake, DBT, Jupyter notebooks, and other relevant tools. What You Will Need: Advanced degree in statistics, mathematics, computer science, or related field. Strong leadership skills with commercial more »
hands on with projects but no requirements, previous experience in this is essential SKILLS AND EXPERIENCE NEEDED: Experience in Redshift database (AWS) Experience with DBT, Airflow and Fivetran Working collaboratively with multiple teams across the business Management/mentoring experience of a team INTERVIEW PROCESS: 1st Stage- Initial Chat 2nd more »
hands on with projects but no requirements, previous experience in this is essential SKILLS AND EXPERIENCE NEEDED: Experience in Redshift database (AWS) Experience with DBT, Airflow and Fivetran Working collaboratively with multiple teams across the business Management/mentoring experience of a team INTERVIEW PROCESS: 1st Stage- Initial Chat 2nd more »
of IR35 Required experience will include: Expertise in designing and implementing data pipelines using Azure services such as Azure Data Factory , Azure Databricks and DBT . Hands-on experience with SQL database design Experience of product lifecycle management principles and tools (e.g. DevOps , terraform ) and relational database manipulation and interrogation. more »
Coventry, England, United Kingdom Hybrid / WFH Options
WEG Tech
Science, Engineering, etc.) or equivalent Extensive experience in data engineering, data governance, and data management roles. Hands-on experience with Azure Data Factory and DBT, testing tools such as Great Expectations or SODA, and familiarity with Snowflake data platform or similar cloud-based data warehousing solutions is also required. Additionally more »
Milton Keynes, England, United Kingdom Hybrid / WFH Options
Addition+
for optimisation. Experience & Skills Required 5+ years of experience as a Data Modeller Experience with Dimensional Modelling. Strong experience using Snowflake Experience with Informatica, DBT or Power BI Background working in Insurance would be desirable Background in engineering would be desirable What’s in it For You? Amazing company to more »
Performing patient-derived organoids quality control steps (cell density, proliferation rate and cytotoxicity evaluation). Setting up and performing multimodality experiments according to the DBT lab project (cell plating, treatment, and organoids coculture with immune cells). Collecting and reporting data to the lab head. Executing and delivering of the more »
Abingdon-On-Thames, England, United Kingdom Hybrid / WFH Options
Mirus Talent
experience with the following: Snowflake : Proficiency in Snowflake, including its setup, configuration, and optimisation, is essential to drive the data platform forward. DBT (DataBuildTool) : Solid understanding and experience with DBT for managing transformations and orchestrating data pipelines. Python : Strong programming skills in Python for scripting, automation, and datamore »
Data Engineer – Hybrid (3 days in Office) Stratford-Upon-Avon £56k + 5% bonus & benefits Do you want to grow your expertise and experience and use your skills in a vibrant environment where teamwork, creativity, diversity, inclusivity, and technical excellence more »
Coventry, England, United Kingdom Hybrid / WFH Options
WEG Tech
the UK’s leading universities. As one of the Data Engineers within the team, you will be responsible for leveraging Azure Data Factory and DBT to design, develop, and maintain robust data pipelines and scalable data models which integrate data from a wide variety of structured and unstructured data sources … backlog of work in planning tools like Jira Utilise Azure Data Factory to design and develop scalable and efficient data pipelines Utilise DBT (DataBuildTool) to create and manage data transformation processes, ensuring consistent and reliable data output Design, implement and maintain DataVault and Kimball-style data models to … and Kimball-style data warehousing methodologies. Proficient in SQL and data querying languages for data manipulation and analysis. Proficiency in Azure Data Factory and DBT, with a demonstrated ability to build scalable and reliable data pipelines and transformation processes. Familiarity with data modelling concepts and techniques, including dimensional modelling. Strong more »