Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the link and more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
code.Implement TMS (Tealium IQ, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pippingand modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development more »
are looking for a dynamic Analytics Engineer to join their Finance Team. Requirements To qualify for this role, you will require: · Strong experience with dbt and SQL · Experience working within Cloud Environments (Redshift, Bigquery, Snowflake) Salary A successful candidate will receive: · A Salary of up to £75,000 · Excellent progression more »
accurate up to date data We have a modern data stack already in place comprising of a Snowplow data pipeline, a Snowflake data warehouse, dbt as the data transformation tool and our BI tool, Looker. We don’t expect you to be fluent with all these technologies, but we do more »
you will need: with owning the data visualisation platform within Tableau experience in data mining, statistical analysis and predictive modelling experience using SQL to DBT, big query or other cloud data warehouses worked cross-functionally with a wide range of data teams with a commercial outlook and working with subscription more »
with owning the data visualisation platform within Tableau Proven experience in data mining, statistical analysis, and predictive modeling Strong experience using SQL Exposure to DBT, big query, or other cloud data warehouses Previously worked cross-functionally with a wide range of data teams with a commercial outlook and working with more »
Python. SKILLS AND EXPERIENCE Commercial experience of implementing best coding practices using CI/CD. Strong knowledge of working within Azure. Strong understanding of DBT and Airflow. Experience building and maintaining Data Pipelines using Python. THE BENEFITS Flexible Working Private Healthcare Generous Holiday package. HOW TO APPLY Please register your more »
data from our data warehouse Work with other data analysts and more junior members of the team to further our data capability Tech Stack; DBT, Looker, SQL & Python Apply with your CV and we'll get in touch. more »
data from our data warehouse Work with other data analysts and more junior members of the team to further our data capability Tech Stack; DBT, Looker, SQL & Python If this sounds like something that could be of interest please Apply Now. more »
Manchester Area, United Kingdom Hybrid / WFH Options
Oscar
alongside a team of engineers to create, maintain, and extend the analytics platform. Promote best practice and coding standards. Must Have Skills: GCP Python DBT Google Analytics 314 Terraform is a bonus! Apply Now If you are an experienced Data Engineer skilled in GCP, Python and Google Analytics and you more »
you to deliver the more strategic business solutions and projects. As a business they have invested in tools such as Google Analytics 4, BigQuery, dbt and Google Cloud but are open to other toolings you see necessary. Strong skills in SQL is essential as you you may be required to more »
looking for a dynamic Analytics Engineering Manager to own their AE function. Requirements To qualify for this role, you will require: · Strong experience with dbt, SQL, Fivetran, AWS · Experience managing a team Salary A successful candidate will receive: · A Salary of up to £100,000 · Excellent progression opportunities Process- two more »
leading on the delivery of data insight projects. Main duties include: Managing and developing the BI Developers Maintaining and managing the BI platform - Looker, Dbt, AWS, Redshift etc Ultimately own the long term BI roadmap. Create a self-serve data culture across the business Skills & Experience If this sounds like more »
hands on with projects but no requirements, previous experience in this is essential SKILLS AND EXPERIENCE NEEDED: Experience in Redshift database (AWS) Experience with DBT, Airflow and Fivetran Working collaboratively with multiple teams across the business Management/mentoring experience of a team INTERVIEW PROCESS: 1st Stage- Initial Chat 2nd more »
Data Modelling. The primary focus of this role will be the development of a Snowflake data warehouse environment and will include using ETL tools, DBT and Power BI for reporting and data visualisations. As the ideal candidate you will be adept at working with large data sets to develop advanced more »
Stay hands on to support your team Were just looking for someone who has come from an AWS focused background who has experience with DBT and has Led a team before. This is a fantastic opportunity for someone with leadership experience looking to take that next step in their career. more »
Milton Keynes, England, United Kingdom Hybrid / WFH Options
Addition+
for optimisation. Experience & Skills Required 5+ years of experience as a Data Modeller Experience with Dimensional Modelling. Strong experience using Snowflake Experience with Informatica, DBT or Power BI Background working in Insurance would be desirable Background in engineering would be desirable What’s in it For You? Amazing company to more »
Performing patient-derived organoids quality control steps (cell density, proliferation rate and cytotoxicity evaluation). Setting up and performing multimodality experiments according to the DBT lab project (cell plating, treatment, and organoids coculture with immune cells). Collecting and reporting data to the lab head. Executing and delivering of the more »
There is a place for you at T. Rowe Price to grow, contribute, learn, and make a difference.?? We are a premier?asset?manager?focused on delivering global investment management excellence and retirement services that investors can rely on today more »
best practices and also needs prior experience leading a data engineering team. Key Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (DataBuildTool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (Apache Airflow) - Knowledge of SQL This is a 6 month initial contract with more »
Abingdon-On-Thames, England, United Kingdom Hybrid / WFH Options
Mirus Talent
experience with the following: Snowflake : Proficiency in Snowflake, including its setup, configuration, and optimisation, is essential to drive the data platform forward. DBT (DataBuildTool) : Solid understanding and experience with DBT for managing transformations and orchestrating data pipelines. Python : Strong programming skills in Python for scripting, automation, and datamore »
distributed cloud-hosted data platform that would be used by the entire firm. Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster. Cloud: AWS, Lambdas, ECS services This role would focus on various areas of Data Engineering including: End to End ETL pipeline … Requirements: Strong Python/Java Software Engineering skills Excellent AWS knowledge, ideally with exposure to Airflow, Glue, Iceberg and Snowflake Previous experience with Dremio, dbt, EMR or Dagster Good Computer Science fundamentals knowledge with strong knowledge of software and data architecture. If you would like more information on the above more »
you will: Automate and scale datasets aligned with specific use cases, facilitating seamless integration into operational workflows. Design, develop, and maintain data models using dbt, ensuring the efficient transformation of raw data into actionable insights. Build and optimize query pipelines in dbt and SQL to extract, transform, and load data … EXPERIENCE To qualify for this Analytics Engineer role, you will require: Proficiency in Advanced SQL for data querying and manipulation. Experience managing dbt (DataBuildTool) for building and maintaining data transformation pipelines. Ability to build Tableau dashboards for data visualization and reporting. Familiarity with GCP (Google Cloud Platform) for more »
Senior Analytics Engineer 💰 £75,000-£90,000 🖥️ dbt, SQL, Snowflake 🌍 London, twice a week in office 💵 FinTech influencing global markets We are partnered exclusively with a technology company transforming payments and who's global reach spans across 120 markets to deliver a better experience for guests, shoppers, and consumers everywhere. … maintenance of data models and ELT processes within a data mesh architecture using Python for data pipeline development and automation, following best practices with dbt and Snowflake. You will play a pivotal role in the delivery of key analytics projects, ensuring they are executed efficiently and effectively, meeting project timelines … best practices to foster their growth and development within the team. You will participate in the monitoring, optimization, and innovation of data pipelines in dbt and Snowflake, aiming for cost-efficiency and alignment with business needs for data timeliness. You will support the development and adherence to data governance policies more »
Coventry, England, United Kingdom Hybrid / WFH Options
WEG Tech
the UK’s leading universities. As one of the Data Engineers within the team, you will be responsible for leveraging Azure Data Factory and DBT to design, develop, and maintain robust data pipelines and scalable data models which integrate data from a wide variety of structured and unstructured data sources … backlog of work in planning tools like Jira Utilise Azure Data Factory to design and develop scalable and efficient data pipelines Utilise DBT (DataBuildTool) to create and manage data transformation processes, ensuring consistent and reliable data output Design, implement and maintain DataVault and Kimball-style data models to … and Kimball-style data warehousing methodologies. Proficient in SQL and data querying languages for data manipulation and analysis. Proficiency in Azure Data Factory and DBT, with a demonstrated ability to build scalable and reliable data pipelines and transformation processes. Familiarity with data modelling concepts and techniques, including dimensional modelling. Strong more »