Greater London, England, United Kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
Lead Data Engineer (Java/Python/AWS/Glue/Dremio). Investment Management. £160,000 - £170,000 + 25% Bonus Target and Benefits. Hybrid 2 Days a week in St Pauls Office. My client is a top tier more »
data from our data warehouse Work with other data analysts and more junior members of the team to further our data capability Tech Stack; DBT, Looker, SQL & Python Apply with your CV and we'll get in touch. more »
data from our data warehouse Work with other data analysts and more junior members of the team to further our data capability Tech Stack; DBT, Looker, SQL & Python If this sounds like something that could be of interest please Apply Now. more »
with accessible data and foster a data-driven culture. As a BI Analyst, you will help the team build an organised data infrastructure using dbt/redshift/tableau to feed the business with insights. You will model raw data in to finalised tables, provide in-depth analysis and buildmore »
be at the forefront of shaping innovative solutions and providing crucial insights to tackle complex challenges. We need someone who is an expert in: DBT Looker Sql Python What you'll be doing: Collaborate with business analysts and product owners to comprehend the data needs of business users. Partner with more »
Coventry, England, United Kingdom Hybrid / WFH Options
WEG Tech
Science, Engineering, etc.) or equivalent Extensive experience in data engineering, data governance, and data management roles. Hands-on experience with Azure Data Factory and DBT, testing tools such as Great Expectations or SODA, and familiarity with Snowflake data platform or similar cloud-based data warehousing solutions is also required. Additionally more »
customers with the most in-depth analytics Working with some of the worlds most recognised brands Fantastic partnerships with businesses like Databricks, Snowflake and dbt Role: Sales Engineering/Pre-sales role where you will be supporting the sales function, engaging with enterprise level customers educating them on the technical more »
City Of London, England, United Kingdom Hybrid / WFH Options
Harnham
Gaming or Entertainment experience. Experience managing, mentoring, or coaching a small team of analytics engineers. Advanced knowledge and commercial experience with tools such as DBT, Redshift, and other AWS tools. Good educational background is preferred. Strong communication skills. A passion for gaming! THE BENEFITS A salary of up to more »
the successful candidate will be instrumental in setting up and automating data cloud platforms such as Databricks, Snowflake, along with services like Cube.dev and DBT, contributing to the development and optimisation of our cloud-based platforms and services Key Duties: Configure, manage, and optimise AWS cloud resources to support application more »
Senior Data Engineer. Java/Python/AWS. Investment Management. £135,000 - £145,000 + Discretionary Bonus and Benefits. The Data Office team at my client is playing a key role in helping build the future of financial services, working more »
solid understanding of, and demonstrable experience in: PostgreSQL (other relational databases are fine but this is the preference) Tableau for BI reporting and visualisations DBT for build of data pipelines and warehouse work. Why you would want this job: There’s a huge amount of scope. It’s a new more »
Milton Keynes, England, United Kingdom Hybrid / WFH Options
Addition+
for optimisation. Experience & Skills Required 5+ years of experience as a Data Modeller Experience with Dimensional Modelling. Strong experience using Snowflake Experience with Informatica, DBT or Power BI Background working in Insurance would be desirable Background in engineering would be desirable What’s in it For You? Amazing company to more »
key requirements helps you to deter irrelevant candidates from applying for your role. We would also recommend making your key requirements screening questions: GCP, DBT, BigQuery PowerBI or Looker Communcation & Stakeholder Management If you would like to be considered for the role and feel you would be an ideal fit more »
Performing patient-derived organoids quality control steps (cell density, proliferation rate and cytotoxicity evaluation). Setting up and performing multimodality experiments according to the DBT lab project (cell plating, treatment, and organoids coculture with immune cells). Collecting and reporting data to the lab head. Executing and delivering of the more »
to-end independently - Recommendations to clients on the tech stack/working with modern stack Skills/Requirements: - Experience with GCP (required) - Looker experience - DBT experience (required) - Consultancy background (required) - Fivetran experience Interview Process 1st stage - initial chat 2nd stage- Technical test - Paid by client 3rd stage - Meet with the more »
a range of civil and criminal legislation, and statutory decision-making. In doing so, you will liaise with both Department for Business and Trade (DBT) and other government departments' legal advisors (GLD) as well as other regulators and commercial entities as appropriate. Key Responsibilities: You will have responsibility for a more »
stakeholder engagement activities acting as point of contact for techUK members Build and hold relationships within and outside Government, including with DSIT, HM Treasury, DBT, the Labour Party and other stakeholders such as influential Members of Parliament and other industry bodies and partners such as DIGITAL EUROPE Design research plans more »
WHAT YOU'LL DO Design, document & implement the data pipelines to feed data models for subsequent consumption in Snowflake using dbt, airflow . Ensure correctness and completeness of the data being transformed via engineering pipelines for end consumption in Analytical Dashboards. Actively monitor and triage technical challenges in critical situations … Bitbucket ) Extremely talented in applying SCD, CDC and DQ/DV framework. Familiar with JIRA & Confluence . Must have exposure to technologies such as dbt, Apache airflow, and Snowflake . Desire to continually keep up with advancements in data engineering practices. Knowledge of AWS cloud, and Python is a plus. … Job Requirements 5+ years of IT experience with major focus on data warehouse/database related projects Must have exposure to technologies such as dbt, Apache Airflow, Snowflake. Experience in data platforms: Snowflake, Oracle, SQL Server, MDM etc Expertise in writing SQL and database objects - Stored procedures, functions, and views. more »
Would you like to play a key role in helping build the future of financial services to change the way people invest? With opportunities to learn and grow, and a collaborative culture that encourages every team member to bring their more »
Abingdon-On-Thames, England, United Kingdom Hybrid / WFH Options
Mirus Talent
experience with the following: Snowflake : Proficiency in Snowflake, including its setup, configuration, and optimisation, is essential to drive the data platform forward. DBT (DataBuildTool) : Solid understanding and experience with DBT for managing transformations and orchestrating data pipelines. Python : Strong programming skills in Python for scripting, automation, and datamore »
distributed cloud-hosted data platform that would be used by the entire firm. Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster. Cloud: AWS, Lambdas, ECS services This role would focus on various areas of Data Engineering including: End to End ETL pipeline … Requirements: Strong Python/Java Software Engineering skills Excellent AWS knowledge, ideally with exposure to Airflow, Glue, Iceberg and Snowflake Previous experience with Dremio, dbt, EMR or Dagster Good Computer Science fundamentals knowledge with strong knowledge of software and data architecture. If you would like more information on the above more »
Data Engineer – Hybrid (3 days in Office) Stratford-Upon-Avon £56k + 5% bonus & benefits Do you want to grow your expertise and experience and use your skills in a vibrant environment where teamwork, creativity, diversity, inclusivity, and technical excellence more »
Coventry, England, United Kingdom Hybrid / WFH Options
WEG Tech
the UK’s leading universities. As one of the Data Engineers within the team, you will be responsible for leveraging Azure Data Factory and DBT to design, develop, and maintain robust data pipelines and scalable data models which integrate data from a wide variety of structured and unstructured data sources … backlog of work in planning tools like Jira Utilise Azure Data Factory to design and develop scalable and efficient data pipelines Utilise DBT (DataBuildTool) to create and manage data transformation processes, ensuring consistent and reliable data output Design, implement and maintain DataVault and Kimball-style data models to … and Kimball-style data warehousing methodologies. Proficient in SQL and data querying languages for data manipulation and analysis. Proficiency in Azure Data Factory and DBT, with a demonstrated ability to build scalable and reliable data pipelines and transformation processes. Familiarity with data modelling concepts and techniques, including dimensional modelling. Strong more »
processing systems that will handle very large amounts of market/exchange data, data lake development using AWS Glue, Apache Kafka/Airflow and dbt, and more traditional end-to-end Data Pipeline development. Lastly, part of this role would look at the onboarding and development of model data sets … Strong Java or Python skills - ability to write production ready code Experience deploying code to AWS utilising relevant tools. Previous experience with AWS Glue, dbt, Airflow and Dagster Good communication skills, and ability to work both independently and collaboratively with a team. Excellent SQL skills For more information, please apply more »
infrastructure. You will work with some of the most innovative tools in the market including Snowflake, AWS (Glue, S3), Apache Spark, Apache Airflow and DBT!! The role is hybrid, with 2 days in the office in central London and the company is offering up to £83,000 with some excellent … and maintaining data pipelines from scratch Data modelling, data integration and transformation experience Hands on work with tools such as Snowflake, AWS, Airflow, and DBT Proficiency in data manipulation, scripting and automation with Python Desirable: Experience leading teams Version control systems such as Git or Bitbucket Agile methodologies Don't more »