Python/Javascript/C# Familiarity with statistical/machine learning/AI concepts and techniques Understanding of data pipeline/orchestration tools e.g. dbt, dataform Appreciation of GCP’s serverless technologies e.g. Cloud Run/Workflows Understanding of Google’s marketing stack, Google Analytics, Google Tag Manager, Google Ads more »
M50, Trafford Park, Trafford, Greater Manchester, United Kingdom
Hoist Finance
SQL Server, Oracle, MySQL, PostgreSQL) Knowledge of BI Stack design and implementation Any knowledge in some of the following areas is an advantage:- Snowflake, DBT, Azure Technologies including Azure Data Factory, Azure Data Lake, Azure DevOps, Powershell, GIT, Python. Excellent communication and interpersonal skills As a Data Engineer you will more »
Employment Type: Permanent
Salary: £40000 - £45000/annum + Car Allowance + Bonus
Job DescriptionA world-leading hedge fund seeks a Senior Data Engineer with exceptional Python, SQL, and DBT skills for its Algorithmic Trading team. We seek an exceptional Financial Markets Data Engineer with front-office trading experience and experience building and managing large (TB+) quantitative research data pipelines. Ideally, you will more »
A world-leading hedge fund seeks a Senior Data Engineer with exceptional Python, SQL, and DBT skills for its Algorithmic Trading team. We seek an exceptional Financial Markets Data Engineer with front-office trading experience and experience building and managing large (TB+) quantitative research data pipelines . Ideally, you will more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry standards like CRISP more »
value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt TC: £85,000 + bonus + up to 22% pension Process: 2 stages No CV? No problem. Email me at athomas@trg-uk.com, and let more »
years of demonstrated commercial experience as a Data Engineer or similar role within large-scale environments dealing with large data sets. Expertise in SQL & dbt and ideally Kafka Significant Python coding skills Containerisation experience (Dockers, Kubernetes) Cloud Computing experience ( GCP/AWS/Azure ) Strong preference for a Snowflake background more »
business requirements spanning a number of systems At least 10 years of relevant experience Hands-on in-depth experience in the following: Snowflake/DBT/Airflow Background/working experience in the following: Azure Power BI/DAX Traditional SQL (SqlServer, MySql, Postgres) JIRA, Confluence, (Github/BitBucket) Azure more »
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information Legal more »
capabilities. Expert data engineering and database practitioner; Experience of designing and building AWS based data solutions including pipelines, data warehouse/lake. Experience of DBT or equivalent data modelling tool. Experience of administrating SQL databases and highly proficient in SQL query optimisation. Broad experience of data analytics and reporting platforms more »
hands-on software engineering using a broad range of technologies including the following: Java or Python Microservices Data pipelines and database programming such as DBT, SQL, BigQuery, Cloud Composer etc CI/CD/DevOps tooling experience e.g., GIT, Jenkins etc What you'll get to learn (any previous experience more »
hands-on software engineering using a broad range of technologies including the following: Java or Python Microservices Data pipelines and database programming such as DBT, SQL, BigQuery, Cloud Composer etc CI/CD/DevOps tooling experience e.g., GIT, Jenkins etc What you'll get to learn (any previous experience more »
SQL), experience working with relational databases, and programming experience in Python or Scala.Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.).Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components.Knowledge of data integration and ETL frameworks and tools.Understanding of DataOps, data mining, and data visualization/BI more »
working with relational databases, and programming experience in Python or Scala. Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration and ETL frameworks and tools. Understanding of DataOps, data mining, and data visualization more »
with AWS, especially AWS services focused on data flow, pipelines, data transformation, storage and streaming. Excellent data engineering skills, for example with SQL, Python, DBT and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Confidence in coding, scripting, configuring, versioning more »
production via machine learning engineering Hands-on knowledge of NoSQL and relational Databases, alongside relevant data modelling techniques ETL/ELT tooling Knowledge of DBT, Luigi or similar orchestration tooling of managing and developing a team in an agile environment Preferred Software development of a product/service provided as more »
modelling. Bonus: Experience with Data Vault 2.0. Analytical mindset with attention to detail. Enthusiastic about learning and thriving in a dynamic setting. Familiarity with dbt, Snowflake, Airflow, FiveTran, and Terraform. Join and be part of a dynamic team driving innovation and excellence in data engineering more »
greenfield re-architecture from the ground up in Microsoft Azure. The Tech you'll be playing with: Azure Data Factory Azure Databricks PySpark SQL DBT What you need to bring: 1-3 year's experience in building Data Pipelines SQL experience in data warehousing Python experience would be desirable Experience more »
smooth operations and reliability. YOUR EXPERIENCE From a technical perspective, you will require: Proficiency in Tableau, SQL and cloud data warehouses. Experience with BigQuery, DBT, or Python/R is beneficial. Your background should include: Demonstrated commercial acumen and adaptability in dynamic environments. Proven ability to translate data insights into more »
smooth operations and reliability. YOUR EXPERIENCE From a technical perspective, you will require: Proficiency in Tableau, SQL and cloud data warehouses. Experience with BigQuery, DBT, or Python/R is beneficial. Your background should include: Experience leading BI capabilities as an individual contributor. Demonstrated commercial acumen and adaptability in dynamic more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the link and more »
the ability to inspire and mentor engineers, teaching them best practices and fostering a culture of continuous improvement within an agile framework. Experience with dbt and Snowflake or other cloud based data warehouses. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. Detail-oriented more »
that there should be learning in every role you do. However, some experience in the following is important for this position: Advanced SQL and dbt skills to clean, transform and validate data from Data warehouses or Data Lakes Experience and knowledge in data warehousing and data modelling best practices Experience more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
code.Implement TMS (Tealium IQ, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pippingand modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development more »
are looking for a dynamic Analytics Engineer to join their Finance Team. Requirements To qualify for this role, you will require: · Strong experience with dbt and SQL · Experience working within Cloud Environments (Redshift, Bigquery, Snowflake) Salary A successful candidate will receive: · A Salary of up to £75,000 · Excellent progression more »