for data modelling and analysis Power BI experience in a business-facing environment Nice to have: Python for data analysis or scripting Familiarity with cloud data environments (e.g. Azure, Snowflake, Databricks) If you're a BI developer with London Market experience looking to make a real impact, this is a rare opportunity to help shape a data-driven future from More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
s, or PhD in Computer Science, Mathematics, or a related field Familiarity with BI tools such as Looker for reporting and dashboarding Exposure to other environments such as Databricks, Snowflake, AWS, Azure, or DBT Understanding of observability, monitoring, and logging in GCP GCP Professional Data Engineer certification (or similar) What’s on Offer Competitive salary of £80,000 + bonus More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Proficiency with BI/reporting tools such as Looker or PowerBI. Excellent communication and stakeholder management skills. Google Cloud Professional certifications. Experience in alternative cloud data platforms such as Snowflake, Databricks, Azure, or AWS. Understanding of DevOps/DataOps practices, CI/CD pipelines, and monitoring tools. Academic background in Computer Science, Mathematics, or a related technical field. What’s More ❯
data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but Redshift/BigQuery/ClickHouse etc. AWS, GCP, Azure) and MLOps practices. Familiarity with data visualization tools (e.g., Tableau, Power BI). COMPENSATION & BENEFITS: Competitive salary + More ❯
cloud platforms (e.g., AWS, Azure, Google Cloud). Knowledge of machine learning techniques and frameworks. Experience with version control systems (e.g., Git). Familiarity with big data technologies (e.g., Snowflake, Hadoop, Spark) #J-18808-Ljbffr More ❯
early-stage) ETL & Integration: Fivetran, with plans to scale Architecture: Medallion Ideal Background: Strong SQL skills with hands-on experience in dbt and modern cloud data warehouses (e.g. Redshift, Snowflake, BigQuery) Familiarity with ELT tools (e.g. Fivetran), BI tools (e.g. Looker, Tableau, Power BI), and Git-based workflows Solid understanding of data modelling, warehousing principles, and analytics best practices Experience More ❯
engineering community. Qualifications and Skills Experience leading a small team of data engineers. Extensive knowledge as a Data Engineer. Proven success in designing and building data products on Databricks, Snowflake, GCP Big Data, Hadoop, Spark, etc. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication and team collaboration abilities. Programming skills in Python (PySpark preferred), Scala, or SQL. Experience designing More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hartree Partners
data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but Redshift/BigQuery/ClickHouse etc. also welcome). PREFERRED QUALIFICATIONS: Meteorological understanding/experience with weather modelling Prior knowledge or experience in the power markets or More ❯
data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but Redshift/BigQuery/ClickHouse etc. also welcome). PREFERRED QUALIFICATIONS: Meteorological understanding/experience with weather modelling Prior knowledge or experience in the power markets or More ❯
other industry leading data visualisation and BI tools such as Microsoft Power BI (Desktop, Mobile, Report Server). Project experience using any of the following technologies: SAS, Teradata, Python, Snowflake, Qlik Replicate, Qlik Compose, Hadoop, Spark, Scala, Oracle, Pega, Salesforce, Cloud (Azure, AWS, SAS Viya), RPA tools such as UiPath, Nintex, or Blue Prism. Knowledge of Enterprise ETL tools such More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Hartree Partners
data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but Redshift/BigQuery/ClickHouse etc. also welcome). PREFERRED QUALIFICATIONS: Meteorological understanding/experience with weather modelling Prior knowledge or experience in the power markets or More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
JR United Kingdom
tolerant data pipelines Solid development experience within a commercial environment creating production grade ETL and ELT pipelines in python Comfortable implementing data architectures in analytical data warehouses such as Snowflake, Redshift or BigQuery Hands on experience with data orchestrators such as Airflow Knowledge of Agile development methodologies Awareness of cloud technology particularly AWS. Knowledge of automated delivery processes Hands on More ❯
lead the data solution to meet our requirements now and in future. Skills/Experience required: (Required) 3+ years of relevant Data Engineering experience (Preferably Databricks/Azure - or Snowflake/Redshift/BigQuery) (Required) Experience with infrastructure as code (e.g. Terraform) (Required) Proficiency in using Python both for scheduling (e.g. Airflow) and manipulating data (PySpark) (Required) Experience building deployment More ❯
experience working on mission critical data pipelines and ETL systems. 5+ years of hands-on experience with big data technology, systems and tools such as AWS, Hadoop, Hive, and Snowflake Expertise with common Software Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages Experience with workflow orchestration tools such as Airflow Detailed More ❯
data ingestion, data processing, data transformation, business intelligence, AI, and advanced analytics. Proven hands-on capability with relevant technology: Azure Platform, Azure Data Services, Databricks, Power BI, SQL DW, Snowflake, Big Query, and Advanced Analytics. Proven ability to understand low-level data engineering solutions and languages (Spark, MPP, Python, Delta, Parquet). Experience with Azure DevOps & CICD processes, software development More ❯
SSO and authentication. Platforms: Proprietary or third-party solutions (e.g., Charles River, BlackRock Aladdin). Languages: Java, Python, or C# with Spring Boot or .NET Core. Data Platforms: Warehouses: Snowflake, Google BigQuery, or Amazon Redshift. Analytics: Tableau, Power BI, or Looker for client reporting. Big Data: Apache Spark or Hadoop for large-scale processing. AI/ML: TensorFlow or Databricks More ❯
management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching, Good knowledge of Databricks, Snowflake, Azure/AWS/Oracle cloud, R, Python. Company Description Version 1 has celebrated over 28 years in Technology Services and continues to be trusted by global brands to deliver More ❯
SQL and (optionally) experience with Python. Experience building scalable, high-quality data models that serve complex business use cases. Knowledge of dbt and experience with cloud data warehouses (BigQuery, Snowflake, Databricks, Azure Synapse etc). Proficiency in building BI dashboards and self-service capabilities using tools like Tableau and Looker. Excellent communication skills and experience in managing stakeholder expectations across More ❯
for quantitative trading and analytics, with demonstrated expertise in: The python data engineering stack (Polars, Parquet, FastAPI, Jupyter, Airflow, Streamlit, Ray) High-performance data stores and query engines (Starburst, Snowflake) Real-time streaming analytics technologies (Kafka, Flink) Cloud container technologies (AWS, Azure, GCP, Docker, Kubernetes) Proven success in enhancing developer experience that reduces friction in coding, building and deploying APIs More ❯
London, England, United Kingdom Hybrid / WFH Options
Citi
for quantitative trading and analytics, with demonstrated expertise in: The python data engineering stack (Polars, Parquet, FastAPI, Jupyter, Airflow, Streamlit, Ray) High-performance data stores and query engines (Starburst, Snowflake) Real-time streaming analytics technologies (Kafka, Flink) Cloud container technologies (AWS, Azure, GCP, Docker, Kubernetes) Proven success in enhancing developer experience that reduces friction in coding, building and deploying APIs More ❯
and efficiency. Highly Desirable: Experience with Informatica ETL, Hyperion Reporting, and intermediate/advanced PL/SQL. Desirable Experience in a financial corporation Lake House/Delta Lake and Snowflake Experience with Spark clusters, both elastic permanent and transitory clusters Familiarity with data governance, data security, and compliance requirements. Power Automate. Benefits of working at Canada Life We believe in More ❯
Blackpool, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
data-related issues Strong analytical and problem-solving skills Strong teamwork, interpersonal and collaboration skills with colleagues and clients Desirable: Experience with Cloud ETL tools such as Databricks/Snowflake, Spark and Kafka Experience using source control tools such as GitHub or Azure DevOps Experience with Azure DevOps for CI/CD pipeline development and data operations (DataOps) Experience with More ❯
Preston, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
complex data-related issues Strong analytical and problem-solving skills Strong teamwork, interpersonal and collaboration skills with colleagues and clients Experience with Cloud ETL tools such as Databricks/Snowflake, Spark and Kafka Experience using source control tools such as GitHub or Azure DevOps Experience with Azure DevOps for CI/CD pipeline development and data operations (DataOps) Experience with More ❯
experience in advanced data modelling and cloud data warehouses, with a proven track record of designing and implementing complex, performant data models in enterprise-grade cloud environments (e.g., BigQuery, Snowflake, Redshift), consistently optimising for scale and cost-efficiency. Demonstrate mastery of SQL and dbt, exhibiting expertise in advanced SQL techniques and extensive experience in developing, optimising, and troubleshooting data transformations More ❯
London, England, United Kingdom Hybrid / WFH Options
Tasman
requirements and available budget. Some of the products and platforms that you are likely to come across at Tasman are: AWS, GCP and Azure cloud environments; Airflow and Prefect; Snowflake, BigQuery, Athena, and Redshift; Airbyte, Stitch, Fivetran, and Meltano; dbt (both Cloud and Core); Looker, Metabase, Tableau and Hollistics; Docker and Kubernetes; Snowplow, Segment, Rudderstack, and mParticle; Metaplane and other More ❯