South East London, England, United Kingdom Hybrid / WFH Options
83data
warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using Apache Airflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
PyTorch). Experience working with financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., Apache Airflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's on Offer Competitive salary up More ❯
platform teams at scale, ideally in consumer-facing or marketplace environments. Strong knowledge of distributed systems and modern data ecosystems, with hands-on experience using technologies such as Databricks, Apache Spark, Apache Kafka, and DBT. Proven success in building and managing data platforms supporting both batch and real-time processing architectures. Deep understanding of data warehousing, ETL/ More ❯
e.g. Applied Intuition, Carla, etc.) Experience with databases (e.g., SQL) Certification in cloud computing (e.g., GCP, AWS, Azure, etc.) Ability to develop dashboard UIs for publishing performance (e.g., Grafana, Apache Superset, etc.) Exposure to safety certification standards and processes We provide: Competitive salary, benchmarked against the market and reviewed annually Company share programme Hybrid and/or flexible work More ❯
Full-stack development experience with front-end technologies like HTML and Vue.js. Proficiency in scalable database design (SQL, NoSQL, Graph databases) such as SQL Server, MongoDB, Cassandra, Redis, and Apache Druid. Experience with REST APIs, GraphQL, and gRPC. Hands-on experience with version control (GitHub/GitLab) and testing frameworks like SonarQube, xUnit, Postman, Cucumber, Polaris, and Blackduck. Knowledge More ❯
data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or … Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL and experience with schema design and query optimization for large datasets. Expertise More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills : Designing Databricks based solutions More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions More ❯
ability to explain complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of Delta Lake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of regulatory data requirements such as Solvency II, Core More ❯
opportunities to optimise data workflows, adopt emerging technologies, and enhance analytics capabilities. Requirements: Technical Proficiency: Hands-on experience building ETL/ELT pipelines with Python, SQL, or tools like Apache Airflow, and expertise in visualisation tools (Power BI, Tableau, or Looker). Cloud Expertise: Familiarity with cloud platforms like Snowflake, Databricks, or AWS/GCP/Azure for scalable More ❯
Brighton, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
opportunities to optimise data workflows, adopt emerging technologies, and enhance analytics capabilities. Requirements: Technical Proficiency : Hands-on experience building ETL/ELT pipelines with Python, SQL, or tools like Apache Airflow, and expertise in visualisation tools (Power BI, Tableau, or Looker). Cloud Expertise : Familiarity with cloud platforms like Snowflake, Databricks, or AWS/GCP/Azure for scalable More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
SQL, and Python. Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT expertise Terraform experience Expert SQL and Python Data Modelling Data Vault Apache Airflow My client have very limited interview slots and they are looking to fill this vacancy ASAP. I have limited slots for 1st stage interviews next week so if More ❯