similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory, Databricks, BigQuery, Dataflow, Dataproc) Would you like to join us as we work hard, have fun and make history? Apply for this job indicates a required field First Name Last Name More ❯
such as MySQL, PostgreSQL, or Oracle Experience with big data technologies such as Hadoop, Spark, or Hive Familiarity with data warehousing and ETL tools such as Amazon Redshift, GoogleBigQuery, or Apache Airflow Proficiency in Python and at least one other programming language such as Java, or Scala Willingness to mentor more junior members of the team Strong analytical More ❯
such as MySQL, PostgreSQL, or Oracle Experience with big data technologies such as Hadoop, Spark, or Hive Familiarity with data warehousing and ETL tools such as Amazon Redshift, GoogleBigQuery, or Apache Airflow Proficiency in Python and at least one other programming language such as Java, or Scala Willingness to mentor more junior members of the team Strong analytical More ❯
similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory, Databricks, BigQuery, Dataflow, Dataproc Would you like to join us as we work hard, have fun and make history? Apply for this job indicates a required field First Name Last Name More ❯
relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra) for efficient data storage and retrieval. Data Warehousing : Experience with data warehousing solutions, such as Amazon Redshift, GoogleBigQuery, Snowflake, or Azure Synapse Analytics, including data modelling and ETL processes. ETL Processes: Proficient in designing and implementing ETL (Extract, Transform, Load) processes using tools like Apache NiFi, Talend More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
Intuita Consulting
One Big Table (OBT) methodologies. • Translate business requirements from stakeholders into robust, well-documented and tested dbt models. • Develop and own workflows within Google Cloud Platform environments, primarily using BigQuery and dbt. • Write high-quality, optimised SQL for data transformation and analysis. • Develop and maintain scalable data pipelines within the Google Cloud Platform, ensuring efficient and cost-effective data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
OTA Recruitment
engineers, data scientists, and business stakeholders. Familiarity with cloud-based data ecosystems such as AWS, Azure, or GCP, and working with data warehouse/lakehouse technologies such as Snowflake, BigQuery, Redshift, or Athena/Glue. Essential: Proficient in writing clean, efficient, and maintainable SQL and Python code, particularly for data transformation and analytics use cases. Strong understanding of data More ❯
engineers, data scientists, and business stakeholders. Familiarity with cloud-based data ecosystems such as AWS, Azure, or GCP, and working with data warehouse/lakehouse technologies such as Snowflake, BigQuery, Redshift, or Athena/Glue. Essential: Proficient in writing clean, efficient, and maintainable SQL and Python code, particularly for data transformation and analytics use cases. Strong understanding of data More ❯
One Big Table (OBT) methodologies.• Translate business requirements from stakeholders into robust, well-documented and tested dbt models.• Develop and own workflows within Google Cloud Platform environments, primarily using BigQuery and dbt.• Write high-quality, optimised SQL for data transformation and analysis.• Develop and maintain scalable data pipelines within the Google Cloud Platform, ensuring efficient and cost-effective data More ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
build, and maintain robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools like Terraform. Ensure data quality More ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
build, and maintain robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools like Terraform. Ensure data quality and compliance More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
industry best practices to continuously enhance data engineering capabilities. Looking at our current pipeline of work, we can also consider those with an Analytics Engineering lean, with experience within BigQuery (GCP) with data modelling in DBT and mobile/telecoms industry experience would be beneficial. A bit about YOU! As much as we just love working with great, fun More ❯
Newbury, England, United Kingdom Hybrid / WFH Options
Intuita
industry best practices to continuously enhance data engineering capabilities. Looking at our current pipeline of work, we can also consider those with an Analytics Engineering lean, with experience within BigQuery (GCP) with data modelling in DBT and mobile/telecoms industry experience would be beneficial. A bit about YOU! As much as we just love working with great, fun More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
industry best practices to continuously enhance data engineering capabilities. Looking at our current pipeline of work, we can also consider those with an Analytics Engineering lean, with experience within BigQuery (GCP) with data modelling in DBT and mobile/telecoms industry experience would be beneficial. A bit about YOU! As much as we just love working with great, fun More ❯
City of London, London, United Kingdom Hybrid / WFH Options
FairPlay Sports Media
About FairPlay Sports Media: We’re a sports media network, focused on building and nurturing a portfolio of highly engaged and connected communities of sports fans and bettors to create value for our partners. We empower sports fans with real More ❯
About FairPlay Sports Media: We’re a sports media network, focused on building and nurturing a portfolio of highly engaged and connected communities of sports fans and bettors to create value for our partners. We empower sports fans with real More ❯
London, England, United Kingdom Hybrid / WFH Options
EXL
commercially viable, and aligned with client expectations. Enterprise Solution Design : Architect and lead the delivery of large-scale data platforms (including lakes, lakehouses, and warehouses) using GCP, Cloud Storage, BigQuery, Databricks, Snowflake. Cloud Data Strategy: Own cloud migration and modernisation strategy, leveraging GCP, and tools such as Terraform, Azure DevOps, GitHub, and CI/CD pipelines. Data Modelling: Apply More ❯
in Python or similar scripting language for test automation Experience with cloud platforms (AWS, GCP, or Azure), especially in data-related services Familiarity with data warehousing concepts (e.g., Snowflake, BigQuery, Redshift) Strong understanding of data governance, data profiling, and quality metrics Excellent problem-solving and communication skills Ability to work independently and as part of a distributed team Nice More ❯
conceptual, logical, physical). • Proficiency in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra). • In-depth knowledge of data warehousing concepts and tools (e.g., Redshift, Snowflake, GoogleBigQuery). • Experience with big data platforms (e.g., Hadoop, Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud). • Expertise in ETL tools More ❯
East Midlands, England, United Kingdom Hybrid / WFH Options
James Adams
Data Expertise: A solid understanding of data warehousing concepts, ETL/ELT processes, data modeling, and various database technologies (e.g., SQL, and cloud data platforms like AWS Redshift, GoogleBigQuery, Snowflake). Strong Technical Acumen: The ability to grasp complex technical details and articulate them clearly to non-technical stakeholders. While not a hands-on coding role, a firm More ❯
Nottingham, England, United Kingdom Hybrid / WFH Options
Digital Native
for downstream use Developing foundational coding skills in Python and SQL to work with and manipulate data Gaining hands-on experience with modern data platforms like Snowflake , Databricks , and BigQuery Exploring cloud services such as AWS , Azure , or Google Cloud Platform (GCP) and their associated tools (e.g., Glue, Lambda, BigQuery, etc.) Being introduced to DataOps practices understanding how More ❯
Git. Excellent Problem-Solving and Communication Skills: The ability to troubleshoot complex data issues and communicate technical concepts effectively to both technical and non-technical audiences. Desirable GCP and BigQuery Knowledge: Utilize your experience with GCP and BigQuery for analytical data workloads and to ensure seamless interoperability within our multi-cloud strategy. Additional Information Along with your benefits More ❯
Git. Excellent Problem-Solving and Communication Skills: The ability to troubleshoot complex data issues and communicate technical concepts effectively to both technical and non-technical audiences. Desirable GCP and BigQuery Knowledge: Utilize your experience with GCP and BigQuery for analytical data workloads and to ensure seamless interoperability within our multi-cloud strategy. Additional Information Along with your benefits More ❯
for data models from the bronze layer upwards, promoting self-service analytics and data literacy. Technical Leadership & Excellence: Act as a subject matter expert in SQL (Postgres, Cloud SQL, BigQuery, Redshift), driving performance optimization and complex query development. Drive the adoption of best practices for dbt development, including modularity, testing, and documentation, across the team. Influence the selection and … design robust data structures. Exceptional knowledge and extensive experience with dbt for designing, building, and optimizing complex enterprise-level data models and transformations. Deep experience with cloud data warehouses (BigQuery, Redshift), including performance tuning and cost optimization. Strong proficiency with workflow orchestration tools like Airflow, capable of designing and implementing complex, production-grade DAGs. Extensive experience with multi-cloud More ❯
years in a Principal or Lead role. Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling More ❯