with strong analytical and problem-solving skills. Excellent communication and stakeholder engagement skills. Experience with cloud data platforms (e.g., Azure Synapse, AWS Redshift, GoogleBigQuery). Knowledge of data governance frameworks and data quality management. Competence in data modelling and database design techniques. Experience working in agile or hybrid More ❯
Expertise in data architecture, data modelling, data analysis, and data migration. Strong knowledge of SQL, NoSQL, cloud-based databases (e.g., AWS Redshift, Snowflake, GoogleBigQuery, Azure Synapse). Experience in ETL development, data pipeline automation, and data integration strategies. Familiarity with AI-driven analytics Strong understanding of data governance More ❯
Expertise in data architecture, data modelling, data analysis, and data migration. Strong knowledge of SQL, NoSQL, cloud-based databases (e.g., AWS Redshift, Snowflake, GoogleBigQuery, Azure Synapse). Experience in ETL development, data pipeline automation, and data integration strategies. Familiarity with AI-driven analytics Strong understanding of data governance More ❯
retention analytics, and customer segmentation models Basic experience with predictive modeling techniques (regression models, clustering, or time-series analysis) Familiarity with Python, Snowflake ML, BigQuery ML, or Azure AI for foundational ML applications Experience with BI tools (Tableau, Power BI) for CX reporting and visualization Ability to optimize large More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
common data science coding languages such as in SQL, Python and/or R Practical experience with Google Cloud Platform and services such as BigQuery, Looker, and DataProc Additional Information The Power of One starts with our people! To do powerful things, we offer powerful resources. Our best-in More ❯
scale datasets. Jupyter, Databricks, or notebooks-based workflows for experimentation. Data Access & Engineering Collaboration : Comfort working with cloud data warehouses (e.g., Snowflake, Databricks, Redshift, BigQuery) Familiarity with data pipelines and orchestration tools like Airflow Work closely with Data Engineers to ensure model-ready data and scalable pipelines. Nice to More ❯
technical issues and successes to team members and Product Owners. Nice to Have Experience with any of these- Python, Spark, Kafka, Kinesis, Kinesis Analytics, BigQuery, Dataflow, BigTable, and SQL. Enthusiastic about learning and applying new technologies (growth mindset). Ability to build new solutions and support our data solutions More ❯
similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for More ❯
people management or desire to manage individuals on the team Nice to Have Experience with some of these- Python, Spark, Kafka, Kinesis, Kinesis Analytics, BigQuery, Dataflow, BigTable, and SQL. Enthusiastic about learning and applying new technologies (growth mindset). Ability to build new solutions and support our data solutions More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
Expect variety – you’ll work across cloud platforms like Azure, AWS, and GCP, leveraging tools such as Databricks, Data Factory, Synapse, Kafka, Glue, Redshift, BigQuery, and more. About You You’re an engineer at heart, with a passion for building efficient, scalable data systems. To succeed, you’ll need More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
on experience or strong theoretical knowledge of AI and GenAI techniques and methodologies. Familiarity with cloud AI platforms such as GCP Vertex AI and BigQuery, Azure AI, or similar enterprise-level AI deployment environments. Experience or knowledge of regulatory requirements and frameworks relevant to AI, such as the EU More ❯
of these. Their ideal candidate would have 10+ years experience in Data Engineering/Architecture and have good knowledge within: Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Leadership/Line Management Consulting/Client Facing Experience In return they would More ❯
throughput backend systems Experience with BI/reporting engines or OLAP stores Deep Ruby/Rails & ActiveRecord expertise Exposure to ClickHouse/Redshift/BigQuery Event-driven or stream processing (Kafka, Kinesis) Familiarity with data-viz pipelines (we use Highcharts.js) AWS production experience (EC2, RDS, IAM, VPC) Contributions to More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Social network you want to login/join with: We’re a sports media network focused on building and nurturing a portfolio of highly engaged and connected communities of sports fans and bettors to create value for our partners. We More ❯
into technical solutions to transform online shopping as we know it. We seek a highly skilled Business Intelligence (BI) Developer with expertise in GoogleBigQuery, Looker Studio, and PostgreSQL. The ideal candidate will be responsible for designing, developing, and optimising our data pipelines and reporting solutions to drive data … driven decision-making across the organisation. KEY RESPONSIBILITIES This role demands a deep understanding of data modelling, SQL, and performance tuning in BigQuery and PostgreSQL, along with a proven ability to create insightful and visually appealing reports and dashboards in Looker Studio. BigQuery Development & Optimization: Design and implement … DataVisualisation: LookerStudio, Kabana Programming Languages: Go, Python Technologies: GCP, Gitlab, Jira, Confluence Skills: Essential : Strong proficiency in SQL with hands-on experience in GoogleBigQuery and PostgreSQL. Proven experience in optimising both BigQuery and PostgreSQL for performance and cost efficiency. Expertise in creating and managing reports and dashboards More ❯
into technical solutions to transform online shopping as we know it. We seek a highly skilled Business Intelligence (BI) Developer with expertise in GoogleBigQuery, Looker Studio, and PostgreSQL. The ideal candidate will be responsible for designing, developing, and optimising our data pipelines and reporting solutions to drive data … driven decision-making across the organisation. KEY RESPONSIBILITIES This role demands a deep understanding of data modelling, SQL, and performance tuning in BigQuery and PostgreSQL, along with a proven ability to create insightful and visually appealing reports and dashboards in Looker Studio. BigQuery Development & Optimization: Design and implement … DataVisualisation: LookerStudio, Kabana Programming Languages: Go, Python Technologies: GCP, Gitlab, Jira, Confluence Skills: Essential : Strong proficiency in SQL with hands-on experience in GoogleBigQuery and PostgreSQL. Proven experience in optimising both BigQuery and PostgreSQL for performance and cost efficiency. Expertise in creating and managing reports and dashboards More ❯
their team and assist with the continued scaling and optimisation of these. Their ideal candidate would have good knowledge within: Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Consulting/Client Facing Experience In return they would be offering Uncapped Progressions More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Responsibilities: Design and build large-scale data warehouses, ETL pipelines, and reporting platforms that are robust and efficient. (Looking at going down Snowflake or BigQuery) Utilize your expertise in backend languages; the tech stack is flexible, including Java, Python, .NET, Ruby, and more. Implement strong coding principles, including CI More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
you’ll lead: Full strategic ownership of analytics across growth, health outcomes, customer journeys and commercial metrics Building and evolving the modern data stack (BigQuery, dbt, Fivetran, Looker, Hex – with full autonomy to optimise) Managing and growing a high-performing analytics team Driving business-wide decision-making through experimentation More ❯
Social network you want to login/join with: col-narrow-left Client: Robert Walters Location: birmingham, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 4 Posted: 04.06.2025 Expiry Date: 19.07.2025 col-wide More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
the Group, including several high-profile household names. What you'll bring: Experience with cloud and big data technologies (e.g., Spark, Databricks, Delta Lake, BigQuery). Familiarity with eventing technologies (e.g., Event Hubs, Kafka) and file formats such as Parquet, Delta, or Iceberg. Interested in learning more? Get in More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
and managing tickets. Work with Product and Engineers to assess and mitigate risks associated with Front-end changes. Preferred Skills and Experience Experience with BigQuery and working with large datasets. Understanding complex systems and component interactions. Analytical, organized, and collaborative communication skills. Strong problem-solving abilities and a curious More ❯
BDD Up to date knowledge of Python coding, testing, debugging Experience working with large sets of data Experience with FastAPI or Flask Experience with BigQuery This is a contract role outside IR35 , so you must be UK based and have a company registered in the UK. If you're More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
GCP security hardening measures (network segmentation, access controls, encryption, GDPR/ISO 27001 compliance). • Design and implement ETL pipelines for marketing data into BigQuery for Looker. • Optimise BigQuery data models and schemas for Looker exploration and reporting. • Implement robust monitoring, logging, and alerting for infrastructure and security. More ❯