london, south east england, United Kingdom Hybrid / WFH Options
Finatal
data architecture and processes to improve reliability, data quality and consistency. Requirements: Strong experience working across a modern cloud environment, a GCP stack including BigQuery or Snowflake DWH, Fivetran and transformation in dbt. Strong skills in building dashboards and using visualisation tools such as Tableau (beneficial), Power BI or More ❯
ability to develop data-driven hypotheses. Data Science Knowledge : Familiarity with concepts like clustering, predictive modelling, and basic statistical techniques. Technical Skills : Knowledge of BigQuery/SQL/Python is desirable but not essential. Strong communication skills - experience presenting to clients and collaborating with internal teams. Experience working with More ❯
in hands-on testing of data pipelines and financial reporting systems. Must be comfortable conducting data reconciliation and accuracy testing independently. Proficiency in SQL (BigQuery preferred) to extract, validate, and compare data across systems. Experience testing dbt transformations and ensuring data integrity within cloud-based data environments. Experience supporting More ❯
a fast-paced startup or technology company. Technical Skills: Proficiency in SQL and Python. Strong statistical and experimental design expertise. Experience with tools like BigQuery, Fivetran, and DBT is desired. Collaboration: You thrive in cross-functional teams, working with engineers, marketers, designers, and leadership to drive impact together. Product More ❯
in hands-on testing of data pipelines and financial reporting systems. Must be comfortable conducting data reconciliation and accuracy testing independently. Proficiency in SQL (BigQuery preferred) to extract, validate, and compare data across systems. Experience testing dbt transformations and ensuring data integrity within cloud-based data environments. Experience supporting More ❯
native languages, with a dash of Scala when needed. DBT, data modeling , and analytics are your go-to tools; Airflow is your daily companion. BigQuery/GCP hold no secrets for you, and AWS is a trusted friend. You know when to build real-time pipelines-and when not More ❯
good. We're searching for an experienced Analytics Engineer to help us enhance our data warehouse. You'll primarily work in SQL with dbt (BigQuery) but have the opportunity to work in other programming languages, expanding your skillset and making a significant impact on our data capabilities. We're More ❯
quality control processes. What We're Looking For: Proven experience in a data analysis, reporting, or business intelligence role. Strong working knowledge of GoogleBigQuery and SQL to write and customise queries. Proficiency in data visualisation tools such as Power BI, Tableau, or Google Data Studio. Strong skills in More ❯
and DAX. Knowledge of all laws relating to data. The ability to analyse data to identify exceptions/areas for attention. Knowledge of GoogleBigQuery, Looker and other GCP tools or desire to learn. Required Documents CV/Resume Application Process Interested and qualified candidates should kindly CLICK HERE More ❯
put you at the top of our list: Google Cloud Storage. Google Data Transfer Service. Google Dataflow (Apache Beam). Google PubSub. Google CloudRun. BigQuery or any RDBMS. Python. Debezium/Kafka. dbt (Data Build tool). Interview process Interviewing is a two way process and we want you More ❯
would put you at the top of our list Google Cloud Storage Google Data Transfer Service Google Dataflow (Apache Beam) Google PubSub Google CloudRun BigQuery or any RDBMS Python Debezium/Kafka dbt (Data Build tool) Interview process Interviewing is a two way process and we want you to More ❯
access, storage, manipulation, and interpretation of media performance data across cloud infrastructure technologies, specifically Google Cloud and API tools. Proven experience with SQL, GoogleBigQuery, Snowflake or similar. Proven experience with technical and commercial stakeholder management. Nice to have experience within a matrixed organization. Experience with Data Visualization software More ❯
scalable, secure, and efficient. Database Expertise : Familiarity with both SQL and NoSQL database technologies, such as Firestore and PostgreSQL. Experience with data warehouses like BigQuery is a significant plus. Authentication and Authorization : Knowledge of authentication and authorization mechanisms, including JWT (JSON Web Tokens), and experience with custom Identity Providers More ❯
proficiency in Python (NumPy, Pandas, Scikit-Learn, etc.), SQL, and cloud platforms such as GCP or AWS. Have experience working with modern databases like BigQuery, Snowflake, or Redshift. Have successfully deployed machine learning models or optimisation algorithms in production environments. Have a solid understanding of digital marketing and marketing More ❯
proficiency in Python (NumPy, Pandas, Scikit-Learn, etc.), SQL, and cloud platforms such as GCP or AWS. Have experience working with modern databases like BigQuery, Snowflake, or Redshift. Have successfully deployed machine learning models or optimisation algorithms in production environments. Have a solid understanding of digital marketing and marketing More ❯
East Hagbourne, Oxfordshire, United Kingdom Hybrid / WFH Options
CV Screen
At least 5 years of experience in data analysis with a strong focus on data extraction, analysis, and manipulation. Expertise in GTM/GA4, BigQuery, SQL, and Excel (VBA & Macros). Familiarity with tools like Power BI and Python is a significant advantage. Salary & Benefits Salary: £60,000 per More ❯
. Experience with recommendation systems, collaborative filtering, and ideally Graph Neural Networks (GNNs). Proficient with SQL and working with large datasets (e.g., GCP BigQuery). Experience with cloud platforms (specifically GCP) and CI/CD processes (specifically GitHub Actions). Experience deploying models in containerized applications. Excellent problem More ❯
. Experience with recommendation systems, collaborative filtering, and ideally Graph Neural Networks (GNNs). Proficient with SQL and working with large datasets (e.g., GCP BigQuery). Experience with cloud platforms (specifically GCP) and CI/CD processes (specifically GitHub Actions). Experience deploying models in containerized applications. Excellent problem More ❯
clients are invoiced correctly. We are looking for someone who ideally has experience of working in a Data Team and is proficient with DBT, Bigquery and Looker. Key Responsibilities: Building new processes to map Data to billable events and KPIs, working with Data engineers to ensure this. Data Mapping More ❯
SQL, including querying, optimizing, and managing databases. Experience with data processing platforms such as Spark, Hadoop. Demonstrated experience with GCP services such as DataProc, BigQuery, GCS, IAM, and others, and/or their AWS equivalents. Work well as an individual and as part of a team that excels at More ❯
environment Enthusiastic about emerging technology and Insurtech Desirable; Experience with dbt Experience with version control software Experience within a modern cloud data warehouse, e.g. BigQuery, Snowflake, Databricks Experience working on data within Insurance and/or a B2B company About Us; We’ve combined groundbreaking AI and industry expertise More ❯
learn, TensorFlow/PyTorch). Statistical methods and machine learning (e.g., A/B testing, model validation). Data pipelining tools like SQL, dbt, BigQuery, or Spark. A strong communicator with the ability to communicate technical concepts into layman's terms for a non-technical audience. You're not More ❯
OPENMIND RESPONSIBILITIES: Performance: Build and maintain robust data pipelines to collect, transform, and load data from various sources into a centralized data warehouse (e.g., BigQuery). Implement and maintain data quality checks and monitoring systems to ensure data accuracy and reliability. Develop and maintain data models and schemas to … Strong understanding of data engineering principles and data warehousing concepts. Proven experience building and maintaining data pipelines using tools like Google Cloud Platform (e.g., BigQuery) or similar technologies. Experience with data integration platforms like Adverity, Funnel.io, or similar. Proficiency in SQL for data extraction, transformation, and loading (ETL) processes. More ❯