Greater London, England, United Kingdom Hybrid / WFH Options
Ignite Digital Talent
Strong hands-on experience with Python in a data context Proven skills in SQL Experience with Data Warehousing (DWH) ideally with Snowflake or similar cloud data platforms (Databricks or Redshift) Experience with DBT, Kafka, Airflow, and modern ELT/ETL frameworks Familiarity with data visualisation tools like Sisense, Looker, or Tableau Solid understanding of data architecture, transformation workflows, and More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Ignite Digital Talent
Strong hands-on experience with Python in a data context Proven skills in SQL Experience with Data Warehousing (DWH) ideally with Snowflake or similar cloud data platforms (Databricks or Redshift) Experience with DBT, Kafka, Airflow, and modern ELT/ETL frameworks Familiarity with data visualisation tools like Sisense, Looker, or Tableau Solid understanding of data architecture, transformation workflows, and More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
JR United Kingdom
data architecture. Work with technologies such as Python, Java, Scala, Spark, and SQL to extract, clean, transform, and integrate data. Build scalable solutions using AWS services like EMR, Glue, Redshift, Kinesis, Lambda, and DynamoDB. Process large volumes of structured and unstructured data, integrating multiple sources to create efficient data pipelines. Collaborate with engineering teams to integrate data solutions into More ❯
London, England, United Kingdom Hybrid / WFH Options
Ignite Digital Talent
Strong hands-on experience with Python in a data context Proven skills in SQL Experience with Data Warehousing (DWH) ideally with Snowflake or similar cloud data platforms (Databricks or Redshift) Experience with DBT, Kafka, Airflow, and modern ELT/ETL frameworks Familiarity with data visualisation tools like Sisense, Looker, or Tableau Solid understanding of data architecture, transformation workflows, and More ❯
London, England, United Kingdom Hybrid / WFH Options
Builder.ai
new approaches. Extensive software engineering experience with Python (no data science background required). Experience with production microservices (Docker/Kubernetes) and cloud infrastructure. Knowledge of databases like Postgres, Redshift, Neo4j is a plus. Why You Should Join This role sits at the intersection of data science and DevOps. You will support data scientists, design, deploy, and maintain microservices More ❯
London, England, United Kingdom Hybrid / WFH Options
Prolific
hands on experience deploying production quality code with proficiency in Python for data processing and related packages. Data Infrastructure Knowledge : Deep understanding of SQL and analytical data warehouses (Snowflake, Redshift preferred) with proven experience implementing ETL/ELT best practices at scale. Pipeline Management : Hands on experience with data pipeline tools (Airflow, dbt) and strong ability to optimise for More ❯
Spotfire. Shape schema design, enrich metadata, and develop APIs for reliable and flexible data access. Optimize storage and compute performance across data lakes and warehouses (e.g., Delta Lake, Parquet, Redshift). Document data contracts, pipeline logic, and operational best practices to ensure long-term sustainability and effective collaboration. Required Qualifications Demonstrated experience as a data engineer in biopharmaceutical or More ❯
new technologies essential for automating models and advancing our engineering practices. You're familiar with cloud technologies . You have experience working with data in a cloud data warehouse (Redshift, Snowflake, Databricks, or BigQuery) Experience with a modern data modeling technology (DBT) You document and communicate clearly . Some experience with technical content writing would be a plus You More ❯
business problems. Comfort with rapid prototyping and disciplined software development processes. Experience with Python, ML libraries (e.g. spaCy, NumPy, SciPy, Transformers, etc.), data tools and technologies (Spark, Hadoop, Hive, Redshift, SQL), and toolkits for ML and deep learning (SparkML, Tensorflow, Keras). Demonstrated ability to work on multi-disciplinary teams with diverse skillsets. Deploying machine learning models and systems More ❯
performing data analytics on AWS platforms Experience in writing efficient SQL's, implementing complex ETL transformations on big data platform. Experience in a Big Data technologies (Spark, Impala, Hive, Redshift, Kafka, etc.) Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues Experience with Databricks, Snowflake, Iceberg are required Preferred qualifications, capabilities More ❯
. Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms. Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure, AWS Redshift, GCP BigQuery). Experience with BI/analytics tools (e.g., Power BI, Tableau) and data visualization best practices. Strong knowledge of data governance, data privacy, and compliance frameworks (e.g. More ❯
and manage DBT models for data transformation and modeling in a modern data stack. Proficiency in SQL , Python , and PySpark . Experience with AWS services such as S3, Athena, Redshift, Lambda, and CloudWatch. Familiarity with data warehousing concepts and modern data stack architectures. Experience with CI/CD pipelines and version control (e.g., Git). Collaborate with data analysts More ❯
and manage DBT models for data transformation and modeling in a modern data stack. Proficiency in SQL , Python , and PySpark . Experience with AWS services such as S3, Athena, Redshift, Lambda, and CloudWatch. Familiarity with data warehousing concepts and modern data stack architectures. Experience with CI/CD pipelines and version control (e.g., Git). Collaborate with data analysts More ❯
if you have 4+ years of relevant work experience in Analytics, Business Intelligence, or Technical Operations Master in SQL, Python, and ETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency in Looker and/or Tableau Strong More ❯
London, England, United Kingdom Hybrid / WFH Options
Aldermore Bank PLC
SQL, and Python or similar programming languages for data analysis and manipulation. • Cloud engineering experience with AWS and Azure • Experience working with cloud-based data platforms such as AWS Redshift or Azure Synapse with an understanding of automated Terraform deployment methods • Strong analytical and problem-solving skills with a detail-oriented approach to data interpretation • Expertise in data visualization More ❯
business problems. Comfort with rapid prototyping and disciplined software development processes. Experience with Python, ML libraries (e.g. spaCy, NumPy, SciPy, Transformers, etc.), data tools and technologies (Spark, Hadoop, Hive, Redshift, SQL), and toolkits for ML and deep learning (SparkML, Tensorflow, Keras). Demonstrated ability to work on multi-disciplinary teams with diverse skillsets. Deploying machine learning models and systems More ❯
if you have 4+ years of relevant work experience in Analytics, Business Intelligence, or Technical Operations Master in SQL, Python, and ETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency in Looker and/or Tableau Strong More ❯
if you have 4+ years of relevant work experience in Analytics, Business Intelligence, or Technical Operations Master in SQL, Python, and ETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency in Looker and/or Tableau Strong More ❯
Functions. Strong knowledge of scripting languages (e.g., Python, Bash, PowerShell) for automation and data transformation. Proficient in working with databases, data warehouses, and data lakes (e.g., SQL, NoSQL, Hadoop, Redshift). Familiarity with APIs and web services for integrating external systems and applications into orchestration workflows. Hands-on experience with data transformation and ETL (Extract, Transform, Load) processes. Strong More ❯
NiFi) Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure, AWS Redshift, GCP BigQuery) Experience with BI/analytics tools (e.g., Power BI, Tableau) and data visualization best practices Strong knowledge of data governance, data privacy, and compliance frameworks (e.g., GDPR More ❯
as-Code (IaC) and delivering data platform projects in iterative cycles. Non-Microsoft Data Tools – Exposure to or hands-on experience with tools such as Snowflake, Databricks or AWS Redshift Cross-Platform Reporting Tools – Knowledge of BI tools beyond Power BI, such as Tableau or Qlik, for comparative understanding or hybrid deployments. KEY COMPETENCIES REQUIRED FOR ROLE Achievement Focus More ❯
scalable product adoption datasets, ensuring ease of downstream integration and rapid onboarding of new events or features. Data Egestion: Develop and manage data pipelines for exporting curated datasets from Redshift to platforms like Salesforce and Gainsight using reverse ETL tools (e.g., Hightouch). Data Ingestion: Own end-to-end responsibility for ingesting key productivity data from platforms such as More ❯
Wideopen, England, United Kingdom Hybrid / WFH Options
Working Families Party
in Python and SQL for data engineering and systems development Expertise in web application backends (Python/Flask/Django or Node.js) Experience with columnar database systems like BigQuery, Redshift, DuckDB, etc Comfortable working in a git-based team environment with collaborative development practices Debugging skills across multiple layers of a system (source data, transformation layers, pipelines, infrastructure) Exceptional More ❯
London, England, United Kingdom Hybrid / WFH Options
Tasman
Some of the products and platforms that you are likely to come across at Tasman are: AWS, GCP and Azure cloud environments; Airflow and Prefect; Snowflake, BigQuery, Athena, and Redshift; Airbyte, Stitch, Fivetran, and Meltano; dbt (both Cloud and Core); Looker, Metabase, Tableau and Hollistics; Docker and Kubernetes; Snowplow, Segment, Rudderstack, and mParticle; Metaplane and other observability tools; Census More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Bit Bio
bit.bio is an award-winning spinout from the University of Cambridge. Our breakthrough technology combines synthetic and stem cell biology for the precise, efficient and consistent reprogramming of human cells used in research, drug discovery, and cell therapy. At bit.bio More ❯