Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like ApacheAirflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge More ❯
role, you will: Lead, mentor, and inspire analytics and data engineering teams, fostering technical excellence and a culture of innovation. Architect and optimise ELT pipelines and workflows using dbt, Airflow, SQL, and Python. Oversee the design and evolution of large-scale analytics platforms and data warehouses (Snowflake, BigQuery, Redshift). Ensure best practices in governance, modelling, data quality, and More ❯
Solid understanding of data modeling concepts (star/snowflake schemas, normalization). Experience with version control systems (e.g., Git) and CI/CD practices. Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
development efficiency and deployment effectiveness, including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
development efficiency and deployment effectiveness, including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong More ❯
as pytest. Preferred Qualifications Experience working with biological or scientific datasets (e.g. genomics, proteomics, or pharmaceutical data). Knowledge of bioinformatics or large-scale research data. Familiarity with Nextflow, Airflow, or Google Workflows. Understanding of NLP techniques and processing unstructured data. Experience with AI/ML-powered applications and containerised development (Docker). Contract Details Day Rate: £750 (Inside More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Montash
S3 and Redshift, particularly around storage, computation and security. Familiarity with modern BI tools such as Power BI or AWS QuickSight. Experience with open-source data stack tools like Airflow, DBT, Airbyte or similar. Strong grasp of software development best practices and CI/CD processes. Skilled in performance tuning, testing, and automation within data engineering environments. Excellent communication More ❯
Certified Data Analytics – Specialty, Solutions Architect). Snowflake certifications. Experience in leading agile teams or projects using Scrum/Kanban methodologies. Familiarity with modern data stack tools (e.g., dbt, Airflow, Fivetran) is a plus. Why Join Us? Be part of a high-impact team shaping the next generation of data capabilities in the cloud. Work with cutting-edge technology More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Oscar
PyTorch). Solid knowledge of SQL and data warehousing concepts. Demonstrated experience with cloud platforms ( GCP & Azure preferred). Strong understanding of data modelling, transformation, and orchestration (e.g. dbt, Airflow, Azure Data Factory). Familiarity with digital marketing data flows (e.g. GA4, ad servers, programmatic platforms, CRMs). Proven ability to manage or mentor junior team members, providing technical More ❯
NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance Good Understanding of Airflow,Data Fusion and Data Flow Strong Background and experience in Data Ingestions,Transformation,Modeling and Performance tuning. One migration Experience from Cornerstone to GCP will be added advantage Suppport More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Experis
Proven experience in developing and deploying machine learning models in production. Solid understanding of data structures, algorithms, and software engineering principles. Experience with ML pipelines and orchestration tools (e.g., Airflow, Kubeflow, MLflow). Proficiency in working with cloud services (AWS, GCP, or Azure). Strong understanding of CI/CD, containerisation (Docker), and orchestration (Kubernetes). Excellent problem-solving More ❯
slough, south east england, united kingdom Hybrid / WFH Options
twentyAI
critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn clean data into … engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯
fine-tuning, or deployment. Technical Skills: Proficiency in Python and frameworks such as PyTorch , TensorFlow , or JAX . Strong familiarity with distributed computing and data engineering tools (e.g., SLURM , Apache Spark , Airflow ). Hands-on experience with LLM training, fine-tuning, and deployment (e.g., Hugging Face , LLamafactory , NVIDIA NeMo ). Preferred Qualifications Advanced degree (MS/PhD) in More ❯