London, South East, England, United Kingdom Hybrid/Remote Options
Involved Solutions
Desirable Skills for the AWS Data Engineer: Experience with Databricks , Kafka , or Kinesis for real-time data streaming Knowledge of containerisation (Docker, ECS) and modern orchestration tools such as Airflow Familiarity with machine learning model deployment pipelines or data lakehouse architectures Data Engineer, AWS Data Engineer More ❯
Azure, or GCP, with hands-on experience in cloud-based data services. Proficiency in SQL and Python for data manipulation and transformation. Experience with modern data engineering tools, including Apache Spark, Kafka, and Airflow. Strong understanding of data modelling, schema design, and data warehousing concepts. Familiarity with data governance, privacy, and compliance frameworks (e.g., GDPR, ISO27001). Hands-on More ❯
architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What were looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused More ❯
research and technology teams. Exposure to low-latency or real-time systems. Experience with cloud infrastructure (AWS, GCP, or Azure). Familiarity with data engineering tools such as Kafka, Airflow, Spark, or Dask. Knowledge of equities, futures, or FX markets. Company Rapidly growing hedge fund offices globally including London Salary & Benefits The salary range/rates of pay is More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Lorien
data storytelling and operational insights. Optimise data workflows across cloud and on-prem environments, ensuring performance and reliability. Skills & Experience: Strong experience in ETL pipeline development using tools like ApacheAirflow, Informatica, or similar. Advanced SQL skills and experience with large-scale relational and cloud-based databases. Hands-on experience with Tableau for data visualisation and dashboarding. Exposure More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Searchability
are giving express consent for us to process (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: GCP | Python | SQL | MongoDB | Airflow | dbt | Terraform | Docker | ETL | AI | Machine Learning More ❯
Machine Learning: Experience supporting and understanding ML pipelines and models in a production setting. Direct experience with Google Cloud Platform, BigQuery, and associated tooling. Experience with workflow tools like Airflow or Kubeflow. Familiarity with dbt (Data Build Tool). Please send your CV for more information on these roles. Reasonable Adjustments: Respect and equality are core values to us. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Sanderson
for individuals with: Experience: Proven background as a Machine Learning Engineer. Technical Skills: Strong in SQL and Python (Pandas, Scikit-learn, Jupyter, Matplotlib). Data transformation & manipulation : experience with Airflow, DBT and Kubeflow Cloud: Experience with GCP and Vertex AI (developing ML services). Expertise: Solid understanding of computer science fundamentals and time-series forecasting. Machine Learning: Strong grasp More ❯
Poole, Dorset, United Kingdom Hybrid/Remote Options
Aspire Personnel Ltd
stakeholders, understanding and translating their needs into technical requirements. Possess outstanding communication and interpersonal skills, facilitating clear and effective collaboration within and outside the team. Desirables: Familiarity with the ApacheAirflow platform. Basic knowledge of BI tools such as Power BI to support data visualization and insights. Experience with version control using GIT for collaborative and organized code More ❯
cross-functional teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments … essential Operate within Agile teams and support DevOps practices What We're Looking For Proven experience as a Data Engineer in complex environments Strong proficiency in PostgreSQL and either Airflow or Spark Solid understanding of Node.js or React for integration and tooling Familiarity with containerisation technologies (Docker/Kubernetes) is a plus Excellent communication and stakeholder engagement skills Experience More ❯
cross-functional teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments … essential Operate within Agile teams and support DevOps practices What We're Looking For Proven experience as a Data Engineer in complex environments Strong proficiency in PostgreSQL and either Airflow or Spark Solid understanding of Node.js or React for integration and tooling Familiarity with containerisation technologies (Docker/Kubernetes) is a plus Excellent communication and stakeholder engagement skills Experience More ❯