driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., Apache Airflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and Azure More ❯
driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., Apache Airflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and Azure More ❯
driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., Apache Airflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and Azure More ❯
HuggingFace, scikit-learn, PyTorch, Pandas, NumPy, SciPy Experience with AWS (principally EC2, S3, SageMaker) or Azure/GCP equivalents Some experience of designing, developing and deploying scalable infrastructure (eg Apache Airflow, Luigi or other cluster management software) Object Orientated concepts and design The ability to design and build unit-tested and well documented modular code Understanding of Agile software More ❯
Bash, Ansible DevOps & CI/CD: Jenkins, GitLab CI/CD, Terraform Cloud & Infrastructure: AWS Testing & Quality: Cucumber, SonarQube Monitoring & Logging: ELK Stack (Elasticsearch, Logstash, Kibana), Grafana Dataflow & Integration: Apache NiFi Experience across multiple areas is desirable; we don't expect you to know everything but a willingness to learn and contribute across the stack is key. #LI-JS2 More ❯
Bash, Ansible DevOps & CI/CD: Jenkins, GitLab CI/CD, Terraform Cloud & Infrastructure: AWS Testing & Quality: Cucumber, SonarQube Monitoring & Logging: ELK Stack (Elasticsearch, Logstash, Kibana), Grafana Dataflow & Integration: Apache NiFi Experience across multiple areas is desirable; we don’t expect you to know everything but a willingness to learn and contribute across the stack is key. #LI-JS2 More ❯
APIs within AWS (S3, Lambda, Glue, Athena, Eventbridge, DynamoDB, API Gateway, Appsync) and Snowflake Possess strong IaC (preferably AWS CDK) and Python skills. Any experience with AWS Kinesis/Apache Flink or other real time streaming analytics setup is a bonus. Ideally, you come from a software engineering background and are familiar with common practices such as environment separation More ❯
data into a data platform using Fivetran. Experience of developing BI dashboards using Power BI. Knowledge of security concepts relevant to Azure. Experience of workflow management tools such as Apache Airflow. Interested in the role? Complete the online application. We look forward to getting to know you. Discover more about LGT Wealth Management A message from our CEO Ben More ❯
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions, while More ❯
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions, while More ❯
on healthcare data preferred. Familiarity with NoSQL databases (e.g., MongoDB) and relational databases (e.g., PostgreSQL, MySQL). 5+ years in Python and SQL work. Knowledge of ETL tools (e.g., Apache Airflow) and cloud platforms (e.g., AWS, Azure, GCP). Understand data modelling concepts and best practices. Experience with healthcare data standards (e.g., HL7, FHIR, ICD, SNOMED, DICOM) preferred. Excellent More ❯
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions, while More ❯
London, England, United Kingdom Hybrid / WFH Options
FDM Group
Node JS and React JS. Utilise serverless functions and event driven architecture Lead the design and development of real-time streaming and batch data pipelines using technologies such as Apache Spark, Kafka, lambda, step functions and Snowflake. Lead the design and development of infrastructure using technologies such as Kubernetes, Lambda, Terraform, cloud custodian and AWS Transit Gateway. Utilise serverless More ❯
physical), metadata management, and master data management (MDM). Deep understanding of data integration, transformation, and ingestion techniques using modern tools (e.g., Azure Data Factory, Boomi, Informatica, Talend, dbt, Apache NiFi). Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms. Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure More ❯
London, England, United Kingdom Hybrid / WFH Options
Made Tech Limited
strategies. Strong experience in IaC and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop. Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes). Ability to create data pipelines on a More ❯
fields. Familiarity with cloud platforms (AWS, Azure, or GCP) for ML model deployment. Knowledge of MLOps practices and tools for experiment tracking. Experience with big data processing frameworks like Apache Spark. Knowledge of NoSQL databases (MongoDB, Elasticsearch) for handling unstructured data. Experience with data versioning and feature stores for machine learning. Proficiency in model deployment using containerization (Docker, Kubernetes More ❯
models on domain-specific data. Experience with cloud platforms like Azure, AWS, or GCP for machine learning workflows. Understanding of data engineering pipelines and distributed data processing (e.g., Databricks, Apache Spark). Strong analytical skills, with the ability to transform raw data into meaningful insights through AI techniques. Ability to work both independently and collaboratively in a dynamic, multicultural More ❯
Salford, England, United Kingdom Hybrid / WFH Options
QinetiQ Limited
are some things we’ve worked on recently that might give you a better sense of what you’ll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data into a common data format More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're proud of that. And it's More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There’s no place quite like BFS and we’re proud of that. And it’s More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're proud of that. And it's More ❯
London, England, United Kingdom Hybrid / WFH Options
The Remote Job Journal
Azure cloud services across a mixture of Enterprise and SME environments Proficiency in Python, SQL, Azure Data Factory, Azure Synapse Analytics, Azure Data Lakes, and big data technologies like Apache Spark Experience with DevOps practices and CI/CD pipelines in an Azure environment is a plus. Certification in Azure (e.g., Microsoft Certified: Azure Data Engineer Associate) is highly More ❯
logical, physical), metadata management, and master data management (MDM) Deep understanding of data integration, transformation, and ingestion techniques using modern tools (e.g., Azure Data Factory, Boomi, Informatica, Talend, dbt, Apache NiFi) Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure, AWS More ❯
Azure) Experience managing PKI/X.509 certificate infrastructure. Extensive experience supporting and implementing TLS/SSL certificate management systems Proficient with Token-based authentication services, Perfect Forward Security (PFS), Apache, Nginx, HAProxy Solid knowledge of Linux security and system operations. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support More ❯
London, England, United Kingdom Hybrid / WFH Options
Aker Systems Limited
exploring new technologies and methodologies to solve complex data challenges. Proven experience leading data engineering projects or teams. Expertise in designing and building data pipelines using frameworks such as Apache Spark, Kafka, Glue, or similar. Solid understanding of data modelling concepts and experience working with both structured and semi-structured data. Strong knowledge of public cloud services, especially AWS More ❯