managing databases (we use Elasticsearch/MongoDB/PostgreSQL). Experience with SQL. Experience with data versioning tools. Experience developing and maintaining data infrastructure for ETL pipelines, such as Apache Airflow. EPIC JOB + EPIC BENEFITS = EPIC LIFE We pay 100% for benefits except for PMI (for dependents). Our current benefits package includes pension, private medical insurance, health More ❯
Experience applying QM techniques to synthesis prediction, including using QM toolkits (e.g., PSI4, Orca, Gaussian). Experience with data curation and processing from heterogeneous sources; familiarity with tools like Apache Spark or Hadoop. Proficiency with cloud platforms (AWS, GCP, Azure). Familiarity with major machine learning frameworks (e.g., scikit-learn, TensorFlow, PyTorch). Open-source contributions or publications demonstrating More ❯
driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., Apache Airflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and Azure More ❯
driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., Apache Airflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and Azure More ❯
driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., Apache Airflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and Azure More ❯
HuggingFace, scikit-learn, PyTorch, Pandas, NumPy, SciPy Experience with AWS (principally EC2, S3, SageMaker) or Azure/GCP equivalents Some experience of designing, developing and deploying scalable infrastructure (eg Apache Airflow, Luigi or other cluster management software) Object Orientated concepts and design The ability to design and build unit-tested and well documented modular code Understanding of Agile software More ❯
Bash, Ansible DevOps & CI/CD: Jenkins, GitLab CI/CD, Terraform Cloud & Infrastructure: AWS Testing & Quality: Cucumber, SonarQube Monitoring & Logging: ELK Stack (Elasticsearch, Logstash, Kibana), Grafana Dataflow & Integration: Apache NiFi Experience across multiple areas is desirable; we don't expect you to know everything but a willingness to learn and contribute across the stack is key. #LI-JS2 More ❯
APIs within AWS (S3, Lambda, Glue, Athena, Eventbridge, DynamoDB, API Gateway, Appsync) and Snowflake Possess strong IaC (preferably AWS CDK) and Python skills. Any experience with AWS Kinesis/Apache Flink or other real time streaming analytics setup is a bonus. Ideally, you come from a software engineering background and are familiar with common practices such as environment separation More ❯
London, England, United Kingdom Hybrid / WFH Options
Modo Energy Limited
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
data into a data platform using Fivetran. Experience of developing BI dashboards using Power BI. Knowledge of security concepts relevant to Azure. Experience of workflow management tools such as Apache Airflow. Interested in the role? Complete the online application. We look forward to getting to know you. Discover more about LGT Wealth Management A message from our CEO Ben More ❯
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions, while More ❯
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions, while More ❯
on healthcare data preferred. Familiarity with NoSQL databases (e.g., MongoDB) and relational databases (e.g., PostgreSQL, MySQL). 5+ years in Python and SQL work. Knowledge of ETL tools (e.g., Apache Airflow) and cloud platforms (e.g., AWS, Azure, GCP). Understand data modelling concepts and best practices. Experience with healthcare data standards (e.g., HL7, FHIR, ICD, SNOMED, DICOM) preferred. Excellent More ❯
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions, while More ❯
London, England, United Kingdom Hybrid / WFH Options
FDM Group
Node JS and React JS. Utilise serverless functions and event driven architecture Lead the design and development of real-time streaming and batch data pipelines using technologies such as Apache Spark, Kafka, lambda, step functions and Snowflake. Lead the design and development of infrastructure using technologies such as Kubernetes, Lambda, Terraform, cloud custodian and AWS Transit Gateway. Utilise serverless More ❯
physical), metadata management, and master data management (MDM). Deep understanding of data integration, transformation, and ingestion techniques using modern tools (e.g., Azure Data Factory, Boomi, Informatica, Talend, dbt, Apache NiFi). Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms. Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure More ❯
London, England, United Kingdom Hybrid / WFH Options
Made Tech Limited
strategies. Strong experience in IaC and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop. Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes). Ability to create data pipelines on a More ❯
fields. Familiarity with cloud platforms (AWS, Azure, or GCP) for ML model deployment. Knowledge of MLOps practices and tools for experiment tracking. Experience with big data processing frameworks like Apache Spark. Knowledge of NoSQL databases (MongoDB, Elasticsearch) for handling unstructured data. Experience with data versioning and feature stores for machine learning. Proficiency in model deployment using containerization (Docker, Kubernetes More ❯
models on domain-specific data. Experience with cloud platforms like Azure, AWS, or GCP for machine learning workflows. Understanding of data engineering pipelines and distributed data processing (e.g., Databricks, Apache Spark). Strong analytical skills, with the ability to transform raw data into meaningful insights through AI techniques. Ability to work both independently and collaboratively in a dynamic, multicultural More ❯
Salford, England, United Kingdom Hybrid / WFH Options
QinetiQ Limited
are some things we’ve worked on recently that might give you a better sense of what you’ll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data into a common data format More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
QinetiQ Limited
are some things we’ve worked on recently that might give you a better sense of what you’ll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data into a common data format More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There’s no place quite like BFS and we’re proud of that. And it’s More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're proud of that. And it's More ❯
London, England, United Kingdom Hybrid / WFH Options
Morgan Advanced Materials
Azure cloud services across a mixture of Enterprise and SME environments Proficiency in Python, SQL, Azure Data Factory, Azure Synapse Analytics, Azure Data Lakes, and big data technologies like Apache Spark Experience with DevOps practices and CI/CD pipelines in an Azure environment is a plus. Certification in Azure (e.g., Microsoft Certified: Azure Data Engineer Associate) is highly More ❯
logical, physical), metadata management, and master data management (MDM) Deep understanding of data integration, transformation, and ingestion techniques using modern tools (e.g., Azure Data Factory, Boomi, Informatica, Talend, dbt, Apache NiFi) Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure, AWS More ❯