Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Data Intellect
Python, SQL and/or Scala Knowledge of two or more common Cloud ecosystems (Azure, AWS, GCP) with expertise in at least one. Deep experience with distributed computing with Apache Spark Working knowledge CI/CD for production deployments Working knowledge of MLOps Familiarity with designing and deploying performant end-to-end data architectures Experience with technical project delivery More ❯
London, England, United Kingdom Hybrid / WFH Options
FIND | Creating Futures
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, Apache Spark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or informal More ❯
City of London, England, United Kingdom Hybrid / WFH Options
ACLED
English, problem-solving skills, attention to detail, ability to work remotely. Desirable: Cloud architecture certification (e.g., AWS Certified Solutions Architect). Experience with Drupal CMS, geospatial/mapping tools, Apache Airflow, serverless architectures, API gateways. Interest in conflict data, humanitarian tech, open data platforms; desire to grow into a solution architect or technical lead role. Application Process Submit CV More ❯
on domain-specific data. Experience working with cloud platforms like Azure, AWS, or GCP for machine learning workflows. Understanding of data engineering pipelines and distributed data processing (e.g., Databricks, Apache Spark). ·Strong analytical skills, with the ability to transform raw data into meaningful insights through AI techniques. Experience with SQL, ETL processes, and data orchestration tools (e.g. Azure More ❯
agile environment to deliver data solutions that support key firm initiatives. Build scalable and efficient batch and streaming data workflows within the Azure ecosystem. Apply distributed processing techniques using Apache Spark to handle large datasets effectively. Help drive improvements in data quality, implementing validation, cleansing, and monitoring frameworks. Contribute to the firm’s efforts around data security, governance, and More ❯
knowledge of ETL processes. Ability to write production-grade, automated testing code. Experience deploying via CI/CD platforms like Github Actions or Jenkins. Proficiency with distributed frameworks like Apache Spark. Experience with cloud platforms (AWS, Azure, GCP) and services (S3, Redshift, BigQuery). Knowledge of data modelling, database systems, and SQL optimisation. Other key criteria Knowledge of UK More ❯
writing, optimization techniques, data modeling, and database performance tuning. Skilled in working with large datasets, building stored procedures, functions, and triggers, and implementing ETL processes. Have used products like Apache Airflow, DBT, Gitlab/Github, BigQuery Demonstrable experience in Data Modelling including working with denormalised data structures, testing, asserts and data cleansing The other stuff we are looking for More ❯
to ensure code is fit for purpose Experience that will put you ahead of the curve Experience using Python on Google Cloud Platform for Big Data projects, BigQuery, DataFlow (Apache Beam), Cloud Run Functions, Cloud Run, Cloud Workflows, Cloud Composure SQL development skills Experience using Dataform or dbt Demonstrated strength in data modelling, ETL development, and data warehousing Knowledge More ❯
between systems Experience with Google Cloud Platform (GCP) is highly preferred.(Experience with other cloud platforms like AWS, Azure can be considered.) Familiarity with data pipeline scheduling tools like Apache Airflow Ability to design, build, and maintain data pipelines for efficient data flow and processing Understanding of data warehousing best practices and experience in organising and cleaning up messy More ❯
Machine Learning tech stack includes: Python, ML libraries (TensorFlow, PyTorch, scikit-learn, transformers, XGBoost, ResNet), geospatial libraries (shapely, geopandas, rasterio), CV libraries (scikit-image, OpenCV, YOLO, Detectron2) AWS, Postgres, Apache Airflow, Kafka, Spark Mandatory requirements: At least 5 years of experience in data science, with deployment into production; Proven experience delivering end-to-end ML solutions that create business More ❯
managing databases (we use Elasticsearch/MongoDB/PostgreSQL). Experience with SQL. Experience with data versioning tools. Experience developing and maintaining data infrastructure for ETL pipelines, such as Apache Airflow. EPIC JOB + EPIC BENEFITS = EPIC LIFE We pay 100% for benefits except for PMI (for dependents). Our current benefits package includes pension, private medical insurance, health More ❯
Experience applying QM techniques to synthesis prediction, including using QM toolkits (e.g., PSI4, Orca, Gaussian). Experience with data curation and processing from heterogeneous sources; familiarity with tools like Apache Spark or Hadoop. Proficiency with cloud platforms (AWS, GCP, Azure). Familiarity with major machine learning frameworks (e.g., scikit-learn, TensorFlow, PyTorch). Open-source contributions or publications demonstrating More ❯
driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., Apache Airflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and Azure More ❯
driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., Apache Airflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and Azure More ❯
driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., Apache Airflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and Azure More ❯
HuggingFace, scikit-learn, PyTorch, Pandas, NumPy, SciPy Experience with AWS (principally EC2, S3, SageMaker) or Azure/GCP equivalents Some experience of designing, developing and deploying scalable infrastructure (eg Apache Airflow, Luigi or other cluster management software) Object Orientated concepts and design The ability to design and build unit-tested and well documented modular code Understanding of Agile software More ❯
Bash, Ansible DevOps & CI/CD: Jenkins, GitLab CI/CD, Terraform Cloud & Infrastructure: AWS Testing & Quality: Cucumber, SonarQube Monitoring & Logging: ELK Stack (Elasticsearch, Logstash, Kibana), Grafana Dataflow & Integration: Apache NiFi Experience across multiple areas is desirable; we don't expect you to know everything but a willingness to learn and contribute across the stack is key. #LI-JS2 More ❯
Bash, Ansible DevOps & CI/CD: Jenkins, GitLab CI/CD, Terraform Cloud & Infrastructure: AWS Testing & Quality: Cucumber, SonarQube Monitoring & Logging: ELK Stack (Elasticsearch, Logstash, Kibana), Grafana Dataflow & Integration: Apache NiFi Experience across multiple areas is desirable; we don’t expect you to know everything but a willingness to learn and contribute across the stack is key. #LI-JS2 More ❯
APIs within AWS (S3, Lambda, Glue, Athena, Eventbridge, DynamoDB, API Gateway, Appsync) and Snowflake Possess strong IaC (preferably AWS CDK) and Python skills. Any experience with AWS Kinesis/Apache Flink or other real time streaming analytics setup is a bonus. Ideally, you come from a software engineering background and are familiar with common practices such as environment separation More ❯
Bash, Ansible DevOps & CI/CD: Jenkins, GitLab CI/CD, Terraform Cloud & Infrastructure: AWS Testing & Quality: Cucumber, SonarQube Monitoring & Logging: ELK Stack (Elasticsearch, Logstash, Kibana), Grafana Dataflow & Integration: Apache NiFi Experience across multiple areas is desirable; we don’t expect you to know everything but a willingness to learn and contribute across the stack is key. #LI-JS2 More ❯
data into a data platform using Fivetran. Experience of developing BI dashboards using Power BI. Knowledge of security concepts relevant to Azure. Experience of workflow management tools such as Apache Airflow. Interested in the role? Complete the online application. We look forward to getting to know you. Discover more about LGT Wealth Management A message from our CEO Ben More ❯
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions, while More ❯
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions, while More ❯
on healthcare data preferred. Familiarity with NoSQL databases (e.g., MongoDB) and relational databases (e.g., PostgreSQL, MySQL). 5+ years in Python and SQL work. Knowledge of ETL tools (e.g., Apache Airflow) and cloud platforms (e.g., AWS, Azure, GCP). Understand data modelling concepts and best practices. Experience with healthcare data standards (e.g., HL7, FHIR, ICD, SNOMED, DICOM) preferred. Excellent More ❯
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions, while More ❯