with cloud platforms (GCP preferred). Experience with CI/CD pipelines and version control. Proficiency in data visualisation tools (e.g. Tableau, PowerBI). Exposure to tools like DBT, Apache Airflow, Docker. Experience working with large-scale datasets (terabyte-level or higher). Excellent problem-solving capabilities. Strong communication and collaboration skills. Proficiency in Python and SQL (or similar More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Fruition Group
best practices for data security and compliance. Collaborate with stakeholders and external partners. Skills & Experience: Strong experience with AWS data technologies (e.g., S3, Redshift, Lambda). Proficient in Python, Apache Spark, and SQL. Experience in data warehouse design and data migration projects. Cloud data platform development and deployment. Expertise across data warehouse and ETL/ELT development in AWS More ❯
London, England, United Kingdom Hybrid / WFH Options
Source
GCP) and their relevant data and ML services. Has experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) and data lake technologies (e.g., S3, ADLS). Has experience with Apache Spark (PySpark). Is familiar with workflow orchestration tools (e.g., Airflow, Prefect, Dagster). Is proficient with Git and GitHub/GitLab. Has a strong understanding of relational, NoSQL More ❯
London, England, United Kingdom Hybrid / WFH Options
Veeva Systems, Inc
recall, or cost savings Requirements Excellent communication skills, used to work in a remote environment More than 5 years of experience Expert skills in Python or Java Experience with Apache Spark Experience writing software in AWS Nice to Have Experience with Data Lakes, Lakehouses, and Warehouses (e. g. DeltaLake, Redshift) Previously worked in agile environments Experience with expert systems More ❯
Stroud, England, United Kingdom Hybrid / WFH Options
Data Engineer
for best practice and technical excellence and be a person that actively looks for continual improvement opportunities. Knowledge and skills Experience as a Data Engineer or Analyst Databricks/Apache Spark SQL/Python BitBucket/GitHub. Advantageous dbt AWS Azure Devops Terraform Atlassian (Jira, Confluence) About Us What's in it for you... Healthcare plan, life assurance and More ❯
London, England, United Kingdom Hybrid / WFH Options
Made Tech Limited
strategies. Strong experience in IaC and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop. Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes). Ability to create data pipelines on a More ❯
Gloucester, Gloucestershire, South West Hybrid / WFH Options
CGI
change management in production environments. Strong communication skills and a positive, solution-focused mindset, with the ability to adapt to changing client needs. Desirable Skills: Hands-on experience with Apache NiFi for data flow management. Exposure to Java, JavaScript/TypeScript, and Vue for full-stack understanding. Experience with BDD frameworks like Cucumber. Background in supporting or working on More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
QinetiQ Limited
are some things we’ve worked on recently that might give you a better sense of what you’ll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data into a common data format More ❯
Salford, England, United Kingdom Hybrid / WFH Options
Naimuri
are some things we’ve worked on recently that might give you a better sense of what you’ll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data into a common data format More ❯
London, England, United Kingdom Hybrid / WFH Options
Starling Bank
primarily GCP. Experience with some or all of the services below would put you at the top of our list: Google Cloud Storage. Google Data Transfer Service. Google Dataflow (Apache Beam). Google PubSub. Google CloudRun. BigQuery or any RDBMS. Python. Debezium/Kafka. dbt (Data Build tool). Interview process Interviewing is a two way process and we More ❯
London, England, United Kingdom Hybrid / WFH Options
Made Tech Limited
could deploy infrastructure into different environments Owning the cloud infrastructure underpinning data systems through a DevOps approach Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of the possible architectures involved in modern data system design (e.g. Data Warehouse, Data Lakes and Data Meshes) and the different use cases More ❯
London, England, United Kingdom Hybrid / WFH Options
Novo Nordisk
Python, data analytics, deep learning (Scikit-learn, Pandas, PyTorch, Jupyter, pipelines), and practical knowledge of data tools like Databricks, Ray, Vector Databases, Kubernetes, and workflow scheduling tools such as Apache Airflow, Dagster, and Astronomer. GPU Computing: Familiarity with GPU computing, both on-premises and on cloud platforms, and experience in building end-to-end scalable ML infrastructure with on More ❯
people, Boeing Defence UK provides long-term support for more than 120 Boeing military rotary-wing and fixed-wing aircrafts in the UK. For example, the Chinook and Apache helicopters, and the Poseidon and C-17 airplanes. Our support ranges from mission critical Logistics Information Services, next generation in-flight digital tools, to aircraft and operational modelling and simulation More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
Solutions Architect or Machine Learning Specialty. Databricks Certified Machine Learning Professional. Agile/Scrum Master Certification. Specialized certifications in AI/ML tools or methodologies. Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g. More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
Agile/Scrum, Python Programmer Preferred Qualifications: DOD 8570 IAT Level II Certification may be required (GSEC, GICSP, CND, CySA+, Security+ CE, SSCP or CCNA-Security). Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g. More ❯
Bracknell, Berkshire, United Kingdom Hybrid / WFH Options
Perforce Software, Inc
of Zend products. Participate in an on-call rotation Requirements: Experience using the AWS EC2 web console and APIs. Deep understanding of HTTP protocol, including web security, and troubleshooting. Apache or Nginx web server administration and configuration experience. Linux system administration experience (Red Hat, Rocky, Alma, Debian, Ubuntu, et. al.) Experience maintaining production RDBMS servers such as MySQL/ More ❯
London, England, United Kingdom Hybrid / WFH Options
Acord (association For Cooperative Operations Research And Development)
and implementation experience using Python or Java (Required) Bachelor's degree or equivalent in Computer Science, Mathematics or Finance-related field (Required) Knowledge of workflow management frameworks such as Apache Airflow (Preferred) Knowledge of cloud computing infrastructure, such as AWS (Preferred) Knowledge of BI visualisation tools such as Looker or Power BI (Preferred) Are you the right candidate? Yes More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
verbal communication skills for effective team collaboration One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g. More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
a focus on innovation and continuous improvement One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g. More ❯
Dunfermline, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
will develop scalable data pipelines, ensure data quality, and support business decision-making with high-quality datasets. -Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, Apache Airflow, Data Architecture, Data Warehousing -Design and develop scalable ETL pipelines to automate data processes and optimize delivery -Implement and manage data warehousing solutions, ensuring data integrity through rigorous … testing and validation -Lead, plan and execute workflow migration and data orchestration using Apache Airflow -Focus on data engineering and data analytics Requirements: -5+ years of experience in SQL -5+ years of development in Python -MUST have strong experience in Apache Airflow -Experience with ETL tools, data architecture, and data warehousing solutions This contract is More ❯
Livingston, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
will develop scalable data pipelines, ensure data quality, and support business decision-making with high-quality datasets. -Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, Apache Airflow, Data Architecture, Data Warehousing -Design and develop scalable ETL pipelines to automate data processes and optimize delivery -Implement and manage data warehousing solutions, ensuring data integrity through rigorous … testing and validation -Lead, plan and execute workflow migration and data orchestration using Apache Airflow -Focus on data engineering and data analytics Requirements: -5+ years of experience in SQL -5+ years of development in Python -MUST have strong experience in Apache Airflow -Experience with ETL tools, data architecture, and data warehousing solutions This contract is More ❯
tools, and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as Apache Hadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices More ❯
Cambourne, England, United Kingdom Hybrid / WFH Options
Remotestar
skills: Strong knowledge of Scala. Familiarity with distributed computing frameworks such as Spark, KStreams, Kafka. Experience with Kafka and streaming frameworks. Understanding of monolithic vs. microservice architectures. Familiarity with Apache ecosystem including Hadoop modules (HDFS, YARN, HBase, Hive, Spark) and Apache NiFi. Experience with containerization and orchestration tools like Docker and Kubernetes. Knowledge of time-series or analytics More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Gaming Innovation Group
You're really awesome at: Object-oriented programming (Java) Data modeling using various database technologies ETL processes (transferring data in-memory, moving away from traditional ETLs) and experience with Apache Spark or Apache NiFi Applied understanding of CI/CD in change management Dockerized applications Using distributed version control systems Being an excellent team player Meticulous and passionate More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo SpA
exciting and critical challenges to the UK's digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK based onsite role with … the option of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards. Manage and … experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯