open-source ETL, and data pipeline orchestration tools such as Apache Airflow and Nifi. Experience with large scale/Big Data technologies, such as Hadoop, Spark, Hive, Impala, PrestoDb, Kafka. Experience with workflow orchestration tools like Apache Airflow. Experience with containerisation using Docker and deployment on Kubernetes. Experience with More ❯
Platform (GCP) Strong proficiency in SQL and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data technologies such as Hadoop, Spark, or Hive Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow Proficiency in Python and at More ❯
Azure, or GCP. Hands-on experience with AI/ML workflows or deploying machine learning models in production. Knowledge of big data technologies like Hadoop, Hive, or Spark. Familiarity with MLOps tools and practices, such as MLflow, Kubeflow, or DataRobot. Education: Bachelor’s degree in Computer Science, Software Engineering More ❯
Azure, or GCP. Hands-on experience with AI/ML workflows or deploying machine learning models in production. Knowledge of big data technologies like Hadoop, Hive, or Spark. Familiarity with MLOps tools and practices, such as MLflow, Kubeflow, or DataRobot. Education: Bachelor’s degree in Computer Science, Software Engineering More ❯
Azure, or GCP. Hands-on experience with AI/ML workflows or deploying machine learning models in production. Knowledge of big data technologies like Hadoop, Hive, or Spark. Familiarity with MLOps tools and practices, such as MLflow, Kubeflow, or DataRobot. Education Bachelor’s degree in Computer Science, Software Engineering More ❯
a team-oriented environment. Preferred Skills: Experience with programming languages such as Python or R for data analysis. Knowledge of big data technologies (e.g., Hadoop, Spark) and data warehousing concepts. Familiarity with cloud data platforms (e.g., Azure, AWS, Google Cloud) is a plus. Certification in BI tools, SQL, or More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Delta Capita
a team-oriented environment. Preferred Skills: Experience with programming languages such as Python or R for data analysis. Knowledge of big data technologies (e.g., Hadoop, Spark) and data warehousing concepts. Familiarity with cloud data platforms (e.g., Azure, AWS, Google Cloud) is a plus. Certification in BI tools, SQL, or More ❯
problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security More ❯
London, England, United Kingdom Hybrid / WFH Options
Luupli
problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security More ❯
London, England, United Kingdom Hybrid / WFH Options
Trudenty
large-scale data. Experience with ETL processes for data ingestion and processing. Proficiency in Python and SQL. Experience with big data technologies like ApacheHadoop and Apache Spark. Familiarity with real-time data processing frameworks such as Apache Kafka or Flink. MLOps & Deployment: Experience deploying and maintaining large-scale More ❯
Oracle, SQL Server, PostgreSQL) and data warehousing technologies. Experience with cloud-based data solutions (AWS, Azure, GCP). Familiarity with big data technologies like Hadoop, Spark, and Kafka. Technical Skills: Proficiency in data modelling (ERD, normalization) and data warehousing concepts. Strong understanding of ETL frameworks and tools (e.g., Talend More ❯
Data Analytics - Specialty or AWS Certified Solutions Architect - Associate. Experience with Airflow for workflow orchestration. Exposure to big data frameworks such as Apache Spark, Hadoop, or Presto. Hands-on experience with machine learning pipelines and AI/ML data engineering on AWS. Benefits: Competitive salary and performance-based bonus More ❯
or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on coding experience More ❯
data engineering roles with progressively increasing responsibility Proven experience designing and implementing complex data pipelines at scale Strong knowledge of distributed computing frameworks (Spark, Hadoop ecosystem) Experience with cloud-based data platforms (AWS, Azure, GCP) Proficiency in data orchestration tools (Airflow, Prefect, Dagster, or similar) Solid programming skills in More ❯
in Palantir Foundry platform is must • Experience designing and implementing data analytics solutions on enterprise data platforms and distributed computing (Spark/Hive/Hadoop preferred). • Proven track record of understanding and transforming customer requirements into a best-fit design and architecture. • Demonstrated experience in end-to-end More ❯
Python and R, and ML libraries (TensorFlow, PyTorch, scikit-learn). Hands-on experience with cloud platforms (Azure ML) and big data ecosystems (e.g., Hadoop, Spark). Strong understanding of CI/CD pipelines, DevOps practices, and infrastructure automation. Familiarity with database systems (SQL Server, Snowflake) and API integrations. More ❯
and other programming skills (Spark/Scala desirable). Experience both using and building APIs. Strong SQL background. Some exposure to big data technologies (Hadoop, Spark, Presto, etc.). Works well collaboratively, and independently, with a proven ability to form and manage strong relationships within the organisation and clients. More ❯
Java Commercial experience in client-facing projects is a plus, especially within multi-disciplinary teams Deep knowledge of database technologies: Distributed systems (e.g., Spark, Hadoop, EMR) RDBMS (e.g., SQL Server, Oracle, PostgreSQL, MySQL) NoSQL (e.g., MongoDB, Cassandra, DynamoDB, Neo4j) Solid understanding of software engineering best practices - code reviews, testing More ❯
on mission critical data pipelines and ETL systems. 5+ years of hands-on experience with big data technology, systems and tools such as AWS, Hadoop, Hive, and Snowflake Expertise with common Software Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages More ❯
and knowledge of when and how to use dedicated hardware. · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) · Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. · Experience with data quality and/or and data lineage More ❯
and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data lineage More ❯
with programming languages such as Python or Java. Understanding of data warehousing concepts and data modeling techniques. Experience working with big data technologies (e.g., Hadoop, Spark) is an advantage. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Benefits Enhanced leave - 38 days inclusive of 8 UK More ❯
London, England, United Kingdom Hybrid / WFH Options
ZILO™
with programming languages such as Python or Java Understanding of data warehousing concepts and data modeling techniques Experience working with big data technologies (e.g., Hadoop, Spark) is an advantage Excellent problem-solving and analytical skills Strong communication and collaboration skills Benefits Enhanced leave - 38 days inclusive of 8 UK More ❯
of a team. Preferred Qualifications: Master's degree in Computer Science, Data Science, or a related field. Experience with big data technologies such as Hadoop, Spark, or Kafka. Experience with data visualization tools such as Power BI, Tableau, or Qlik. Certifications in Azure data and AI technologies. Benefits Salary More ❯
data modeling, and ETL/ELT processes. Proficiency in programming languages such as Python, Java, or Scala. Experience with big data technologies such as Hadoop, Spark, and Kafka. Familiarity with cloud platforms like AWS, Azure, or Google Cloud. Excellent problem-solving skills and the ability to think strategically. Strong More ❯