of the following: Python, SQL, Java Commercial experience in client-facing projects is a plus, especially within multi-disciplinary teams Deep knowledge of database technologies: Distributed systems (e.g., Spark, Hadoop, EMR) RDBMS (e.g., SQL Server, Oracle, PostgreSQL, MySQL) NoSQL (e.g., MongoDB, Cassandra, DynamoDB, Neo4j) Solid understanding of software engineering best practices - code reviews, testing frameworks, CI/CD, and More ❯
pipelines and ETL - informatica - Experience in SQL and database management systems - Knowledge of data modelling , warehousing concepts , and ETL processes - Experience with big data technologies and frameworks such as Hadoop, Hive, Spark. Programming experience in Python or Scala. - Demonstrated analytical and problem-solving skills. - Familiarity with cloud platforms (e.g Azure , AWS ) and their data related services - Proactive and detail More ❯
Liverpool, Merseyside, North West, United Kingdom Hybrid / WFH Options
Forward Role
both relational databases (SQL Server, MySQL) and NoSQL solutions (MongoDB, Cassandra) Hands-on knowledge of AWS S3 and associated big data services Extensive experience with big data technologies including Hadoop and Spark for large-scale dataset processing Deep understanding of data security frameworks, encryption protocols, access management and regulatory compliance Proven track record building automated, scalable ETL frameworks and More ❯
English is required. Preferred Skills: Experience in commodities markets or broader financial markets. Knowledge of quantitative modeling, risk management, or algorithmic trading. Familiarity with big data technologies like Kafka, Hadoop, Spark, or similar. Why Work With Us? Impactful Work: Directly influence the profitability of the business by building technology that drives trading decisions. Innovative Culture: Be part of a More ❯
English is required. Preferred Skills: Experience in commodities markets or broader financial markets. Knowledge of quantitative modeling, risk management, or algorithmic trading. Familiarity with big data technologies like Kafka, Hadoop, Spark, or similar. Why Work With Us? Impactful Work: Directly influence the profitability of the business by building technology that drives trading decisions. Innovative Culture: Be part of a More ❯
English is required. Preferred Skills: Experience in commodities markets or broader financial markets. Knowledge of quantitative modeling, risk management, or algorithmic trading. Familiarity with big data technologies like Kafka, Hadoop, Spark, or similar. Why Work With Us? Impactful Work: Directly influence the profitability of the business by building technology that drives trading decisions. Innovative Culture: Be part of a More ❯
london (city of london), south east england, united kingdom
Hedge Fund
English is required. Preferred Skills: Experience in commodities markets or broader financial markets. Knowledge of quantitative modeling, risk management, or algorithmic trading. Familiarity with big data technologies like Kafka, Hadoop, Spark, or similar. Why Work With Us? Impactful Work: Directly influence the profitability of the business by building technology that drives trading decisions. Innovative Culture: Be part of a More ❯
English is required. Preferred Skills: Experience in commodities markets or broader financial markets. Knowledge of quantitative modeling, risk management, or algorithmic trading. Familiarity with big data technologies like Kafka, Hadoop, Spark, or similar. Why Work With Us? Impactful Work: Directly influence the profitability of the business by building technology that drives trading decisions. Innovative Culture: Be part of a More ❯
data architecture, integration, governance frameworks, and privacy-enhancing technologies Experience with databases (SQL & NoSQL - Oracle, PostgreSQL, MongoDB), data warehousing, and ETL/ELT tools Familiarity with big data technologies (Hadoop, Spark, Kafka), cloud platforms (AWS, Azure, GCP), and API integrations Desirable: Data certifications (TOGAF, DAMA), government/foundational data experience, cloud-native platforms knowledge, AI/ML data requirements More ❯
data architecture, integration, governance frameworks, and privacy-enhancing technologies Experience with databases (SQL & NoSQL - Oracle, PostgreSQL, MongoDB), data warehousing, and ETL/ELT tools Familiarity with big data technologies (Hadoop, Spark, Kafka), cloud platforms (AWS, Azure, GCP), and API integrations Desirable: Data certifications (TOGAF, DAMA), government/foundational data experience, cloud-native platforms knowledge, AI/ML data requirements More ❯
data architecture, integration, governance frameworks, and privacy-enhancing technologies Experience with databases (SQL & NoSQL - Oracle, PostgreSQL, MongoDB), data warehousing, and ETL/ELT tools Familiarity with big data technologies (Hadoop, Spark, Kafka), cloud platforms (AWS, Azure, GCP), and API integrations Desirable: Data certifications (TOGAF, DAMA), government/foundational data experience, cloud-native platforms knowledge, AI/ML data requirements More ❯
data architecture, integration, governance frameworks, and privacy-enhancing technologies Experience with databases (SQL & NoSQL - Oracle, PostgreSQL, MongoDB), data warehousing, and ETL/ELT tools Familiarity with big data technologies (Hadoop, Spark, Kafka), cloud platforms (AWS, Azure, GCP), and API integrations Desirable: Data certifications (TOGAF, DAMA), government/foundational data experience, cloud-native platforms knowledge, AI/ML data requirements More ❯
data architecture, integration, governance frameworks, and privacy-enhancing technologies Experience with databases (SQL & NoSQL - Oracle, PostgreSQL, MongoDB), data warehousing, and ETL/ELT tools Familiarity with big data technologies (Hadoop, Spark, Kafka), cloud platforms (AWS, Azure, GCP), and API integrations Desirable: Data certifications (TOGAF, DAMA), government/foundational data experience, cloud-native platforms knowledge, AI/ML data requirements More ❯
london (city of london), south east england, united kingdom
Computappoint
data architecture, integration, governance frameworks, and privacy-enhancing technologies Experience with databases (SQL & NoSQL - Oracle, PostgreSQL, MongoDB), data warehousing, and ETL/ELT tools Familiarity with big data technologies (Hadoop, Spark, Kafka), cloud platforms (AWS, Azure, GCP), and API integrations Desirable: Data certifications (TOGAF, DAMA), government/foundational data experience, cloud-native platforms knowledge, AI/ML data requirements More ❯
with Trino/Starburst Enterprise/Galaxy administration and CLI operations Container Orchestration : Proven track record with Kubernetes/OpenShift in production environments Big Data Ecosystem : Strong background in Hadoop, Hive, Spark, and cloud platforms (AWS/Azure/GCP) Systems Architecture : Understanding of distributed systems, high availability, and fault-tolerant design Security Protocols : Experience with LDAP, Active Directory More ❯
Portsmouth, Hampshire, England, United Kingdom Hybrid / WFH Options
Computappoint
with Trino/Starburst Enterprise/Galaxy administration and CLI operations Container Orchestration : Proven track record with Kubernetes/OpenShift in production environments Big Data Ecosystem : Strong background in Hadoop, Hive, Spark, and cloud platforms (AWS/Azure/GCP) Systems Architecture : Understanding of distributed systems, high availability, and fault-tolerant design Security Protocols : Experience with LDAP, Active Directory More ❯
transform traditional business models with cutting-edge digital solutions. Know more about us: https://corporate.jd.com/Key Responsibilities: Build real-time/offline data pipelines (ETL) using Hadoop/Spark/Flink Design data storage solutions (warehouses/lakes) for multi-source heterogeneous data Implement data quality monitoring and SQL optimization Minimum Qualifications Bachelor’s+ in Data More ❯
transform traditional business models with cutting-edge digital solutions. Know more about us: https://corporate.jd.com/Key Responsibilities: Build real-time/offline data pipelines (ETL) using Hadoop/Spark/Flink Design data storage solutions (warehouses/lakes) for multi-source heterogeneous data Implement data quality monitoring and SQL optimization Minimum Qualifications Bachelor’s+ in Data More ❯
transform traditional business models with cutting-edge digital solutions. Know more about us: https://corporate.jd.com/Key Responsibilities: Build real-time/offline data pipelines (ETL) using Hadoop/Spark/Flink Design data storage solutions (warehouses/lakes) for multi-source heterogeneous data Implement data quality monitoring and SQL optimization Minimum Qualifications Bachelor’s+ in Data More ❯
transform traditional business models with cutting-edge digital solutions. Know more about us: https://corporate.jd.com/Key Responsibilities: Build real-time/offline data pipelines (ETL) using Hadoop/Spark/Flink Design data storage solutions (warehouses/lakes) for multi-source heterogeneous data Implement data quality monitoring and SQL optimization Minimum Qualifications Bachelor’s+ in Data More ❯
london (city of london), south east england, united kingdom
JD.COM
transform traditional business models with cutting-edge digital solutions. Know more about us: https://corporate.jd.com/Key Responsibilities: Build real-time/offline data pipelines (ETL) using Hadoop/Spark/Flink Design data storage solutions (warehouses/lakes) for multi-source heterogeneous data Implement data quality monitoring and SQL optimization Minimum Qualifications Bachelor’s+ in Data More ❯
Reading, England, United Kingdom Hybrid / WFH Options
HD TECH Recruitment
Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding of statistical analysis and predictive modelling Experience: 5+ years working with Microsoft data platforms 5+ years in a customer-facing consulting or More ❯
work experience). Proven experience with Trino/Starburst Enterprise/Galaxy administration/CLI. Implementation experience with container orchestration solutions (Kubernetes/OpenShift). Knowledge of Big Data (Hadoop/Hive/Spark) and Cloud technologies (AWS, Azure, GCP). Understanding of distributed system architecture, high availability, scalability, and fault tolerance. Familiarity with security authentication systems such as More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Advanced Resource Managers Limited
work experience). Proven experience with Trino/Starburst Enterprise/Galaxy administration/CLI. Implementation experience with container orchestration solutions (Kubernetes/OpenShift). Knowledge of Big Data (Hadoop/Hive/Spark) and Cloud technologies (AWS, Azure, GCP). Understanding of distributed system architecture, high availability, scalability, and fault tolerance. Familiarity with security authentication systems such as More ❯
DataStage, Talend and Informatica. Ingestion mechanism like Flume & Kafka. Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Data visualization – Tools like Tableau Big data – Hadoop eco-system, Distributions like Cloudera/Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming More ❯