City of London, London, United Kingdom Hybrid / WFH Options
Empresaria Group plc
or Digital Experience). Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency in Tableau, Qlik, or similar visualization tools. Experience with big data tools (Snowflake, Databricks, Spark) and ETL processes. Exposure to Python or R for automation, experimentation, or analytics. Excellent communication and storytelling skills with both technical and non-technical audiences. Proactive, growth-oriented mindset More ❯
london, south east england, united kingdom Hybrid / WFH Options
Empresaria Group plc
or Digital Experience). Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency in Tableau, Qlik, or similar visualization tools. Experience with big data tools (Snowflake, Databricks, Spark) and ETL processes. Exposure to Python or R for automation, experimentation, or analytics. Excellent communication and storytelling skills with both technical and non-technical audiences. Proactive, growth-oriented mindset More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Empresaria Group plc
or Digital Experience). Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency in Tableau, Qlik, or similar visualization tools. Experience with big data tools (Snowflake, Databricks, Spark) and ETL processes. Exposure to Python or R for automation, experimentation, or analytics. Excellent communication and storytelling skills with both technical and non-technical audiences. Proactive, growth-oriented mindset More ❯
role involves structuring analytical solutions that address business objectives and problem solving. We are looking for hands-on experience in writing code for AWS Glue in Python, PySpark, and Spark SQL. The successful candidate will translate stated or implied client needs into researchable hypotheses, facilitate client working sessions, and be involved in recurring project status meetings. You will develop More ❯
and ETL/ELT processes. Proficiency in AWS data platforms and services. Solid understanding of data governance principles (data quality, metadata, access control). Familiarity with big data technologies (Spark, Hadoop) and distributed computing. Advanced SQL skills and proficiency in at least one programming language (Python, Java). Additional Requirements Immediate availability for an October start. Must be UK More ❯
with ML frameworks such as PyTorch, TensorFlow, and JAX. Skilled in using tools like scikit-learn, XGBoost, and LightGBM. Data Engineering & Infrastructure Skills - Comfortable working with big data technologies (Spark, Dask), SQL/NoSQL databases, and cloud platforms (AWS, GCP, Azure). Able to build scalable ML pipelines for large-scale financial data. Model Optimisation & Deployment Experience - Proven track More ❯
deep learning architectures (e.g., attention models, transformers, retrieval models). Hands-on experience with LLMs and GenAI technologies. Strong programming and problem-solving skills with proficiency in Python, SQL, Spark, and Hive. Deep understanding of classical and modern ML techniques, A/B testing methodologies, and experiment design. Solid background in ranking, recommendation, and retrieval systems. Familiarity with large More ❯
deep learning architectures (e.g., attention models, transformers, retrieval models). Hands-on experience with LLMs and GenAI technologies. Strong programming and problem-solving skills with proficiency in Python, SQL, Spark, and Hive. Deep understanding of classical and modern ML techniques, A/B testing methodologies, and experiment design. Solid background in ranking, recommendation, and retrieval systems. Familiarity with large More ❯
deep learning architectures (e.g., attention models, transformers, retrieval models). Hands-on experience with LLMs and GenAI technologies. Strong programming and problem-solving skills with proficiency in Python, SQL, Spark, and Hive. Deep understanding of classical and modern ML techniques, A/B testing methodologies, and experiment design. Solid background in ranking, recommendation, and retrieval systems. Familiarity with large More ❯
london (city of london), south east england, united kingdom
oryxsearch.io
deep learning architectures (e.g., attention models, transformers, retrieval models). Hands-on experience with LLMs and GenAI technologies. Strong programming and problem-solving skills with proficiency in Python, SQL, Spark, and Hive. Deep understanding of classical and modern ML techniques, A/B testing methodologies, and experiment design. Solid background in ranking, recommendation, and retrieval systems. Familiarity with large More ❯
cloud data platforms, Lakehouse architecture, and data engineering frameworks. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Databricks, Delta Lake, and Spark (PySpark preferred). Proven track record implementing Medallion Architecture (Bronze, Silver, Gold layers) in production environments. Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. … Proficiency in Python, SQL, and Spark optimization techniques. Experience working with cloud data platforms such as Azure Data Lake, AWS S3, or GCP BigQuery. Strong understanding of data quality frameworks, testing, and CI/CD pipelines for data workflows. Excellent communication skills and ability to collaborate across teams. Preferred Qualifications Experience with Databricks Unity Catalog and Delta Live Tables. More ❯
cloud data platforms, Lakehouse architecture, and data engineering frameworks. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Databricks, Delta Lake, and Spark (PySpark preferred). Proven track record implementing Medallion Architecture (Bronze, Silver, Gold layers) in production environments. Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. … Proficiency in Python, SQL, and Spark optimization techniques. Experience working with cloud data platforms such as Azure Data Lake, AWS S3, or GCP BigQuery. Strong understanding of data quality frameworks, testing, and CI/CD pipelines for data workflows. Excellent communication skills and ability to collaborate across teams. Preferred Qualifications Experience with Databricks Unity Catalog and Delta Live Tables. More ❯
cloud data platforms, Lakehouse architecture, and data engineering frameworks. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Databricks, Delta Lake, and Spark (PySpark preferred). Proven track record implementing Medallion Architecture (Bronze, Silver, Gold layers) in production environments. Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. … Proficiency in Python, SQL, and Spark optimization techniques. Experience working with cloud data platforms such as Azure Data Lake, AWS S3, or GCP BigQuery. Strong understanding of data quality frameworks, testing, and CI/CD pipelines for data workflows. Excellent communication skills and ability to collaborate across teams. Preferred Qualifications Experience with Databricks Unity Catalog and Delta Live Tables. More ❯
cloud data platforms, Lakehouse architecture, and data engineering frameworks. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Databricks, Delta Lake, and Spark (PySpark preferred). Proven track record implementing Medallion Architecture (Bronze, Silver, Gold layers) in production environments. Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. … Proficiency in Python, SQL, and Spark optimization techniques. Experience working with cloud data platforms such as Azure Data Lake, AWS S3, or GCP BigQuery. Strong understanding of data quality frameworks, testing, and CI/CD pipelines for data workflows. Excellent communication skills and ability to collaborate across teams. Preferred Qualifications Experience with Databricks Unity Catalog and Delta Live Tables. More ❯
london (city of london), south east england, united kingdom
TRIA
cloud data platforms, Lakehouse architecture, and data engineering frameworks. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Databricks, Delta Lake, and Spark (PySpark preferred). Proven track record implementing Medallion Architecture (Bronze, Silver, Gold layers) in production environments. Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. … Proficiency in Python, SQL, and Spark optimization techniques. Experience working with cloud data platforms such as Azure Data Lake, AWS S3, or GCP BigQuery. Strong understanding of data quality frameworks, testing, and CI/CD pipelines for data workflows. Excellent communication skills and ability to collaborate across teams. Preferred Qualifications Experience with Databricks Unity Catalog and Delta Live Tables. More ❯
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & ApacheSpark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & ApacheSpark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch More ❯
consulting 8+ years leading technical teams in data engineering or analytics Expertise in modern data platforms such as Databricks, Snowflake, GCP, AWS, or Azure Strong understanding of tools like ApacheSpark, Kafka, and Kubernetes Deep knowledge of data governance, strategy, and privacy regulations (GDPR, etc.) Strategic mindset with the ability to balance technical depth and business insight Passion More ❯
consulting 8+ years leading technical teams in data engineering or analytics Expertise in modern data platforms such as Databricks, Snowflake, GCP, AWS, or Azure Strong understanding of tools like ApacheSpark, Kafka, and Kubernetes Deep knowledge of data governance, strategy, and privacy regulations (GDPR, etc.) Strategic mindset with the ability to balance technical depth and business insight Passion More ❯
consulting 8+ years leading technical teams in data engineering or analytics Expertise in modern data platforms such as Databricks, Snowflake, GCP, AWS, or Azure Strong understanding of tools like ApacheSpark, Kafka, and Kubernetes Deep knowledge of data governance, strategy, and privacy regulations (GDPR, etc.) Strategic mindset with the ability to balance technical depth and business insight Passion More ❯
london (city of london), south east england, united kingdom
Space Executive
consulting 8+ years leading technical teams in data engineering or analytics Expertise in modern data platforms such as Databricks, Snowflake, GCP, AWS, or Azure Strong understanding of tools like ApacheSpark, Kafka, and Kubernetes Deep knowledge of data governance, strategy, and privacy regulations (GDPR, etc.) Strategic mindset with the ability to balance technical depth and business insight Passion More ❯