in Computer Science, Software Engineering, or equivalent technical discipline. 8+ years of hands-on experience building large-scale distributed data pipelines and architectures. Expert-level knowledge in Apache Spark, PySpark, and Databricksincluding experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong experience More ❯
in Computer Science, Software Engineering, or equivalent technical discipline. 8+ years of hands-on experience building large-scale distributed data pipelines and architectures. Expert-level knowledge in Apache Spark, PySpark, and Databricksincluding experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong experience More ❯
to logically analyse complex requirements, processes, and systems to deliver solutions. Solid understanding of data modelling, ETL/ELT processes, and data warehousing. Proficiency in SQL and Python (especially PySpark), as well as other relevant programming languages. Passion for using data to drive key business decisions. Skills wed love to see/Amazing Extras: Experience with Microsoft Fabric. Familiarity More ❯
Azure cloud data lakes and services (Data Factory, Synapse, Databricks). Skilled in ETL/ELT pipeline development and big data tools (Spark, Hadoop, Kafka). Strong Python/PySpark programming and advanced SQL with query optimisation. Experience with relational, NoSQL, and graph databases. Familiar with CI/CD, version control, and infrastructure as code (Terraform). Strong analytical More ❯
City of London, Greater London, UK Hybrid / WFH Options
Formula Recruitment
Azure cloud data lakes and services (Data Factory, Synapse, Databricks). Skilled in ETL/ELT pipeline development and big data tools (Spark, Hadoop, Kafka). Strong Python/PySpark programming and advanced SQL with query optimisation. Experience with relational, NoSQL, and graph databases. Familiar with CI/CD, version control, and infrastructure as code (Terraform). Strong analytical More ❯
proactive problem-solver with strong analytical and communication skills, capable of working both independently and collaboratively. Desirable: Experience working with Azure (or other major cloud platforms). Familiarity with PySpark or other big data technologies. Understanding of version control systems (e.g., Git). Knowledge of pricing or modelling workflows and the impact of engineering decisions on model performance. Why More ❯
City of London, Greater London, UK Hybrid / WFH Options
Ascentia Partners
proactive problem-solver with strong analytical and communication skills, capable of working both independently and collaboratively. Desirable: Experience working with Azure (or other major cloud platforms). Familiarity with PySpark or other big data technologies. Understanding of version control systems (e.g., Git). Knowledge of pricing or modelling workflows and the impact of engineering decisions on model performance. Why More ❯
Factory for data ingestion and transformation Work with Azure Data Lake, Synapse, and SQL DW to manage large volumes of data Develop data transformation logic using SQL, Python, and PySpark code Collaborate with cross-functional teams to translate business requirements into data solutions Create mapping documents, transformation rules, and ensure quality delivery Contribute to DevOps processes, CI/CD … development and big data solutions Recent experience within Insurance Technology essential Solid expertise with Azure Databricks, Data Factory, ADLS, Synapse, and Azure SQL Strong skills in SQL, Python, and PySpark Solid understanding of DevOps, CI/CD, and Agile methodologies Excellent communication and stakeholder management skills More ❯
Factory for data ingestion and transformation Work with Azure Data Lake, Synapse, and SQL DW to manage large volumes of data Develop data transformation logic using SQL, Python, and PySpark code Collaborate with cross-functional teams to translate business requirements into data solutions Create mapping documents, transformation rules, and ensure quality delivery Contribute to DevOps processes, CI/CD … development and big data solutions Recent experience within Insurance Technology essential Solid expertise with Azure Databricks, Data Factory, ADLS, Synapse, and Azure SQL Strong skills in SQL, Python, and PySpark Solid understanding of DevOps, CI/CD, and Agile methodologies Excellent communication and stakeholder management skills More ❯
ll architect and maintain robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to … reviews, and foster a culture of continuous learning and knowledge-sharing Mandatory Skills Description: - 5+ years of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL More ❯
Role Sr. Technology Architect Technology Data Modelling, ERWIN Modelling, Azure Architect, PySpark Location UK(London) Job Description Data Architect should have key understanding of building Data Models, Experience in Data Architecting and Engineering space on Spark, Pyspark, ADF, Azure Synapse and Data Lake. Your role In the role of a Sr Technology Architect, you will primarily be responsible … Hand on experience in technology consulting, enterprise and solutions architecture and architectural frameworks, Data Modelling, Experience on ERWIN Modelling. Hands on experience in ADF, Azure Databricks, Azure Synapse, Spark, PySpark, Python/Scala, SQL. Hands on Experience in Designing and building Data Lake from Multiple Source systems/Data Providers. Experience on Data Modelling, architecture, implementation & Testing. Experienced in More ❯