in Azure Cloud technologies e.g., ML/OPS, ML Flow, Azure Data Factory, Azure Function, Data Bricks, Event Hub, Microservices/API, Python/PYSPARK/R, or SQL 2+ years of experience designing, developing, and implementing Big Data platforms using Azure Cloud architecture with structured and unstructured data more »
in Azure Cloud technologies e.g., ML/OPS, ML Flow, Azure Data Factory, Azure Function, Data Bricks, Event Hub, Microservices/API, Python/PYSPARK/R, or SQL 2+ years of experience designing, developing, and implementing Big Data platforms using Azure Cloud architecture with structured and unstructured data more »
following technologies Azure Synapse, Data Factory, Databricks, SQL Db, Datalake, Key Vault Azure Dev Ops and CI/CD pipelines Coding in SQL and PySpark/Python DW/Data Vault concepts Power BI Experience with core Finance reporting (Projects, GL, AP, AR etc) - Highly desirable Preferred experience Knowledge more »
an Agile way. Who are we looking for? • Degree in Computer Science, Information Systems, Data Science, or a related field. • Experience with Databricks, Dataverse, PySpark, Synapse and Powerautomate • Experience with integrating SAP end to end advantageous • Experience with data warehouse technologies and data integration processes. • Knowledge of ETL (Extract more »
levels of experience within data engineering. Experience deploying pipelines within Azure Databricks in line with the medallion architecture framework. Experience using SQL, Python and PySpark to build data engineering pipelines. Understanding of how to define best practices in relation to documentation standards as well as code standards. Understanding of more »
levels of experience within data engineering. Experience deploying pipelines within Azure Databricks in line with the medallion architecture framework. Experience using SQL, Python and PySpark to build data engineering pipelines. Understanding of how to define best practices in relation to documentation standards as well as code standards. Understanding of more »
SQL Server and relational databases. Solid understanding of the Azure data engineering stack, including Azure Synapse and Azure Data Lake. Programming skills in Python, PySpark, and T-SQL. Nice to haves: Familiarity with broader Azure Data Solutions, such as Azure ML Studio. Previous experience with Azure DevOps and knowledge more »
london (chiswick), south east england, United Kingdom
Square One Resources
SQL Server and relational databases. Solid understanding of the Azure data engineering stack, including Azure Synapse and Azure Data Lake. Programming skills in Python, PySpark, and T-SQL. Nice to haves: Familiarity with broader Azure Data Solutions, such as Azure ML Studio. Previous experience with Azure DevOps and knowledge more »
related field Certifications such as Azure Data Engineer Associate are desirable. Knowledge of data ingestion methods for real-time and batch processing Proficiency in PySpark and debugging Apache Spark workloads. What’s in it for you? Annual bonus scheme – up to 10% Excellent pension scheme Flexible working Enhanced family more »
the insurance domain is advantageous. - Education : A degree in Computer Science, Data Science, Engineering, or a related discipline. Technical Skills : Proficient in Python, SQL, PySpark, and Databricks. Demonstrated proficiency in modern NLP techniques and tools. Proven track record in developing and managing data quality metrics and dashboards. Experience collaborating more »
Azure Cloud platform Knowledge on orchestrating workloads on cloud Ability to set and lead the technical vision while balancing business drivers Strong experience with PySpark, Python programming Proficiency with APIs, containerization and orchestration is a plus Qualifications: Bachelor's and/or master’s degree About you: You are more »
Azure Cloud platform Knowledge on orchestrating workloads on cloud Ability to set and lead the technical vision while balancing business drivers Strong experience with PySpark, Python programming Proficiency with APIs, containerization and orchestration is a plus Qualifications: Bachelor's and/or master’s degree About you: You are more »
West Midlands, Dudley, West Midlands (County), United Kingdom Hybrid / WFH Options
Concept Resourcing
standardise the Data Warehouse environment and solutions. Required Skills, Knowledge, and Experience: Experience designing, developing, and testing Azure Data Factory/Fabric pipelines, including PySpark Notebooks and Dataflow Gen2 workflows. Previous experience with Informatica Power Centre is desired. Significant experience in designing, writing, editing, debugging, and testing SQL code more »
Cycle · Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is more »
Cycle · Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is more »
understand consumers. Hands-on data engineering/development experience, preferably in a cloud/big data environment Skilled in at least one of Python, PySpark, SQL or similar Experience in guiding or managing roles in insight or data functions, delivering data projects and insight to inspire action and drive more »
understand consumers. Hands-on data engineering/development experience, preferably in a cloud/big data environment Skilled in at least one of Python, PySpark, SQL or similar Experience in guiding or managing roles in insight or data functions, delivering data projects and insight to inspire action and drive more »
understand consumers. Hands-on data engineering/development experience, preferably in a cloud/big data environment Skilled in at least one of Python, PySpark, SQL or similar Experience in guiding or managing roles in insight or data functions, delivering data projects and insight to inspire action and drive more »
applied machine learning, probability, statistics, and quantitative risk modelling. High proficiency in Python & SQL. Experience with big data technologies and tools, particularly Databricks and Pyspark, is highly desirable. Experience in agile software development processes is a plus. Experience in insurance, cyber, or a related domain is ideal. Understanding of more »
Exeter, Devon, South West, United Kingdom Hybrid / WFH Options
Staffworx Limited
Ability to optimise workflows and analysis for map reduce processing Experience with BI software (Power BI, Tableau, Qlik Sense) Any experience with data engineering, PySpark, Databricks, Delta Lakes beneficial Confident presenting complex problems in ways suitable to target audience Experience leading or managing a small analysis team Familiarity working more »
South Harting, England, United Kingdom Hybrid / WFH Options
Adecco
learning, probability, statistics, and quantitative risk modelling. High proficiency in Python and SQL. Experience with big data technologies and tools such as Databricks and PySpark is highly desirable. Essential experience in Probabilistic Risk Modelling. Highly desirable experience with Monte Carlo, Capula, Gamma, Statistical modelling, financial modelling, and Stochastic modelling. more »
and industry standards for the organization. Strong experience on Azure cloud services like Azure, ADF, ADLS, Synapse Proficiency in querying languages such as SQL, Pyspark, Python and familiarity with data visualization tools (e.g. Power BI). Strong communication skills to gather the business requirements from stakeholder and propose best more »
and industry standards for the organization. Strong experience on Azure cloud services like Azure, ADF, ADLS, Synapse Proficiency in querying languages such as SQL, Pyspark, Python and familiarity with data visualization tools (e.g. Power BI). Strong communication skills to gather the business requirements from stakeholder and propose best more »
West Midlands, Dudley, West Midlands (County), United Kingdom Hybrid / WFH Options
Concept Resourcing
standardise the Data Warehouse environment and solutions. Required Skills, Knowledge, and Experience: Experience designing, developing, and testing Azure Data Factory/Fabric pipelines, including PySpark Notebooks and Dataflow Gen2 workflows. Previous experience with Informatica Power Centre is desired. Significant experience in designing, writing, editing, debugging, and testing SQL code more »
can offer you exposure to the latest technologies. We are looking for a senior Data Engineer who has solid Python skills as well as Pyspark, DataBricks and SQL, as well as Data Modeling, and Azure Data Factors . Azure Devops would be a distinct advantage. Strong communication and business more »