Requirements: 3+ years as a Business Analyst. Proficiency in ERP/CRM solutions and data, including Workday HCM Strong Azure data skills. Proficiency in PySpark, Java, or Python. Familiarity with Kimball data modeling and SQL. Experience with Power BI and CI/CD practices. Nice to Have: B2B supply more »
Senior Databricks Migration Consultant Remote This role demands in-depth knowledge of data engineering, cloud technologies (preferably AWS), and a successful record in enterprise level data migrations into Databricks. You will ensure the efficient and secure transition of our data more »
engineering leaders/stakeholders in decision making and implementing the models into production. You will need to have hands on skills in Python and PySpark, experience working in a cloud environment and knowledge of development tools like Git or Docker. You can also expect to work with the latest more »
experience as a Data Engineer. Strong proficiency in AWS and relevant technologies Good communication skills Preferably experience with some or all of the following; Pyspark, Kubernetes, Terraform Expeirence in a start-up/scale up or fast-paced environment is desirable. HOW TO APPLY Please register your interest by more »
to the table. Key Responsibilities Engineer and orchestrate data flows & pipelines in a cloud environment using a progressive tech stack e.g. Databricks, Spark, Python, PySpark, Delta Lake, SQL, Logic Apps, Azure Functions, ADLS, Parquet, Neo4J, Flask Ingest and integrate data from a large number of disparate data sources Design … Spark/Databricks or similar Experience working in a cloud environment (Azure, AWS, GCP) Experience in at least one of: Python (or similar), SQL, PySpark Experience in building data pipeline/ETL/ELT solutions Ability and strong desire to research and learn new technologies and languages Interest in more »
Engineer you will be pivotal in designing, developing, and maintaining data architecture and infrastructure. The ideal candidate should have a strong foundation in Python, PySpark, SQL, and ETL processes, along with proven experience in implementing solutions in a cloud environment. Roles & Responsibilities: Experienced Data Engineer with a background in … and mastering to management and distribution of large datasets. Mandatory Skills: 6+ Years of experience in Design, build, and maintain data pipelines using Python, PySpark and SQL. Develop and maintain ETL processes to move data from various data sources to our data warehouse on AWS. Collaborate with data scientists more »
Engineer you will be pivotal in designing, developing, and maintaining data architecture and infrastructure. The ideal candidate should have a strong foundation in Python, PySpark, SQL, and ETL processes, along with proven experience in implementing solutions in a cloud environment. Roles & Responsibilities: Experienced Data Engineer with a background in … and mastering to management and distribution of large datasets. Mandatory Skills: 6+ Years of experience in Design, build, and maintain data pipelines using Python, PySpark and SQL. Develop and maintain ETL processes to move data from various data sources to our data warehouse on AWS. Collaborate with data scientists more »
Lead Azure Data Engineer | PySpark (Python) & Synapse | Tech for Good/Charity Rate: £500-650 per day Duration: 6-12 months IR35: Outside Location: Remote with occasional travel to London (Once every two months max) Essential skills required: Azure – solid experience required of the Azure Data ecosystem Python - ESSENTIAL … as PySpark is used heavily. You will be tested on PySpark Azure Synapse - ESSENTIAL as is used heavily Spark Azure Data Lake/Data Bricks/Data Factory Be happy to act as a lead and mentor to the other permanent Azure Data Engineers This is the chance … tool experience. Familiar with building Catalogs and lineage This is an urgent contract, so if you are interested apply ASAP. Lead Azure Data Engineer | PySpark (Python), Synapse, Data Lake | Tech for Good/Charity more »
Bristol, England, United Kingdom Hybrid / WFH Options
Adecco
experience in developing data ingestion pipelines with advanced ML elements. Degree in Computer Science, Data Science, Engineering, or related field. Proficient in Python, SQL, PySpark, and Databricks. Experience with modern NLP techniques and tools. Familiarity with Git for version control. Knowledge of the insurance sector is advantageous but not … Expertise in applied machine learning, probability, statistics, and quantitative risk modelling. High proficiency in Python and SQL, with experience in big data technologies (Databricks, PySpark). Experience in the insurance, cyber, or related domain preferred. Strong problem-solving and communication skills.Preferred Skills (Nice to Have): Knowledge of Monte Carlo more »
Lead Azure Data Engineer | PySpark (Python) & Synapse | Tech for Good/Charity Essential skills required: Azure – solid experience required of the Azure Data ecosystem Python - ESSENTIAL as PySpark is used heavily. You will be tested on PySpark Azure Synapse - ESSENTIAL as is used heavily Spark Azure Data … people comprising developers, data engineers, QA and DevOps. Essential skills required: Azure – solid experience required of the Azure Data ecosystem Python - ESSENTIAL as PySpark is used heavily. You will be tested on PySpark Azure Synapse - ESSENTIAL as is used heavily Spark Azure Data Lake/Data Bricks/… architecture Familiar with Synapse CI/CD Azure Purview or another governance tool experience. Familiar with building Catalogs and lineage Lead Azure Data Engineer | PySpark (Python), Synapse, Data Lake | Tech for Good/Charity more »
up with a high performance Data Architecture Good to have the Retail functional Knowledge Must have Good Knowledge and Hands on experience in PythonPySpark ADF and ADB Good to have Knowledge in ADF CI CD Experience in designing architecting and implementing large scale data processing data storage data … delivery teams Supporting business development and ensuring high levels of client satisfaction during delivery Skills Must have strong hands-on technical Skills in PythonPyspark Azure Databricks Spark ETL Cloud Azure preferred Good to have knowledge on ADF CI CD more »
get the most from their data. They are looking for someone with core skills in SQL complimented with Azure experience (Azure Data Factory, Databricks, PySpark etc) This is a very exciting time to join as they shake things up across the industry, so please get in touch asap to … Working with and modelling data warehouses. Skills & Qualifications Strong technical expertise using SQL Server and Azure Data Factory for ETL Solid experience with Databricks, PySpark etc. Understanding of Agile methodologies including use of Git Experience with Python and Spark Benefits £65,000 - £75,000 Bonus To apply for this more »
Guildford, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
Data Engineer We're partnering with a with a global financial services institution that are embarking on an ambitious project: to build an industry-leading data platform. The long-term vision is to migrate all data within the coming years more »
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
DWP Digital
prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Manchester, North West, United Kingdom Hybrid / WFH Options
DWP Digital
prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
DWP Digital
prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
DWP Digital
prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom Hybrid / WFH Options
DWP Digital
prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Blackpool, Lancashire, Marton, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »