SR3, New Silksworth, Sunderland, Tyne & Wear, United Kingdom Hybrid / WFH Options
Avanti Recruitment
record in data warehousing, ETL processes, and data modelling Azure certifications in Database Administration or Data Engineering Nice to have: Experience with Python/PySpark and MySQL on Azure Additional Azure certifications (e.g., Fabric Analytics Engineer) Knowledge of data visualization platforms (Tableau/PowerBi) This is an exciting opportunity more »
Sunderland, Tyne and Wear, Tyne & Wear, United Kingdom
Nigel Frank International
Working with and modelling data warehouses. Skills & Qualifications Strong technical expertise using SQL Server and Azure Data Factory for ETL Solid experience with Databricks, PySpark etc. Understanding of Agile methodologies including use of Git Experience with Python and Spark Benefits £60,000 - £65,000 To apply for this role more »
Liverpool, England, United Kingdom Hybrid / WFH Options
Maxwell Bond
level database development and admin in both Azure cloud and on prem. Proven working experience with Azure Data Warehouse, Databricks, SQL and Python/PySpark . ETL & ETL Frameworks Experience of consuming and combining data for near real time answers. This position does not offer Visa Sponsorship , please refrain more »
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom Hybrid / WFH Options
Nigel Frank International
Working with and modelling data warehouses. Skills & Qualifications Strong technical expertise using SQL Server and Azure Data Factory for ETL Solid experience with Databricks, PySpark etc. Understanding of Agile methodologies including use of Git Experience with Python and Spark Benefits £60,000 - £65,000 This role is an urgent more »
Liverpool, England, United Kingdom Hybrid / WFH Options
Maxwell Bond
level database development and admin in both Azure cloud and on prem. Proven working experience with Azure Data Warehouse, Databricks, SQL and Python/PySpark . Experience of consuming and combining data for near real time answers. This position does not offer Visa Sponsorship , please refrain from applying if more »
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom
Nigel Frank International
To be successful in the role you will have. Strong Azure Data Platform experience Strong understanding of Databricks Coding experience with both Python/PySpark and SQL Experience working with Azure Data Factory for creating ETL solutions This is just a brief overview of the role. For the full more »
SQL Server and Azure Experience creating data pipelines with Azure Data Factory Databricks experience would be beneficial Experience working with Python/Spark/PySpark This is just a brief overview of the role. For the full information, simply apply to the role with your CV, and I will more »
SQL Server and Azure Experience creating data pipelines with Azure Data Factory Databricks experience would be beneficial Experience working with Python/Spark/PySpark This is just a brief overview of the role. For the full information, simply apply to the role with your CV, and I will more »
Sunderland, Tyne and Wear, Tyne & Wear, United Kingdom
Nigel Frank International
SQL Server and Azure Experience creating data pipelines with Azure Data Factory Databricks experience would be beneficial Experience working with Python/Spark/PySpark This is just a brief overview of the role. For the full information, simply apply to the role with your CV, and I will more »
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom
Nigel Frank International
SQL Server and Azure Experience creating data pipelines with Azure Data Factory Databricks experience would be beneficial Experience working with Python/Spark/PySpark This is just a brief overview of the role. For the full information, simply apply to the role with your CV, and I will more »
Leeds, England, United Kingdom Hybrid / WFH Options
Fruition IT
technical stakeholders. Experience Required: Extensive experience in training, deploying and maintaining Machine Learning models. Data Warehousing and ETL tools. Python and surrounding ML tech; PySpark, Snowflake, Scikit Learn, TensorFlow, PyTorch etc. Infrastructure as Code - Terraform, Ansible. Stakeholder Management - Tech and Non-Technical. The Offer: Base Salary more »
GBMs, Elastic Net GLMs, GAMs, Decision Trees, Random Forests, Neural Nets and Clustering Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW’s Radar and Emblem software is preferred more »
GBMs, Elastic Net GLMs, GAMs, Decision Trees, Random Forests, Neural Nets and Clustering Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW’s Radar software is preferred Proficient at more »
GBMs, Elastic Net GLMs, GAMs, Decision Trees, Random Forests, Neural Nets and Clustering Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Proven experience of modelling using WTW toolkit (Emblem and Radar more »
GBMs, Elastic Net GLMs, GAMs, Decision Trees, Random Forests, Neural Nets and Clustering Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW’s Radar software is preferred Proficient at more »
between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW’s Radar software is preferred Proficient at more »
GBMs, Elastic Net GLMs, GAMs, Decision Trees, Random Forests, Neural Nets and Clustering Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW’s Radar software is preferred Proficient at more »
Leeds, England, United Kingdom Hybrid / WFH Options
Mastek
OBIEE, Workato and PL/SQL. Design and build data solutions on Azure, leveraging Databricks, Data Factory, and other Azure services. Utilize Python and PySpark for data transformation, analysis, and real-time streaming. Collaborate with cross-functional teams to gather requirements, design solutions, and deliver insights. Implement and maintain … Technologies: Databricks, Data Factory: Expertise in data engineering and orchestration. DevOps, Storage Explorer, Data Studio: Competence in deployment, storage management, and development tools. Python, PySpark: Advanced coding skills, including real-time data streaming through Autoloader. Development Tools: VS Code, Jira, Confluence, Bitbucket. Service Management: Experience with ServiceNow. API Integration more »
Lead Data Engineer: We need some strong Lead data engineer profiles… they need good experience with Python, SQL, ADF and preferably Azure Databricks experience Job description: Building new data pipelines and optimizing data flows using the Azure cloud stack. Building more »
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
Newcastle upon Tyne, Tyne & Wear Hybrid / WFH Options
DWP
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
Leeds, West Yorkshire, Yorkshire and the Humber Hybrid / WFH Options
DWP
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »