Leeds, England, United Kingdom Hybrid / WFH Options
Fruition IT
technical stakeholders. Experience Required: Extensive experience in training, deploying and maintaining Machine Learning models. Data Warehousing and ETL tools. Python and surrounding ML tech; PySpark, Snowflake, Scikit Learn, TensorFlow, PyTorch etc. Infrastructure as Code - Terraform, Ansible. Stakeholder Management - Tech and Non-Technical. The Offer: Base Salary more »
Leeds, England, United Kingdom Hybrid / WFH Options
Mastek
OBIEE, Workato and PL/SQL. Design and build data solutions on Azure, leveraging Databricks, Data Factory, and other Azure services. Utilize Python and PySpark for data transformation, analysis, and real-time streaming. Collaborate with cross-functional teams to gather requirements, design solutions, and deliver insights. Implement and maintain … Technologies: Databricks, Data Factory: Expertise in data engineering and orchestration. DevOps, Storage Explorer, Data Studio: Competence in deployment, storage management, and development tools. Python, PySpark: Advanced coding skills, including real-time data streaming through Autoloader. Development Tools: VS Code, Jira, Confluence, Bitbucket. Service Management: Experience with ServiceNow. API Integration more »
SR3, New Silksworth, Sunderland, Tyne & Wear, United Kingdom Hybrid / WFH Options
Avanti Recruitment
record in data warehousing, ETL processes, and data modelling Azure certifications in Database Administration or Data Engineering Nice to have: Experience with Python/PySpark and MySQL on Azure Additional Azure certifications (e.g., Fabric Analytics Engineer) Knowledge of data visualization platforms (Tableau/PowerBi) This is an exciting opportunity more »
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom Hybrid / WFH Options
Nigel Frank International
Working with and modelling data warehouses. Skills & Qualifications Strong technical expertise using SQL Server and Azure Data Factory for ETL Solid experience with Databricks, PySpark etc. Understanding of Agile methodologies including use of Git Experience with Python and Spark Benefits £60,000 - £65,000 This role is an urgent more »
data and BI software tools and systems Designing, coding, testing and documenting all new or modified applications, and programs using languages such as (python, pyspark, SQL, Java and other industry standard tools) Analysing user requirements and, based on findings, design functional specifications for front-end applications. Modelling a collaborative … Extensive knowledge with integration and transformation within Cloud Data Warehouses (e.g. Azure, AWS, GCP) Experience Essential Experience with programming languages such as Python, SQL, PySpark, Java Experience of using data engineering expertise to architect the most appropriate solution design to suit a particular requirement or set of requirements Experience more »
Blackpool, Lancashire, Marton, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
Newcastle upon Tyne, Tyne and Wear, High Heaton, Tyne & Wear, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
Sheffield, South Yorkshire, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
Leeds, West Yorkshire, Richmond Hill, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
scalable, automated ETL pipelines in an AWS cloud environment using AWS S3 Cloud Object Storage Strong coding skills using Hive SQL, Spark SQL, Python, PySpark and Bash Experience of working with a wide variety of structured and unstructured data. You and your role As a Data Engineer at DWP … you'll be executing code across a full tech stack, including Azure, Databricks, PySpark and Pandas, helping the department to move towards a cloud computing environment, working with huge data sets as part of our DataWorks platform - a system that provides Universal Credit data to our Data Science team more »
science Experience and detailed technical knowledge of GLMs/Elastic Nets, GBMs, GAMs, Random Forests, and clustering techniques Experience in programming languages (e.g. Python, PySpark, R, SAS, SQL) Proficient at communicating results in a concise manner both verbally and written Behaviours: Motivated by technical excellence Team player Self-motivated more »
predictive modelling techniques; Logistic Regression, GBMs, Elastic Net GLMs, GAMs, Decision Trees, Random Forests, Neural Nets and Clustering Experience in programming languages (e.g. Python, PySpark, SAS, SQL) A good quantitative degree in, but not limited to: Mathematics, Statistics, Engineering, Physics, Computer Science Proficient at communicating results in a concise more »