considering applicants with experience in Java, Scala, C# or C++. Practical experience with distributed processing of large amounts of data. Hands-on experience with PySpark/Spark. Experience working according to DevOps best-practises (CI/CD, testing, familiarity with Github/Gitlab). Good knowledge of at least more »
workflows, DLT pipelines, Unity Catalog 3+ years working with data warehouses, relational databases and query languages 2+ years building data pipelines in databricks using pyspark, scala and/or spark SQL and ability to work across structured, semi-structured and unstructured data 2+ years data modeling (e.g., data vault more »
algorithms). Expert in SQL and Python, DBT, and standard data science libraries and frameworks such as pandas, numpy, scikit-learn, TensorFlow/PyTorch, PySpark, etc. Experience with Infrastructure-as-Code, Continuous Integration & Deployment patterns. Experience with PostgreSQL, Google Cloud Platform, and BigQuery. Excellent problem-solving skills and the more »
ML and AI models to get results code use your extensive coding background to remains hands-on and deliver projects on our modern Azure, pyspark-based data infrastructure understand how the IB business works, what our stakeholders and clients really need, and how you can make a difference You more »
Pipeline using AWS services (S3, Lambda, Glue, Athena, QuickSight, etc.) or similar native Azure or GCP services. Strong experience on Snowflake or DataBricks (including PySpark) along with tools like DBT or Snaplogic or Informatica Stack Proactively seek to understand major industry trends and apply them to solutions, new initiatives more »
engineering concepts and technologies Experience working in a cloud environment Experience with modern and traditional data warehousing and data processing technologies and concepts (Hadoop, PySpark, Streaming Data) #J-18808-Ljbffr more »
Penrith, Cumbria, United Kingdom Hybrid / WFH Options
Computer Futures
Knowledge on Spark architecture and modern Datawarehouse/Data-Lake/Lakehouse techniques Build transformation tables using SQL. Moderate level knowledge of Python/PySpark or equivalent programming language. PowerBI Data Gateways and DataFlows, permissions. Creation, utilisation, optimisation and maintenance of Relational SQL and NoSQL databases. Experienced working with more »
experience in Data Modeling within a cloud-based data platformStrong experience with SQL ServerAzure data engineering stack, including Azure Synapse and Azure Data LakePython, PySpark and T-SQL In return you will be offered a competitive salary and benefits package, remote working options and an opportunity to work with more »
cross functional teams entrusted with business-critical platforms.Desirable skills & experienceWorking to an Agile methodology and familiarity with Azure DevOpsDeep automation knowledge with PythonSkilled in Pyspark and SynapseExperience with data modelling and visualisation in Power BI (or alternative)A strong understanding of architecting data platforms, BI, MI, or analytics solutionsStrong more »
in retail/marketing but not requiredCandidates should be looking to work in a fast paced startup feel environmentTech across: Python, SQL, AWS, Databricks, PySpark, AB Testing, MLFlow, APIsApply below!CONTACTIf you can’t see what you’re looking for right now, send us your CV anyway – we’re more »
end data solutions, delivering best-in-class experiences for their external clients. Technical Background: SAS SAS Base Azure or AWS or GCP Python/PySpark Proficiency in SQL and/or similar data technologies Familiarity with data pipeline tools and ETL processes Knowledge of cloud platforms and data architecture more »
Blackpool, Lancashire, Marton, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Sheffield, South Yorkshire, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Newcastle upon Tyne, Tyne and Wear, High Heaton, Tyne & Wear, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Leeds, West Yorkshire, Richmond Hill, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Department of Work & Pensions
prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
performance, scalability, and reliability. Technical Skills required: RedShift Glue (inc. Glue Studio, Glue Data Quality, Glue DataBrew) Step Functions Athena Lambda Kinesis Python, Spark, Pyspark, SQL Your contributions as a Data Engineer will directly impact the organization's operations and revenue. In addition to a competitive annual salary, we more »
engineering leaders/stakeholders in decision making and implementing the models into production. You will need to have hands on skills in Python and PySpark, experience working in a cloud environment and knowledge of development tools like Git or Docker. You can also expect to work with the latest more »
Blackpool, Lancashire, Marton, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
Newcastle upon Tyne, Tyne and Wear, High Heaton, Tyne & Wear, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
City of London, London, Westminster Abbey, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
Leeds, West Yorkshire, Richmond Hill, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »