Job Description for Data Engineer - Coding skills: Pyspark or Scala(level to troubleshoot issues) and SQL. Must have good pyspark knowledge to build data transformation. Hands on Azure data platforms i.e; Azure Datafactory, Databricks, Synapse, SQL, Data lake storage. Experience building and optimizing ‘big data’ data pipelines, architectures more »
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Anson McCade
Data Engineer • Location : Belfast based – Hybrid - flexible working • Salary : £32,5000 - £40,000 • Package: 10% bonus + 11% pension Overview One of the UK’s leading digital solution provider’s is searching for a Data Engineer to join their practice more »
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »
a week in the Liverpool office - rest remote**** Senior Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. **** Informatica Cloud (IICS & IDMC ) is essential - NOT PowerCenter**** A top insurance firm are looking for a … e.g. Scrum, SAFe) and tools (i.e. Jira, AzureDevOps). Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. Seniority Level Mid-Senior level Industry Insurance Financial Services Employment Type Full-time Job Functions Information more »
a week in the Liverpool office - rest remote**** Senior Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. **** Informatica Cloud (IICS & IDMC ) is essential - NOT PowerCenter**** A top insurance firm are looking for a … e.g. Scrum, SAFe) and tools (i.e. Jira, AzureDevOps). Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. Seniority Level Mid-Senior level Industry Insurance Financial Services Employment Type Full-time Job Functions Information more »
Liverpool office 2-3 days a week - rest remote* Senior Data Engineer, Data, Datamodelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or PySpark, Agile, migrating data from on-prem to cloud. *Informatica (IICS & IDMC) is essential* A top Reinsurance firm are looking for a Senior/Lead … cloud and tooling - Informatica and Azure are highly desired. Data Engineer, Data, Datamodelling, Migration, ETL, ETL Tooling ETL (Informatica IICS & IDMC) SQL, Python or PySpark, Agile, migrating data from on-prem to cloud. more »
Hampshire, England, United Kingdom Hybrid / WFH Options
Parity Network Group
range of data sources. Most of their roles focus on the development of reproducible analytical pipelines using a range of coding languages (notably Python, PySpark and R) across a range of cloud based and local platforms. The roles also provide an opportunity for cutting edge data science analysis with … Job Responsibilities Write code and functions in a way that is reusable, fits best standards guidance, and are robustly unit tested. (Preferably in Python, PySpark or R) Create reproducible pipelines for systematic processing of statistical data, based on initial design documents. This includes writing documentation, designing and conducting tests … projects and tasks on a day-to-day basis. Supporting colleagues in the development of their data science skills. Essential Skills: Experience with Python, PySpark and R. Experience with Agile or Waterfall methodologies. Experience building scalable data products for a range of users in technology (e.g Big data). more »
months to begin with & its extendable Location: Leeds, UK (min 3 days onsite) Context: Legacy ETL code for example DataStage is being refactored into PySpark using Prophecy low-code no-code and available converters. Converted code is causing failures/performance issues. Skills: Spark Architecture – component understanding around Spark … Data Integration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans. Spark SME – Be able to analyse Spark code failures through Spark Plans and make correcting recommendations. Spark SME – Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations. Spark – SME Be able … are Cluster level failures. Cloudera (CDP) – Knowledge of understanding how Cloudera Spark is set up and how the run time libraries are used by PySpark code. Prophecy – High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code. more »
Python, PySpark, AWS, Oracle, Kafka, Banking Become a key member of an agile team designing and delivering a market-leading secure and scalable reporting product. The team is migrating from Oracle to AWS cloud so experience with either is nice to have. You serve as a seasoned member of … candidates able to work under these requirements without visa sponsorship will be able to be considered at this time. Required skills: - Either Python or PySpark experience. Ideally, you will be working as either a Python Developer (3+ years) or Data Engineer using PySpark - Either AWS or Oracle. Candidates more »