Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
Instances, Azure DB, Data Lakes, Azure Synapse and Power BI, Azure Data Factory). Proficiency in at least one scripting language (i.e. SQL, and PySpark/Python). Proficiency of designing and building API and API Consumption. Familiarity with data visualisations tools such as PowerBI. Proficiency in ETL/… Model, Data Platforms, Data Models, Data Architecture, Data Integration, DIAS, Microsoft Azure, Azure DB, Data Lakes, Azure Synapse, Power BI, Azure Data Factory, SQL, PySpark, Python, APIs, ETL, ELT, CI/CD, Data Pipelines, Data-as-a-Service. Please note that due to a high level of applications, we more »
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »
a week in the Liverpool office - rest remote**** Senior Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. **** Informatica Cloud (IICS & IDMC ) is essential - NOT PowerCenter**** A top insurance firm are looking for a … e.g. Scrum, SAFe) and tools (i.e. Jira, AzureDevOps). Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. Seniority Level Mid-Senior level Industry Insurance Financial Services Employment Type Full-time Job Functions Information more »
worked as a senior data engineer in data lake/lake house architecture platform and has hands on experience of developing data pipelines using Pyspark and SQL. The successful individual will have working experience of AWS data eco system and toolsets, with passion to learn and explore new technology … with hands on delivery. Working experience of Cloud Technologies and data eco system, AWS is preferable. Expert in development of data pipeline based on Pyspark and SQL with orchestration tools such as Airflow, Tivoli etc Knowledge of OLAP implementation on any Database technologies (Snowflake preferred) Excellent understanding of Data more »
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »
Data Engineer – SQL, Python, PySpark, Azure, Databricks, Lloyds, Insurance, Permanent – London A fast-growing global Insurance broker situated in the City of London is searching for x2 Data Engineers to join their newly formed Data & Analytics team and play a pivotal role in their data transformation. At present, they … both structured and unstructured data from the global business. Skills Required Prior experience working with large datasets and a proficiency in SQL, Python and PySpark is required. An understanding of medallion architecture and experience working within the framework in a Lakehouse platform. Experience of pipeline deployment within Azure Databricks more »
Lead Data Engineer – SQL, Python, PySpark, Azure, Databricks, Lloyds, Insurance, Permanent – London A fast-growing global Insurance broker situated in the City of London is searching for a Lead Data Engineer to join their newly formed Data & Analytics team and play a pivotal role in their data transformation. At … Engineer in a previous business and can manage your own deliverables. Prior experience working with large datasets and a proficiency in SQL, Python and PySpark is required. Experience of managing a team of Engineers with varying degrees of experience. Experience of pipeline deployment within Azure Databricks in line with more »
Are you looking for an exciting opportunity in Solution Architecture ? Are you passionate about everything cloud and data ? Join us as an AWS Cloud Data Platform Solution Architect Careers at TCS: It means more TCS is a purpose-led transformation more »
London. Responsibilities: Collaborate with cross-functional teams to gather requirements and implement solutions. Develop and maintain data processing applications using Python. Optimise and tune PySpark jobs for performance and scalability. Ensure data quality, reliability, and integrity throughout the data processing pipelines. Technical Requirements: Python: Proficiency in Python programming. Object … Oriented Design: Solid understanding of object-oriented principles and design patterns. PySpark: Experience with PySpark for data processing and analytics. Azure: Familiarity with Azure services and cloud platforms. Financial Services Background: Knowledge of financial markets, instruments, and related data. more »