is essential and the role is 2-3 days a week in the Liverpool office - rest remote**** Senior Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. **** Informatica Cloud (IICS & IDMC ) is essential - NOT PowerCenter … data. Essential/core experience required: 10+ years of experience and in-depth knowledge of data delivery and associated architecture principles, data modelling concepts, ETL procedures, and all steps of data production process Prior experience in insurance and/or reinsurance in support of specialty lines is a plus. Extensive … hands on experience in SQL, Python, Data Integration/Ingestion and associated patterns - ETL tooling – Informatica IICS, Warehousing technologies and associated patterns, Cloud platforms – Azure preferred. Experience with on-prem and cloud versions of databases such as Oracle and SQL Server. Professional certifications in public cloud and tooling – Informatica andmore »
Location: London – Hybrid Contract: 6 Months Inside IR35 (potential for additional 6 months) Skills: Data Engineer, DBT, Data Build Tool, GCP, Google Cloud Platform, ETL, ELT, SQL, PowerBI, Data Warehouse, Snowflake We are looking for a skilled Data Engineer to work on a dynamic and innovative project within a well … using state-of-the-art tools and technologies, including DBT, GCP, and PowerBI. - Design and implement efficient ELT/ETL processes to extract, transform, andload data from various sources into our data warehouse. - Collaborate with cross-functional teams to understand data requirements and deliver robust solutions to meet business … cloud environment. - Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, and Pub/Sub. - Strong understanding of ELT (Extract, Load, Transform) andETL (Extract, Transform, Load) methodologies. - Familiarity with visualisation tools like PowerBI for creating insightful dashboards and reports. - Excellent problem-solving skills and ability to work more »
Carlisle, Cumbria, North West, United Kingdom Hybrid / WFH Options
Erin Associates
ETL Developer 1 day a month in their Cumbria offices ETL, JavaScript, Python, Data Modelling , Kafka 55k & benefits Join an award-winning software company nestled in the serene beauty of the Lake District. We're on the lookout for a dynamic ETL Developer to be part of their growing team … role is all about data from crafting storage architectures to orchestrating extraction, formatting, and presenting data to end-users. Take charge of the entire ETL process, ensuring seamless data flow and management. Enjoy the flexibility of remote work, with occasional visits to their Penrith/Lake District offices (expenses covered … is the perfect time to hop aboard their success story, as they embark on exciting growth initiatives and groundbreaking projects. Core experience for this ETL role: Proven track record as an ETL developer or similar data-focused role Expertise in transferring data between transactional and warehouse solutions Consultation with data more »
business readiness, change & adoption. Able to assess complex situations, identify solutions, and strategically plan for implementation.* Knowledge of data principles including ETL (extract, transformandload) pipelines, integration methods and data migration techniques.* Excellent verbal and written communication skills to adapt to technical and non-technical audiences.* Capable of creating more »
London, England, United Kingdom Hybrid / WFH Options
Aventum Group
a plus. DB: Azure SQL Database, Cosmos DB, NoSQL, MongoDB, and HBase are a plus Methodologies: Agile and DevOps must have Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (Public Key Infrastructure) and Integration testing more »
Databricks Spark Delta Lake SQL Python PySpark ADLS Day To Day Responsibilities: Extensive experience in designing, developing, and managing end-to-end data pipelines, ETL (Extract, Transform, Load), and ELT (Extract, Load, Transform) solutions. Maintains a proactive approach to staying updated with emerging technologies and a strong desire to continuously more »
Project Delivery Experience Face-to-face communication skills Day To Day Responsibilities: Extensive experience in designing, developing, and managing end-to-end data pipelines, ETL (Extract, Transform, Load), and ELT (Extract, Load, Transform) solutions. Maintains a proactive approach to staying updated with emerging technologies and a strong desire to continuously more »
Engineer Immediate Responsibilities Managing and delivering end to end BI reporting projects from initial brief/requirement gathering to data pipeline development (APIs/ETL/ELT/SQL) to visualization Undertaking ad hoc BI tasks based on business requirements and priorities As part of the BI team you will more »
data modelling, schema design, and database optimization techniques Tech Skills: Proficiency in modern cloud-based data platforms and technologies. Expertise in data analysis, modelling, ETL, data warehousing, and core data infrastructure services. Familiarity with technologies such as Microsoft Synapse, Azure Data Factory, Databricks Notebooks, Python/PySpark, and more. Proficiency more »
of new technologies Ability to maintain delivery momentum We’d Love If You Also Have These: Highly proficient in SQL Experience using Python based ETL tools such as Databricks Experience using Data Ingestion tools such as FiveTran or Stitch Experience using Business Intelligence tools such as PowerBI, Looker or Tableau more »
City of London, London, United Kingdom Hybrid / WFH Options
The Health Foundation
our principal CRM system the successful candidate will have varied and demonstratable experience in a range of data and information systems, ETL (Extract, TransformandLoad) concepts and a working knowledge of SQL. Beyond development, you will create user manuals, technical documentation, and training materials to empower end-users andmore »
Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some of the below technologies: Data integration – ETL tools like Talend and Informatica. Ingestion mechanism like Flume & Kafka Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in more »
DN4 5PL, Doncaster, South Yorkshire, United Kingdom Hybrid / WFH Options
Keepmoat
Business Intelligence/Analytics (Power BI). They will have experience with Microsoft BI stack, relational database and SQL experience alongside experience working with ETLand/or data integration tools (preferably Informatica). Experience working in an agile environment (preferably with Azure DevOps) is also essential. Experience within the more »
Greater London, England, United Kingdom Hybrid / WFH Options
Morgan McKinley
productivity across teams and unlocking the full potential of data within the organisation. Responsibilities Maintain and optimise available data warehouse infrastructureDesign, implement and document ETL procedures for intake of new data from relevant sources employing industry standards and best practices; as well as ensure data is verified and quality checkedCollaborate more »
data modelling, schema design, and database optimization techniques Tech Skills: Familiarity with cloud-based data platforms and technologies. Good knowledge of data analysis, modelling, ETL processes, data warehousing, and core data infrastructure services. Exposure to technologies like Microsoft Synapse, Azure Data Factory, Databricks Notebooks, Python/PySpark, and similar tools. more »
encryptions, etc.). In-depth knowledge and experience with Azure data storage (SQL Server, Data lake, Synapse, etc.) & access tools, APIs, cloud connectivity, andETL processes. Knowledge and some experience of MS Office/MS Office 365 suite, SharePoint Online, Power Apps, GitHub, MS Teams, etc. In-depth knowledge & experience more »
Proficiency in Azure services (SQL Database, Data Lake Storage, Databricks, CosmosDB). Understanding of data modelling and governance principles. Experience with Python, Databricks, andETL tools. Familiarity with Spark, Kafka, and Azure service integration. Skills in Power BI or Azure Data Explorer for visualisation. Knowledge of data security and compliance. more »
legacy data to our new cloud-based data platform. Design and implement master and transactional data models to ensure data integrity and consistency. Develop ETL pipelines and cloud data storage solutions that meet business requirements and non-functional needs. Collaborate closely with data analysts, visualization engineers, and project managers to more »
with relational databases and NoSQL databases Significant experience and in-depth knowledge of creating data pipelines and associated design principles, standards, Data modelling concepts, ETL procedures, and all steps of data production process Experience with unit and integration testing, and data quality frameworks such as Deequ, Great Expectations or Delta more »
London, England, United Kingdom Hybrid / WFH Options
Amplifi Capital
working in virtual teams and being in constant collaboration Ability to work unsupervised and take ownership for key services Nice to Have: Experience with ETL architectures and tools, including integration with APIs and coding Python data pipelines Team leadership skills for managing tasks and conducting stand-ups Knowledge of reference more »
Manchester Area, United Kingdom Hybrid / WFH Options
Airtime Rewards
Requirements Bachelor’s degree in Computer Science, a relevant technical field, or equivalent experience. Experience designing cloud data warehouse solutions, data modelling, and building ETL/ELT processes, preferably on GCP or equivalent platforms (AWS, Azure, Snowflake). Proficiency with SQL, Python, Docker, and Terraform (or similar IaC tools). more »
implementation of appropriate observability (monitoring and alerting) across our Data platforms. We’re looking for a Lead Data Engineer who has: Experience working with ETL/ELT tools, including Azure Data Factory and dbt, developing a wide variety of integration solutions incorporating APIs, files, databases, etc. Experience with database platforms more »
to structure data and make databases work efficiently. Tech Skills: Familiarity with cloud-based data platforms and technologies. Good knowledge of data analysis, modelling, ETL processes, data warehousing, and core data infrastructure services. Exposure to technologies like Microsoft Synapse, Azure Data Factory, Databricks Notebooks, Python/PySpark, and similar tools. more »
and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration andETL frameworks and tools. Understanding of DataOps, data mining, and data visualisation/BI tools. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. more »
demonstrate working on large engagements * Experience of AWS tools (e.g. Athena, Redshift, Glue, EMR) * Java, Scala, Python, Spark, SQL * Experience of developing enterprise grade ETL/ELT data pipelines. * Deep understanding of data manipulation/wrangling techniques * Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit more »