is essential and the role is 2-3 days a week in the Liverpool office - rest remote**** Senior Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. **** Informatica Cloud (IICS & IDMC ) is essential - NOT PowerCenter … data. Essential/core experience required: 10+ years of experience and in-depth knowledge of data delivery and associated architecture principles, data modelling concepts, ETL procedures, and all steps of data production process Prior experience in insurance and/or reinsurance in support of specialty lines is a plus. Extensive … hands on experience in SQL, Python, Data Integration/Ingestion and associated patterns - ETL tooling – Informatica IICS, Warehousing technologies and associated patterns, Cloud platforms – Azure preferred. Experience with on-prem and cloud versions of databases such as Oracle and SQL Server. Professional certifications in public cloud and tooling – Informatica andmore »
Location: London – Hybrid Contract: 6 Months Inside IR35 (potential for additional 6 months) Skills: Data Engineer, DBT, Data Build Tool, GCP, Google Cloud Platform, ETL, ELT, SQL, PowerBI, Data Warehouse, Snowflake We are looking for a skilled Data Engineer to work on a dynamic and innovative project within a well … using state-of-the-art tools and technologies, including DBT, GCP, and PowerBI. - Design and implement efficient ELT/ETL processes to extract, transform, andload data from various sources into our data warehouse. - Collaborate with cross-functional teams to understand data requirements and deliver robust solutions to meet business … cloud environment. - Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, and Pub/Sub. - Strong understanding of ELT (Extract, Load, Transform) andETL (Extract, Transform, Load) methodologies. - Familiarity with visualisation tools like PowerBI for creating insightful dashboards and reports. - Excellent problem-solving skills and ability to work more »
London, England, United Kingdom Hybrid / WFH Options
Aventum Group
a plus. DB: Azure SQL Database, Cosmos DB, NoSQL, MongoDB, and HBase are a plus Methodologies: Agile and DevOps must have Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (Public Key Infrastructure) and Integration testing more »
a plus. DB: Azure SQL Database, Cosmos DB, NoSQL, MongoDB, and HBase are a plus. Methodologies: Agile and DevOps must have. Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (Public Key Infrastructure) and Integration testing. If this sounds like you, be sure to get more »
Databricks Spark Delta Lake SQL Python PySpark ADLS Day To Day Responsibilities: Extensive experience in designing, developing, and managing end-to-end data pipelines, ETL (Extract, Transform, Load), and ELT (Extract, Load, Transform) solutions. Maintains a proactive approach to staying updated with emerging technologies and a strong desire to continuously more »
Engineer Immediate Responsibilities Managing and delivering end to end BI reporting projects from initial brief/requirement gathering to data pipeline development (APIs/ETL/ELT/SQL) to visualization Undertaking ad hoc BI tasks based on business requirements and priorities As part of the BI team you will more »
of new technologies Ability to maintain delivery momentum We’d Love If You Also Have These: Highly proficient in SQL Experience using Python based ETL tools such as Databricks Experience using Data Ingestion tools such as FiveTran or Stitch Experience using Business Intelligence tools such as PowerBI, Looker or Tableau more »
Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some of the below technologies: Data integration – ETL tools like Talend and Informatica. Ingestion mechanism like Flume & Kafka Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in more »
DN4 5PL, Doncaster, South Yorkshire, United Kingdom Hybrid / WFH Options
Keepmoat
Business Intelligence/Analytics (Power BI). They will have experience with Microsoft BI stack, relational database and SQL experience alongside experience working with ETLand/or data integration tools (preferably Informatica). Experience working in an agile environment (preferably with Azure DevOps) is also essential. Experience within the more »
Greater London, England, United Kingdom Hybrid / WFH Options
Morgan McKinley
across teams and unlocking the full potential of data within the organisation. Responsibilities Maintain and optimise available data warehouse infrastructure Design, implement and document ETL procedures for intake of new data from relevant sources employing industry standards and best practices; as well as ensure data is verified and quality checked more »
encryptions, etc.). In-depth knowledge and experience with Azure data storage (SQL Server, Data lake, Synapse, etc.) & access tools, APIs, cloud connectivity, andETL processes. Knowledge and some experience of MS Office/MS Office 365 suite, SharePoint Online, Power Apps, GitHub, MS Teams, etc. In-depth knowledge & experience more »
legacy data to our new cloud-based data platform. Design and implement master and transactional data models to ensure data integrity and consistency. Develop ETL pipelines and cloud data storage solutions that meet business requirements and non-functional needs. Collaborate closely with data analysts, visualization engineers, and project managers to more »
with relational databases and NoSQL databases Significant experience and in-depth knowledge of creating data pipelines and associated design principles, standards, Data modelling concepts, ETL procedures, and all steps of data production process Experience with unit and integration testing, and data quality frameworks such as Deequ, Great Expectations or Delta more »
Advanced skills in SQL programming, with the ability to design, optimize, and maintain complex queries and databases. Strong experience in designing, implementing, and managing ETL processes using Azure Data Factory and other relevant tools. Proficiency in building and managing data warehouses using Azure Synapse Analytics and other data warehousing solutions. more »
London, England, United Kingdom Hybrid / WFH Options
Amplifi Capital
working in virtual teams and being in constant collaboration Ability to work unsupervised and take ownership for key services Nice to Have: Experience with ETL architectures and tools, including integration with APIs and coding Python data pipelines Team leadership skills for managing tasks and conducting stand-ups Knowledge of reference more »
Manchester Area, United Kingdom Hybrid / WFH Options
Airtime Rewards
Requirements Bachelor’s degree in Computer Science, a relevant technical field, or equivalent experience. Experience designing cloud data warehouse solutions, data modelling, and building ETL/ELT processes, preferably on GCP or equivalent platforms (AWS, Azure, Snowflake). Proficiency with SQL, Python, Docker, and Terraform (or similar IaC tools). more »
development, Blazor, NServiceBus, CQRS and Domain-driven design, Scrum/Agile, API Design and management skills Data Engineering projects using Data Lakes, Data factories, ETL techniques, Synapse and PowerBI MS Logic and Power Apps Azure DevOps Experience with the following would be beneficial: AI + Machine Learning Analytics: Analysis Services more »
and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration andETL frameworks and tools. Understanding of DataOps, data mining, and data visualisation/BI tools. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. more »
demonstrate working on large engagements * Experience of AWS tools (e.g. Athena, Redshift, Glue, EMR) * Java, Scala, Python, Spark, SQL * Experience of developing enterprise grade ETL/ELT data pipelines. * Deep understanding of data manipulation/wrangling techniques * Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit more »
Brighton, England, United Kingdom Hybrid / WFH Options
Legal & General
impact on internal stakeholders and customers is understood. Qualifications What we’re looking for Proficient across many domains of data engineering, including ELT/ETL, metadata management, data integration, data management in transit and at rest and data streaming. Experience of Kimball modelling techniques to design databases & data warehousing solutions more »
and scalability. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. Extensive experience in designing and implementing data pipelines, ETL processes, and data warehousing solutions. Proficiency in cloud platforms such as Azure, with hands-on experience in data lakes, Databricks, and Synapse Analytics. Strong programming more »
data pipelines for data ingestion, processing, and transformation in Azure. Utilise Azure Data Factory, Azure Databricks and SAP Business Objects to create and maintain ETL operations. Deliver dashboard reporting and data sets which are interactive and user-friendly via PowerBI Identifying and integrating external data into the business data model more »
the different data architecture patterns: Data Fabric, Data Mesh, Data Warehouse, Data Marts, data modeling, ontologies & knowledge graphs, MicroServices • You have experience in implementing ETL data flows and data pipelines, and know one or more of the following tools: Informatica PowerCenter, SAS Data Integration Studio, Microsoft SSIS, Ab Initio, etc. more »
new technology • able to quickly learn new skills & technologies: sharing any insights gained with colleagues • technical testing experience testing data pipelines or batch andETL type processes • experience of working in Agile/SCRUM/KAN-BAN and TDD environments and teams • willing to listen to ideas of others andmore »
high-performance data systems that are foundational to driving business growth and success. Key responsibilities: Implementing scalable data architectures and systems. Developing and maintaining ETL (Extract, Transform, Load) pipelines. Managing data storage, backup, and recovery mechanisms. Writing complex SQL queries to extract data for analysis. Developing and implementing data security more »