is essential and the role is 2-3 days a week in the Liverpool office - rest remote**** Senior Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. **** Informatica Cloud (IICS & IDMC ) is essential - NOT PowerCenter … data. Essential/core experience required: 10+ years of experience and in-depth knowledge of data delivery and associated architecture principles, data modelling concepts, ETL procedures, and all steps of data production process Prior experience in insurance and/or reinsurance in support of specialty lines is a plus. Extensive … hands on experience in SQL, Python, Data Integration/Ingestion and associated patterns - ETL tooling – Informatica IICS, Warehousing technologies and associated patterns, Cloud platforms – Azure preferred. Experience with on-prem and cloud versions of databases such as Oracle and SQL Server. Professional certifications in public cloud and tooling – Informatica andmore »
a plus. DB: Azure SQL Database, Cosmos DB, NoSQL, MongoDB, and HBase are a plus. Methodologies: Agile and DevOps must have. Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (Public Key Infrastructure) and Integration testing. If this sounds like you, be sure to get more »
Databricks Spark Delta Lake SQL Python PySpark ADLS Day To Day Responsibilities: Extensive experience in designing, developing, and managing end-to-end data pipelines, ETL (Extract, Transform, Load), and ELT (Extract, Load, Transform) solutions. Maintains a proactive approach to staying updated with emerging technologies and a strong desire to continuously more »
Engineer Immediate Responsibilities Managing and delivering end to end BI reporting projects from initial brief/requirement gathering to data pipeline development (APIs/ETL/ELT/SQL) to visualization Undertaking ad hoc BI tasks based on business requirements and priorities As part of the BI team you will more »
of new technologies Ability to maintain delivery momentum We’d Love If You Also Have These: Highly proficient in SQL Experience using Python based ETL tools such as Databricks Experience using Data Ingestion tools such as FiveTran or Stitch Experience using Business Intelligence tools such as PowerBI, Looker or Tableau more »
Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some of the below technologies: Data integration – ETL tools like Talend and Informatica. Ingestion mechanism like Flume & Kafka Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in more »
DN4 5PL, Doncaster, South Yorkshire, United Kingdom Hybrid / WFH Options
Keepmoat
Business Intelligence/Analytics (Power BI). They will have experience with Microsoft BI stack, relational database and SQL experience alongside experience working with ETLand/or data integration tools (preferably Informatica). Experience working in an agile environment (preferably with Azure DevOps) is also essential. Experience within the more »
2. Design and develop scalable and efficient business intelligence solutions, including data models, dashboards, reports, and analytics using industry-leading tools and technologies. 3. Extract, transform, andload (ETL) data from various sources into data warehouses or data lakes, ensuring data integrity, consistency, and quality. 4. Conduct data profiling, data … development using tools such as Tableau, Power BI, QlikView, or similar. 4. Strong SQL skills for data extraction, transformation, and analysis. 5. Experience with ETL tools and processes, data warehousing concepts, and data integration techniques. 6. Solid understanding of business processes and the ability to translate business requirements into technical more »
knowledge of business intelligence concepts, data visualisation and analytic methods. * Data Engineering: A strong background in data engineering, with thorough understanding of concepts like ETL (Extract, Transform, Load), data cleaning, data structures, and data warehousing. * Azure Data Services: Hands-on experience with Azure data services like Azure SQL Database, Azure more »
legacy data to our new cloud-based data platform. Design and implement master and transactional data models to ensure data integrity and consistency. Develop ETL pipelines and cloud data storage solutions that meet business requirements and non-functional needs. Collaborate closely with data analysts, visualization engineers, and project managers to more »
with relational databases and NoSQL databases Significant experience and in-depth knowledge of creating data pipelines and associated design principles, standards, Data modelling concepts, ETL procedures, and all steps of data production process Experience with unit and integration testing, and data quality frameworks such as Deequ, Great Expectations or Delta more »
Advanced skills in SQL programming, with the ability to design, optimize, and maintain complex queries and databases. Strong experience in designing, implementing, and managing ETL processes using Azure Data Factory and other relevant tools. Proficiency in building and managing data warehouses using Azure Synapse Analytics and other data warehousing solutions. more »
Manchester Area, United Kingdom Hybrid / WFH Options
Airtime Rewards
Requirements Bachelor’s degree in Computer Science, a relevant technical field, or equivalent experience. Experience designing cloud data warehouse solutions, data modelling, and building ETL/ELT processes, preferably on GCP or equivalent platforms (AWS, Azure, Snowflake). Proficiency with SQL, Python, Docker, and Terraform (or similar IaC tools). more »
development, Blazor, NServiceBus, CQRS and Domain-driven design, Scrum/Agile, API Design and management skills Data Engineering projects using Data Lakes, Data factories, ETL techniques, Synapse and PowerBI MS Logic and Power Apps Azure DevOps Experience with the following would be beneficial: AI + Machine Learning Analytics: Analysis Services more »
Experience Essential Significant experience of strategic planning to support wider organisational objectives In-depth knowledge of BI tools and technologies, such as data warehousing, ETL (Extract, Transform, Load), and analytics platforms. Proficiency in SQL (Structured Query Language) and other data querying languages Excellent problem-solving and analytical skills, with the … ability to extract meaningful insights from complex datasets. Familiarity with data governance and data quality best practices. Strong project management skills, with the ability to prioritise tasks and deliver projects within specified timelines Significant experience working a senior level in a large and complex organisation Experience of leading innovative approaches more »
City Of London, England, United Kingdom Hybrid / WFH Options
Morgan Hunt
/SQL Proven expertise with Microsoft SQL Server Integration Services and BI tools like Power BI, Power Pivot, and SSRS Experience in data warehousing, ETL, data modelling, and data orchestration Excellent stakeholder management and communication skills A degree in Computer Science, IT, or a related STEM subject Morgan Hunt is more »
and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration andETL frameworks and tools. Understanding of DataOps, data mining, and data visualization/BI tools. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. more »
open-source data engineering and scientific Python toolset. Our tech stack includes Airbyte for data ingestion, Prefect for pipeline orchestration, AWS Glue for managed ETL, along with Pandas and PySpark for pipeline logic implementation. We utilize Delta Lake and PostgreSQL for data storage, emphasizing the importance of data integrity andmore »
with programming languages such as Python or R is a plus. Knowledge of statistical analysis techniques and methodologies. Familiarity with data warehousing concepts andETL processes. more »
and scalability. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. Extensive experience in designing and implementing data pipelines, ETL processes, and data warehousing solutions. Proficiency in cloud platforms such as Azure, with hands-on experience in data lakes, Databricks, and Synapse Analytics. Strong programming more »
data pipelines for data ingestion, processing, and transformation in Azure. Utilise Azure Data Factory, Azure Databricks and SAP Business Objects to create and maintain ETL operations. Deliver dashboard reporting and data sets which are interactive and user-friendly via PowerBI Identifying and integrating external data into the business data model more »
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Anson McCade
consulting firm. Proficiency in Azure data services such as Azure SQL Database, Azure Data Factory, Azure Databricks, Azure Synapse Analytics, etc. Strong understanding of ETL processes, data modeling, and data warehousing principles. Experience with programming languages such as SQL and Python. Familiarity with data visualization tools such as Power BI more »
the different data architecture patterns: Data Fabric, Data Mesh, Data Warehouse, Data Marts, data modeling, ontologies & knowledge graphs, MicroServices • You have experience in implementing ETL data flows and data pipelines, and know one or more of the following tools: Informatica PowerCenter, SAS Data Integration Studio, Microsoft SSIS, Ab Initio, etc. more »
various teams to understand data requirements and implement solutions. > Optimizing data workflows and processes to enhance data quality, reliability, and performance. > Developing and managing ETL processes for data ingestion, processing, and transformation. > Implementing data governance practices to ensure data integrity, security, and compliance. > Monitoring and troubleshooting data infrastructure to address more »
London, England, United Kingdom Hybrid / WFH Options
Aventum Group
responsible for accessing, validating, and querying data from various repositories using available tools. Build and maintain data integration processes using SQL Services and other ETL/ELT processes and scripting tools as well as ongoing requests and projects related to the data warehouse, MI, or fast-moving financial data. Designing … Architecting, building, testing, and maintaining data platform. Develop and support a wide range of data transformations and migrations for the whole business. Construct custom ETL processes: Design and implement data pipelines, data marts and schemas, access versatile data sources and apply data quality measures. Monitoring the complete process and applying … ML is a plus Experience with Azure SQL Database, Cosmos DB, NoSQL, MongoDB Experience with Agile, DevOps methodologies Awareness and knowledge of ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (Public Key Infrastructure) and Integration testing Skills and Abilities Knowledge of Python, SQL, SSIS, andmore »