is essential and the role is 2-3 days a week in the Liverpool office - rest remote**** Senior Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. **** Informatica Cloud (IICS & IDMC ) is essential - NOT PowerCenter … data. Essential/core experience required: 10+ years of experience and in-depth knowledge of data delivery and associated architecture principles, data modelling concepts, ETL procedures, and all steps of data production process Prior experience in insurance and/or reinsurance in support of specialty lines is a plus. Extensive … hands on experience in SQL, Python, Data Integration/Ingestion and associated patterns - ETL tooling – Informatica IICS, Warehousing technologies and associated patterns, Cloud platforms – Azure preferred. Experience with on-prem and cloud versions of databases such as Oracle and SQL Server. Professional certifications in public cloud and tooling – Informatica andmore »
Location: London – Hybrid Contract: 6 Months Inside IR35 (potential for additional 6 months) Skills: Data Engineer, DBT, Data Build Tool, GCP, Google Cloud Platform, ETL, ELT, SQL, PowerBI, Data Warehouse, Snowflake We are looking for a skilled Data Engineer to work on a dynamic and innovative project within a well … using state-of-the-art tools and technologies, including DBT, GCP, and PowerBI. - Design and implement efficient ELT/ETL processes to extract, transform, andload data from various sources into our data warehouse. - Collaborate with cross-functional teams to understand data requirements and deliver robust solutions to meet business … cloud environment. - Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, and Pub/Sub. - Strong understanding of ELT (Extract, Load, Transform) andETL (Extract, Transform, Load) methodologies. - Familiarity with visualisation tools like PowerBI for creating insightful dashboards and reports. - Excellent problem-solving skills and ability to work more »
a plus. DB: Azure SQL Database, Cosmos DB, NoSQL, MongoDB, and HBase are a plus. Methodologies: Agile and DevOps must have. Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (Public Key Infrastructure) and Integration testing. If this sounds like you, be sure to get more »
Databricks Spark Delta Lake SQL Python PySpark ADLS Day To Day Responsibilities: Extensive experience in designing, developing, and managing end-to-end data pipelines, ETL (Extract, Transform, Load), and ELT (Extract, Load, Transform) solutions. Maintains a proactive approach to staying updated with emerging technologies and a strong desire to continuously more »
of new technologies Ability to maintain delivery momentum We’d Love If You Also Have These: Highly proficient in SQL Experience using Python based ETL tools such as Databricks Experience using Data Ingestion tools such as FiveTran or Stitch Experience using Business Intelligence tools such as PowerBI, Looker or Tableau more »
Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some of the below technologies: Data integration – ETL tools like Talend and Informatica. Ingestion mechanism like Flume & Kafka Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in more »
DN4 5PL, Doncaster, South Yorkshire, United Kingdom Hybrid / WFH Options
Keepmoat
Business Intelligence/Analytics (Power BI). They will have experience with Microsoft BI stack, relational database and SQL experience alongside experience working with ETLand/or data integration tools (preferably Informatica). Experience working in an agile environment (preferably with Azure DevOps) is also essential. Experience within the more »
encryptions, etc.). In-depth knowledge and experience with Azure data storage (SQL Server, Data lake, Synapse, etc.) & access tools, APIs, cloud connectivity, andETL processes. Knowledge and some experience of MS Office/MS Office 365 suite, SharePoint Online, Power Apps, GitHub, MS Teams, etc. In-depth knowledge & experience more »
legacy data to our new cloud-based data platform. Design and implement master and transactional data models to ensure data integrity and consistency. Develop ETL pipelines and cloud data storage solutions that meet business requirements and non-functional needs. Collaborate closely with data analysts, visualization engineers, and project managers to more »
with relational databases and NoSQL databases Significant experience and in-depth knowledge of creating data pipelines and associated design principles, standards, Data modelling concepts, ETL procedures, and all steps of data production process Experience with unit and integration testing, and data quality frameworks such as Deequ, Great Expectations or Delta more »
Advanced skills in SQL programming, with the ability to design, optimize, and maintain complex queries and databases. Strong experience in designing, implementing, and managing ETL processes using Azure Data Factory and other relevant tools. Proficiency in building and managing data warehouses using Azure Synapse Analytics and other data warehousing solutions. more »
Manchester Area, United Kingdom Hybrid / WFH Options
Airtime Rewards
Requirements Bachelor’s degree in Computer Science, a relevant technical field, or equivalent experience. Experience designing cloud data warehouse solutions, data modelling, and building ETL/ELT processes, preferably on GCP or equivalent platforms (AWS, Azure, Snowflake). Proficiency with SQL, Python, Docker, and Terraform (or similar IaC tools). more »
development, Blazor, NServiceBus, CQRS and Domain-driven design, Scrum/Agile, API Design and management skills Data Engineering projects using Data Lakes, Data factories, ETL techniques, Synapse and PowerBI MS Logic and Power Apps Azure DevOps Experience with the following would be beneficial: AI + Machine Learning Analytics: Analysis Services more »
Experience Essential Significant experience of strategic planning to support wider organisational objectives In-depth knowledge of BI tools and technologies, such as data warehousing, ETL (Extract, Transform, Load), and analytics platforms. Proficiency in SQL (Structured Query Language) and other data querying languages Excellent problem-solving and analytical skills, with the … ability to extract meaningful insights from complex datasets. Familiarity with data governance and data quality best practices. Strong project management skills, with the ability to prioritise tasks and deliver projects within specified timelines Significant experience working a senior level in a large and complex organisation Experience of leading innovative approaches more »
West London, London, United Kingdom Hybrid / WFH Options
Morgan Hunt UK Limited
/SQL Proven expertise with Microsoft SQL Server Integration Services and BI tools like Power BI, Power Pivot, and SSRS Experience in data warehousing, ETL, data modelling, and data orchestration Excellent stakeholder management and communication skills A degree in Computer Science, IT, or a related STEM subject Morgan Hunt is more »
and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration andETL frameworks and tools. Understanding of DataOps, data mining, and data visualization/BI tools. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. more »
demonstrate working on large engagements * Experience of AWS tools (e.g. Athena, Redshift, Glue, EMR) * Java, Scala, Python, Spark, SQL * Experience of developing enterprise grade ETL/ELT data pipelines. * Deep understanding of data manipulation/wrangling techniques * Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit more »
and scalability. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. Extensive experience in designing and implementing data pipelines, ETL processes, and data warehousing solutions. Proficiency in cloud platforms such as Azure, with hands-on experience in data lakes, Databricks, and Synapse Analytics. Strong programming more »
data pipelines for data ingestion, processing, and transformation in Azure. Utilise Azure Data Factory, Azure Databricks and SAP Business Objects to create and maintain ETL operations. Deliver dashboard reporting and data sets which are interactive and user-friendly via PowerBI Identifying and integrating external data into the business data model more »
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Anson McCade
consulting firm. Proficiency in Azure data services such as Azure SQL Database, Azure Data Factory, Azure Databricks, Azure Synapse Analytics, etc. Strong understanding of ETL processes, data modeling, and data warehousing principles. Experience with programming languages such as SQL and Python. Familiarity with data visualization tools such as Power BI more »
the different data architecture patterns: Data Fabric, Data Mesh, Data Warehouse, Data Marts, data modeling, ontologies & knowledge graphs, MicroServices • You have experience in implementing ETL data flows and data pipelines, and know one or more of the following tools: Informatica PowerCenter, SAS Data Integration Studio, Microsoft SSIS, Ab Initio, etc. more »
London, England, United Kingdom Hybrid / WFH Options
Aventum Group
responsible for accessing, validating, and querying data from various repositories using available tools. Build and maintain data integration processes using SQL Services and other ETL/ELT processes and scripting tools as well as ongoing requests and projects related to the data warehouse, MI, or fast-moving financial data. Designing … Architecting, building, testing, and maintaining data platform. Develop and support a wide range of data transformations and migrations for the whole business. Construct custom ETL processes: Design and implement data pipelines, data marts and schemas, access versatile data sources and apply data quality measures. Monitoring the complete process and applying … ML is a plus Experience with Azure SQL Database, Cosmos DB, NoSQL, MongoDB Experience with Agile, DevOps methodologies Awareness and knowledge of ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (Public Key Infrastructure) and Integration testing Skills and Abilities Knowledge of Python, SQL, SSIS, andmore »
high-performance data systems that are foundational to driving business growth and success. Key responsibilities: Implementing scalable data architectures and systems. Developing and maintaining ETL (Extract, Transform, Load) pipelines. Managing data storage, backup, and recovery mechanisms. Writing complex SQL queries to extract data for analysis. Developing and implementing data security more »
requirements into technical requirements & development experience using data discovery tools such as MS Power BI, QlikView, Tableau Ability to model andtransform data, build ETL solutions and present data in a useful business context Experience developing data warehouses & data marts using the Kimball methodology Experience or awareness of Big Data more »
Technologies such as Docker and orchestration tools like Kubernetes for containerized deployments. Workflow management tools such as Airflow for orchestrating complex data pipelines andETL processes. Certifications in Azure cloud services and data engineering technologies, demonstrating expertise and proficiency in the Azure ecosystem. Rewards & Benefits: TCS is consistently voted a more »