is essential and the role is 2-3 days a week in the Liverpool office - rest remote**** Senior Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. **** Informatica Cloud (IICS & IDMC ) is essential - NOT PowerCenter … data. Essential/core experience required: 10+ years of experience and in-depth knowledge of data delivery and associated architecture principles, data modelling concepts, ETL procedures, and all steps of data production process Prior experience in insurance and/or reinsurance in support of specialty lines is a plus. Extensive … hands on experience in SQL, Python, Data Integration/Ingestion and associated patterns - ETL tooling – Informatica IICS, Warehousing technologies and associated patterns, Cloud platforms – Azure preferred. Experience with on-prem and cloud versions of databases such as Oracle and SQL Server. Professional certifications in public cloud and tooling – Informatica andmore »
Location: London – Hybrid Contract: 6 Months Inside IR35 (potential for additional 6 months) Skills: Data Engineer, DBT, Data Build Tool, GCP, Google Cloud Platform, ETL, ELT, SQL, PowerBI, Data Warehouse, Snowflake We are looking for a skilled Data Engineer to work on a dynamic and innovative project within a well … using state-of-the-art tools and technologies, including DBT, GCP, and PowerBI. - Design and implement efficient ELT/ETL processes to extract, transform, andload data from various sources into our data warehouse. - Collaborate with cross-functional teams to understand data requirements and deliver robust solutions to meet business … cloud environment. - Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, and Pub/Sub. - Strong understanding of ELT (Extract, Load, Transform) andETL (Extract, Transform, Load) methodologies. - Familiarity with visualisation tools like PowerBI for creating insightful dashboards and reports. - Excellent problem-solving skills and ability to work more »
a plus. DB: Azure SQL Database, Cosmos DB, NoSQL, MongoDB, and HBase are a plus. Methodologies: Agile and DevOps must have. Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (Public Key Infrastructure) and Integration testing. If this sounds like you, be sure to get more »
management and engineering • 8+ years as a Data Engineer working with Data Warehouses and Data Lakes • Strong hands-on experience with PL/SQL, ETL design and Orchestration • Experience in data profiling, data validation and performance improvement experience for Analytical and OLTP systems • In-depth Object Oriented software development experience more »
Databricks Spark Delta Lake SQL Python PySpark ADLS Day To Day Responsibilities: Extensive experience in designing, developing, and managing end-to-end data pipelines, ETL (Extract, Transform, Load), and ELT (Extract, Load, Transform) solutions. Maintains a proactive approach to staying updated with emerging technologies and a strong desire to continuously more »
of new technologies Ability to maintain delivery momentum We’d Love If You Also Have These: Highly proficient in SQL Experience using Python based ETL tools such as Databricks Experience using Data Ingestion tools such as FiveTran or Stitch Experience using Business Intelligence tools such as PowerBI, Looker or Tableau more »
manipulate data from diverse sources (SQL, Oracle, etc.). What You Will Do • Explore and analyze data from a wide range of sources to extract actionable insights and drive data-based decision making. • Work closely with cross-functional teams to understand business requirements and design effective data pipelines. • Prototype, automate … management, and administration. • Proficiency in SQL and other programming languages commonly used in data engineering (e.g., Python, Java, Scala). • Hands-on experience with ETL (extract, transform, load) processes and data integration tools and techniques such as Snowflake, Fivetran, dbt, etc. • Familiarity with cloud-based data platforms and services (e.g. more »
technical field (e.g. Computer Science, Statistics, Engineering). · 7+ years of relevant experience in building DW/BI systems · Demonstrated ability in data modeling, ETL development, and Data warehousing. · Strong experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.) · Expertise in a BI solution like Power BI · Hands more »
Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some of the below technologies: Data integration – ETL tools like Talend and Informatica. Ingestion mechanism like Flume & Kafka Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in more »
DN4 5PL, Doncaster, South Yorkshire, United Kingdom Hybrid / WFH Options
Keepmoat
Business Intelligence/Analytics (Power BI). They will have experience with Microsoft BI stack, relational database and SQL experience alongside experience working with ETLand/or data integration tools (preferably Informatica). Experience working in an agile environment (preferably with Azure DevOps) is also essential. Experience within the more »
encryptions, etc.). In-depth knowledge and experience with Azure data storage (SQL Server, Data lake, Synapse, etc.) & access tools, APIs, cloud connectivity, andETL processes. Knowledge and some experience of MS Office/MS Office 365 suite, SharePoint Online, Power Apps, GitHub, MS Teams, etc. In-depth knowledge & experience more »
legacy data to our new cloud-based data platform. Design and implement master and transactional data models to ensure data integrity and consistency. Develop ETL pipelines and cloud data storage solutions that meet business requirements and non-functional needs. Collaborate closely with data analysts, visualization engineers, and project managers to more »
with relational databases and NoSQL databases Significant experience and in-depth knowledge of creating data pipelines and associated design principles, standards, Data modelling concepts, ETL procedures, and all steps of data production process Experience with unit and integration testing, and data quality frameworks such as Deequ, Great Expectations or Delta more »
Advanced skills in SQL programming, with the ability to design, optimize, and maintain complex queries and databases. Strong experience in designing, implementing, and managing ETL processes using Azure Data Factory and other relevant tools. Proficiency in building and managing data warehouses using Azure Synapse Analytics and other data warehousing solutions. more »
Livingston NJ, Livingston, Essex County, New Jersey
Nexus Jobs Limited
encryptions, etc.). In-depth knowledge and experience with Azure data storage (SQL Server, Data lake, Synapse, etc.) & access tools, APIs, cloud connectivity, andETL processes. Knowledge and some experience of MS Office/MS Office 365 suite, SharePoint Online, Power Apps, GitHub, MS Teams, etc. In-depth knowledge & experience more »
Manchester Area, United Kingdom Hybrid / WFH Options
Airtime Rewards
Requirements Bachelor’s degree in Computer Science, a relevant technical field, or equivalent experience. Experience designing cloud data warehouse solutions, data modelling, and building ETL/ELT processes, preferably on GCP or equivalent platforms (AWS, Azure, Snowflake). Proficiency with SQL, Python, Docker, and Terraform (or similar IaC tools). more »
London, England, United Kingdom Hybrid / WFH Options
Version 1
Stream Analytics Direct experience in building data pipelines using Azure Data Factory and Apache Spark (preferably Databricks). Experience building data warehouse solutions using ETL/ELT tools such as SQL Server Integration Services (SSIS), Oracle Data Integrator (ODI), Talend, and Wherescape Red. Experience with Azure Event Hub, IOT Hub more »
development, Blazor, NServiceBus, CQRS and Domain-driven design, Scrum/Agile, API Design and management skills Data Engineering projects using Data Lakes, Data factories, ETL techniques, Synapse and PowerBI MS Logic and Power Apps Azure DevOps Experience with the following would be beneficial: AI + Machine Learning Analytics: Analysis Services more »
Experience Essential Significant experience of strategic planning to support wider organisational objectives In-depth knowledge of BI tools and technologies, such as data warehousing, ETL (Extract, Transform, Load), and analytics platforms. Proficiency in SQL (Structured Query Language) and other data querying languages Excellent problem-solving and analytical skills, with the … ability to extract meaningful insights from complex datasets. Familiarity with data governance and data quality best practices. Strong project management skills, with the ability to prioritise tasks and deliver projects within specified timelines Significant experience working a senior level in a large and complex organisation Experience of leading innovative approaches more »
West London, London, United Kingdom Hybrid / WFH Options
Morgan Hunt UK Limited
/SQL Proven expertise with Microsoft SQL Server Integration Services and BI tools like Power BI, Power Pivot, and SSRS Experience in data warehousing, ETL, data modelling, and data orchestration Excellent stakeholder management and communication skills A degree in Computer Science, IT, or a related STEM subject Morgan Hunt is more »
and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration andETL frameworks and tools. Understanding of DataOps, data mining, and data visualization/BI tools. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. more »
encryptions, etc.). In-depth knowledge and experience with Azure data storage (SQL Server, Data lake, Synapse, etc.) & access tools, APIs, cloud connectivity, andETL processes. In-depth knowledge & experience using Visual Studio, with one of the programming languages C#/Java/JavaScript/Python, PowerShell, and Postman, SOAPUI more »
demonstrate working on large engagements * Experience of AWS tools (e.g. Athena, Redshift, Glue, EMR) * Java, Scala, Python, Spark, SQL * Experience of developing enterprise grade ETL/ELT data pipelines. * Deep understanding of data manipulation/wrangling techniques * Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit more »
only: technical proficiency rating of advanced beginner on the Dreyfus engineering scale Preferred Qualifications If we had our say, we'd also look for: ETL/ELT Tool AbInitio Strong experience with SQL. Experience in supplemental tools and technologies involved in data integration (Unix/Linux, TWS/Control-M more »
PowerBI. Collaborate with stakeholders to understand reporting requirements and translate them into effective BI solutions. Data Integration and Modeling: Utilize strong SQL skills to extract, transform, andload (ETL) data from various sources into PowerBI and write complex measures in DAX Design and implement data models that align with business more »