South West London, London, United Kingdom Hybrid / WFH Options
Verisure
preferred 2-3 years experience working in a Consulting, Strategy, or Data Analytics role Demonstrated knowledge of Excel (advanced, will be tested), SQL (PostgreSQL), ETL pipelines, PowerBI (or other visualisation software). Strong communication and collaboration skills. Ability to translate and visualise data in a concise and accurate way thats more »
environment requires a more robust structure so much of the role will focus around improving: - Data Governance - Data Warehousing - Developing and implementing automation processes - ETL activities - Advanced analytics (involving forecasting and predictive analytics) Clearly, these will not all be achieved by one single person, nor all at once but it more »
model to meet new business requirements2+ years managing an analytics engineering team using a scrum/agile methodologyExperience working with commercial data warehouses (Redshift), ETL tools (Dbt), data visualization (Python notebook, Thoughtspot, Looker, Tableau, Hex), and Data Dictionary tools (Atlan)Demonstrated experience leading 2 or more multi-department analytics projects more »
and data transformation.Strong experience in managing and leading data engineering teams.Strong proficiency in Scala, Python, SQL, Snowflake and DBT.In-depth understanding of data modelling, ETL processes, and data warehousing concepts.Experience with cloud-based data platforms (e.g., AWS, Azure, GCP) and containerisation technologies (e.g., Docker, Kubernetes) is a plus.Excellent problem-solving more »
London, England, United Kingdom Hybrid / WFH Options
Aventum Group
responsible for accessing, validating, and querying data from various repositories using available tools. Build and maintain data integration processes using SQL Services and other ETL/ELT processes and scripting tools as well as ongoing requests and projects related to the data warehouse, MI, or fast-moving financial data. Designing … Architecting, building, testing, and maintaining data platform. Develop and support a wide range of data transformations and migrations for the whole business. Construct custom ETL processes: Design and implement data pipelines, data marts and schemas, access versatile data sources and apply data quality measures. Monitoring the complete process and applying … ML is a plus Experience with Azure SQL Database, Cosmos DB, NoSQL, MongoDB Experience with Agile, DevOps methodologies Awareness and knowledge of ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (Public Key Infrastructure) and Integration testing Skills and Abilities Knowledge of Python, SQL, SSIS, andmore »
SSAS). Ability to translate business requirements into technical solutions. Skills in data discovery tools (Power BI, QlikView, Tableau). Experience with data modeling, ETL development, and Kimball methodology. Awareness of Big Data and Data Science technologies. UK applicants only. Contact Faye at 0203 800 0792 or submit your CV. more »
Isleworth, England, United Kingdom Hybrid / WFH Options
WNTD
of data governance concepts and practices. Expertise in data modeling, metadata management, and data lineage. Strong background in data warehouse design and SQL Server ETL processes. Proficiency in Power BI for reporting. Desirable Skills: Experience with Nexthink, SNOW, and JAMF software. Accreditations: Microsoft Certified Data Analyst Microsoft Certified Data Engineer more »
and Agile delivery methodologiesDegree educated in mathematics or scientific/engineering discipline.CFA or similar industry qualificationExperience with Snowflake, Databricks, SQL, Python and cloud-native ETL/ELT tools.Supervisory responsibilitiesYesPotential for growthMentoringLeadership development programsRegular trainingCareer development servicesContinuing education coursesYou will be expected to understand the regulatory obligations of the firm, andmore »
Science or related fields. Proven experience with Insurance Broking Systems data migration (ideally Acturis). Proficiency in SQL and data manipulation languages. Experience with ETL tools. Strong analytical and critical thinking skills, with a focus on practical solutions. Excellent communication and people skills for conveying data concepts to diverse audiences. more »
projects, leverage cutting-edge tech, and collaborate with renowned technologists in a welcoming environment. Requirements: 3+ years' experience as a Data Engineer Knowledge of ETLand data pipelnes Degree in CS/Engineering/Math or related field Strong Python and SQL skills Benefits: Up to $200,000 CAD salary more »
harmonising data, messages, etc. Desired experience: Building integration solutions Experience in designing application databases Experience in designing and building data pipelines using SQL, code, ETL tools Strong estimation and planning skills Mentoring team members and peer-review of their work If you're interested in this opportunity, please click Apply more »
decision-making and automation. The Role As a Data Engineer with us, you will: Design, develop and maintain scalable and efficient data pipelines andETL processes using Spark, ensuring high-quality data processing and integration. Collaborate with cross-functional teams to translate complex data requirements into actionable technical solutions. Utilize … in Apache Spark and cloud-based technologies, especially Microsoft Azure and Databricks. Skilled in programming, particularly Python, and familiar with data integration tools andETL frameworks. Knowledgeable in data modelling and data governance principles, with a keen eye for data quality. Versatile in database technologies, including SQL and NoSQL, andmore »
London, England, United Kingdom Hybrid / WFH Options
Solirius Consulting
data models, schemas to support business requirements Develop and maintain data ingestion and processing systems using various tools and technologies, such as SQL, NoSQL, ETL, Luigi, Airflow, Argo, etc Implement data storage solutions using different types of databases, such as relational, non-relational, or cloud-based Working collaboratively with the … and relational databases (e.g. MS SQL/Azure SQL, PostgreSQL) You have framework experience within either Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc Data acquisition and development of more »
technology budget. Collaborating with various departments to determine ongoing modifications to the data warehouse. Experience: Experience with data warehouse within a cloud environment andETL process management. Experience in a similar role, demonstrating line management and technical capabilities. Excellent written and verbal communications. Please contact Joely at Synchro to explore more »
storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transformandload large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. Our business is growing quickly and with that so … Key Responsibilities Design, construct, install, test, and maintain data pipelines. Ensure systems meet business requirements and industry practices for data integrity and quality. Manage ETLand ELT pipelines across many data sources (CSV/parquet files, API endpoints, etc) Design and build data models for the business end users. Write more »
part of a team of engineers building web applications for zero-down time low latency infrastructure Building UIs for some of the most performant ETL pipelines on the planet Conducting R&D for functional programming within the firm Building out a DevOps environment from scratch in a Software Engineering Capacity more »
London, England, United Kingdom Hybrid / WFH Options
Pioneer Search
Azure functions as well as Azure Logic Apps. Good knowledge of cloud security and Cyber Security principles. Desirable Skills: Familiarity with Azure Data Factory, ETL processes, and data manipulation. Experience within the Financial Services sector or Specialist insurance. Understanding of ITIL-based service management concepts (Incident Management, Problem Management, Change more »
accelerated time to market without leaving traces of identity. Required Skills and Qualifications: Demonstrated expertise in architecting systems for real-time transaction processing alongside ETL applications, with a focus on discretion. A comprehensive of data modelling, data warehousing principles, and the innovative Lakehouse architecture. Exceptional proficiency in ETL methodologies, preferably … utilising Azure Databricks or equivalent technologies (Spark, Spark SQL, Python, SQL), including deep insight into ETL/ELT design patterns. Proficient in Databricks, SQL, and Python, with a robust understanding of software development life cycles. Familiarity with columnar and/or time series data design patterns, as well as performance more »
management of data sources with 3rd party organizations. Working closely with Data Engineers and Data Architects to facilitate various types of integrations, such as ETL processes and API awareness. Researching and sourcing data for new product development, including national datasets and system integrations. Providing insight and adding value to existing more »
Greater London, England, United Kingdom Hybrid / WFH Options
Anson McCade
delivering complex initiatives across the Defence & Security sector. The Role: As a Data Analytics Consultant, you’ll design and build data solutions such as ETL components, data warehouses or data virtualisation implementation. Working closely with client stakeholders to design the source-to-target mappings for large-scale data migrations whilst … the ability to translate business requirements into functional/technical data designs/solutions Agile and/or DevOps for software development & IT operations ETL tools such as Informatica, SSIS, Talend or Pentaho Data governance and data management tools such as Informatica MDM, Informatica AXON, Informatica EDC, and Collibra MySQL more »
high quality solution options, based on your vast experience. deep knowledge and experience of different tech-stacks. development knowledge and skills (SQL, Oracle DB, ETL tool (Informatica) , Data Warehouse model, Shell scripts, PostgreSql, Java) knowledge of technologies such as [MS SQL Server, Oracle DB, C#, Java, ETL (Informatica), Tableau, PostgreSql more »
Databricks Spark Delta Lake SQL Python PySpark ADLS Day To Day Responsibilities: Extensive experience in designing, developing, and managing end-to-end data pipelines, ETL (Extract, Transform, Load), and ELT (Extract, Load, Transform) solutions. Maintains a proactive approach to staying updated with emerging technologies and a strong desire to continuously more »
in your approach to designing and delivering high-quality solutions that meet or exceed client expectations.Responsibilities include:Architect Databricks Enterprise Data Platform solutions including ETL/ELT and Lakehouse architecture and design and lead use case delivery across a variety of use cases (e.g. legacy migrations, enterprise analytics, data engineering more »
working with relational databases Experience in data visualization best practices and storytelling with data Understanding of business intelligence concepts and data analytics Knowledge of ETL processes and data integration Experience with data warehousing and data governance Strong problem-solving and analytical skills Excellent communication and collaboration abilities Attention to detail more »
in your approach to designing and delivering high-quality solutions that meet or exceed client expectations.Responsibilities include:Architect Snowflake Enterprise Data Platform solutions including ETL/ELT and Lakehouse architecture and design and lead use case delivery across a variety of use cases (e.g. legacy migrations, Enterprise analytics, data engineering more »