function, delivering end to end solutions for an array of customer data projects. The role: Designing data pipelines, managing data warehouses, and implementing complex ETL processes. Work closely with data scientists, analysts, and stakeholders to optimise our client’s infrastructure. Optimise data driven products, personalisation, reporting and overall business success. more »
and ability to promote solutions to production via machine learning engineering Hands-on knowledge of NoSQL and relational Databases, alongside relevant data modelling techniques ETL/ELT tooling Knowledge of DBT, Luigi or similar orchestration tooling of managing and developing a team in an agile environment Preferred Software development of more »
and Sell additional annual leave Funded Learning and development programmes The Successful Data Engineer will have: Experience in developing and designing data pipelines andETL processes Previous experience of data warehousing Utilise SAS for data reporting and analysis Skilled in SQL, Python, Git and Databricks In depth experience of reporting more »
London, Tottenham Court Road, United Kingdom Hybrid / WFH Options
Jumar Solutions
Data Experience: Competence in handling financial data, coupled with a solid understanding of accounting principles. Technical Skills: In-depth knowledge of dimensional modeling andETL processes. Agile Framework: Experience working within an Agile Scrum environment. Advanced Excel Proficiency: High-level skills in Excel for managing data exports. Desirable Qualifications: MS more »
Snowflake data architecture, design, and deployment. Strong proficiency in SQL and experience with scripting languages such as Python or Java. Experience with data modelling, ETL processes, and data warehousing techniques. Familiarity with cloud computing services, ideally, Azure. Excellent problem-solving skills and the ability to work collaboratively in a team more »
design, implement, and utilize various database structures with a focus on cloud-based data services such as Azure, Databricks, and Snowflake ❄️ 🔹 Experience in building ETL/ELT data pipelines and applying DevOps (CI/CD) concepts to test, schedule, and deploy to a production environment 🔄 If you're interested in more »
be done not even built yet. You’ll be coming in to work closely with data scientists and the wider business to build all ETL solutions behind round the clock trading decisions. This is a great chance to get into working for a buy-side financial firm. You don’t more »
the business Skills & Experience If this sounds like an opportunity you would like to apply to, please review the necessary competencies below: Knowledge of ETL, Analytics & Data Warehousing Experience with Cloud, AWS, Snowflake, Bigquery Experience in building a data driven culture, in the form of self-serve analytics Previous hands more »
Strong stakeholder management & communication skills Experience of working with structured and unstructured datasets Able to design a range of architectural solutions Data models andETL Data lake and data warehouse end to end architecture Experience within media, publishing, research, or a similar consumer focused industry is highly desirable, but not more »
Greater London, England, United Kingdom Hybrid / WFH Options
Anson McCade
the Defence & Security sector. Required Skills/Experience: British Passport Holder with active DV (Developed Vetting Clearance) End-to-end development with data pipelines, ETL processes, and workflow orchestration - using core concepts that apply across tech stacks. Working with diverse data sources and types - batch, streaming, real-time, and unstructured. more »
a strategic vision for data management information reporting within our organisation. Responsibilities Develop, support, and our business reporting platform, including DataBlend for integrations an ETL, along with gathering requirements for business reporting across a range of systems. Create and maintain reports and dashboards in PowerBI to support business decisions andmore »
in R, Python, and SQL for data manipulation and analysis. Experience with Enterprise Data Management (EDM) tools and concepts. Strong understanding of data modeling, ETL processes, and data quality management. Familiarity with financial services or asset management industry preferred. Excellent problem-solving skills and attention to detail. Ability to work more »
experience. Within Azure, should be familiar with performance tuning, T-SQL, maintenance, and replication. Experience with multiple database platforms including Sybase ASE. Experience with ETL development within Azure as well as knowledge of Informatica. Should be comfortable with Linux and Shell Scripting. Some experience with Terraform, Powershell and Gitlab. Bachelor more »
with plans to also add GCP and on-prem. They are adding extensive usage of distributed compute on Spark, starting with their more complex ETLand advanced analytics functions, e.g. Time Series Processing. They soon plan to integrate other approaches, including native distributed PyTorch/Tensorflow, Spark-based distributor libraries more »
Provide estimates, work independently and meet deadlines Manage the releases and the related builds in each environment Perform development, testing, and support of data ETL/ELT programs (pipelines), using such tools as AWS Glue, Azure Data Factory, Informatica or similar ETL/ELT platforms, as well as knowledge of … learn the associated scheduling logic for data pipelines using scheduling tool such as Autosys or Airflow Read and translate logical data models and use ETL skills to load the physical layer, based on an understanding of timing of data loads, data transformation, and optimization of ETLload performance. Provide production more »
London, England, United Kingdom Hybrid / WFH Options
Pioneer Search
in delivering complex solutions for Enterprise level Clients. As well as maintaining robust and scalable applications in Scala, you will be able to implement ETL pipelines to process, transform, and standardize data from various sources as well as optimise the performance of Spark applications. Work closely with data scientists, software more »
specialist training or experience in an appropriate field. Knowledge of data, its use and application within NHS or other complex organisation Demonstrable experience in ETL processes and developing data shares. Stakeholder management and negotiation leading to securing digital and data assets for counter fraud purposes. Recent and ongoing continuous professional more »
client site. Preferred skills: Experience with Azure and Databricks Proficiency in SQL and/or similar data technologies Familiarity with data pipeline tools andETL processes Knowledge of cloud platforms and data architecture Experience working in the Retail Banking sector or financial services would be highly beneficial. This is a more »
a 50/50 split between BI reporting/analysis and data ingestion, and would suit someone with a couple years experience across Python , ETL, AWS and any BI tool (PowerBI, Tableau, Qlik, QuickSight, or GoodData - it's more about the transferable skills!). This is an ideal role for more »
financial services. Strong leadership skills with a track record of successfully managing and developing high-performing teams. In-depth knowledge of data engineering concepts, ETL processes, and data warehouse architectures. Expertise in working with big data technologies and cloud platforms (preferably AWS or Azure). Familiarity with asset management industry more »
Support the development team in resolving issues by using data gathered from various sources Designing, developing, and maintaining scalable and reliable data pipelines andETL processes Building and managing data monitoring tools to ensure data quality and system performance Proactively shaping your role and driving real change for the company more »
information technology, or a related field. Proven experience with Insurance Broking Systems data migration (ideally Acturis). Proficiency in SQL and SSIS. Experience with ETL tools. Strong analytical and critical thinking skills, with a focus on practical solutions. Excellent communication and people skills for conveying data concepts to diverse audiences. more »
Azure technologies Financial Services domain knowledge Do you tick these 4 boxes? Must haves: Python coding experience (Pure Python) Data modelling, data warehousing andETL frameworks SQL Our client have 2,500 lines of Python code for you to maintain and optimize. You'll need hands-on coding for this more »
up with your own solution. You will be building dashboards in Python as well as be in charge of Transformation when it comes to ETL pipelines. Experience wise, you’ll need to be a SQL Server expert with strong architecture skills and also strong skills with Python, and DBT. The more »