in Python to work within a fast paced Data team . You will work very closely with Quant Research, Trading, Quant Development and traditional ETL/ELT Data Engineering. In this latency sensitive trading environment , your work is intrinsic to the firms PnL. Your core focus will be on on more »
a call on 0191 3387551. Keywords: Azure Data Factory, Azure Databricks, Databricks Lakehouse, MS Power BI, Power BI, Spark, Delta Lake, T-SQL, DevOps, ETL, Data Modelling, DAX, Data Warehousing, London more »
and manipulation · Analytical capability/Analytical skills: · Issues & risks identification & mitigation · Root cause analysis · Qualitative & quantitative analysis · End to end development of PowerBI Reports (ETL, data modelling, expression writing and report writing) · Experience with data mastery · Good practices in SQL development · Good understanding of relational databases · Advanced Excel (Power Query more »
knowledge in distributed systems, cloud architecture, and data pipelines. Proficiency in Python programming (knowledge of Scala or Rust is a plus). Familiarity with ETL principles in contemporary data applications (Dagster, Airflow, Perfect). Familiarity with AWS services such as Glue, Redshift, Athena, and S3. Proficiency with Terraform, Kubernetes, andmore »
your problem-solving skills and creativity. Google Cloud Professional Cloud Architect or Professional Cloud Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes andmore »
your problem-solving skills and creativity. Google Cloud Professional Cloud Architect or Professional Cloud Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes andmore »
Knutsford, England, United Kingdom Hybrid / WFH Options
Capgemini
technologies like Python, Scala, Spark, PySpark or Ab Initio, Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines andETL processes. Experience with data pipeline orchestration tools like Apache Airflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. more »
Northamptonshire, England, United Kingdom Hybrid / WFH Options
Capgemini
technologies like Python, Scala, Spark, PySpark or Ab Initio, Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines andETL processes. Experience with data pipeline orchestration tools like Apache Airflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. more »
Ability to think strategically and assess options across short, medium, and long-term timeframes. Experience with data technologies including MS SQL, data warehousing, andETL processes. Familiarity with microservices architecture, APIs, and integration patterns. Knowledge of DevOps practices and CI/CD pipelines. Understanding of security standards and optimal practices more »
contribute to the technical aspects in the upkeep of the data estate. Focusing on hands-on development, enhancement, and maintenance of data models andETL processes within the data estate You will emphasise the importance of data quality and governance in your project work, contributing to the organisational data model more »
Central London, London, United Kingdom Hybrid / WFH Options
Recruitment Revolution
analytic role, with a focus on analysis, reporting and visualisation in Looker Studio + Highly proficient in GCP/GBQ, SQL, data modelling andETL processes + Experience with Supermetrics, GitHub, DBT, Google Big Query and Shopify + Strong communication and presentation skills to effectively convey insights to both technical more »
and Machine Learning engineers and it is responsible for supporting data scientists in deploying, maintaining and monitoring an increasing number of Python-based microservices, ETL pipelines, SaaS models, databases and vector stores. The MLOps Lead would need to act as an interface between data scientists, the data & analytics team andmore »
Burton-On-Trent, Staffordshire, Burton upon Trent, United Kingdom
Michael Page
company data acquisition strategy allowing for near real time process reporting and full support of contractual reporting to customers. Key Responsibilities: Monitor and maintain ETL processes to ensure the continuation of an accurate data reporting platform for the business. Explore solutions for optimising the performance of the strategic data platform … Extensive SQL knowledge (Microsoft SQL Server 2005+) and experience working with relational databases, query authoring (T-SQL) Microsoft SQL server Integrated Services (SSIS) including ETL/ELTL design and development experience. Microsoft DevOps source control software and development lifecycle software Developing and maintaining objects within Data Warehouses/Lakehouses. Experience … Extensive SQL knowledge (Microsoft SQL Server 2005+) and experience working with relational databases, query authoring (T-SQL) Microsoft SQL server Integrated Services (SSIS) including ETL/ELTL design and development experience. Microsoft DevOps source control software and development lifecycle software Developing and maintaining objects within Data Warehouses/Lakehouses. Experience more »
a strategic vision for data management information reporting within our organisation. Responsibilities Develop, support, and our business reporting platform, including DataBlend for integrations an ETL, along with gathering requirements for business reporting across a range of systems. Create and maintain reports and dashboards in PowerBI to support business decisions andmore »
customer-centric environment In depth expertise in data management, data architecture, and data governance and tools Technical familiarity with data modeling and database technologies (ETL processes, DB staging, data platforms) as well as data tools (Power BI, Databricks, Palantir) (Re)Insurance business understanding (specifically underwriting and claims processes) Excellent verbal more »
Mathematics, or related field 📊 Experience : Proven Data Engineer experience, preferably in finance 💻 Skills : Python, SQL, NoSQL, big data tools Cloud platforms (AWS, Azure, GCP) ETL processes and database design Join us and make a difference more »
engineers. Proficiency in Azure (including services like Azure Data Factory, Azure Databricks, etc.). Strong programming skills in Python . Familiarity with data modeling, ETL processes, and data warehousing. Excellent communication and leadership abilities. Additional Information: This role offers a £100k + performance-based bonus, and comprehensive benefits package. Candidates more »
key stakeholders and other business units to understand business needs and gather requirements. Required Experience Track record of designing and building data infrastructure andETL pipelines Experience with Azure Platform including Data Factory, Synapse, and Data Lake DB development experience with SQL Software development experience in relevant languages like Python more »
strategy constraints. Essential Skills: The ideal candidates will have a proven Senior Data & Analytics Development background, with the following skills/experience: Knowledge of ETL/ELT, data warehousing/business intelligence methodologies and best practice including dealing with big data, cloud technology and unstructured data and the relative required more »
a 50/50 split between BI reporting/analysis and data ingestion, and would suit someone with a couple years experience across Python , ETL, (AWS, Azure or GCP) and any BI tool (PowerBI, Tableau, Qlik, QuickSight, or GoodData - it's more about the transferable skills!). This is an more »
Python and its data processing, analysis, and visualization libraries Experienced with SQL and Timeseries databases Skilled in AWS services: S3, EC2, RDS Knowledgeable in ETL tools like Airflow Proficient in Git, CI/CD, testing tools, and documentation best practices Adheres to quality engineering practices including TDD and BDD Nice more »
/SQL ADF, Databricks and Synapse knowledge required Strong client-facing experience as a data engineer/consultant. Experience designing and developing ELT/ETL processes Experience with Azure and/or SQL Server stack (SSIS, SSRS, SSAS) A willingness to develop Azure Platform/services skills Relevant Microsoft Certifications more »
high availability and disaster recovery scenarios Designing robust implementation, migration, and test plans Database administration Performance optimisation – Queries, indexing, database design, storage Data integration – ETL/ELT Data security Proven troubleshooting skills Scalability Maintenance and operations Licensing What you’ll need Minimum 2 years’ experience with SQL Server more »
Devise practical solutions to business challenges, drawing from industry experience. Spearhead engineering collaboration with actuarial subject matter experts. Guide team members in developing robust ETL processes. Uphold exemplary coding and technical standards. Showcase technical prowess in Python, SQL, and JSON. Implement and oversee a robust CI/CD process. Enhance more »
and Sell additional annual leave Funded Learning and development programmes The Successful Data Engineer will have: Experience in developing and designing data pipelines andETL processes Previous experience of data warehousing Utilise SAS for data reporting and analysis Skilled in SQL, Python, Git and Databricks In depth experience of reporting more »