Spark, Databricks, or similar data processing tools. Stron g technical proficiency in data modelin g , SQL, NoSQL databases, and data warehousing . Hands-on experience with data pipeline development, ETL processes, and big data technolo g ies (e. g ., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Goo g le Cloud and cloud-based More ❯
Spark, Databricks, or similar data processing tools. Stron g technical proficiency in data modelin g , SQL, NoSQL databases, and data warehousing . Hands-on experience with data pipeline development, ETL processes, and big data technolo g ies (e. g ., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Goo g le Cloud and cloud-based More ❯
generation technology. Data Support : Assist the Head of Data and Technology as a Subject Matter Expert in Data. Act as a senior custodian of the WRBU Data Platform; administering ETL/ELT pipelines, completing peer/code reviews, ensuring the integrity of WRBU data, and holding accountability for data integrity and accuracy. Data Warehousing : Help manage aspects of the data … warehousing landscape and data movements, including assisting with solution design for new and existing requirements. Collaborate with the business to understand needs. Design and develop ETL/ELT processes, data lake storage architecture, data warehouse, data marts, cubes, reports and dashboards; in-line with the company data management framework and enterprise data strategy. MI Development and Maintenance : Support the development More ❯
plus. Experience in data modeling and warehouse architecture design. Experience with data orchestration and workflow management tools such as Airflow. Solid understanding of data ingestion, transformation, and ELT/ETL practices. Expertise on data transformation frameworks such as DBT. Experience with Infrastructure as Code (IaC) toolsTerraform preferred. Experience with version control systems, GitHub preferred. Experience with continuous integration and delivery More ❯
Business Intelligence solutions using Alteryx for data preparation, SQL for data transformation and storage, Tableau for data visualization, and R/Python for advanced analytics. •Design, implement, and maintain ETL pipelines using Alteryx to extract data from diverse source systems (e.g., relational databases, APIs, flat files), transform it according to defined business rules and data quality standards, andload it … or Master's degree in Computer Science, Statistics, Mathematics, Engineering, or a related quantitative field. •Experience: 1+ years of hands-on experience in Business Intelligence and Digital Transformation projects. •ETL Expertise: Proficiency in Alteryx Designer, including the ability to design and implement complex ETL workflows, and optimize performance. •Data Visualization Proficiency: Proficiency in Tableau Desktop, including the ability to create More ❯
Required Skills: Proficiency in BI tools (Power BI, Tableau, QlikView) and data visualization techniques Strong SQL skills and experience with relational databases (e.g., SQL Server, Oracle, MySQL) Familiarity with ETL processes and data warehousing concepts Experience in data modeling and designing effective reporting structures Knowledge of programming languages such as Python or R for data manipulation and analysis Strong analytical More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
SQL, Spark SQL, and Python for data processing and automation Knowledge of Microsoft Fabric and Azure Data Factory would be useful but not essential Power BI Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats Familiarity with workflow automation tools (eg, Power Automate) and/ More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
Gaming, and TransferGo. Our products are recognised by industry leaders like Gartner's Magic Quadrant, Forrester Wave and Frost Radar. Our tech stack: Superset and similar data visualisation tools. ETL tools: Airflow, DBT, Airbyte, Flink, etc. Data warehousing and storage solutions: ClickHouse, Trino, S3. AWS Cloud, Kubernetes, Helm. Relevant programming languages for data engineering tasks: SQL, Python, Java, etc. What … you will be doing: Designing and developing scalable and efficient data pipelines, ETL processes, and data integration solutions to support data ingestion, processing, and storage needs. Ensuring data quality and reliability by implementing data validation, data cleansing, and data quality monitoring processes. Optimising database performance by tuning queries, implementing indexing strategies, and monitoring and analysing system performance metrics. Collaborating with … stakeholders to define data strategies, implement data governance policies, and ensure data security and compliance. About you: Strong technical proficiency in data engineering technologies, such as Apache Airflow, ClickHouse, ETL tools, and SQL databases. Deep understanding of data modeling, ETL processes, data integration, and data warehousing concepts. Proficiency in programming languages commonly used in data engineering, such as Python, Java More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Morgan Philips Specialist Recruitment
platform. Strong Data Warehouse experience, hands-on experience with designing, building, and managing DW ideally using Kimball methodology. Hands on with Azure tooling, expertise in Azure Data Factory ADF (ETL processes), Azure SQL Server, Azure DevOps, Key Vaults, and Logic Apps. Experience in optimizing data loads to ensure reliability, scalability, and high performance, with a focus on reducing processing times … a solid understanding of advanced data technologies and practices. Hands on with Integration such as integrating data from a range of sources, ensuring data accuracy and consistency throughout the ETL process. The ability to work closely with the wider business, stakeholders, and other teams to align data solutions with business objectives. Familiarity with CI/CD pipelines and agile workflows. More ❯
Translating Business requirements to technical solutions and the production of specifications, Designing and implementing business intelligence & modern data analytics platform technical solutions, Data architecture design and implementation Data modelling, ETL, data integration and data migration design and implementation Master data management system and process design and implementation, Data quality system and process design and implementation, Major focus on data science … data modelling, data staging, and data extraction processes, including data warehouse and cloud infrastructure. Experience with multi-dimensional design, star schemas, facts and dimensions. Experience and demonstrated competencies in ETL development techniques. Experience in data warehouse performance optimization. Experience on projects across a variety of industry sectors an advantage, Comprehensive understanding of data management best practices including demonstrated experience with More ❯
QUALIFICATIONS - Experience in analyzing and interpreting data with Redshift, Oracle, NoSQL etc. - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process … data for modeling PREFERRED QUALIFICATIONS - Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift - Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your More ❯
years of SQL programming experience, including relational schema design and query optimization 2+ years using Python or similar scripting languages for data science Experience with data processes and building ETL pipelines Experience with cloud data warehouses such as Snowflake, Azure Data Warehouse, Amazon Redshift, or Google BigQuery Proficiency in creating visualizations using Power BI or Tableau Experience designing ETL/ More ❯
following product/solution development lifecycles using frameworks/methodologies such as Agile, SAFe, DevOps and use of associated tooling (e.g., version control, task tracking). ·Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. ·Experience in other programming languages for data manipulation (e.g., Python, Scala). ·Extensive experience of data engineering and the More ❯
workshops, and shape pre-sales proposals (40–50% of the role) Advise senior client stakeholders (CDOs, CIOs, Heads of Data) on data strategy, governance, and platform modernisation Design robust ETL/ELT frameworks with Azure Data Factory , SSIS , Informatica , or IBM DataStage Lead data modelling using ERwin , ER/Studio , or PowerDesigner Implement data governance and quality frameworks using Unity More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
workshops, and shape pre-sales proposals (40–50% of the role) Advise senior client stakeholders (CDOs, CIOs, Heads of Data) on data strategy, governance, and platform modernisation Design robust ETL/ELT frameworks with Azure Data Factory , SSIS , Informatica , or IBM DataStage Lead data modelling using ERwin , ER/Studio , or PowerDesigner Implement data governance and quality frameworks using Unity More ❯
Scala. Experience with AWS RDS, AWS Glue, AWS Kinesis, AWS S3, Redis and/or Azure SQL Database, Azure Data Lake Storage , Azure Data Factory. Knowledge of data modelling , ETL processes, and data warehousing principles. Preferred Skills Strong problem-solving and analytical skills . Experience with cloud databases, cloud data services, messaging/data streaming & big data technologies. Familiarity with More ❯
workshops, and shape pre-sales proposals (40–50% of the role) Advise senior client stakeholders (CDOs, CIOs, Heads of Data) on data strategy, governance, and platform modernisation Design robust ETL/ELT frameworks with Azure Data Factory , SSIS , Informatica , or IBM DataStage Implement data governance and quality frameworks using Unity Catalog , Profisee , Alation , or similar platforms Provide leadership and mentoring More ❯
Owner in Agile environments. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong understanding of data quality frameworks, data contracts, and lineage. Proficient in using analytics and BI tools More ❯
various sources and supports diverse analytical needs, while optimizing costs and meeting business requirements. Implementing Data Engineering Pipelines: Design and develop data pipelines for data extraction, transformation, and loading (ETL) processes, ensuring data quality and consistency. Enabling Data Intelligence and Analytics: Build and maintain data warehouses, data marts, and data lakes to support business intelligence and data analytics initiatives. Supporting More ❯