initiatives. Qualifications: Bachelor's degree or higher in Computer Science, Engineering, or related field. Proven experience in designing, building, and optimizing data pipelines andETL processes. Proficiency in programming languages such as Python, Java, or Scala. Strong SQL skills and experience working with relational and NoSQL databases. Familiarity with containerization more »
Greater London, England, United Kingdom Hybrid / WFH Options
Xcede
a strong preference for Snowflake experience, but are open to those from a GCP/AWS/Azure background more generally. Develop high-performance ETL/ELT processes for batch and (ideally) real-time data integration. Ensure optimal extraction, transformation, loading (ETL) processes by implementing quality checks and balances. Collaborate more »
per day - London/remote - 12 months Strong programming skills on GoLang with experience on building APIs and having knowledge on Data warehousing/ETL who can design and develop the data pipelines for Group data consolidation platform built on Google Cloud. Duties Assess, analyze, data ingestion, transformation & storage layers … programming. Able to build APIs to integrate different sources Perform UT and validate the transformations. Deploy code using CICD pipelines built. Essential Skills: Extensive ETLand Data warehousing experience Strong experience in GoLang programming Google Cloud Services and BigQuery more »
London iR35 determination - Inside Duration - 1 year Strong programming skills on GoLang with experience on building APIs and having knowledge on Data warehousing/ETL who can design and develop the data pipelines for Group data consolidation platform built on Google Cloud. Assess, analyze, data ingestion, transformation & storage layers (RAW more »
strong ability to learn new skills quickly. Key responsibilities include designing and developing scalable data models and warehouses, creating and maintaining database objects, andETL processes for loading data into Snowflake. Proven proficiency in SQL, database optimisation, and effective database security implementation is a must (working closely with Data Science … modelling skills. Knowledge of the Data Engineering and ML capabilities within either Azure or AWS Familiarity with DataOps concepts and data mining Experience with ETL processes and data integration. Experience with Python Familiarity with cloud data warehousing concepts. Strong problem-solving and communication skills Proficiency in scripting for automating deployment more »
environment requires a more robust structure so much of the role will focus around improving: - Data Governance - Data Warehousing - Developing and implementing automation processes - ETL activities - Advanced analytics (involving forecasting and predictive analytics) Clearly, these will not all be achieved by one single person, nor all at once but it more »
environment requires a more robust structure so much of the role will focus around improving: - Data Governance - Data Warehousing - Developing and implementing automation processes - ETL activities - Advanced analytics (involving forecasting and predictive analytics) Clearly, these will not all be achieved by one single person, nor all at once but it more »
in R, Python, and SQL for data manipulation and analysis. Experience with Enterprise Data Management (EDM) tools and concepts. Strong understanding of data modeling, ETL processes, and data quality management. Familiarity with financial services or asset management industry preferred. Excellent problem-solving skills and attention to detail. Ability to work more »
pipelines Extensive experience of data mapping and data transformation Experience in building optimised data pipelines Experience of data mapping and data transformation Experience of ETL tools (Azure Data Factory highly desirable) Experience of commercial off the shelf database packages e.g. Microsoft SQL Server Experience of Microsoft Power Platform Working knowledge more »
with immediate effect. Data Engineer Responsibilities Design and implement scalable data architectures for large-scale web-scraping tools. Develop and optimize data pipelines for ETL from online platforms, including textual and unstructured data, and REST APIs. Test, validate, and secure acquired data to ensure accuracy and quality. Work closely with more »
London, England, United Kingdom Hybrid / WFH Options
Ripple Labs Inc
implement the key payment data models for various analytics, ML/AI solutions, and data-centric product features, which includes building batch/stream ETL pipelines for payment golden datasets, creating unified data monitoring and alerting system for operation excellence, and setting up the standard for payment data governance. Successful more »
London, England, United Kingdom Hybrid / WFH Options
Aventum Group
responsible for accessing, validating, and querying data from various repositories using available tools. Build and maintain data integration processes using SQL Services and other ETL/ELT processes and scripting tools as well as ongoing requests and projects related to the data warehouse, MI, or fast-moving financial data. Designing … Architecting, building, testing, and maintaining data platform. Develop and support a wide range of data transformations and migrations for the whole business. Construct custom ETL processes: Design and implement data pipelines, data marts and schemas, access versatile data sources and apply data quality measures. Monitoring the complete process and applying … ML is a plus Experience with Azure SQL Database, Cosmos DB, NoSQL, MongoDB Experience with Agile, DevOps methodologies Awareness and knowledge of ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (Public Key Infrastructure) and Integration testing Skills and Abilities Knowledge of Python, SQL, SSIS, andmore »
Technologies such as Docker and orchestration tools like Kubernetes for containerized deployments. Workflow management tools such as Airflow for orchestrating complex data pipelines andETL processes. Certifications in Azure cloud services and data engineering technologies, demonstrating expertise and proficiency in the Azure ecosystem. Rewards & Benefits: TCS is consistently voted a more »
support organizational growth. Essential Requirements: Minimum 5 years of experience in a similar role. Proven track record in designing and building data infrastructure andETL pipelines. Proficiency in Azure Platform, including Data Lake, Data Factory, Synapse, Logic Apps, and Function Apps. SQL Server, including Store Procedures, T-SQL, or similar more »
technology, Computer Science or related with/or proven continual relevant learning. Knowledge of AI and machine learning processes Understanding of geospatial data andETL (Desirable) Project management experience, Agile (Desirable) Interested? As well as a starting salary circa £60,000, the role offer a fantastic range of benefits, flexible more »
SSAS). Ability to translate business requirements into technical solutions. Skills in data discovery tools (Power BI, QlikView, Tableau). Experience with data modeling, ETL development, and Kimball methodology. Awareness of Big Data and Data Science technologies. UK applicants only. Contact Faye at 0203 800 0792 or submit your CV. more »
Proficiency in designing and implementing data warehouse solutions using Snowflake, including performance tuning and optimization. Strong understanding of data modeling, data integration patterns, andETL processes. Experience with data governance, data security, and data quality management. Excellent communication and collaboration skills, with the ability to work effectively in a cross more »
your problem-solving skills and creativity. Google Cloud Professional Cloud Architect or Professional Cloud Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes andmore »
pipelines. As well as being an expert in cloud platform, you’ll have a strong background in Data Ingestion and Integration, designing and implementing ETL pipelines on various technologies, Data Quality monitoring and a rounded understanding of data operations. Aviva believes strongly in experimentation leading to industrialisation and we are more »
experience in data analytics. Strong understanding of data science and machine learning. Proficiency in data preparation, cleansing, and transformation. Knowledge of data integration andETL processes. Skills in Excel, Python (3+ years), SQL (3+ years), MongoDB, API, JSON, and Power BI. Strong communication and project management skills. Nice To Haves more »
pipelines. As well as being an expert in cloud platform, you’ll have a strong background in Data Ingestion and Integration, designing and implementing ETL pipelines on various technologies, Data Quality monitoring and a rounded understanding of data operations. Aviva believes strongly in experimentation leading to industrialisation and we are more »
Isleworth, England, United Kingdom Hybrid / WFH Options
WNTD
of data governance concepts and practices. Expertise in data modeling, metadata management, and data lineage. Strong background in data warehouse design and SQL Server ETL processes. Proficiency in Power BI for reporting. Desirable Skills: Experience with Nexthink, SNOW, and JAMF software. Accreditations: Microsoft Certified Data Analyst Microsoft Certified Data Engineer more »
financial services. Strong leadership skills with a track record of successfully managing and developing high-performing teams. In-depth knowledge of data engineering concepts, ETL processes, and data warehouse architectures. Expertise in working with big data technologies and cloud platforms (preferably AWS or Azure). Familiarity with asset management industry more »
up with your own solution. You will be building dashboards in Python as well as be in charge of Transformation when it comes to ETL pipelines. Experience wise, you’ll need to be a SQL Server expert with strong architecture skills and also strong skills with Python, and DBT. The more »
Your Responsibilities in this Role will be: Taking ownership of designing and developing scalable data models and warehouses Create and maintain database objects, andETL processes Optimize data ingestion and transformation processes Collaborate with stakeholders Ensure effective database optimization and security implementation Champion data quality and integrity Requirements: SQL Python more »