and security standards. Establish and uphold best practice standards, design patterns, and documentation for data management and engineering. Design, build, and manage data pipelines, ETL processes, and data orchestration workflows for seamless data transfer across systems and platforms. Develop system designs for integrating and managing data effectively. Experience and Qualifications more »
a modern cloud based data stack further developing the estate. You'll work closely with data scientists deploying models, developing scalable data warehouses, designing ETL pipelines loading data into Snowflake and solving data engineering problems. You'll be impactful to ensure that the business get the most out of Snowflake. more »
good knowledge of their respective language and writing production ready code that will be deployed on the cloud. 𝐑𝐨𝐥𝐞: Development of end-to-end ETL Data Pipelines Architect and build Market Data processing systems that will deal with Tick Data, Order Book data and Trade Data. Working collaboratively with the more »
marketing business partners, teammates and leadership; implementing machine learning algorithms, working end-to-end on machine learning pipelines in production; data engineering working on ETL pipelines, crawling APIs and websites, and automating outputs (report generation, workflow automation, Google Sheet interaction); setting and meeting detailed timelines and expectations while executing projects more »
pipelines. As well as being an expert in cloud platform, you’ll have a strong background in Data Ingestion and Integration, designing and implementing ETL pipelines on various technologies, Data Quality monitoring and a rounded understanding of data operations. Aviva believes strongly in experimentation leading to industrialisation and we are more »
or data lakes Data-modelling techniques (Relational, 3NF and dimensional modelling, Kimball, DV 2.0 etc.) Strong experience in building robust and scalable ELT/ETL data pipelines Proficient coding in - python, Apache Spark and expert knowledge of SQL and good experience with shell-scripting languages Working knowledge of orchestration tools … e.g. Apache Airflow Experience of ETL/ELT tooling – for example Pentaho, AWS Glue, DBT, airflow etc. GIT and experience in building CI/CD pipelines DBA Experience in AWS cloud environment managing AWS Aurora, and Amazon RDS (MySQL, Postgres, MSSQL) Desired: Experience of financial services or consumer finance IaaC more »
efficient data ingestion, processing, storage, and analysis. The ideal candidate will have a strong background in data engineering, including expertise in building and optimizing ETL processes, data warehousing, and working with big data technologies. Main Responsibilities ● Collaborate with business stakeholders to gather and analyse business requirements, translating them into specific … applications. ● Lead end-to-end processes following the CRISP-DM process model to deliver high- quality datasets and their applications. ● Design and implement robust ETL/ELT pipelines from a variety of data sources, including RDBMS, texts, spreadsheets, and API endpoints, into the data warehouse. ● Utilise workflow management tools to … Azure DevOps and Data Factory is a plus. ● Experience with Docker and Kubernetes for creating portable and scalable environments is a plus. ● Experience with ETL tools like Pentaho Data Integration is a plus. Soft skills ● Analytical thinking with a keen attention to detail and a problem-solving mindset. ● Excellent communication more »
Consultants and Solution Architects from the Data Modeling or Analytics software/services industries. Experience with Data Warehouse concepts & Reporting Technologies (i.e. Tableau, PowerBI), ETL tools (i.e. Informatica, Unifi, SnapLogic) Experience working with APIs or an API testing framework such as Postman Experience with Business Requirements definition and management, structured more »
SKILLS Proficiency in SQL and data querying validation and testing purposes. Hands on experience with Snowflake or Airflow or DBT. Familiarity with data integration, ETL processes, and data governance frameworks. Solid understanding of data structures, relational databases and data modelling concepts. Experience with CDP Customer Data Platform in a data more »
for an individual skilled in Python/Java and AWS to join the talented team ingesting data from internal and external sources, carrying out ETL processes into data lakehouse. Role: > Work on the Data and Analytics Platform contributing to architectural design, integration and analytical systems > Data pipeline implementation Requirements: > 4+ more »
London, England, United Kingdom Hybrid / WFH Options
iO Associates - UK/EU
governance concepts, practices, and frameworks. Strong technical skills in data modelling, metadata management, and data lineage. Datawarehouse design and implementation Experience with SQL Server ETL processes A good knowledge of Power BI reporting Desirable Skills Nexthink Software SNOW Software JAMF Software Accreditations Microsoft Data Analyst Microsoft Data Engineer more »
you will: Act as the go-to person for all things data and understanding client goals. Collect data from online and offline channels, build ETL data pipelines and data models, to report and advise on marketing effectiveness. Help formulate digital media strategies and refine channels through testing and development initiatives. more »
candidates with the following skills and experience: Expert with SQL and Azure Data Engineering Hands-on experience in designing and developing scripts for custom ETL processes and automation in Azure Data Factory, Azure Databricks, Azure Synapse, Python, Pyspark etc. Experience being customer-facing on numerous data focused projects with a more »
be involved with project migration and business-critical upgrades. You must have experience with Azure Data Factory – Google Cloud – Airflow, along with SQL andETL experience. This could be a great career move with personal development plans to grow your career. You will require a blend of the following: - Strong more »
is an exciting opportunity to contribute to the success of a forward-thinking property management company. Responsibilities: Design, develop, and maintain data pipelines andETL processes using Azure Data Factory. Optimize data storage and retrieval processes to ensure scalability, reliability, and performance. Collaborate with cross-functional teams to understand data more »
qualities 5+ years of experience in data engineering, with a passion for sports. Pro-level skills in Python and Bash scripting. Experience dribbling through ETL processes with tools like Prefect or Airflow. Goal-scoring knowledge of containerization and orchestration tools like Docker and Kubernetes. Captain of the cloud with expertise more »
CI/CD pipelines within Azure DevOps; Working experience of database development methodologies, including hands on experience with SQL; Working experience and automation of ETL processes and tools such as SSIS, ADF, Azure Synapse; Highly scalable data processing platform design and implementation experience; Understanding of Infrastructure as Code frameworks e.g. more »
Greater London, England, United Kingdom Hybrid / WFH Options
Anson McCade
the Defence & Security sector. Required Skills/Experience: British Passport Holder with active DV (Developed Vetting Clearance) End-to-end development with data pipelines, ETL processes, and workflow orchestration - using core concepts that apply across tech stacks. Working with diverse data sources and types - batch, streaming, real-time, and unstructured. more »
London, England, United Kingdom Hybrid / WFH Options
Austin Fraser
it's a plus: Cutting-Edge Tech: Experience with containerisation, Kubernetes, and observability platforms. Workflow Wizardry: Familiarity with data orchestration tools like Airflow andETL with Apache Beam. Data Visionary: Knowledge of DataVault (DV2) and data management concepts. Location: Our opportunities are available in London Victoria and Bracknell. Choose the more »
in Google Cloud Platform (GCP). Specialized expertise in Asset Management data. Proficient in SQL, with a mandatory requirement of DBT proficiency. Proficient in ETL or ELT processes. If interested please come back with an updated CV more »
from multiple sources. Advanced skills in Excel as well as any data visualization tools like Amazon Quicksight, Tableau, or similar BI tools. Experienced in ETL, data modeling and big data tools. Preferred qualifications Engineering experience. People management experience, managing business intelligence engineers. Experience in designing and delivering cross functional custom more »
data science, statistical modelling and/or machine learning Knowledge around software concepts Experience building and deploying data pipelines, e.g. in the context of ETLand machine learning applications End to end project delivery Proven ability to handle multiple project commitments simultaneously Ability to define tasks clearly, estimating timings andmore »
Proficiency in handling and processing large datasets, ensuring data quality and accessibility, including expertise in vectorized datasets. Experience with Data Integration Tools: Proficiency in ETL/ELT tools and practices. Competency in deploying and managing AI solutions on AWS cloud infrastructure. Experience with containerization technologies like Docker. Understanding and application more »
and tools, integrating it into a central data lake or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Data Quality and Reliability: Implement systems to monitor and ensure data quality, making it clean, consistent, and usable. Report Generation and Automation … business requirements. Key Skillsets: Proficiency in database technologies, programming languages (e.g. Python, SQL), cloud services (e.g. AWS, Azure, Google Cloud), and data-modelling/ETL tools (e.g., Redshift, BigTable, Tableau, AWS Glue). Experience with machine learning and strong engineering skills are highly desirable, enabling autonomy in architectural decisions andmore »
to the Data Scientist. Focus on collecting and preparing data for use by Data Scientists and analysts. Solve challenging data integration problems, utilising optimal ETL patterns, frameworks, query techniques, sourcing from structured and unstructured data sources. Support the design, build, and launch collections of sophisticated data models and visualisations that … with programming languages such as Python, R or Java and with data analysis libraries (e.g. Pandas, NumPy, scikit-learn). Understanding of database technologies (ETL) and SQL proficiency for data manipulation, data mining and querying. Knowledge of Big Data Tools (Spark or Hadoop a plus). Power BI, Dashboard design more »