products, like server, portal, IIS, web store and services• Experience of setup, configuration of oracle views and the using SQL Plus to query andextract data to meet business needs and functions• Experience of automation and continuous integration with tools such as Azure DevOps Pipelines and PowerShell or Python scripting … and satellite development teams, identifying training requirements, and leading on training of team members, mentoring, or coaching using relevant resources• Experience of carrying out ETL processes via SQL code and/or utilisation of off-the-shelf ETL tools such as Safe Software FME Desktop and Server• Experience of utilising more »
and Machine Learning engineers and it is responsible for supporting data scientists in deploying, maintaining and monitoring an increasing number of Python-based microservices, ETL pipelines, SaaS models, databases and vector stores. The MLOps Lead would need to act as an interface between data scientists, the data & analytics team andmore »
Greater London, England, United Kingdom Hybrid / WFH Options
itecopeople
deadlines. Develop and uphold the best practice standards, design patterns and documentation for data management and data engineering. Design, build, and manage data pipelines, ETL processes, and data orchestration workflows that transfer data smoothly across different systems and platforms. Create system design for integrating and managing data. Ensure that you more »
Greater London, England, United Kingdom Hybrid / WFH Options
Source Technology
enhance decision-making processes and optimise the value derived from the data. Unlock the potential of metadata within the data, exploring innovative methods to extract valuable insights. Experience Required: Proficient data engineering experience Experience with Azure Databricks and building ETL pipelines Package: Up to £120,000 Competitive Bonus Market leading more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
delivering complex initiatives across the Defence & Security sector. The Role : As a Data Analytics Consultant, you’ll design and build data solutions such as ETL components, data warehouses or data virtualisation implementation. Working closely with client stakeholders to design the source-to-target mappings for large-scale data migrations whilst … the ability to translate business requirements into functional/technical data designs/solutions • Agile and/or DevOps for software development & IT operations • ETL tools such as Informatica, SSIS, Talend or Pentaho • Data governance and data management tools such as Informatica MDM, Informatica AXON, Informatica EDC, and Collibra • MySQL more »
Greater London, England, United Kingdom Hybrid / WFH Options
Xcede
a strong preference for Snowflake experience, but are open to those from a GCP/AWS/Azure background more generally. Develop high-performance ETL/ELT processes for batch and (ideally) real-time data integration. Ensure optimal extraction, transformation, loading (ETL) processes by implementing quality checks and balances. Collaborate more »
West London, London, United Kingdom Hybrid / WFH Options
Recruitment Revolution
analytic role, with a focus on analysis, reporting and visualisation in Looker Studio + Highly proficient in GCP/GBQ, SQL, data modelling andETL processes + Experience with Supermetrics, GitHub, DBT, Google Big Guery and Shopify + Strong communication and presentation skills to effectively convey insights to both technical more »
capabilities and support our Finance business partners in their decision-making processes. Key responsibilities include: Design, build, and maintain efficient, reliable data pipelines using ETLand ELT processes. Ensure the seamless flow and availability of high-quality data across the organisation Use Snowflake for data storage, processing, and analytics. Optimise more »
model to meet new business requirements2+ years managing an analytics engineering team using a scrum/agile methodologyExperience working with commercial data warehouses (Redshift), ETL tools (Dbt), data visualization (Python notebook, Thoughtspot, Looker, Tableau, Hex), and Data Dictionary tools (Atlan)Demonstrated experience leading 2 or more multi-department analytics projects more »
of team, including ways of working, engineering principles, data governance and best practice. Become an SME on the design, development, and deployment of data ETL pipelines (using Azure Data Factory and other technologies) to access, combine andtransform data from on-prem and cloud-based sources. Ensure that all data more »
and data transformation.Strong experience in managing and leading data engineering teams.Strong proficiency in Scala, Python, SQL, Snowflake and DBT.In-depth understanding of data modelling, ETL processes, and data warehousing concepts.Experience with cloud-based data platforms (e.g., AWS, Azure, GCP) and containerisation technologies (e.g., Docker, Kubernetes) is a plus.Excellent problem-solving more »
and Machine Learning engineers and it is responsible for supporting data scientists in deploying, maintaining and monitoring an increasing number of Python-based microservices, ETL pipelines, SaaS models, databases and vector stores. The MLOps Lead would need to act as an interface between data scientists, the data & analytics team andmore »
and Sell additional annual leave Funded Learning and development programmes The Successful Data Engineer will have: Experience in developing and designing data pipelines andETL processes Previous experience of data warehousing Utilise SAS for data reporting and analysis Skilled in SQL, Python, Git and Databricks In depth experience of reporting more »
Strong stakeholder management & communication skills Experience of working with structured and unstructured datasets Able to design a range of architectural solutions Data models andETL Data lake and data warehouse end to end architecture Experience within media, publishing, research, or a similar consumer focused industry is highly desirable, but not more »
Provide estimates, work independently and meet deadlines Manage the releases and the related builds in each environment Perform development, testing, and support of data ETL/ELT programs (pipelines), using such tools as AWS Glue, Azure Data Factory, Informatica or similar ETL/ELT platforms, as well as knowledge of … learn the associated scheduling logic for data pipelines using scheduling tool such as Autosys or Airflow Read and translate logical data models and use ETL skills to load the physical layer, based on an understanding of timing of data loads, data transformation, and optimization of ETLload performance. Provide production more »
London, England, United Kingdom Hybrid / WFH Options
Pioneer Search
in delivering complex solutions for Enterprise level Clients. As well as maintaining robust and scalable applications in Scala, you will be able to implement ETL pipelines to process, transform, and standardize data from various sources as well as optimise the performance of Spark applications. Work closely with data scientists, software more »
Science, Engineering, or a related field. Strong Python development. Experience with Pandas is desirable. 3+ years of experience in data engineering. Experienced in building ETL data pipelines. Relational database experience w/PostgreSQL. Understanding of tech within our stack: AWS/Apache beam/Kafka. Experience with Object Orientated Programming more »
to take their next step in their career and wants to take on more responsibility.Technical Skillset:Experience as a data engineerdeveloping and maintenance of ETL/ELT processesGood experience in Data Modeling within a cloud-based data platformStrong experience with SQL ServerAzure data engineering stack, including Azure Synapse and Azure more »
Science or related fields. Proven experience with Insurance Broking Systems data migration (ideally Acturis). Proficiency in SQL and data manipulation languages. Experience with ETL tools. Strong analytical and critical thinking skills, with a focus on practical solutions. Excellent communication and people skills for conveying data concepts to diverse audiences. more »
and optimize automation processes, effective monitoring, and infrastructure-as-code using Terraform. Collaborate closely with our engineering teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg. Optimize infrastructure costs and develop strategies for more »
storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transformandload large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. Our business is growing quickly and with that so … Key Responsibilities Design, construct, install, test, and maintain data pipelines. Ensure systems meet business requirements and industry practices for data integrity and quality. Manage ETLand ELT pipelines across many data sources (CSV/parquet files, API endpoints, etc) Design and build data models for the business end users. Write more »
and performant schemas. Experience with various database technologies like relational databases, NoSQL, and cloud-based data storage solutions. Understanding of data warehousing concepts andETL (Extract, Transform, Load) processes. Familiarity with tools like Tableau for data visualization. Good to have: Good Knowledge in Erwin Good Knowledge in Magic Draw Rewards more »
and reporting on IWS scheduling objects. Analysis and solution design experience. An example from previous work would be an advantage. Knowledge and experience of ETL concepts. (Specific tools not an issue.) Good programming skills in Javascript & Python. Other languages would be an advantage. SQL/Xquery experience, specific DB not more »
the organization. · Developing and implementing data models and algorithms to support data science and machine learning initiatives. TECHNICAL REQUIREMENTS · Proven track record leading complex ETLand Data Infrastructure projects, as well as designing and building data intensive applications and services. · Experience with data processing and distributed computing frameworks such as more »
Azure and Snowflake, enhancing data capabilities for analytics and science. What you need: 3+ years of hands-on experience as a Data Engineer, building ETL pipelines and managing data lifecycle. Strong Python coding experience. 2+ years of commercial experience developing in Snowflake. Good understanding of cloud principles (ideally Azure but more »