London, England, United Kingdom Hybrid / WFH Options
Anson McCade
delivering complex initiatives across the Defence & Security sector. The Role : As a Data Analytics Consultant, you’ll design and build data solutions such as ETL components, data warehouses or data virtualisation implementation. Working closely with client stakeholders to design the source-to-target mappings for large-scale data migrations whilst … the ability to translate business requirements into functional/technical data designs/solutions • Agile and/or DevOps for software development & IT operations • ETL tools such as Informatica, SSIS, Talend or Pentaho • Data governance and data management tools such as Informatica MDM, Informatica AXON, Informatica EDC, and Collibra • MySQL more »
Greater London, England, United Kingdom Hybrid / WFH Options
itecopeople
deadlines. Develop and uphold the best practice standards, design patterns and documentation for data management and data engineering. Design, build, and manage data pipelines, ETL processes, and data orchestration workflows that transfer data smoothly across different systems and platforms. Create system design for integrating and managing data. Ensure that you more »
Greater London, England, United Kingdom Hybrid / WFH Options
Source Technology
enhance decision-making processes and optimise the value derived from the data. Unlock the potential of metadata within the data, exploring innovative methods to extract valuable insights. Experience Required: Proficient data engineering experience Experience with Azure Data Factory and building ETL pipelines Package: Up to £120,000 Competitive Bonus Market more »
databases, tables, views, stored procedures, user defined functions, SSIS packages and other database objects using SQL Server Experience in Data Extraction, Transforming and Loading (ETL) using MS SQL Server tools and utilities, creating SSIS master and child packages and packages configurations. Experienced with Tabular reports, matrix reports, parameterized reports using more »
Agile and equivalent methodologies; working as part of a development team whose responsibility is to deliver complex data and BI products. Detailed knowledge of ETL, data warehousing, Datamodelling and experience dealing with big data is expected. Along with good knowledge of data schema structures, design and how they can be … with several bespoke data sources written by other parties and other working areas of the project. Key Knowledge/Skills Detailed working knowledge of ETL/ELT, data warehousing/business intelligence methodologies and best practice including dealing with big data, cloud technology and unstructured data and the relative required … Data Factory, Event Hubs, Data Lake, Synapse, Azure SQL server. Detailed knowledge in developing in Databricks and experience in coding with PySpark. Spark SQL ETL coding standards: ensuring that code is standardised, self-documenting and can be reliably tested Knowledge of best practice data encryption techniques and standards Knowledge of more »
bluewaveSELECT have been retained by a a global organisation to engage with the right Senior SAP BODS Consultant. Key Responsibilities: Architectural Design Technical Specification ETL Configuration and Installation Key Requirements: BODS experience in Senior positions Data Migration experience Scheduling and management console experience Senior SAP BODS Consultant - 6 months more »
City of London, London, United Kingdom Hybrid / WFH Options
Sanderson Recruitment
company and to take your career to the next level: Keys skills/Experience: Strong Visualisation experience using Power BI Comfortable setting up complex ETL processes Strong experience around SSIS and Azure Data Factory Azure Synapse, Azure Data Factory & Data Lakes DevOps and Deployments knowledge and exposure Solid communication skills more »
capabilities and support our Finance business partners in their decision-making processes. Key responsibilities include: Design, build, and maintain efficient, reliable data pipelines using ETLand ELT processes. Ensure the seamless flow and availability of high-quality data across the organisation Use Snowflake for data storage, processing, and analytics. Optimise more »
of team, including ways of working, engineering principles, data governance and best practice. Become an SME on the design, development, and deployment of data ETL pipelines (using Azure Data Factory and other technologies) to access, combine andtransform data from on-prem and cloud-based sources. Ensure that all data more »
and Machine Learning engineers and it is responsible for supporting data scientists in deploying, maintaining and monitoring an increasing number of Python-based microservices, ETL pipelines, SaaS models, databases and vector stores. The MLOps Lead would need to act as an interface between data scientists, the data & analytics team andmore »
and Sell additional annual leave Funded Learning and development programmes The Successful Data Engineer will have: Experience in developing and designing data pipelines andETL processes Previous experience of data warehousing Utilise SAS for data reporting and analysis Skilled in SQL, Python, Git and Databricks In depth experience of reporting more »
Strong stakeholder management & communication skills Experience of working with structured and unstructured datasets Able to design a range of architectural solutions Data models andETL Data lake and data warehouse end to end architecture Experience within media, publishing, research, or a similar consumer focused industry is highly desirable, but not more »
Provide estimates, work independently and meet deadlines Manage the releases and the related builds in each environment Perform development, testing, and support of data ETL/ELT programs (pipelines), using such tools as AWS Glue, Azure Data Factory, Informatica or similar ETL/ELT platforms, as well as knowledge of … learn the associated scheduling logic for data pipelines using scheduling tool such as Autosys or Airflow Read and translate logical data models and use ETL skills to load the physical layer, based on an understanding of timing of data loads, data transformation, and optimization of ETLload performance. Provide production more »
Science, Engineering, or a related field. Strong Python development. Experience with Pandas is desirable. 3+ years of experience in data engineering. Experienced in building ETL data pipelines. Relational database experience w/PostgreSQL. Understanding of tech within our stack: AWS/Apache beam/Kafka. Experience with Object Orientated Programming more »
Science or related fields. Proven experience with Insurance Broking Systems data migration (ideally Acturis). Proficiency in SQL and data manipulation languages. Experience with ETL tools. Strong analytical and critical thinking skills, with a focus on practical solutions. Excellent communication and people skills for conveying data concepts to diverse audiences. more »
and optimize automation processes, effective monitoring, and infrastructure-as-code using Terraform. Collaborate closely with our engineering teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg. Optimize infrastructure costs and develop strategies for more »
storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transformandload large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. Our business is growing quickly and with that so … Key Responsibilities Design, construct, install, test, and maintain data pipelines. Ensure systems meet business requirements and industry practices for data integrity and quality. Manage ETLand ELT pipelines across many data sources (CSV/parquet files, API endpoints, etc) Design and build data models for the business end users. Write more »
and performant schemas. Experience with various database technologies like relational databases, NoSQL, and cloud-based data storage solutions. Understanding of data warehousing concepts andETL (Extract, Transform, Load) processes. Familiarity with tools like Tableau for data visualization. Good to have: Good Knowledge in Erwin Good Knowledge in Magic Draw Rewards more »
and reporting on IWS scheduling objects. Analysis and solution design experience. An example from previous work would be an advantage. Knowledge and experience of ETL concepts. (Specific tools not an issue.) Good programming skills in Javascript & Python. Other languages would be an advantage. SQL/Xquery experience, specific DB not more »
to experience level and will find good fits for the best people Strong experience with Python or Rust Experience with Airflow Exposure to building ETL pipelines is a huge plus A desire to learn Rust Solid SQL knowledge Fantastic education Experience working in mission critical environments where speed, reliability andmore »
North West London, London, United Kingdom Hybrid / WFH Options
Viqu Limited
Bundles (DAB's) for streamlining the development of complex data and analytics for the Databricks platform. (IaC) Strong understanding of data warehousing concepts andETL processes. Excellent problem-solving skills and attention to detail. Ability to work independently in a remote setup with minimal supervision. Role details: Job role: Senior more »
Experience with Real Time data analysis and financial systems (preferred). Knowledge of database design principles, performance optimization, and Datamodelling. Familiarity with data integration, ETL processes, and data warehousing. Excellent problem-solving skills and the ability to work effectively in a fast-paced environment. Strong communication and teamwork skills. A more »
Azure and Snowflake, enhancing data capabilities for analytics and science. What you need: 3+ years of hands-on experience as a Data Engineer, building ETL pipelines and managing data lifecycle. Strong Python coding experience. 2+ years of commercial experience developing in Snowflake. Good understanding of cloud principles (ideally Azure but more »
Greater London, England, United Kingdom Hybrid / WFH Options
Anson McCade
the Defence & Security sector. Required Skills/Experience: British Passport Holder with active DV (Developed Vetting Clearance) End-to-end development with data pipelines, ETL processes, and workflow orchestration - using core concepts that apply across tech stacks. Working with diverse data sources and types - batch, streaming, real-time, and unstructured. more »
Greater London, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
experience is paramount. Build and deploy APIs for interconnected services Design APIs for clients and partners (authentication/versioning/documentation) Design and develop ETL pipelines using Python and AWS services Help prioritise our backlog with a balance between short-term deliverables and longer-term investment in our technology Effectively more »