build and maintain batch or real-time data pipelines in production, re-engineering manual flows and optimising code to ensure processes perform optimally. • Develop ETL (extract, transform, load) processes to help extractand manipulate data from multiple sources. • Write ETL/ELT scripts and code to ensure the ETL/… optimally (T-SQL stored procedures, Azure ADF/Synapse Pipelines, Azure Databricks/Synapse Notebooks) • Automate data workflows such as data ingestion, aggregation, andETL/ELT processing. • Implement data flows to connect operational systems, data for analytics and business intelligence (BI) systems • Recognise opportunities to reuse existing data flows more »
Working knowledge of database management, big data solutions, or data analytics in a cloud environment. •Strong understanding of data integration techniques, data pipelines, andETL/ELT processes within Microsoft Fabric's ecosystem. EXPERIENCE •Proven experience in Business Intelligence and data analytics, particularly in a Microsoft Fabric environment. •Hands-on more »
to deliver impactful data products for our retail chain Key responsibilities: ETL Pipeline Development : Develop, optimize, and maintain ETL pipelines to efficiently extract, transform, andload data from various sources, ensuring high data quality. Monitor and troubleshoot production data pipelines , ensuring their performance and reliability. Mentor junior engineers and lead … such as AWS/Azure and DBT with Snowflake to build and maintain scalable data solutions. Your Profile Key skills/knowledge/experience: ETL/ELT & Data Pipelines: Solid understanding of ETL/ELT processes, along with hands-on experience building and maintaining data pipelines using DBT, Snowflake, Python more »
Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks, and Azure SQL. Proficiency in SQL , Python , or Scala for data processing. Experience with ETL/ELT design patterns and building scalable, distributed data architectures. Deep understanding of data modelling , data warehousing , and data integration best practices. Knowledge of Azure more »
and maintaining robust data infrastructure and pipelines, ensuring efficient data processing and availability for analysis and decision-making. Responsibilities: Design, develop, and maintain scalable ETL pipelines using cloud technologies, ensuring efficient data ingestion, processing, and storage. Implement data quality checks, monitoring systems, and data governance practices, troubleshooting and resolving data more »
Engineer Immediate Responsibilities Managing and delivering end to end BI reporting projects from initial brief/requirement gathering to data pipeline development (APIs/ETL/ELT/SQL) to visualization Undertaking ad hoc BI tasks based on business requirements and priorities As part of the BI team you will more »
data stores. Hands-on experience with SQL and/or PL/SQL. Strong understanding of data architecture concepts (data warehouse, data mart, ODS, ETL, reporting, analytics). Familiarity with data governance and security best practices. Ability to work independently and as part of a team. Experience with data visualization more »
ERP, with experience of working from D365 being beneficial for our project work. Experience of data lineage testing to ensure data quality throughout the ETL process and can check their work as they develop. Strong collaborative mindset with a willingness to work across various teams. Passion for continuous learning andmore »
flows (Desirable) Will be an effective communicator and able to work with stakeholders cross-functionally Knowledge of cloud computing platforms, in particular Azure Databricks, ETLand ELT processes, as well as experience with big data technologies and frameworks a plus more »
driven insights and solutions. - SQL & Data Queries: Write and optimize SQL queries to retrieve, manipulate, and analyze data stored in databases. - Data Transformation: Use ETL (Extract, Transform, Load) processes to clean, transform, and integrate data from various sources. - Collaboration: Collaborate with cross-functional teams including business, finance, and IT to … other BI tools) and data visualization best practices. - Strong knowledge of SQL and database management. - Experience with data cleaning and data wrangling. - Familiarity with ETL processes and data pipelines. - Ability to work independently and manage multiple projects simultaneously. - Strong problem-solving and analytical thinking skills. - Excellent verbal and written communication more »
driven insights and solutions. - SQL & Data Queries: Write and optimise SQL queries to retrieve, manipulate, and analyse data stored in databases. - Data Transformation: Use ETL (Extract, Transform, Load) processes to clean, transform, and integrate data from various sources. - Collaboration: Collaborate with cross-functional teams including business, finance, and IT to … other BI tools) and data visualisation best practices. - Strong knowledge of SQL and database management. - Experience with data cleaning and data wrangling. - Familiarity with ETL processes and data pipelines. - Ability to work independently and manage multiple projects simultaneously. - Strong problem-solving and analytical thinking skills. - Excellent verbal and written communication more »
Chelmsford, England, United Kingdom Hybrid / WFH Options
Senitor Associates
strong data modelling experience (star schema/Kimball), SQL and Python experience Have a proven track record in data modelling, building out data infrastructure andETL pipelines Strong data visualisation experience in Power BI using DAX Be experienced using Azure data platform tools including Azure Data Lake, Azure Data Factory … semantic data models using Azure Synapse Analytics/Fabric, Spark notebooks. Ensure data model accuracy, scalability, and performance. Use PySpark within Azure notebooks to extract, transform, andload (ETL/ELT) data from raw formats (e.g. Delta, Parquet, CSV) stored in ADLS Gen2. Implement data transformation pipelines and workflows in more »
including defining scope, managing timelines, and ensuring successful delivery within budget. ETL Pipeline Development : Develop, optimize, and maintain ETL pipelines to efficiently extract, transform, andload data from various sources, ensuring high data quality. Cloud Infrastructure and Data Integration : Lead the implementation of data solutions on cloud platforms such as … program level, using influence and adaptive communication strategies Your Profile Key skills/knowledge/experience: Advanced Data Engineering Skills : Proficiency in designing and managing ETL processes using DBT, Python, Terraform and Airflow Expertise in Cloud Platforms: In-depth knowledge of Snowflake and Azure with experience in leveraging these platforms more »
Services : Data Factory, Data Lake, Synapse Analytics, Databricks, etc. Strong proficiency in building and managing Power BI dashboards and reports. Expertise in SQL andETL processes. Solid understanding of data warehousing , data modeling , and data integration . Problem-solving mindset with a passion for optimizing data processes. Excellent communication skills more »
Sunderland, Tyne and Wear, Tyne & Wear, United Kingdom Hybrid / WFH Options
Nigel Frank International
development and maintenance of their Azure Data Warehouse, and lead on the ongoing migration of newly acquired data sources. This will involve designing effective ETL processes, and implementing data integration solutions using tools including Azure Data Factory, Azure Databricks, Azure Data Lake, and Synapse Analytics. This role would be well more »
Sunderland, Tyne and Wear, Tyne & Wear, United Kingdom Hybrid / WFH Options
Nigel Frank International
development and maintenance of their Azure Data Warehouse, and lead on the ongoing migration of newly acquired data sources. This will involve designing effective ETL processes, and implementing data integration solutions using tools including Azure Data Factory, Azure Databricks, Azure Data Lake, and Synapse Analytics. This role would be well more »
Durham, County Durham, United Kingdom Hybrid / WFH Options
Nigel Frank International
development and maintenance of their Azure Data Warehouse, and lead on the ongoing migration of newly acquired data sources. This will involve designing effective ETL processes, and implementing data integration solutions using tools including Azure Data Factory, Azure Databricks, Azure Data Lake, and Synapse Analytics. This role would be well more »
state-of-the-art data platform to support the data needs of our rapidly growing company Design and implement scalable and efficient data pipelines, ETL processes, and data integration solutions to collect, process, and store large volumes of data within AWS Implement data transformation logic to cleanse, validate, and enrich more »
Chelmsford, Essex, United Kingdom Hybrid / WFH Options
Senitor Associates Ltd
strong data modelling experience (star schema/Kimball), SQL and Python experience Have a proven track record in data modelling, building out data infrastructure andETL pipelines Strong data visualisation experience in Power BI using DAX Be experienced using Azure data platform tools including Azure Data Lake, Azure Data Factory … semantic data models using Azure Synapse Analytics/Fabric, Spark notebooks. Ensure data model accuracy, scalability, and performance. Use PySpark within Azure notebooks to extract, transform, andload (ETL/ELT) data from raw formats (e.g. Delta, Parquet, CSV) stored in ADLS Gen2. Implement data transformation pipelines and workflows in more »
encryptions, etc.). In-depth knowledge and experience with Azure data storage (SQL Server, Data lake, Synapse, etc.) & access tools, APIs, cloud connectivity, andETL processes. Knowledge and some experience of MS Office/MS Office 365 suite, SharePoint Online, Power Apps, GitHub, MS Teams, etc. In-depth knowledge & experience more »
ultimately deliver data solutions including integrations, data transformations, warehousing and business intelligence reporting. The design and implementation of data ETL pipelines to extract, transformandload data from client and third-party data sources. Implementing and enforcing data governance policies and processes to ensure data is of the highest quality. … PostgreSQL, MongoDB, Cassandra or similar) Excellent SQL skills. Skills in developing data visualizations (e.g. Power BI) Strong programming skills, ideally in Python. Understanding of ETL pipeline development for data source integrations and data transformations. Understanding of data modelling to describe the data landscape, entities and relationships. Strong communication and teamwork more »
compliance with healthcare data regulations and security standards (GDPR, HIPAA). Technical Skills Required Strong proficiency in SQL, Python, and data modelling. Experience with ETL tools and cloud-based platforms (AWS, Google Cloud, or Azure). Familiarity with healthcare data formats (HL7, FHIR) and large-scale databases (e.g., PostgreSQL, NoSQL more »
Data Engineer - SAS/SQL/ETL * Unfortunately Sponsorship is not available with this position * Role – Data Engineer (SAS) Salary – Up to £65k DOE Location – Manchester or London (Hybrid) A well established and award-winning Data consultancy are looking to add a Data Engineer skilled in SAS to join their more »
SR3, New Silksworth, Sunderland, Tyne & Wear, United Kingdom Hybrid / WFH Options
Avanti Recruitment
in Azure services: SQL, Data Factory, Databricks, Synapse Analytics Strong SQL skills, including query optimization and database design Proven track record in data warehousing, ETL processes, and data modelling Azure certifications in Database Administration or Data Engineering Nice to have: Experience with Python/PySpark and MySQL on Azure Additional more »
offering to work with people in a global team all around from Oceania, Asia, Europe, & American continents. Projects You Will Work On: Data Warehousing ETL (Extract, Transform, Load) Processes Data Modelling and Database Management Data Pipeline Development Data Quality Assurance Big Data Technologies (e.g., Hadoop, Spark) Data Visualization Roles & Responsibilities … Collaborate with experienced data engineering professionals and global team members. Participate in designing and implementing data warehousing solutions. Develop and maintain ETL processes to ensure efficient data flow. Contribute to data modelling efforts for optimized database structures. Assist in building and maintaining data pipelines for real-time and batch processing. more »