sets to meet business and technical requirements. Process Improvement: Identify and implement process enhancements, automate manual tasks, and optimize data delivery. Data Integration: Build ETL infrastructure to ensure smooth data extraction, transformation, and loading. Collaboration: Work alongside stakeholders, including data scientists and analysts, to meet data infrastructure needs. Data Quality More ❯
Microsoft Fabric, including Lakehouse (Delta format), OneLake, Pipelines & Dataflows Gen2, Notebooks (PySpark), Power BI & Semantic Models. Possess a solid understanding of data integration patterns, ETL/ELT, and modern data architectures. Be familiar with CI/CD practices in a data engineering context. Have excellent SQL and Spark (PySpark) skills. More ❯
environments, and support data initiatives and future projects. Qualifications: Proficiency in Databricks, Python/PySpark, and SQL/SparkSQL. Experience with Big Data/ETL processes, preferably Spark and Databricks. Expertise in Azure cloud platform. Knowledge of version control systems, preferably Git. Experience with CI/CD pipelines. Knowledge of More ❯
environments, and support data initiatives and future projects. Qualifications: Proficiency in Databricks, Python/PySpark, and SQL/SparkSQL. Experience with Big Data/ETL processes, preferably Spark and Databricks. Expertise in Azure cloud platform. Knowledge of version control systems, preferably Git. Experience with CI/CD pipelines. Knowledge of More ❯
statistical methods (e.g. t-test, Chi-squared) - Experience with scripting language (e.g., Python, Java, or R) - Experience building and maintaining basic data artifacts (e.g., ETL, data models, queries) PREFERRED QUALIFICATIONS - Master's degree, or Advanced technical degree - Knowledge of data modeling and data pipeline design - Experience with AWS solutions such More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Smart DCC
organization to address their data needs, craft agile solutions, and continuously improve the data environment. What will you be doing? Design and implement efficient ETL processes for data extraction, transformation, and loading. Build real-time data processing pipelines using platforms like Apache Kafka or cloud-native tools. Optimize batch processing More ❯
Engineer: Advanced SQL expertise is essential for querying, transforming, and managing data within databases to support business insights. Proven experience in developing and optimising ETL/ELT pipelines, particularly with tools like DBT, ensures efficient data transformation and modelling. A strong understanding of data modelling techniques, including star and snowflake More ❯
Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations Use Azure Data Factory and Databricks to assemble large, complex data sets Implementing data validation and cleansing procedures will ensure the … 5+ years of experience, analytical problem-solving, and collaboration with cross-functional teams Azure DevOps Apache Spark, Python Strong SQL proficiency Data modeling understanding ETL processes, Azure Data Factory Azure Databricks knowledge Familiarity with data warehousing Big data technologies Data governance principles are a plus Overview: Infosys is a global More ❯
encryptions, etc.). In-depth knowledge and experience with Azure data storage (SQL Server, Data lake, Synapse, etc.) & access tools, APIs, cloud connectivity, andETL processes. Knowledge and some experience of MS Office/MS Office 365 suite, SharePoint Online, Power Apps, GitHub, MS Teams, etc. In-depth knowledge & experience More ❯
Eastleigh, Hampshire, United Kingdom Hybrid / WFH Options
OKA Direct Ltd
interactive Power BI dashboards, reports, and data visualizations that meet the needs of stakeholders across various departments. • Data Management & SQL Querying: Utilize SQL to extract, transform, andload (ETL) data from various sources into Power BI. Write complex SQL queries to analyze large datasets and ensure data integrity and accuracy. More ❯
your expertise could strengthen our growing data practice. As a data engineer/scientist, you will: Design, implement, and maintain scalable data pipelines andETL processes Develop and maintain data warehouses and data lakes Implement data quality monitoring and validation systems Create and maintain data documentation and cataloguing systems Optimize More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Actica Consulting
your expertise could strengthen our growing data practice. As a data engineer/scientist, you will: Design, implement, and maintain scalable data pipelines andETL processes Develop and maintain data warehouses and data lakes Implement data quality monitoring and validation systems Create and maintain data documentation and cataloguing systems Optimize More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Actica Consulting
your expertise could strengthen our growing data practice. As a data engineer/scientist, you will: Design, implement, and maintain scalable data pipelines andETL processes Develop and maintain data warehouses and data lakes Implement data quality monitoring and validation systems Create and maintain data documentation and cataloguing systems Optimize More ❯
a full cloud migration. 🔧 Key Responsibilities Manage and enhance cloud-based data infrastructure (Azure) across multiple international markets Own the development and performance of ETL pipelines, ensuring integrity and consistency Lead the migration of legacy environments into Azure Cloud Maintain and synchronise Dev, Test, and Production environments using DevOps principles More ❯
a full cloud migration. 🔧 Key Responsibilities Manage and enhance cloud-based data infrastructure (Azure) across multiple international markets Own the development and performance of ETL pipelines, ensuring integrity and consistency Lead the migration of legacy environments into Azure Cloud Maintain and synchronise Dev, Test, and Production environments using DevOps principles More ❯
a full cloud migration. 🔧 Key Responsibilities Manage and enhance cloud-based data infrastructure (Azure) across multiple international markets Own the development and performance of ETL pipelines, ensuring integrity and consistency Lead the migration of legacy environments into Azure Cloud Maintain and synchronise Dev, Test, and Production environments using DevOps principles More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Ignite Digital Search Limited
analytics models and web applications for internal stakeholders. Drive segmentation modelling projects, improving customer targeting and personalization strategies. Develop and refine data pipelines andETL processes, enabling efficient data integration into Azure Synapse & Fabric. Play a key role in cloud migration projects, supporting the organization's transition to Azure-based More ❯
Cheltenham, Gloucestershire, United Kingdom Hybrid / WFH Options
HipHopTune Media
reporting. Requirements Proven experience with PowerBI and data visualisation. Ideally 5 years plus. Strong knowledge of data warehousing principles. Strong SQL expertise, understanding of ETL, DAX, Power Query. Ability to maintain and enhance semantic models. Knowledge of dimensional data modeling (Kimball), Data warehousing concepts, OLAP. Excellent analytical and problem-solving More ❯
using Azure D&A stack, Databricks, and Azure Open AI. Proficiency in coding (Python, PL/SQL, Shell Script), relational and non-relational databases, ETL tools like Informatica, and scalable data platforms. Experience with Azure Data and Analytics stack; familiarity with AWS and GCP data solutions. Knowledge of deploying AI More ❯
Mars Wrigley Confectionery UK (SLO, WAL, ISB & PAD)
CCPA), and data security best practices. Proven experience in enterprise-level architecture design and implementation. Hands-on knowledge of database systems (SQL/NoSQL), ETL/ELT processes, and data modeling techniques. Exceptional leadership, communication, and stakeholder management skills. Ability to work in fast-paced, agile environments and balance long More ❯
Mars Wrigley Confectionery UK (SLO, WAL, ISB & PAD)
CCPA), and data security best practices. Proven experience in enterprise-level architecture design and implementation. Hands-on knowledge of database systems (SQL/NoSQL), ETL/ELT processes, and data modeling techniques. Exceptional leadership, communication, and stakeholder management skills. Ability to work in fast-paced, agile environments and balance long More ❯
Mars Wrigley Confectionery UK (SLO, WAL, ISB & PAD)
CCPA), and data security best practices. Proven experience in enterprise-level architecture design and implementation. Hands-on knowledge of database systems (SQL/NoSQL), ETL/ELT processes, and data modeling techniques. Exceptional leadership, communication, and stakeholder management skills. Ability to work in fast-paced, agile environments and balance long More ❯
plans What does Leidos need from me? Proven experience in front-end development with a focus on building data-driven applications. Experience working with ETL platforms Experience with Python and its numerical, data and machine learning libraries Experience of working in an agile software development environment Experience estimating task effort More ❯
data-driven decision-making and contribute to the success of our data-driven initiatives. Key Responsibilities: Data Integration: Develop and maintain data pipelines to extract, transform, andload (ETL) data from various sources into AWS data stores for both batch and streaming data ingestion. AWS Expertise: Utilize your expertise in … and deliver high-quality data solutions. Automation: Implement automation processes and best practices to streamline data workflows and reduce manual interventions. Must have: AWS, ETL, EMR, GLUE, Spark/Scala, Java, Python. Good to have: Cloudera - Spark, Hive, Impala, HDFS, Informatica PowerCenter, Informatica DQ/DG, Snowflake Erwin. Qualifications: Bachelor … working with AWS services. Proficiency in AWS services like S3, Glue, Redshift, Lambda, and EMR. Knowledge of Cloudera-based Hadoop is a plus. Strong ETL development skills and experience with data integration tools. Knowledge of data modeling, data warehousing, and data transformation techniques. Familiarity with data quality and data governance More ❯
charts and pivot tables), - Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages - Fluency in SQL andETL PREFERRED QUALIFICATIONS - Knowledge of data modeling and data pipeline design - Masters degree in Business, Engineering, Statistics, Computer Science, Data Science, Mathematics or related field - Experience More ❯