managing and mentoring data scientists or AI engineers is desirable. Expert proficiency in Python programming. Strong experience with data engineering principles and practices, including data extraction, transformation, and loading (ETL). Solid SQL/BQ skills for data querying and manipulation. Extensive experience with Google Cloud Platform (GCP) AI/ML services is highly desirable (e.g. Vertex AI, Cloud Functions More ❯
departments to understand their data needs and provide actionable insights and analysis that drives business decisions. You will have a good understanding of data quality, integrity, modelling, data warehousing, ETL processes and change management and the successful candidate must be proficient in Power Query, DAX and other Power BI tools. Equally, you must have a sound understanding of SQL language … data sources such as Azure MSSQL, MySQL, Azure Cosmos, Azure Data Lake, Azure OneLake within an enterprise environment. Experience utilising Azure Data Factory and its capabilities for orchestration of ETLand ELT processes. Desirable to have experience of other analytical tools such as Azure Databricks, Azure Synapse Analytics. Experience and knowledge of data governance and compliance including the Data Protection More ❯
Scunthorpe, Crosby, North Lincolnshire, Lincolnshire, United Kingdom
Ongo Recruitment
departments to understand their data needs and provide actionable insights and analysis that drives business decisions. You will have a good understanding of data quality, integrity, modelling, data warehousing, ETL processes and change management and the successful candidate must be proficient in Power Query, DAX and other Power BI tools. Equally, you must have a sound understanding of SQL language … data sources such as Azure MSSQL, MySQL, Azure Cosmos, Azure Data Lake, Azure OneLake within an enterprise environment. Experience utilising Azure Data Factory and its capabilities for orchestration of ETLand ELT processes. Desirable to have experience of other analytical tools such as Azure Databricks, Azure Synapse Analytics. Experience and knowledge of data governance and compliance including the Data Protection More ❯
Atherstone, Warwickshire, West Midlands, United Kingdom Hybrid / WFH Options
Aldi Stores
industry level best practices. Reporting to the Platform and Engineering Manager, the candidate will be required to design and manage data warehousing solutions, including the development of data models, ETL processes and data integration pipelines for efficient data consolidation, storage and retrieval, providing technical guidance and upskilling for the team, and conducting monitoring and optimisation activities. If youre a hardworking … winning employer, apply to join #TeamAldi today! Your New Role: Project Management of demands and initiatives Lead the design and implementation of data warehousing Design and develop data models, ETL processes and data integration pipelines Complete Data Engineering end-to-end ownership of demand delivery Provide technical guidance for team members Providing 2nd or 3rd level technical support About You More ❯
Tiverton, Devon, South West, United Kingdom Hybrid / WFH Options
Your Tech Future
and KPIsensuring consistency and accuracy across reporting. Optimising data models and queries for performance and scalability. Partnering with the Data Engineer on data modelling (Power BI tabular, SSAS) andETL requirements. Working with business analysts and stakeholders to translate requirements into impactful reporting solutions. Supporting user enablement: training, documentation, and promoting best practice to enable self-service analytics. Helping shape … querying and transforming data, including creating reports in Power BI Reporting Services. Excellent communication skills, able to work with both technical and non-technical stakeholders. In addition knowledge of ETL concepts, Python skills, exposure to CI/CD pipelines and knowledge of Microsoft Fabric and/or Dataverse would be advantageous. This is your chance to join a business where More ❯
Develop clean, responsive front-end interfaces using frameworks like Vue.js or React, to present complex datasets and user workflows. •Collaborate with data scientists and engineers to integrate ML models, ETL pipelines, and cloud-based data storage solutions. •Optimise system performance and reliability for high-volume data operations across distributed environments. •Implement secure, compliant and scalable data access layers, ensuring compliance … . •Experience working with containerised applications (e.g. Docker, Swarm or Kubernetes) in a Linux-based environment. •Solid understanding of RESTful API design, microservices architectures, and asynchronous workflows. •Familiarity with ETL processes, data warehousing and distributed systems. Not necessary for you apply, but would be great if you also have: •Experience with data visualisation tools such as Apache Superset, Jupytr or More ❯
Develop clean, responsive front-end interfaces using frameworks like Vue.js or React, to present complex datasets and user workflows. •Collaborate with data scientists and engineers to integrate ML models, ETL pipelines, and cloud-based data storage solutions. •Optimise system performance and reliability for high-volume data operations across distributed environments. •Implement secure, compliant and scalable data access layers, ensuring compliance … . •Experience working with containerised applications (e.g. Docker, Swarm or Kubernetes) in a Linux-based environment. •Solid understanding of RESTful API design, microservices architectures, and asynchronous workflows. •Familiarity with ETL processes, data warehousing and distributed systems. Not necessary for you apply, but would be great if you also have: •Experience with data visualisation tools such as Apache Superset, Jupytr or More ❯
Develop clean, responsive front-end interfaces using frameworks like Vue.js or React, to present complex datasets and user workflows. •Collaborate with data scientists and engineers to integrate ML models, ETL pipelines, and cloud-based data storage solutions. •Optimise system performance and reliability for high-volume data operations across distributed environments. •Implement secure, compliant and scalable data access layers, ensuring compliance … . •Experience working with containerised applications (e.g. Docker, Swarm or Kubernetes) in a Linux-based environment. •Solid understanding of RESTful API design, microservices architectures, and asynchronous workflows. •Familiarity with ETL processes, data warehousing and distributed systems. Not necessary for you apply, but would be great if you also have: •Experience with data visualisation tools such as Apache Superset, Jupytr or More ❯
It is personal to all of us.” – Julie Sweet, Accenture CEO Job Qualifications Key responsibilities Deploy machine learning models to production and implement measures to monitor their performance Implement ETL pipelines and orchestrate data flows using batch and streaming technologies based on software engineering best practice Define, document and iterate data mappings based on concepts and principles of data modelling … of different skills which include some of the below. Strong proficiency in Python Extensive experience with cloud platforms (AWS, GCP, or Azure) Experience with: Data warehousing and lake architectures ETL/ELT pipeline development SQL and NoSQL databases Distributed computing frameworks (Spark, Kinesis etc) Software development best practices including CI/CD, TDD and version control. Containerisation tools like Docker More ❯
reports and dashboards for operational and executive insights. Experience interpreting requirements and advising best approach to achieve desired outcomes. Experience using SQL, DAX and Power Query to extract, transformandload data. Advanced Excel skills to support data analysis and manipulation. Experience using a range of project management, portfolio management and digital collaboration tools such as Microsoft Project Online, the More ❯
learning, and business analytics. Skills and Experience Practical experience in coding languages (e.g., Python, R, Scala ) with a preference for Python. Strong proficiency in database technologies such as SQL, ETL, No-SQL, DW. Accenture is a global professional services company offering expertise in digital, cloud, and security solutions across various industries worldwide. More ❯
to specialise in one of the fastest-growing technology markets and collaborate with some of the most experienced leaders in the field. Responsibilities: Develop and maintain data pipelines andETL processes Analyse client challenges and design tailored solutions Collaborate on implementing AI and machine learning models Serve as a point of contact, providing consultative guidance and support Requirements: Proven programming More ❯
to specialise in one of the fastest-growing technology markets and collaborate with some of the most experienced leaders in the field. Responsibilities: Develop and maintain data pipelines andETL processes Analyse client challenges and design tailored solutions Collaborate on implementing AI and machine learning models Serve as a point of contact, providing consultative guidance and support Requirements: Proven programming More ❯
to specialise in one of the fastest-growing technology markets and collaborate with some of the most experienced leaders in the field. Responsibilities: Develop and maintain data pipelines andETL processes Analyse client challenges and design tailored solutions Collaborate on implementing AI and machine learning models Serve as a point of contact, providing consultative guidance and support Requirements: Proven programming More ❯
to specialise in one of the fastest-growing technology markets and collaborate with some of the most experienced leaders in the field. Responsibilities: Develop and maintain data pipelines andETL processes Analyse client challenges and design tailored solutions Collaborate on implementing AI and machine learning models Serve as a point of contact, providing consultative guidance and support Requirements: Proven programming More ❯
london (city of london), south east england, united kingdom
TechYard
to specialise in one of the fastest-growing technology markets and collaborate with some of the most experienced leaders in the field. Responsibilities: Develop and maintain data pipelines andETL processes Analyse client challenges and design tailored solutions Collaborate on implementing AI and machine learning models Serve as a point of contact, providing consultative guidance and support Requirements: Proven programming More ❯
the heart of a fast-paced trading environment, blending operational expertise with technical precision to ensure seamless trading execution and reliable market data delivery. Key Responsibilities Oversee and validate ETL pipelines to ensure accurate and timely pricing data using Python and SQL Bring trading systems online and provide Tier 1 and Tier 2 operational support across trading sessions Streamline, automate More ❯
the heart of a fast-paced trading environment, blending operational expertise with technical precision to ensure seamless trading execution and reliable market data delivery. Key Responsibilities Oversee and validate ETL pipelines to ensure accurate and timely pricing data using Python and SQL Bring trading systems online and provide Tier 1 and Tier 2 operational support across trading sessions Streamline, automate More ❯
the heart of a fast-paced trading environment, blending operational expertise with technical precision to ensure seamless trading execution and reliable market data delivery. Key Responsibilities Oversee and validate ETL pipelines to ensure accurate and timely pricing data using Python and SQL Bring trading systems online and provide Tier 1 and Tier 2 operational support across trading sessions Streamline, automate More ❯
the heart of a fast-paced trading environment, blending operational expertise with technical precision to ensure seamless trading execution and reliable market data delivery. Key Responsibilities Oversee and validate ETL pipelines to ensure accurate and timely pricing data using Python and SQL Bring trading systems online and provide Tier 1 and Tier 2 operational support across trading sessions Streamline, automate More ❯
london (city of london), south east england, united kingdom
Bonhill Partners
the heart of a fast-paced trading environment, blending operational expertise with technical precision to ensure seamless trading execution and reliable market data delivery. Key Responsibilities Oversee and validate ETL pipelines to ensure accurate and timely pricing data using Python and SQL Bring trading systems online and provide Tier 1 and Tier 2 operational support across trading sessions Streamline, automate More ❯
integration design and development. Strong working knowledge of IT strategy and architecture, Agile methodologies, and related tooling and processes. In-depth understanding of SOA, REST principles, API management, MFT, ETL, and messaging concepts. Broad technical experience, including security, data management, cloud solutions, and preferably some exposure to J2EE, XML, XSLT, or web services. Proven ability to generate new business development More ❯
takes pride in engineering fundamentals, thrives in a small team, and wants to see the commercial impact of their work. Key Responsibilities Develop, maintain, and enhance data pipelines andETL processes using Python and SQL. Manage and integrate API connections and FTP data feeds into internal systems. Build and support dashboards and reports to provide visibility across trading and operations. … Maths, Physics). 2–4 years’ experience in a technical, data, or engineering-focused role. Strong skills in Python, SQL, and Excel/VBA . Experience building or maintaining ETL/data pipelines , particularly around APIs or FTP processes. Working knowledge of Microsoft Azure and Git . Excellent analytical, communication, and problem-solving skills. A proactive, curious mindset and a More ❯
takes pride in engineering fundamentals, thrives in a small team, and wants to see the commercial impact of their work. Key Responsibilities Develop, maintain, and enhance data pipelines andETL processes using Python and SQL. Manage and integrate API connections and FTP data feeds into internal systems. Build and support dashboards and reports to provide visibility across trading and operations. … Maths, Physics). 2–4 years’ experience in a technical, data, or engineering-focused role. Strong skills in Python, SQL, and Excel/VBA . Experience building or maintaining ETL/data pipelines , particularly around APIs or FTP processes. Working knowledge of Microsoft Azure and Git . Excellent analytical, communication, and problem-solving skills. A proactive, curious mindset and a More ❯
takes pride in engineering fundamentals, thrives in a small team, and wants to see the commercial impact of their work. Key Responsibilities Develop, maintain, and enhance data pipelines andETL processes using Python and SQL. Manage and integrate API connections and FTP data feeds into internal systems. Build and support dashboards and reports to provide visibility across trading and operations. … Maths, Physics). 2–4 years’ experience in a technical, data, or engineering-focused role. Strong skills in Python, SQL, and Excel/VBA . Experience building or maintaining ETL/data pipelines , particularly around APIs or FTP processes. Working knowledge of Microsoft Azure and Git . Excellent analytical, communication, and problem-solving skills. A proactive, curious mindset and a More ❯