years of Excel or Tableau (data manipulation, macros, charts and pivot tables) experience - Experience defining requirements and using data and metrics to draw business insights - Experience with SQL or ETL - Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages - 1+ years of tax, finance or a related analytical field experience PREFERRED QUALIFICATIONS - Experience … in Amazon Redshift and other AWS technologies - Experience creating complex SQL queries joining multiple datasets, ETL DW concepts - Experience in SCALA and Pyspark Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or More ❯
world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. YOUR ROLE Assist in designing and developing ETL processes to extract, transform, andload data from various sources into data warehouses or data marts. Very good in Informatica development, setup and IDMC cloud migration Strong in writing SQL, joining between tables and compare … the table data Collaborate with team members to understand data requirements and translate them into technical specifications. Support the maintenance and enhancement of existing ETL processes to ensure data accuracy and reliability. Conduct data quality checks and troubleshoot issues related to ETL processes. Participate in code reviews and provide feedback to improve ETL processes and performance. Document ETL processes and … clear and comprehensive records. Learn and apply best practices in ETL development and data integration. Knowledge of scripting languages (Python, Shell scripting) is advantageous. Very good knowledge in Datawarehouse andETL concepts YOUR PROFILE Bachelor's degree in Computer Science, Information Technology, or a related field. Basic understanding of ETL concepts and processes. Familiarity with SQL and database concepts. Knowledge More ❯
/5 Year planning exercises for the group, Manage sprint planning for basic/advance analysis requests across stakeholders. • Interfacing with multiple technology and partner teams to extract, transform, andload data from a wide variety of data sources and ability to use a programming and/or scripting language to process data for analysis and modeling • Evolve organization wide … 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience writing complex SQL queries - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse … and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS - Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift - Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need More ❯
DSP Performance. - Manage and execute entire projects from start to finish including stakeholder management, data gathering and manipulation, modeling, problem solving, and communication of insights and recommendations. - Extract, transform, andload data from many data sources using SQL, Scripting and other ETL tools. - Design, build, and maintain automated reporting, dashboards, and ongoing analysis to enable data driven decisions across our … the "Request Informational" button on the job page. BASIC QUALIFICATIONS - 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience writing complex SQL queries - Experience using SQL to pull data from a database or data warehouse and … modeling - Experience in Statistical Analysis packages such as R, SAS and Matlab PREFERRED QUALIFICATIONS - Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift - Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets - Experience developing and presenting recommendations of new metrics allowing better understanding of the performance of the business More ❯
define metrics to measure and monitor programs, and most importantly work with different stakeholders to drive improvements over time. You will also work closely with internal business teams to extract or mine information from our existing systems to create new analysis, and expose data from our group to wider teams in intuitive ways. As a BIE embedded in the product … platform (software) teams. Successful candidates must thrive in fast-paced environments which encourage collaborative and creative problem solving, be able to measure and estimate risks, constructively critique peer research, extractand manipulate data across various data marts, and align research focuses on Amazon's strategic needs. A day in the life • Perform complex data analysis (correlations, regressions, simulations, optimization, etc. … years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process More ❯
Job Summary We are seeking a skilled and motivated Data Scientist to join our client. In this role, you will leverage your advanced analytical skills and programming expertise to extract insights from complex datasets, develop predictive models, and support decision-making for our diverse range of customers. As a mid-level contributor, you will work on a variety of data … Conduct A/B testing and experimental analysis to validate hypotheses. Data Management & Engineering: - Collaborate with data engineering teams to ensure data quality, accessibility, and efficiency. - Design and develop ETL pipelines and workflows for data preprocessing. - Develop automated tests to validate the processes and models you create. Collaboration & Communication: - Collaborate with stakeholders to define project goals, requirements, and deliverables. - Actively More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
and trends. Requirements: -3+ years of experience as a data engineer. -Strong proficiency in AWS data services such as S3, Glue, Lambda, and Redshift. -Experience with data modelling, ETL processes, and data warehousing concepts. -Proficiency in SQL and Python. Benefits: -Competitive salary, benefits package and discretionary bonus. -Opportunity to work on cutting-edge technology. -Career growth and development opportunities. More ❯
and trends. Requirements: -3+ years of experience as a data engineer. -Strong proficiency in AWS data services such as S3, Glue, Lambda, and Redshift. -Experience with data modelling, ETL processes, and data warehousing concepts. -Proficiency in SQL and Python. Benefits: -Competitive salary, benefits package and discretionary bonus. -Opportunity to work on cutting-edge technology. -Career growth and development opportunities. More ❯
Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
and trends. Requirements: -3+ years of experience as a data engineer. -Strong proficiency in AWS data services such as S3, Glue, Lambda, and Redshift. -Experience with data modelling, ETL processes, and data warehousing concepts. -Proficiency in SQL and Python. Benefits: -Competitive salary, benefits package and discretionary bonus. -Opportunity to work on cutting-edge technology. -Career growth and development opportunities. More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
and trends. Requirements: -3+ years of experience as a data engineer. -Strong proficiency in AWS data services such as S3, Glue, Lambda, and Redshift. -Experience with data modelling, ETL processes, and data warehousing concepts. -Proficiency in SQL and Python. Benefits: -Competitive salary, benefits package and discretionary bonus. -Opportunity to work on cutting-edge technology. -Career growth and development opportunities. More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Tenth Revolution Group
and trends. Requirements: -3+ years of experience as a data engineer. -Strong proficiency in AWS data services such as S3, Glue, Lambda, and Redshift. -Experience with data modelling, ETL processes, and data warehousing concepts. -Proficiency in SQL and Python. Benefits: -Competitive salary, benefits package and discretionary bonus. -Opportunity to work on cutting-edge technology. -Career growth and development opportunities. More ❯
as we continue to scale our business. This is an opportunity to join a fast moving team building out data infrastructure from scratch and establish best practice methodology for ETLand data consumption at a global company that is quickly growing. Key Responsibilities: Oversee a team of 2 ETL Developers to ensure all source systems in EcoOnline's landscape are … discretion when handling sensitive information Excellent written and spoken English Strong ability to manage multiple tasks and priorities at once Strong experience with Talend, SAP DataServices, Informatica, or similar ETL Tool. Bachelor's degree in business, Engineering, Computer Science or other related analytical or technical disciplines, or at least four (4) years related experience. 5-7 years of experience. Nice More ❯
priorities. Manage & implement the building of resilient Data Models and robust design of data marts to support reporting and analytical requirements. Manage the creation & development of end-to-end ETL routines using Pentaho, SQL, EDQ and PLSQL. Provide technical expertise to the team for Data Warehouse design, development, and implementation. Manage the embedding of best practice and continual improvement for … jobs into production. Identify areas of innovation in future technology changes. Manage the loading of data files & implement strategies that ensure business requirements are met. Manage the fixing of ETLload failures as a matter of priority to make sure that the DWH is up and running as soon as possible. Manage & implement strategies for the monitoring of all ETL … Data Warehouse capability. Mange & review the EDQ data quality outputs and inform Data Quality team accordingly to resolve any issues. Manage the planning of regular maintenance of DWH structure & ETL routines to prevent redundancy & keep the DWH running efficiently. Ensure the daily monitoring of Landing Database to ensure regular feed of external data from suppliers. Engage with Data Architecture team More ❯
and rapid onboarding of new events or features. Data Egestion: Develop and manage data pipelines for exporting curated datasets from Redshift to platforms like Salesforce and Gainsight using reverse ETL tools (e.g., Hightouch). Data Ingestion: Own end-to-end responsibility for ingesting key productivity data from platforms such as GitHub and JIRA into the data warehouse, enabling accurate internal … and Jinja for effective data modeling and transformation. Query performance optimization: Skilled at optimizing datasets for fast, efficient querying in enterprise-scale data warehouses such as AWS Redshift. Reverse ETLand integrations: Practical knowledge of reverse ETL tools like Hightouch for delivering curated datasets to systems including Salesforce and Gainsight. Event Data Management: Adept at working with product event data More ❯
Data Engineering Develop and maintain real-time data pipelines for processing large-scale data Ensure data quality and integrity in all stages of the data lifecycle Develop and maintain ETL processes for data ingestion and processing Algorithm Development, Model Training and Optimisation Design, develop, and implement advanced machine learning algorithms for fraud prevention and user personalization Train and fine-tune … Data Mining & Analysis Apply data mining techniques such as clustering, classification, regression, and anomaly detection to discover patterns and trends in large datasets. Analyze and preprocess large datasets to extract meaningful insights and features for model training Code Review and Documentation Conduct code reviews to ensure high-quality, scalable, and maintainable code Create comprehensive documentation for developed algorithms and models … machine learning frameworks such as TensorFlow, PyTorch, and scikit-learn. Data Engineering Skills: Proficiency in developing and maintaining real-time data pipelines for processing large-scale data. Experience with ETL processes for data ingestion and processing. Proficiency in Python and SQL. Experience with big data technologies like Apache Hadoop and Apache Spark. Familiarity with real-time data processing frameworks such More ❯
data and validate by profiling in a data environment Understand data structures and data model (dimensional & relational) concepts like Star schema or Fact & Dimension tables, to design and develop ETL patterns/mechanisms to ingest, analyse, validate, normalize and cleanse data Liaise with data/business SME to understand/confirm data requirements and obtain signoffs Implement data quality procedures More ❯
requirements. Required education None Preferred education Bachelor's Degree Required technical and professional expertise Design, construct, install, test, and maintain highly scalable data management systems. Develop data processes andETL pipelines to support analytics and reporting. Ensure data quality, reliability, and security across all systems. Collaborate with data scientists, analysts, and IT professionals to improve data reliability, efficiency, and quality. … professional experience Java (Programming Language): Proficient in Java programming language with a strong understanding of its ecosystems. General Data Engineering Skills: Demonstrated experience in data engineering, including data modelling, ETL processes, and data warehousing principles. SQL (Search Query Language for relational databases): Proficiency in writing complex queries and managing relational databases. Highly Desirable Skills: Python (Programming Language): Experience in Python More ❯
with respect to your focus area - Creates and maintains comprehensive business documentation including user stories, acceptance criteria, and process flows that help the BIE understand the context for developing ETL processes and visualization solutions. - Performs user acceptance testing and business validation of delivered dashboards and reports, ensuring that BIE-created solutions meet actual operational needs and can be effectively utilized … mindset and ability to see the big picture and influence others - Detail-oriented and must have an aptitude for solving unstructured problems. The role will require the ability to extract data from various sources and to design/construct/execute complex analyses to finally come up with data/reports that help solve the business problem - Good oral, written More ❯
ensure understanding and adherence to architectural and modeling standards. Stakeholder Engagement: Partner with the and Data Management team to drive the groups data strategy. Collaborate with business units to extract greater value from data assets. Engage with key stakeholders to identify technical opportunities for enhancing data product delivery. Provides consultative advice to business leaders and organizational stakeholders with ctionable recommendations … EA Sparx). Strong knowledge of SQL and NoSQL databases (e.g., MongoDB, Cassandra). Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud). Experience with data integration andETL tools (e.g., Talend, Informatica). Excellent analytical and technical skills. Excellent planning and organizational skills. Knowledge of all components of holistic enterprise architecture. What we offer: Colt is a growing More ❯
Easter Howgate, Midlothian, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
and maintaining data pipelines, data warehouses, and leveraging data services. Proficient in DataOps methodologies and tools, including experience with CI/CD pipelines, containerisation, and workflow orchestration. Familiar with ETL/ELT frameworks, and experienced with Big Data Processing Tools (e.g. Spark, Airflow, Hive, etc.) Knowledge of programming languages (e.g. Java, Python, SQL) Hands-on experience with SQL/NoSQL More ❯
Lambda), Snowflake , Databricks , and Reltio to design and implement scalable data solutions. Key Responsibilities: Lead data architecture design using AWS, Snowflake, Databricks, and Reltio. Manage data migration, data modeling, ETL processes, and data integration. Work with reporting tools such as Power BI or Tableau. Implement and maintain data governance frameworks. Ensure compliance with GDPR and other data privacy regulations. Maintain … plus. Required Skills & Experience: 15-20 years of experience in data architecture . Expertise in AWS , Snowflake , Databricks , and Reltio . Experience with data governance , data privacy regulations , andETL tools . Familiarity with reporting tools like Power BI and Tableau . Experience with motor fleet insurance is a plus. Additional Information: Remote work with occasional travel to Belgium More ❯
working closely with senior leadership. The team has already laid the foundations for a modern data platform using Azure and Databricks and is now focused on building out scalable ETL processes, integrating AI tools, and delivering bespoke analytics solutions across the organisation. THE ROLE As a Data Engineer, you'll play a pivotal role in designing and implementing robust data … This role combines hands-on development with collaborative architecture design, and offers the opportunity to contribute to AI readiness within a fast-paced business. KEY RESPONSIBILITIES Develop and maintain ETL pipelines, including manual and semi-manual data loads Connect and integrate diverse data sources across cloud platforms Collaborate with analytics and design teams to create bespoke, scalable data solutions Support … ready for future AI use cases REQUIRED SKILLS & EXPERIENCE Strong experience with Azure and Databricks environments Advanced Python skills for data engineering (pandas, PySpark) Proficiency in designing and maintaining ETL pipelines Experience with Terraform for infrastructure automation Track record of working on cloud migration projects, especially Azure to Databricks Comfortable working onsite in London 2 days/week and engaging More ❯
moving/transforming data across layers (Bronze, Silver, Gold) using ADF, Python, and PySpark. Must have hands-on experience with PySpark, Python, AWS, data modelling. Must have experience in ETL processes. Must have hands-on experience in Databricks development. Good to have experience in developing and maintaining data integrity and accuracy, data governance, and data security policies and procedures. Must More ❯
Your Mission: Design and build data integration solutions following technical standards. Develop application interfaces for data and analytics products. Work on big data projects, leveraging real-time technologies. Support ETL development and data transformations based on business requirements. Ensure data work aligns with governance and security policies. Collaborate with the wider data and analytics team. Maintain and troubleshoot production data … pipelines. Document data development using technical documents and DevOps tools. Skills & Experience: Solid experience with Azure data integration stack. Proficiency with SQL, ETL, Python, and APIs. Strong knowledge of relational databases (MySQL, SQL Server, PostgreSQL). Experience with NoSQL databases (e.g., Cosmos DB). Familiar with data integration methods (real-time, periodic, batch). Strong understanding of application interfaces (REST More ❯
SSIS) packages and new cloud-native solutions within a microservice and containerised architecture. Key accountabilities include: Developing near real-time integration services using cloud technologies. Building and optimising SSIS ETL packages. Selecting appropriate integration approaches (ETL/ELT) aligned to business outcomes. Collaborating across product and analytics teams to integrate with platforms such as Azure Data Lake and Synapse. Managing … through innovation, prototyping, and cross-functional collaboration. We are looking for a technically skilled and proactive professional with: Strong experience in Microsoft SQL, Azure Data Factory, Synapse, and related ETL/ELT tools. Proven capability in managing Azure-based integration services and data solutions. A solid understanding of microservices, data modelling, and DevOps principles. Experience in handling large, complex datasets More ❯