Coventry, England, United Kingdom Hybrid / WFH Options
Berkeley Square IT
ETL Data Engineer - Talend - 100% remote - Outside IR35 Job Description: My client is seeking an experienced data engineer with expertise in using Talend for ETL processes and Data Quality. This role is 100% remote and sits Outside of IR35. Must have technologies: Experience in an ETL toolset (Talend, Pentaho, SAS DI, Informatica, etc.) Snowflake Experience in a Database (Oracle, RDS More ❯
Technology, Data Management, or a related field. 2. Experience a. Proven experience in data architecture, migrations, or conversions. b. Hands-on experience with data integration tools and platforms (e.g., Talend, Informatica, MuleSoft). c. Experience with Infor Cloudsuite Applications (M3, LN, Lawson, FSM, GHR) 3. Technical Skills a. Strong knowledge of data modeling and relational databases (e.g., SQL, Oracle, MySQL More ❯
experience. Awareness of industry standards, regulations, and developments. Ideally, you’ll also have: Experience with Relational Databases and Data Warehousing concepts. Experience with enterprise ETL tools such as Informatica, Talend, DataStage, or Alteryx. Experience with Hadoop, Spark, Scala, Oracle, Pega, Salesforce. Cross-platform experience. Financial services sector experience. Line management, team building, mentoring, and staff development experience. You must be More ❯
proactive awareness of industry standards, regulations, and developments. Ideally, you'll also have: Experience of Relational Databases and Data Warehousing concepts. Experience of Enterprise ETL tools such as Informatica, Talend, Datastage or Alteryx. Project experience using the any of the following technologies: Hadoop, Spark, Scala, Oracle, Pega, Salesforce. Cross and multi-platform experience. Team building and leading. You must be More ❯
services (e.g., Amazon RDS, Azure SQL, or Google Cloud SQL). Solid understanding of data modeling techniques and normalization standards. Experience with data integration tools (e.g., SSIS, Informatica, or Talend) and scripting (e.g., Python, Shell). Understanding of database security best practices and regulatory requirements (e.g., Experience with NoSQL or big data technologies (e.g., MongoDB, Cassandra, Hadoop). Qualifications: Familiarity More ❯
conceptual, logical, physical), metadata management, and master data management (MDM). Deep understanding of data integration, transformation, and ingestion techniques using modern tools (e.g., Azure Data Factory, Boomi, Informatica, Talend, dbt, Apache NiFi). Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms. Hands-on experience with SQL and cloud data platforms (e.g., Snowflake More ❯
complex and disparate data sets and communicate clearly with stakeholders Hands-on experience with cloud platforms such as AWS, Azure, or GCP Familiarity with traditional ETL tools (e.g., Informatica, Talend, Pentaho, DataStage) and data warehousing concepts Strong understanding of data security, compliance , and governance best practices Experience leading or influencing cross-functional teams in a product or platform environment Strong More ❯
modeling (conceptual, logical, physical), metadata management, and master data management (MDM) Deep understanding of data integration, transformation, and ingestion techniques using modern tools (e.g., Azure Data Factory, Boomi, Informatica, Talend, dbt, Apache NiFi) Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure More ❯
BigQuery, Snowflake, or Azure Synapse Analytics, including data modelling and ETL processes. ETL Processes: Proficient in designing and implementing ETL (Extract, Transform, Load) processes using tools like Apache NiFi, Talend, or custom scripts. Familiarity with ELT (Extract, Load, Transform) processes is a plus. Big Data Technologies : Familiarity with big data frameworks such as Apache Hadoop and Apache Spark, including experience More ❯
Cassandra. Big Data Ecosystems: Hadoop, Spark, Hive, and HBase. Data Integration & ETL: Data Pipelining Tools: Apache NiFi, Apache Kafka, and Apache Flink. ETL Tools: AWS Glue, Azure Data Factory, Talend, and Apache Airflow. AI & Machine Learning: Frameworks: TensorFlow, PyTorch, Scikit-learn, Keras, and MXNet. AI Services: AWS SageMaker, Azure Machine Learning, Google AI Platform. DevOps & Infrastructure as Code: Containerization: Docker More ❯
product teams. We seek passionate testers with expertise Leading team, designing, developing, and maintaining data quality solutions. The ideal candidate should have strong expertise in ETL framework testing (preferably Talend or DataStage), BI report testing (preferably Power BI, Cognos), cloud technologies (preferably Azure, Databricks), SQL/PLSQL coding, and Unix/Python scripting. Key Responsibilities Lead and mentor a team … validation using advanced SQL, stored procedures, and database design principles. Utilize Unix/Python scripting for data validation, and process automation. Contribute to developing ETL solutions using tools like Talend or DataStage OR building stored procedures, scheduling workflows, and integrating data pipelines. Additionally, develop BI reports using tools like Power BI or Cognos. Implement best practices in data governance, compliance … a related field. 5+ year of experience in leading test teams, mentoring engineers, and driving test strategies. 5+ year of experience in data testing using ETL tool experience like Talend, DataStage, or equivalent. 5+ year of experience is automated & Data Quality testing using tools like TOSCA, ICEDQ or equivalent frameworks. 5+ year Advanced SQL/PL SQL experience (Snowflake a More ❯
Factory), Azure DB, Azure Synapse, Azure Data lake and Azure Monitor providing added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled in security More ❯
Sand Technologies is a fast-growing enterprise AI company that solves real-world problems for large blue-chip companies and governments worldwide. We're pioneers of meaningful AI : our solutions go far beyond chatbots. We are using data and AI More ❯
Sand Technologies is a fast-growing enterprise AI company that solves real-world problems for large blue-chip companies and governments worldwide. We're pioneers of meaningful AI : our solutions go far beyond chatbots. We are using data and AI More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
InterEx Group
tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra, Mongo DB), ELK, Kafka, Snowflake, etc. Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.) Previous involvement in working in a Multilanguage and multicultural environment Proactive, tech passionate and highly motivated Desirable requirements: Experience in Data analysis and visualization solutions: Microstrategy, Qlik, PowerBI, Tableau More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Snap Analytics
schemas and snowflake schemas, ensuring performance and ease of use for the client. You’ll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or Talend, adhering to data engineering best practices and incorporating things such as metadata driven ETL approaches and near real-time data platform architectures. Innovation & Continuous Improvement You’ll stay updated on … data warehousing for enterprise organisations. Proven experience designing data architectures on platforms like AWS, Azure, or GCP. Technical Skills Extensive experience with ETL/ELT tools (e.g. Matillion, Informatica, Talend) and cloud data platforms (e.g., Snowflake, Databricks, BigQuery). Expertise in data modelling techniques (e.g., star schema, snowflake schema) and optimising models for analytics and reporting. Familiarity with version control More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Capgemini
Excel, queues, and topics. Proficiency in SQL queries, functions, and procedures across Big Data platforms, Oracle, SQL Server, ERP solutions, and cloud providers. Useful experience with tools like Informatica, Talend, DataStage, or similar. Your Security Clearance Obtaining Security Check (SC) clearance is required, which involves residency in the UK for the last 5 years and other criteria. The process includes More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
myGwork - LGBTQ+ Business Community
The Job You're Considering The Data Management practice within the Insights and Data business unit of Capgemini is a global practice involved in a broad range of business and IT focused topics from Information Strategy, Governance, Master Data Management More ❯
Job Description We are looking for a skilled ETL Developer with hands-on experience in Talend, Python, and Spark, to join data engineering team. The ideal candidate will be responsible for designing, building, and maintaining ETL pipelines that support data extraction, transformation, and loading from various sources into target systems. Key Responsibilities: · Design, build, and maintain ETL workflows using the … Talend ETL toolset. · Develop ETL solutions for extracting and transforming data from various sources such as Cloudera, PostgreSQL, and SQL Server. · Create and manage database schemas, tables, and constraints based on business requirements. · Collaborate with cross-functional teams to understand source systems and ensure accurate data mapping and transformation. · Write transformation logic using ETL tools or scripting languages like SQL … standards. · Contribute to data quality improvement initiatives and proactively resolve data inconsistencies. · Participate in troubleshooting and performance tuning of ETL jobs and workflows. Required Skills & Qualifications: · Proven experience with Talend, Python, and Apache Spark. · Strong understanding of relational databases and Big Data ecosystems (Hive, Impala, HDFS). · Solid experience in data warehousing and data modelling techniques. · Familiarity with data quality More ❯
techniques and data pipeline design patterns and behaviours. Experience with pipeline management and orchestration tools such as Airflow. Experience with low/no-code pipeline development tools such as Talend or SnapLogic. Experience developing data pipelines using cloud services (AWS preferred) like Lambda, S3, Redshift, Glue, Athena, Secrets Manager or equivalent services. Experience of working with APIs for data extraction More ❯
Join to apply for the Core Data Engineer role at Optima Partners Join to apply for the Core Data Engineer role at Optima Partners The Role As a Core Data Engineer, you will play a pivotal role in designing, building More ❯
Basildon, England, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
an excellent communicator, possess strong analytical skills with an ability to understand business requirements and identify the most appropriate BI solution. Skills and experience is essential in the following: Talend Studio 8 ETL developer - including TMS or TAC cloud management Proficient SQL Skills using MS SQL Server management tool. Proficient in Power BI with Experience in tools MS SQL Server … BI stack, including power query, PowerBI and DAX. To design, develop and deploy Talend ETL datasets, transformations and data loading to MSSQL server. Understand and convert business requirements into technical specifications. Additional optional skills : SAP BO/SAP BI (Web Intelligence, IDT, Design Studio, SAP Lumira) SAP Data Services SAP BW Oracle SQL server Management You will possess the following More ❯
schemas and snowflake schemas, ensuring performance and ease of use for the client. You'll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or Talend, adhering to data engineering best practices and incorporating things such as metadata driven ETL approaches and near real-time data platform architectures. Innovation & Continuous Improvement You'll stay updated on … data warehousing for enterprise organisations. Proven experience designing data architectures on platforms like AWS, Azure, or GCP. Technical Skills Extensive experience with ETL/ELT tools (e.g. Matillion, Informatica, Talend) and cloud data platforms (e.g., Snowflake, Databricks, BigQuery). Expertise in data modelling techniques (e.g., star schema, snowflake schema) and optimising models for analytics and reporting. Familiarity with version control More ❯
communicator with experience writing technical documentation Comfortable working with Agile, TOGAF, or similar frameworks Desirable Experience with Python and data libraries (Pandas, Scikit-learn) Knowledge of ETL tools (Airflow, Talend, NiFi) Familiarity with analytics platforms (SAS, Posit) Prior work in high-performance or large-scale data environments Why Join? This is more than a job, it's an opportunity to More ❯
Preferred Qualifications · Azure certifications such as Microsoft Certified: Azure Data Engineer Associate or Microsoft Certified: Azure Solutions Architect Expert. · Experience with other data integration tools like SSIS, Informatica, or Talend (optional but beneficial). · Knowledge of big data technologies such as Databricks. Seniority level Seniority level Mid-Senior level Employment type Employment type Full-time Job function Job function Engineering More ❯