demonstrate working on large engagements * Experience of AWS tools (e.g. Athena, Redshift, Glue, EMR) * Java, Scala, Python, Spark, SQL * Experience of developing enterprise grade ETL/ELT data pipelines. * Deep understanding of data manipulation/wrangling techniques * Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit more »
PowerBI. Collaborate with stakeholders to understand reporting requirements and translate them into effective BI solutions. Data Integration and Modeling: Utilize strong SQL skills to extract, transform, andload (ETL) data from various sources into PowerBI and write complex measures in DAX Design and implement data models that align with business more »
open-source data engineering and scientific Python toolset. Our tech stack includes Airbyte for data ingestion, Prefect for pipeline orchestration, AWS Glue for managed ETL, along with Pandas and PySpark for pipeline logic implementation. We utilize Delta Lake and PostgreSQL for data storage, emphasizing the importance of data integrity andmore »
of proven professional experience in Business Intelligence, Analytics or any other relevant technical field, preferably using Power BI Knowledge of data modelling, OLAP andETL frameworks Strong relational database and SQL experience (Microsoft SQL Server, T-SQL) Experience with data visualization tools (Tableau, Power BI or Qlik) Experience of working more »
with programming languages such as Python or R is a plus. Knowledge of statistical analysis techniques and methodologies. Familiarity with data warehousing concepts andETL processes. more »
and scalability. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. Extensive experience in designing and implementing data pipelines, ETL processes, and data warehousing solutions. Proficiency in cloud platforms such as Azure, with hands-on experience in data lakes, Databricks, and Synapse Analytics. Strong programming more »
with programming languages such as Python or R is a plus. Knowledge of statistical analysis techniques and methodologies. Familiarity with data warehousing concepts andETL processes. You will be working directly with our Senior BI Analyst and the wider team including, web and DBA developers, system accountants, project managers andmore »
data pipelines for data ingestion, processing, and transformation in Azure. Utilise Azure Data Factory, Azure Databricks and SAP Business Objects to create and maintain ETL operations. Deliver dashboard reporting and data sets which are interactive and user-friendly via PowerBI Identifying and integrating external data into the business data model more »
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Anson McCade
consulting firm. Proficiency in Azure data services such as Azure SQL Database, Azure Data Factory, Azure Databricks, Azure Synapse Analytics, etc. Strong understanding of ETL processes, data modeling, and data warehousing principles. Experience with programming languages such as SQL and Python. Familiarity with data visualization tools such as Power BI more »
the different data architecture patterns: Data Fabric, Data Mesh, Data Warehouse, Data Marts, data modeling, ontologies & knowledge graphs, MicroServices • You have experience in implementing ETL data flows and data pipelines, and know one or more of the following tools: Informatica PowerCenter, SAS Data Integration Studio, Microsoft SSIS, Ab Initio, etc. more »
various teams to understand data requirements and implement solutions. > Optimizing data workflows and processes to enhance data quality, reliability, and performance. > Developing and managing ETL processes for data ingestion, processing, and transformation. > Implementing data governance practices to ensure data integrity, security, and compliance. > Monitoring and troubleshooting data infrastructure to address more »
London, England, United Kingdom Hybrid / WFH Options
Ripple Labs Inc
implement the key payment data models for various analytics, ML/AI solutions, and data-centric product features, which includes building batch/stream ETL pipelines for payment golden datasets, creating unified data monitoring and alerting system for operation excellence, and setting up the standard for payment data governance. Successful more »
London, England, United Kingdom Hybrid / WFH Options
Aventum Group
responsible for accessing, validating, and querying data from various repositories using available tools. Build and maintain data integration processes using SQL Services and other ETL/ELT processes and scripting tools as well as ongoing requests and projects related to the data warehouse, MI, or fast-moving financial data. Designing … Architecting, building, testing, and maintaining data platform. Develop and support a wide range of data transformations and migrations for the whole business. Construct custom ETL processes: Design and implement data pipelines, data marts and schemas, access versatile data sources and apply data quality measures. Monitoring the complete process and applying … ML is a plus Experience with Azure SQL Database, Cosmos DB, NoSQL, MongoDB Experience with Agile, DevOps methodologies Awareness and knowledge of ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (Public Key Infrastructure) and Integration testing Skills and Abilities Knowledge of Python, SQL, SSIS, andmore »
high-performance data systems that are foundational to driving business growth and success. Key responsibilities: Implementing scalable data architectures and systems. Developing and maintaining ETL (Extract, Transform, Load) pipelines. Managing data storage, backup, and recovery mechanisms. Writing complex SQL queries to extract data for analysis. Developing and implementing data security more »
requirements into technical requirements & development experience using data discovery tools such as MS Power BI, QlikView, Tableau Ability to model andtransform data, build ETL solutions and present data in a useful business context Experience developing data warehouses & data marts using the Kimball methodology Experience or awareness of Big Data more »
Technologies such as Docker and orchestration tools like Kubernetes for containerized deployments. Workflow management tools such as Airflow for orchestrating complex data pipelines andETL processes. Certifications in Azure cloud services and data engineering technologies, demonstrating expertise and proficiency in the Azure ecosystem. Rewards & Benefits: TCS is consistently voted a more »
platform in AWS. Utilise Agile methodologies to manage the development lifecycle. Design and implement data pipelines, ensuring data quality and integrity. Requirements: Proficiency in ETL tools such as SQL, SSIS, AWS, Lambda, and Python. Strong understanding of data warehouse concepts. Excellent problem-solving and communication skills. Benefits: Competitive salary andmore »
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Clearwater People Solutions
Databricks, Power BI Strong in delivering solutions at scale, on time and within budget. Strong background in data warehousing, data architecture, data modelling, integration, ETL/ELT processes Please apply as directed more »
Databricks, Power BI Strong in delivering solutions at scale, on time and within budget. Strong background in data warehousing, data architecture, data modelling, integration, ETL/ELT processes Please apply as directed more »
/Developer to lead the transition of databases from SQL Server 2012 to SQL Server 2019. If you are passionate about data warehouse design, ETL methods, and Agile projects, we have the perfect opportunity for you to showcase your skills and contribute to our ongoing success. Our client are a more »
Cheltenham, Gloucestershire, United Kingdom Hybrid / WFH Options
Third Nexus Group Limited
client sites. We're seeking a Data Engineer with a focus on Azure, showcasing proficiency in: · Leveraging Azure cloud technologies for tasks such as ETL pipeline development, data warehousing, data lake creation, and data movement. · Utilizing Azure data and analytics services, including but not limited to Azure Data Factory, Azure more »
pipelines, security and networking Expertise with data warehousing, data lakes and data lake houses Experience with master data management software and technologies Experience with ETL technologies including SQL Server, SSIS, Azure Data Factory Working knowledge of agile development, CI/CD, test and data automation Working knowledge or experience of more »
data from various sources into our data warehouse and data lakes. Uphold data integrity, quality, and consistency throughout the entire process. Create and enhance ETL/ELT workflows capable of handling substantial data volumes. Collaborate with data analysts, data scientists, and other stakeholders to comprehend their data needs. Develop andmore »
support organizational growth. Essential Requirements: Minimum 5 years of experience in a similar role. Proven track record in designing and building data infrastructure andETL pipelines. Proficiency in Azure Platform, including Data Lake, Data Factory, Synapse, Logic Apps, and Function Apps. SQL Server, including Store Procedures, T-SQL, or similar more »
Sheffield, England, United Kingdom Hybrid / WFH Options
Undisclosed
concepts, including infrastructure as code, serverless computing, and containerization. Expertise in designing scalable and resilient architectures for data-intensive applications. Familiarity with data modeling, ETL processes, and data integration techniques. Proficiency in programming languages commonly used in cloud environments (e.g., Python, Java). Strong knowledge of security principles and best more »