Greater London, England, United Kingdom Hybrid / WFH Options
Morgan McKinley
productivity across teams and unlocking the full potential of data within the organisation. Responsibilities Maintain and optimise available data warehouse infrastructureDesign, implement and document ETL procedures for intake of new data from relevant sources employing industry standards and best practices; as well as ensure data is verified and quality checkedCollaborate more »
data modelling, schema design, and database optimization techniques Tech Skills: Familiarity with cloud-based data platforms and technologies. Good knowledge of data analysis, modelling, ETL processes, data warehousing, and core data infrastructure services. Exposure to technologies like Microsoft Synapse, Azure Data Factory, Databricks Notebooks, Python/PySpark, and similar tools. more »
encryptions, etc.). In-depth knowledge and experience with Azure data storage (SQL Server, Data lake, Synapse, etc.) & access tools, APIs, cloud connectivity, andETL processes. Knowledge and some experience of MS Office/MS Office 365 suite, SharePoint Online, Power Apps, GitHub, MS Teams, etc. In-depth knowledge & experience more »
Proficiency in Azure services (SQL Database, Data Lake Storage, Databricks, CosmosDB). Understanding of data modelling and governance principles. Experience with Python, Databricks, andETL tools. Familiarity with Spark, Kafka, and Azure service integration. Skills in Power BI or Azure Data Explorer for visualisation. Knowledge of data security and compliance. more »
legacy data to our new cloud-based data platform. Design and implement master and transactional data models to ensure data integrity and consistency. Develop ETL pipelines and cloud data storage solutions that meet business requirements and non-functional needs. Collaborate closely with data analysts, visualization engineers, and project managers to more »
with relational databases and NoSQL databases Significant experience and in-depth knowledge of creating data pipelines and associated design principles, standards, Data modelling concepts, ETL procedures, and all steps of data production process Experience with unit and integration testing, and data quality frameworks such as Deequ, Great Expectations or Delta more »
Livingston NJ, Livingston, Essex County, New Jersey
NexusJobs
encryptions, etc.). In-depth knowledge and experience with Azure data storage (SQL Server, Data lake, Synapse, etc.) & access tools, APIs, cloud connectivity, andETL processes. Knowledge and some experience of MS Office/MS Office 365 suite, SharePoint Online, Power Apps, GitHub, MS Teams, etc. In-depth knowledge & experience more »
London, England, United Kingdom Hybrid / WFH Options
Amplifi Capital
working in virtual teams and being in constant collaboration Ability to work unsupervised and take ownership for key services Nice to Have: Experience with ETL architectures and tools, including integration with APIs and coding Python data pipelines Team leadership skills for managing tasks and conducting stand-ups Knowledge of reference more »
Manchester Area, United Kingdom Hybrid / WFH Options
Airtime Rewards
Requirements Bachelor’s degree in Computer Science, a relevant technical field, or equivalent experience. Experience designing cloud data warehouse solutions, data modelling, and building ETL/ELT processes, preferably on GCP or equivalent platforms (AWS, Azure, Snowflake). Proficiency with SQL, Python, Docker, and Terraform (or similar IaC tools). more »
implementation of appropriate observability (monitoring and alerting) across our Data platforms. We’re looking for a Lead Data Engineer who has: Experience working with ETL/ELT tools, including Azure Data Factory and dbt, developing a wide variety of integration solutions incorporating APIs, files, databases, etc. Experience with database platforms more »
to structure data and make databases work efficiently. Tech Skills: Familiarity with cloud-based data platforms and technologies. Good knowledge of data analysis, modelling, ETL processes, data warehousing, and core data infrastructure services. Exposure to technologies like Microsoft Synapse, Azure Data Factory, Databricks Notebooks, Python/PySpark, and similar tools. more »
and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration andETL frameworks and tools. Understanding of DataOps, data mining, and data visualisation/BI tools. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. more »
encryptions, etc.). In-depth knowledge and experience with Azure data storage (SQL Server, Data lake, Synapse, etc.) & access tools, APIs, cloud connectivity, andETL processes. In-depth knowledge & experience using Visual Studio, with one of the programming languages C#/Java/JavaScript/Python, PowerShell, and Postman, SOAPUI more »
Abilities And Knowledge: Highly numerate with previous experience in delivering BI reporting solutions. Expert T-SQL skills are essential. Strong understanding of data warehousing, ETLand database concepts. Strong experience of Microsoft BI technologies. Experience in delivering complex reporting solutions. Technologies Essential: Microsoft stack SQL Server SSIS SSRS Power BI more »
demonstrate working on large engagements * Experience of AWS tools (e.g. Athena, Redshift, Glue, EMR) * Java, Scala, Python, Spark, SQL * Experience of developing enterprise grade ETL/ELT data pipelines. * Deep understanding of data manipulation/wrangling techniques * Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit more »
Abilities And Knowledge: Highly numerate with previous experience in delivering BI reporting solutions. Expert T-SQL skills are essential. Strong understanding of data warehousing, ETLand database concepts. Strong experience of Microsoft BI technologies. Experience in delivering complex reporting solutions. Technologies Essential: stack Server BI Studio A framework based upon more »
Brighton, England, United Kingdom Hybrid / WFH Options
Legal & General
impact on internal stakeholders and customers is understood. Qualifications What we’re looking for Proficient across many domains of data engineering, including ELT/ETL, metadata management, data integration, data management in transit and at rest and data streaming. Experience of Kimball modelling techniques to design databases & data warehousing solutions more »
PowerBI. Collaborate with stakeholders to understand reporting requirements and translate them into effective BI solutions. Data Integration and Modeling: Utilize strong SQL skills to extract, transform, andload (ETL) data from various sources into PowerBI and write complex measures in DAX Design and implement data models that align with business more »
of proven professional experience in Business Intelligence, Analytics or any other relevant technical field, preferably using Power BI Knowledge of data modelling, OLAP andETL frameworks Strong relational database and SQL experience (Microsoft SQL Server, T-SQL) Experience with data visualization tools (Tableau, Power BI or Qlik) Experience of working more »
and scalability. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. Extensive experience in designing and implementing data pipelines, ETL processes, and data warehousing solutions. Proficiency in cloud platforms such as Azure, with hands-on experience in data lakes, Databricks, and Synapse Analytics. Strong programming more »
data pipelines for data ingestion, processing, and transformation in Azure. Utilise Azure Data Factory, Azure Databricks and SAP Business Objects to create and maintain ETL operations. Deliver dashboard reporting and data sets which are interactive and user-friendly via PowerBI Identifying and integrating external data into the business data model more »
new technology • able to quickly learn new skills & technologies: sharing any insights gained with colleagues • technical testing experience testing data pipelines or batch andETL type processes • experience of working in Agile/SCRUM/KAN-BAN and TDD environments and teams • willing to listen to ideas of others andmore »
high-performance data systems that are foundational to driving business growth and success. Key responsibilities: Implementing scalable data architectures and systems. Developing and maintaining ETL (Extract, Transform, Load) pipelines. Managing data storage, backup, and recovery mechanisms. Writing complex SQL queries to extract data for analysis. Developing and implementing data security more »
working with relational databases Experience in data visualization best practices and storytelling with data Understanding of business intelligence concepts and data analytics Knowledge of ETL processes and data integration Experience with data warehousing and data governance Strong problem-solving and analytical skills Excellent communication and collaboration abilities Attention to detail more »
requirements into technical requirements & development experience using data discovery tools such as MS Power BI, QlikView, Tableau Ability to model andtransform data, build ETL solutions and present data in a useful business context Experience developing data warehouses & data marts using the Kimball methodology Experience or awareness of Big Data more »