reporting on critical project data across the construction lifecycle. You will work closely with data analysts and IT to leverage the project data lake, ETL processes, and Power BI to deliver insightful reports and dashboards to stakeholders. Key Responsibilities: Implement and maintain reporting processes, including monthly reports Collaborate with PMO more »
the infrastructure that supports their cutting-edge trading strategies. Key Responsibilities: Design and manage scalable data pipelines. Integrate data from various sources and develop ETL processes. Optimize database systems for high performance. Work with data scientists and analysts to meet data needs. Automate data processing tasks to enhance efficiency. Implement more »
management of an Azure cloud-based data & analytics platform. Comprehensive experience in the complete data pipeline development process, encompassing data warehousing, data analytics, andETL processes. Profound understanding of best practices in data governance, privacy, and quality. Exceptional skills in communication, leadership, and team management. Capability to handle multiple tasks more »
Base Azure or AWS or GCP Python/PySpark Proficiency in SQL and/or similar data technologies Familiarity with data pipeline tools andETL processes Knowledge of cloud platforms and data architecture Additional Skills: Excellent communication and stakeholder management skills Ability to translate complex data into actionable insights. Strong more »
Hackney, Greater London, Shoreditch, United Kingdom
Talent Smart
We are seeking a highly skilled and motivated Data Engineer. The ideal candidate will have extensive experience in data engineering, particularly with Snowflake, ETL processes, and Power BI. The Data Engineer will be responsible for designing, developing, and maintaining our data infrastructure, ensuring seamless data integration and delivery of high … quality insights. Key Responsibilities: Data Pipeline Development: Design, build, and maintain efficient and reliable ETL processes to move data from various sources into our Snowflake data warehouse. Data Modelling: Develop and maintain data models and schemas to support business needs and ensure data integrity and quality. Data Integration: Integrate data … experience in data engineering or a related role. Proven experience with Snowflake data warehouse, including data loading, transformations, and performance tuning. Strong expertise in ETL tools and processes (e.g., Talend, Informatica, Apache Nifi, etc.). Experience with data visualization tools, particularly Power BI. Excellent problem-solving and analytical skills. Strong more »
capabilities and support our Finance business partners in their decision-making processes. Key responsibilities include: Design, build, and maintain efficient, reliable data pipelines using ETLand ELT processes. Ensure the seamless flow and availability of high-quality data across the organisation Use Snowflake for data storage, processing, and analytics. Optimise more »
of team, including ways of working, engineering principles, data governance and best practice. Become an SME on the design, development, and deployment of data ETL pipelines (using Azure Data Factory and other technologies) to access, combine andtransform data from on-prem and cloud-based sources. Ensure that all data more »
to work with a team of experts who are passionate about harnessing data to create impactful solutions. Key Responsibilities: • Design, develop, and maintain robust ETL (Extract, Transform, Load) processes to ensure efficient data flow and integration across various systems. • Utilize Python for data manipulation, transformation, and automation tasks. • Collaborate with … a Data Engineer or in a similar role. • Strong proficiency in Python and experience with relevant libraries (e.g., pandas, numpy). • Extensive experience with ETL tools and processes. • Familiarity with data warehousing concepts and technologies (e.g., Redshift, BigQuery, Snowflake). • Proficient in SQL and experience with relational databases (e.g., PostgreSQL more »
and communicate how our data inputs and outputs affect different groups and identify areas Janes can provide additional value. Requirements Experience architecting and developing ETL pipelines solutions for the ingestion, transformation, and serving of data, as well as solutions for the orchestration of pipeline components (e.g. AWS Step Functions, Apache more »
guide and motivate junior team members Deep understanding of the Microsoft technology stack, specifically PowerBI, Databricks & Azure Cloud Proficiency in SQL, strong understanding of ETLand ELT processes Strong communication skills, ability to translate complex messages into concise, easy-to-understand messages Strong story telling skills, ability to influence decision more »
Reporting andETL Developer, Dynamics 365, D365, D365 F&O, D365 Consultant, AX 2012, Dynamics AX, Microsoft power BI, SSRS, ETL, Reporting developer, SQL, Azure synapse, Data Lake, Lake House, Delta Lake, Data verse, DAX, BI reports, Paginated Report Builder, South London, Hybrid, £60-£75K Our end user client requires … a Reporting andETL Developer to join them on a permanent basis to work on their D365 FO implementation. As a Technical Resource, you will play a pivotal role in supporting analytics and reporting efforts within Dynamics 365. It will be a Hybrid role with a requirement to be onsite … Analytics and Reporting: Collaborating with cross-functional teams to identify reporting requirements, design dashboards, and generate insights that drive business decisions. SQL Experience for ETL: Designing and executing Extract, Transform, Load (ETL) processes using SQL to facilitate data integration and reporting. Azure Resource Procurement: Procuring resources in Azure Portal to more »
quality Business Intelligence products and solutions Good understanding of the Microsoft technology stack, specifically PowerBI, Databricks & Azure Cloud Proficiency in SQL, good understanding of ETLand ELT processes Strong communication skills, ability to translate complex messages into concise, easy-to-understand messages Strong story telling skills, ability to create a more »
and Machine Learning engineers and it is responsible for supporting data scientists in deploying, maintaining and monitoring an increasing number of Python-based microservices, ETL pipelines, SaaS models, databases and vector stores. The MLOps Lead would need to act as an interface between data scientists, the data & analytics team andmore »
be a SPOC for all technical discussions across industry groups. • Excellent design experience, with entrepreneurship skills to own and lead solutions for clients • Excellent ETL skills, Data Modeling Skills • Excellent communication skills • Ability to define the monitoring, alerting, deployment strategies for various services. • Experience providing solution for resiliency, fail over more »
knowledge in distributed systems, cloud architecture, and data pipelines. Proficiency in Python programming (knowledge of Scala or Rust is a plus). Familiarity with ETL principles in contemporary data applications (Dagster, Airflow, Perfect). Familiarity with AWS services such as Glue, Redshift, Athena, and S3. Proficiency with Terraform, Kubernetes, andmore »
Python and its data processing, analysis, and visualization libraries Experienced with SQL and Timeseries databases Skilled in AWS services: S3, EC2, RDS Knowledgeable in ETL tools like Airflow Proficient in Git, CI/CD, testing tools, and documentation best practices Adheres to quality engineering practices including TDD and BDD Nice more »
Devise practical solutions to business challenges, drawing from industry experience. Spearhead engineering collaboration with actuarial subject matter experts. Guide team members in developing robust ETL processes. Uphold exemplary coding and technical standards. Showcase technical prowess in Python, SQL, and JSON. Implement and oversee a robust CI/CD process. Enhance more »
a genuine impact on a growing business. Requirements: Strong experience with Python Strong SQL Experience working with REST Microservices Strong experience building and managing ETL Pipelines Exposure to modern cloud technologies - (Azure, GCP, AWS etc) Strong formal education - ideally in Computer Science If this sounds of interest, then please do more »
and Sell additional annual leave Funded Learning and development programmes The Successful Data Engineer will have: Experience in developing and designing data pipelines andETL processes Previous experience of data warehousing Utilise SAS for data reporting and analysis Skilled in SQL, Python, Git and Databricks In depth experience of reporting more »
Strong stakeholder management & communication skills Experience of working with structured and unstructured datasets Able to design a range of architectural solutions Data models andETL Data lake and data warehouse end to end architecture Experience within media, publishing, research, or a similar consumer focused industry is highly desirable, but not more »
with plans to also add GCP and on-prem. They are adding extensive usage of distributed compute on Spark, starting with their more complex ETLand advanced analytics functions, e.g. Time Series Processing. They soon plan to integrate other approaches, including native distributed PyTorch/Tensorflow, Spark-based distributor libraries more »
Provide estimates, work independently and meet deadlines Manage the releases and the related builds in each environment Perform development, testing, and support of data ETL/ELT programs (pipelines), using such tools as AWS Glue, Azure Data Factory, Informatica or similar ETL/ELT platforms, as well as knowledge of … learn the associated scheduling logic for data pipelines using scheduling tool such as Autosys or Airflow Read and translate logical data models and use ETL skills to load the physical layer, based on an understanding of timing of data loads, data transformation, and optimization of ETLload performance. Provide production more »
financial services. Strong leadership skills with a track record of successfully managing and developing high-performing teams. In-depth knowledge of data engineering concepts, ETL processes, and data warehouse architectures. Expertise in working with big data technologies and cloud platforms (preferably AWS or Azure). Familiarity with asset management industry more »
Financial Services domain knowledge Do you tick these 4 boxes? Must haves: Python coding experience (Pure Python) - JSON format Data modelling, data warehousing andETL frameworks SQL Our client have 2,500 lines of Python code for you to maintain and optimize. You'll need hands-on coding for this more »
Science, Engineering, or a related field. Strong Python development. Experience with Pandas is desirable. 3+ years of experience in data engineering. Experienced in building ETL data pipelines. Relational database experience w/PostgreSQL. Understanding of tech within our stack: AWS/Apache beam/Kafka. Experience with Object Orientated Programming more »