and data transformation.Strong experience in managing and leading data engineering teams.Strong proficiency in Scala, Python, SQL, Snowflake and DBT.In-depth understanding of data modelling, ETL processes, and data warehousing concepts.Experience with cloud-based data platforms (e.g., AWS, Azure, GCP) and containerisation technologies (e.g., Docker, Kubernetes) is a plus.Excellent problem-solving more »
to work with a team of experts who are passionate about harnessing data to create impactful solutions. Key Responsibilities: Design, develop, and maintain robust ETL (Extract, Transform, Load) processes to ensure efficient data flow and integration across various systems. Utilize Python for data manipulation, transformation, and automation tasks. Collaborate with … a Data Engineer or in a similar role. Strong proficiency in Python and experience with relevant libraries (e.g., pandas, numpy). Extensive experience with ETL tools and processes. Familiarity with data warehousing concepts and technologies (e.g., Redshift, BigQuery, Snowflake). Proficient in SQL and experience with relational databases (e.g., PostgreSQL more »
and communicate how our data inputs and outputs affect different groups and identify areas Janes can provide additional value. Requirements Experience architecting and developing ETL pipelines solutions for the ingestion, transformation, and serving of data, as well as solutions for the orchestration of pipeline components (e.g. AWS Step Functions, Apache more »
guide and motivate junior team members Deep understanding of the Microsoft technology stack, specifically PowerBI, Databricks & Azure Cloud Proficiency in SQL, strong understanding of ETLand ELT processes Strong communication skills, ability to translate complex messages into concise, easy-to-understand messages Strong story telling skills, ability to influence decision more »
Job details: Cloud technologies (preferably GCP/Azure/AWS) Python programming ETL, ELT development in classical DWHs (Oracle, DB2, Teradata) PL/SQL, data modeling Knowledge of Oracle 12c, TMSP AAX² Development in Microsoft Power BI environment, MS-BI suite CI/CD with GitLab Active communication in English more »
Reporting andETL Developer, Dynamics 365, D365, D365 F&O, D365 Consultant, AX 2012, Dynamics AX, Microsoft power BI, SSRS, ETL, Reporting developer, SQL, Azure synapse, Data Lake, Lake House, Delta Lake, Data verse, DAX, BI reports, Paginated Report Builder, South London, Hybrid, £60-£75K Our end user client requires … a Reporting andETL Developer to join them on a permanent basis to work on their D365 FO implementation. As a Technical Resource, you will play a pivotal role in supporting analytics and reporting efforts within Dynamics 365. It will be a Hybrid role with a requirement to be onsite … Analytics and Reporting: Collaborating with cross-functional teams to identify reporting requirements, design dashboards, and generate insights that drive business decisions. SQL Experience for ETL: Designing and executing Extract, Transform, Load (ETL) processes using SQL to facilitate data integration and reporting. Azure Resource Procurement: Procuring resources in Azure Portal to more »
quality Business Intelligence products and solutions Good understanding of the Microsoft technology stack, specifically PowerBI, Databricks & Azure Cloud Proficiency in SQL, good understanding of ETLand ELT processes Strong communication skills, ability to translate complex messages into concise, easy-to-understand messages Strong story telling skills, ability to create a more »
and Machine Learning engineers and it is responsible for supporting data scientists in deploying, maintaining and monitoring an increasing number of Python-based microservices, ETL pipelines, SaaS models, databases and vector stores. The MLOps Lead would need to act as an interface between data scientists, the data & analytics team andmore »
be a SPOC for all technical discussions across industry groups. • Excellent design experience, with entrepreneurship skills to own and lead solutions for clients • Excellent ETL skills, Data Modeling Skills • Excellent communication skills • Ability to define the monitoring, alerting, deployment strategies for various services. • Experience providing solution for resiliency, fail over more »
knowledge in distributed systems, cloud architecture, and data pipelines. Proficiency in Python programming (knowledge of Scala or Rust is a plus). Familiarity with ETL principles in contemporary data applications (Dagster, Airflow, Perfect). Familiarity with AWS services such as Glue, Redshift, Athena, and S3. Proficiency with Terraform, Kubernetes, andmore »
Python and its data processing, analysis, and visualization libraries Experienced with SQL and Timeseries databases Skilled in AWS services: S3, EC2, RDS Knowledgeable in ETL tools like Airflow Proficient in Git, CI/CD, testing tools, and documentation best practices Adheres to quality engineering practices including TDD and BDD Nice more »
and Sell additional annual leave Funded Learning and development programmes The Successful Data Engineer will have: Experience in developing and designing data pipelines andETL processes Previous experience of data warehousing Utilise SAS for data reporting and analysis Skilled in SQL, Python, Git and Databricks In depth experience of reporting more »
Strong stakeholder management & communication skills Experience of working with structured and unstructured datasets Able to design a range of architectural solutions Data models andETL Data lake and data warehouse end to end architecture Experience within media, publishing, research, or a similar consumer focused industry is highly desirable, but not more »
Provide estimates, work independently and meet deadlines Manage the releases and the related builds in each environment Perform development, testing, and support of data ETL/ELT programs (pipelines), using such tools as AWS Glue, Azure Data Factory, Informatica or similar ETL/ELT platforms, as well as knowledge of … learn the associated scheduling logic for data pipelines using scheduling tool such as Autosys or Airflow Read and translate logical data models and use ETL skills to load the physical layer, based on an understanding of timing of data loads, data transformation, and optimization of ETLload performance. Provide production more »
London, England, United Kingdom Hybrid / WFH Options
Pioneer Search
in delivering complex solutions for Enterprise level Clients. As well as maintaining robust and scalable applications in Scala, you will be able to implement ETL pipelines to process, transform, and standardize data from various sources as well as optimise the performance of Spark applications. Work closely with data scientists, software more »
financial services. Strong leadership skills with a track record of successfully managing and developing high-performing teams. In-depth knowledge of data engineering concepts, ETL processes, and data warehouse architectures. Expertise in working with big data technologies and cloud platforms (preferably AWS or Azure). Familiarity with asset management industry more »
Azure technologies Financial Services domain knowledge Do you tick these 4 boxes? Must haves: Python coding experience (Pure Python) Data modelling, data warehousing andETL frameworks SQL Our client have 2,500 lines of Python code for you to maintain and optimize. You'll need hands-on coding for this more »
Science, Engineering, or a related field. Strong Python development. Experience with Pandas is desirable. 3+ years of experience in data engineering. Experienced in building ETL data pipelines. Relational database experience w/PostgreSQL. Understanding of tech within our stack: AWS/Apache beam/Kafka. Experience with Object Orientated Programming more »
processes. Desired : experience in Infrastructure as Code developments and tools e.g. Terraform Desired : experience with MLOps deployment and maintenance. Desired: Data Engineering technologies e.g. ETL , Spark , Dataflow , BigQuery Please note: even if you don't have exactly the background indicated, do contact us now if this type of job is more »
to take their next step in their career and wants to take on more responsibility.Technical Skillset:Experience as a data engineerdeveloping and maintenance of ETL/ELT processesGood experience in Data Modeling within a cloud-based data platformStrong experience with SQL ServerAzure data engineering stack, including Azure Synapse and Azure more »
processes. Desired : experience in Infrastructure as Code developments and tools e.g. Terraform Desired : experience with MLOps deployment and maintenance. Desired: Data Engineering technologies e.g. ETL , Spark , Dataflow , BigQuery Please note: even if you don't have exactly the background indicated, do contact us now if this type of job is more »
Science or related fields. Proven experience with Insurance Broking Systems data migration (ideally Acturis). Proficiency in SQL and data manipulation languages. Experience with ETL tools. Strong analytical and critical thinking skills, with a focus on practical solutions. Excellent communication and people skills for conveying data concepts to diverse audiences. more »
+ benefits Purpose: Design, build, and maintain scalable data architectures, including pipelines and cloud-based data warehouses. Tech: Python (NumPy, Pandas), SQL, ETL, Cloud (AWS, Azure or GCP), Snowflake, Airflow, BigQuery, PowerBI/Tableau Industry: Fintech, Maritime trading Immersum are supporting the growth of a specialist consultancy who solely specialise … Pandas and NumPy. SQL: Advanced skills in complex querying and data manipulation. Data Modelling: Proven ability in designing efficient models for scalability and performance. ETL Processes: Deep expertise in developing and optimising ETL pipelines. Version Control: Experience with Git for code collaboration and change tracking. Data Pipeline Tools: Proficiency with more »