Mathematics, or related field 📊 Experience : Proven Data Engineer experience, preferably in finance 💻 Skills : Python, SQL, NoSQL, big data tools Cloud platforms (AWS, Azure, GCP) ETL processes and database design Join us and make a difference more »
Azure and Snowflake, enhancing data capabilities for analytics and science. What you need: 3+ years of hands-on experience as a Data Engineer, building ETL pipelines and managing data lifecycle. Strong Python coding experience. 2+ years of commercial experience developing in Snowflake. Good understanding of cloud principles (ideally Azure but more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Adria Solutions
with excellent analytical and problem-solving abilities Expertise using modern data tools (SQL, Python) Snowflake and AWS experience preferred. Good knowledge of Data Warehouses, ETL pipelines Good understanding of Cloud-based environments AWS and/or Azure Good appreciation of the modern software development lifecycle, CI/CD Benefits for more »
to combine the latest thinking with traditional military functions. *Must have Active SC Clearance* Role requirements Leveraging Azure cloud technologies for tasks such as ETL pipeline development, data warehousing, data lake creation, and data movement. Utilizing Azure data and analytics services, including but not limited to Azure Data Factory, Azure more »
Microsoft Windows and web services (IIS and Tomcat), CI/CD pipelines (Azure DevOps), and generating XML files Understanding of database design techniques, optimisation, ETL best practices, and dimensional modeling BSc/BA in Computer Science, Engineering, or related field This is a change to join a brilliant, established business more »
with real-time data analysis and financial systems (preferred). Knowledge of database design principles, performance optimization, and data modeling. Familiarity with data integration, ETL processes, and data warehousing. Excellent problem-solving skills and the ability to work effectively in a fast-paced environment. Strong communication and teamwork skills. A more »
in Python to work within a fast paced Data team . You will work very closely with Quant Research, Trading, Quant Development and traditional ETL/ELT Data Engineering. In this latency sensitive trading environment , your work is intrinsic to the firms PnL. Your core focus will be on on more »
London, England, United Kingdom Hybrid / WFH Options
AWTG Ltd
development processes. Additional duties as needed. Preferred Skills: Experience with additional frameworks and libraries such as PyTorch or TensorFlow. Knowledge of data engineering andETL processes. Experience with version control systems, particularly Git. Knowledge of telecommunications technologies and industry standards. Qualifications: Bachelor's or Master's degree in Computer Science more »
be a SPOC for all technical discussions across industry groups. • Excellent design experience, with entrepreneurship skills to own and lead solutions for clients • Excellent ETL skills, Data Modeling Skills • Excellent communication skills • Ability to define the monitoring, alerting, deployment strategies for various services. • Experience providing solution for resiliency, fail over more »
proposing solutions for improvements Skills and Experience Excellent knowledge of SQL Knowledge of Python (other languages might be equivalent) Experience in creating and maintaining ETL pipelines Know your way around Unix based operating system Experience working with any major cloud provider (AWS, GCP, Azure) Fluency in English Experience using Apache more »
and manipulation · Analytical capability/Analytical skills: · Issues & risks identification & mitigation · Root cause analysis · Qualitative & quantitative analysis · End to end development of PowerBI Reports (ETL, data modelling, expression writing and report writing) · Experience with data mastery · Good practices in SQL development · Good understanding of relational databases · Advanced Excel (Power Query more »
knowledge in distributed systems, cloud architecture, and data pipelines. Proficiency in Python programming (knowledge of Scala or Rust is a plus). Familiarity with ETL principles in contemporary data applications (Dagster, Airflow, Perfect). Familiarity with AWS services such as Glue, Redshift, Athena, and S3. Proficiency with Terraform, Kubernetes, andmore »
Northamptonshire, England, United Kingdom Hybrid / WFH Options
Capgemini
technologies like Python, Scala, Spark, PySpark or Ab Initio, Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines andETL processes. Experience with data pipeline orchestration tools like Apache Airflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. more »
Knutsford, England, United Kingdom Hybrid / WFH Options
Capgemini
technologies like Python, Scala, Spark, PySpark or Ab Initio, Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines andETL processes. Experience with data pipeline orchestration tools like Apache Airflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. more »
Ability to think strategically and assess options across short, medium, and long-term timeframes. Experience with data technologies including MS SQL, data warehousing, andETL processes. Familiarity with microservices architecture, APIs, and integration patterns. Knowledge of DevOps practices and CI/CD pipelines. Understanding of security standards and optimal practices more »
analyst, some of the data tasks you will work on are: Participating in data discovery workshops to develop proprietary supply chain-specific datasets Writing ETL code in Python Updating spreadsheets in Excel Connecting different services like ChatGPT and Salesforce on Google Cloud Performing data analyses in SQL (BigQuery) Building dashboards more »
organization. Developing and implementing data models and algorithms to support data science and machine learning initiatives . Technical Requirements Proven track record leading complex ETLand Data Infrastructure projects, as well as designing and building data intensive applications and services. Experience with data processing and distributed computing frameworks such as more »
a strategic vision for data management information reporting within our organisation. Responsibilities Develop, support, and our business reporting platform, including DataBlend for integrations an ETL, along with gathering requirements for business reporting across a range of systems. Create and maintain reports and dashboards in PowerBI to support business decisions andmore »
key stakeholders and other business units to understand business needs and gather requirements. Required Experience Track record of designing and building data infrastructure andETL pipelines Experience with Azure Platform including Data Factory, Synapse, and Data Lake DB development experience with SQL Software development experience in relevant languages like Python more »
a 50/50 split between BI reporting/analysis and data ingestion, and would suit someone with a couple years experience across Python , ETL, (AWS, Azure or GCP) and any BI tool (PowerBI, Tableau, Qlik, QuickSight, or GoodData - it's more about the transferable skills!). This is an more »
Python and its data processing, analysis, and visualization libraries Experienced with SQL and Timeseries databases Skilled in AWS services: S3, EC2, RDS Knowledgeable in ETL tools like Airflow Proficient in Git, CI/CD, testing tools, and documentation best practices Adheres to quality engineering practices including TDD and BDD Nice more »
high availability and disaster recovery scenarios Designing robust implementation, migration, and test plans Database administration Performance optimisation – Queries, indexing, database design, storage Data integration – ETL/ELT Data security Proven troubleshooting skills Scalability Maintenance and operations Licensing What you’ll need Minimum 2 years’ experience with SQL Server more »
Devise practical solutions to business challenges, drawing from industry experience. Spearhead engineering collaboration with actuarial subject matter experts. Guide team members in developing robust ETL processes. Uphold exemplary coding and technical standards. Showcase technical prowess in Python, SQL, and JSON. Implement and oversee a robust CI/CD process. Enhance more »
a genuine impact on a growing business. Requirements: Strong experience with Python Strong SQL Experience working with REST Microservices Strong experience building and managing ETL Pipelines Exposure to modern cloud technologies - (Azure, GCP, AWS etc) Strong formal education - ideally in Computer Science If this sounds of interest, then please do more »
BI. Work closely with stakeholders to understand their needs, objectives, and requirements. Complete end-to-end development of Power BI dashboards which includes the ETL process required to gather and clean data from various data sources, ensuring data integrity and accuracy. Coordinate with other professionals to implement changes and new more »