Python and SQL; strong experience with data engineering libraries (e.g., Pandas, PySpark, Dask). Deep knowledge of ETL/ELT frameworks and orchestration tools (e.g., Airflow, Azure Data Factory, Dagster). Proficient in cloud platforms (preferably Azure) and services such as Data Lake, Synapse, Event Hubs, and Functions. Authoring reports and dashboards with either open source or commercial products More ❯
Python and SQL; strong experience with data engineering libraries (e.g., Pandas, PySpark, Dask). Deep knowledge of ETL/ELT frameworks and orchestration tools (e.g., Airflow, Azure Data Factory, Dagster). Proficient in cloud platforms (preferably Azure) and services such as Data Lake, Synapse, Event Hubs, and Functions. Authoring reports and dashboards with either open source or commercial products More ❯
/CD skills to automate the development lifecycle: proficiency with Github Actions preferred. Advanced proficiency in Python for data engineering tasks. Experience with data orchestration tools such as Airflow, Dagster, or similar. Solid understanding of data governance, security principles, and privacy best practices, ideally in regulated or sensitive data environments. Experience with dbt for data modelling and quality testing. More ❯
the portfolio. Key Requirements: Strong proficiency in Python and PySpark Successful track history in a Data Engineering role including Database Management (ideally SQL), Data Orchestration (ideally Apache Airflow or Dagster), Containerisation (ideally Kubernetes and Docker) .Data Pipelines (big data technologies and architectures) Experience in Financial Services (ideally Commodity) Trading. Bachelor's degree in Information Systems, Computer Science, Engineering, or More ❯
the portfolio. Key Requirements: Strong proficiency in Python and PySpark Successful track history in a Data Engineering role including Database Management (ideally SQL), Data Orchestration (ideally Apache Airflow or Dagster), Containerisation (ideally Kubernetes and Docker) .Data Pipelines (big data technologies and architectures) Experience in Financial Services (ideally Commodity) Trading. Bachelor's degree in Information Systems, Computer Science, Engineering, or More ❯
together data from all kinds of sources, whether that's batch files or real-time streams. You'll have set up and worked with ETL and ELT tools like Dagster, AWS Glue, Azure Data Factory, Airflow or dbt, and you can decide what tools are right for the job. You'll have an understanding of how Node.js and TypeScript More ❯
similar Prior experience or interest in working with geospatial data Technologies we use ️ Programming languages: SQL, Python, LookML, (+ Go for other backend services) Development tools and frameworks: dbt, dagster, Airbyte, dlt, data-diff, Elementary Data lake and warehouse: GCS, BigQuery Analytics: Looker, Looker Studio and geospatial analytics tools How we reward our team Dynamic working environment with a More ❯
BI tools such as Lightdash, Looker or Tableau Comfortable with version control, testing frameworks, and CI/CD in a data context Familiarity with Python and orchestration tools like Dagster or Airflow is highly desirable Experience with Ecommerce and/or Subscription services is desirable The Interview Process Meet & Greet Call with a Talent Partner Call with the Hiring More ❯
and rebuilding our data platform to ensure we stay ahead in a rapidly evolving landscape. You'll be hands-on, using the latest tools like Apache Arrow, Polars, and Dagster to craft a data architecture that aligns with our innovative investment strategies. In this position, you'll be deeply involved in building and optimizing data pipelines that are reliable More ❯
Defender suite, Microsoft Entra, Microsoft Purview and Azure security services (e.g. Key Vault, NSGs, WAF, etc.). Experience with Kubernetes (Azure Kubernetes Service) and data platforms (e.g. Databricks, Snowflake, Dagster). Proven understanding of security risk management. Excellent understanding of common security controls, in particular cloud security controls. Understanding of threat modelling. Knowledge of ISO 27001 and other commonly More ❯
At least 4 years of experience in industry working with the above tools and technologies. Nice to Have Experience with other parts of our tech stack (Clickhouse, Postgres, Datadog, Dagster, Temporal, AWS, Gitlab and RunAI) Any domain knowledge within Life Sciences and Biotech/Techbio, specifically around high throughput sequencing, ML models, and bioinformatics. What we can offer in More ❯
In details, the position encompasses duties and responsibilities as follows: An experienced Data Engineer is required for the Surveillance IT team to develop ingestion pipelines and frameworks across the application portfolio, supporting Trade Surveillance analysts with strategy and decision-making. More ❯
We are looking for brilliant engineers to join our team at Magentic. We're pushing the boundaries of AI with next-generation agentic systems that can manage entire workflows. We're focusing on a three trillion dollar market of supply More ❯
at the client's office, which is located in London. Key Responsibilities: Data Pipeline Development & Maintenance Build, maintain, and optimize scalable ETL/ELT pipelines using tools such as Dagster, or similar. Ensure high data availability, reliability, and consistency through rigorous data validation and monitoring practices. Collaborate with cross-functional teams to align data pipeline requirements with business objectives … experience in query optimization and data modelling. Strong programming skills in Python (preferred), with a focus on building scalable data solutions. Experience with data pipeline orchestration tools such as Dagster or similar. Familiarity with cloud platforms (e.g. AWS) and their data services (e.g., S3, Redshift, Snowflake). Understanding of data warehousing concepts and experience with modern warehousing solutions. Experience More ❯
to join our clients data engineering team and help deliver mission-critical data pipelines . Day to day Work closely with our Tech Lead on data pipelines using Snowflake, Dagster, dbt, and Python Translate requirements from stakeholders into clear, actionable work for engineers Help ensure data is delivered quickly, reliably, and in an optimised way Experience Experience with data … pipelines, orchestration, and transformation tools Strong communicator who can work across technical and non-technical teams Knowledge of Snowflake, Dagster, or dbt More ❯