Wilmslow, England, United Kingdom Hybrid / WFH Options
The Citation Group
if not? We’ll help you get there: Understanding of cloud computing security concepts Proficiency with IaC tools like Terraform or CloudFormation. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in Agile environments with tools like Jira and Git. About More ❯
Python and SQL; strong experience with data engineering libraries (e.g., Pandas, PySpark, Dask). Deep knowledge of ETL/ELT frameworks and orchestration tools (e.g., Airflow, Azure Data Factory, Dagster). Proficient in cloud platforms (preferably Azure) and services such as Data Lake, Synapse, Event Hubs, and Functions. Authoring reports and dashboards with either open source or commercial products More ❯
Python and SQL; strong experience with data engineering libraries (e.g., Pandas, PySpark, Dask). Deep knowledge of ETL/ELT frameworks and orchestration tools (e.g., Airflow, Azure Data Factory, Dagster). Proficient in cloud platforms (preferably Azure) and services such as Data Lake, Synapse, Event Hubs, and Functions. Authoring reports and dashboards with either open source or commercial products More ❯
and development of a new global data platform (PaaS) Ensure scalable storage solutions (data lakes, data warehouses) to handle structured and unstructured data. Implement ETL/ELT pipelines using Dagster, Airflow, or similar tools. Optimize performance and scalability for large data volumes. Govern data security, compliance, and access controls. Development & DevOps: Strong programming and scripting skills in Python. Knowledge More ❯
together data from all kinds of sources, whether that's batch files or real-time streams. You'll have set up and worked with ETL and ELT tools like Dagster, AWS Glue, Azure Data Factory, Airflow or dbt, and you can decide what tools are right for the job. You'll have an understanding of how Node.js and TypeScript More ❯
and rebuilding our data platform to ensure we stay ahead in a rapidly evolving landscape. You'll be hands-on, using the latest tools like Apache Arrow, Polars, and Dagster to craft a data architecture that aligns with our innovative investment strategies. In this position, you'll be deeply involved in building and optimizing data pipelines that are reliable More ❯
At least 4 years of experience in industry working with the above tools and technologies. Nice to Have Experience with other parts of our tech stack (Clickhouse, Postgres, Datadog, Dagster, Temporal, AWS, Gitlab and RunAI) Any domain knowledge within Life Sciences and Biotech/Techbio, specifically around high throughput sequencing, ML models, and bioinformatics. What we can offer in More ❯
clients - Collaborating with cross-functional teams to deploy and operate solutions in production - Supporting real-time and near-real-time data analytics initiatives - Leveraging orchestration tools such as Airflow, Dagster, Azure Data Factory or Fivetran Required qualifications to be successful in this role - Solid experience designing and delivering Snowflake-based data warehouse solutions - Strong background performing architectural assessments and … Java or Scala - Hands-on experience using DBT for pipeline development and transformation - Familiarity with cloud platforms such as AWS, Azure or GCP - Knowledge of orchestration tooling (e.g., Airflow, Dagster, Azure Data Factory, Fivetran) Desirable: - Experience deploying AI/ML models in production environments - Familiarity with AWS data services (e.g., S3, Glue, Kinesis, Athena) - Exposure to real-time data More ❯
In details, the position encompasses duties and responsibilities as follows: An experienced Data Engineer is required for the Surveillance IT team to develop ingestion pipelines and frameworks across the application portfolio, supporting Trade Surveillance analysts with strategy and decision-making. More ❯
Data Engineer at Magentic - Visa Sponsorship Available Are you an experienced Data Engineer ready to take your career to the next level? Magentic is seeking a talented and driven professional to join our dynamic team in London. In this exciting More ❯
We are looking for brilliant engineers to join our team at Magentic. We're pushing the boundaries of AI with next-generation agentic systems that can manage entire workflows. We're focusing on a three trillion dollar market of supply More ❯
at the client's office, which is located in London. Key Responsibilities: Data Pipeline Development & Maintenance Build, maintain, and optimize scalable ETL/ELT pipelines using tools such as Dagster, or similar. Ensure high data availability, reliability, and consistency through rigorous data validation and monitoring practices. Collaborate with cross-functional teams to align data pipeline requirements with business objectives … experience in query optimization and data modelling. Strong programming skills in Python (preferred), with a focus on building scalable data solutions. Experience with data pipeline orchestration tools such as Dagster or similar. Familiarity with cloud platforms (e.g. AWS) and their data services (e.g., S3, Redshift, Snowflake). Understanding of data warehousing concepts and experience with modern warehousing solutions. Experience More ❯