sources to ensure optimal value extraction. Required Skills: Proficiency in Python. Solid experience with Linux, SQL, relational databases, and version control systems. Familiarity with Airflow, Kafka, and GCP (AWS experience is also acceptable). If this sounds like you, please apply directly or reach out to Daniel O'Connell More β―
SQL expertise designing and maintaining ETL/data pipelines ideally proficiency in multiple cloud infrastructures, databases and data warehousing solutions - AWS and GCP being Airflow π Bonus points for: Experience working in a small, fast-growing start-up, comfortable navigating unstructured/fuzzy environments Experience with RudderStack, Expo and/ More β―
SQL expertise designing and maintaining ETL/data pipelines ideally proficiency in multiple cloud infrastructures, databases and data warehousing solutions - AWS and GCP being Airflow π Bonus points for: Experience working in a small, fast-growing start-up, comfortable navigating unstructured/fuzzy environments Experience with RudderStack, Expo and/ More β―
modelling, and extracting value from large, disconnected datasets Familiarity with data warehouse design and modelling Familiarity with cloud data services (preferably Azure) Experience with Airflow π Please note : unfortunately, this role does not offer VISA sponsorship. More β―
modelling, and extracting value from large, disconnected datasets Familiarity with data warehouse design and modelling Familiarity with cloud data services (preferably Azure) Experience with Airflow π Please note : unfortunately, this role does not offer VISA sponsorship. More β―
So they offer very flexible working arrangements through both WFH options and flexi working hours. Experience required... Expert in Snowflake Strong DBT experience Strong Airflow experience Expert knowledge and understanding of Data Warehousing Strong AWS experience More β―
london, south east england, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
So they offer very flexible working arrangements through both WFH options and flexi working hours. Experience required... Expert in Snowflake Strong DBT experience Strong Airflow experience Expert knowledge and understanding of Data Warehousing Strong AWS experience More β―
Python, dbt, and data modelling, as well as experience or a strong interest in blockchain technology. Twinstake utilises a modern data stack including Airbyte, Airflow, Snowflake, dbt and Quicksight. What you will contribute: Data Modelling : Building scalable data models to transform complex datasets into actionable insights, using advanced SQL More β―
cloud provider Frontend development with React or another modern web framework DevOps Kubernetes Infrastructure engineering with Terraform, Pulumi or similar Data workload orchestration with Airflow or similar Containerisation with Docker Experience with SQL, as well as relational database design and administration Experience in other tools not listed is also More β―
management processes, and data profiling. Experience in developing APIs and working with WebSockets. Knowledge of React, Django, FastAPI, or equivalent technologies. Previous experience with AirFlow, Linux shell commands and setup websites on IIS. What we would like from you: Bachelor's or Master's degree in a relevant field. More β―
building, maintaining and continuously enhancing automations needed for scalability & efficiency in running the Network Infrastructure. Experience in infrastructure Automation and orchestration Frameworks e.g. Ansible, Airflow, Terraform, Chef, Salt. Proven experience with object-oriented programming languages preferably in Python. A bachelor's or master's degree in computer science, Engineering More β―
methods and forecasting. Expertise in data querying languages (e.g. SQL) and scripting languages (e.g. Python, R). Experience with data architecture technologies such as Airflow, Databricks, and dbt. Preferred qualifications: Experience in technology, financial services and/or a high growth environment. Experience with Excel and Finance systems (e.g. More β―
/or financial instruments including equities, futures, and options Strong Unix skills along with programming skills in Python and experience in Scheduling tools like Airflow Good understanding of networking and system concepts Experience with SQL and the use of databases is a plus The ability to manage multiple tasks More β―
Data Expertise β Experience with AWS, Azure, GCP, ETL pipelines, and scalable data architectures. DevOps & CI/CD β Strong knowledge of GitOps, Kubernetes, Terraform, Jenkins, Airflow, and related tools. Strategic Thinking β Ability to drive long-term engineering strategy while delivering incremental value. Technical Debt Management β Experience identifying and remediating inefficient More β―
Data Expertise β Experience with AWS, Azure, GCP, ETL pipelines, and scalable data architectures. DevOps & CI/CD β Strong knowledge of GitOps, Kubernetes, Terraform, Jenkins, Airflow, and related tools. Strategic Thinking β Ability to drive long-term engineering strategy while delivering incremental value. Technical Debt Management β Experience identifying and remediating inefficient More β―
operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. Airflow, dbt, MLFlow, or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational More β―
commerce, retail, or the travel industry Experience designing and analysing large-scale A/B test experiments Mastery of workflow orchestration technologies such as Airflow, Dagster or Prefect Expert knowledge of technologies such as: Google Cloud Platform, particularly Vertex AI Docker and Kubernetes Infrastructure as Code Experience establishing data More β―
necessary infrastructure, resources, and interfaces to enable data loading and LLM workflows. Use Python/Java and large-scale data workflow orchestration platforms e.g., Airflow to construct software artifacts for ETL, interfacing with diverse data formats and storage technologies, and incorporate them into robust data workflows and dynamic systems. More β―
React frontends. You'll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You will serve as the go-to engineer for More β―
skills Knowledge of data analysis would be beneficial, (e.g. SQL, Tableau, Looker), as well as knowledge of data tooling (e.g. BigQuery, Alation, DBT, Fivetran, Airflow) We're operating in a hybrid working model with a mix of office based and remote based work. The data team come into our More β―
ideally for a systematic trading desk) Strong knowledge of SQL and relational databases In depth knowledge of data streaming technologies like Kafka, S3 and Airflow Degree or higher in Computer Science or similar field Willingness to do support and occasional on call work as and when required More β―
ideally for a systematic trading desk) Strong knowledge of SQL and relational databases In depth knowledge of data streaming technologies like Kafka, S3 and Airflow Degree or higher in Computer Science or similar field Willingness to do support and occasional on call work as and when required More β―
ideally for a systematic trading desk) Strong knowledge of SQL and relational databases In depth knowledge of data streaming technologies like Kafka, S3 and Airflow Degree or higher in Computer Science or similar field Willingness to do support and occasional on call work as and when required More β―
to-Have: Experience with marketing data or customer-level models (e.g. uplift, attribution, causal inference, campaign optimization) Familiarity with MLOps tools (e.g. MLflow, FastAPI, Airflow) Exposure to A/B testing and experimentation frameworks WHY THIS ROLE IS DIFFERENT This isnβt a narrow data science role β you won More β―
to-Have: Experience with marketing data or customer-level models (e.g. uplift, attribution, causal inference, campaign optimization) Familiarity with MLOps tools (e.g. MLflow, FastAPI, Airflow) Exposure to A/B testing and experimentation frameworks WHY THIS ROLE IS DIFFERENT This isnβt a narrow data science role β you won More β―