meetings. What You Need to Succeed Strong skills in Python and SQL Demonstrable hands-on experience in AWS cloud Data ingestions both batch and streaming data and data transformations (Airflow, Glue, Lambda, Snowflake Data Loader, FiveTran, Spark, Hive etc.). Apply agile thinking to your work. Delivering in iterations that incrementally build on what went before. Excellent problem-solving … translate concepts into easily understood diagrams and visuals for both technical and non-technical people alike. AWS cloud products (Lambda functions, Redshift, S3, AmazonMQ, Kinesis, EMR, RDS (Postgres . ApacheAirflow for orchestration. DBT for data transformations. Machine Learning for product insights and recommendations. Experience with microservices using technologies like Docker for local development. Apply engineering best practices More ❯
fintech, crypto, or trading industries; familiarity with FIX is a plus. Experience in object-oriented development with strong software engineering foundations. Experience with data-engineering cloud technologies such as ApacheAirflow, K8S, Clickhouse, Snowflake, Redis, cache technologies, and Kafka. Proven experience with relational and non-relational databases; proficient in SQL and query optimization. Experience designing infrastructure at scale More ❯
collaboratively Proficiency in multiple programming languages Technologies: Scala, Java, Python, Spark, Linux, shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience with process scheduling platforms like ApacheAirflow Open to working with proprietary GS technologies such as Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing, including parallel and cloud More ❯
or MS degree in Computer Science or equivalent Experience in developing Finance or HR related applications Working experience with Tableau Working experience with Terraform Experience in creating workflows for ApacheAirflow and Jenkins Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our More ❯
Strong knowledge of algorithms, design patterns, OOP, threading, multiprocessing, etc. Experience with SQL, NoSQL, or tick databases Experience working in a Unix environment and git Familiarity with Kafka, Docker, AirFlow, Luigi Strong communication skills in verbal and written English. Domain knowledge in futures & swaps is a plus Highly competitive compensation and bonus structure Meritocratic environment with ample opportunity for More ❯
MySQL, PostgreSQL, or Oracle. Experience with big data technologies such as Hadoop, Spark, or Hive. Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow. Proficiency in Python and at least one other programming language such as Java, or Scala. Willingness to mentor more junior members of the team. Strong analytical and problem More ❯
following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith More ❯
experience Experience in data quality engineering, software testing, or data engineering. Strong proficiency in SQL and Python with the ability to validate large datasets. Experience with Snowflake, Databricks, Spark, Airflow, or similar tools. Proven ability to build and scale automated data testing frameworks. Experience leading strategic initiatives and/or managing engineering team members. Strong problem-solving and debugging More ❯
Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Nice to Have Experience Understanding of various data architecture paradigms (e.g., Data Lakehouse, Data Warehouse, Data Mesh) and their applicability to different More ❯
Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Business Intelligence - Looker Skills & Attributes We'd Like To See: Extensive experience in data engineering, including designing and maintaining robust data pipelines. More ❯
analysis, and supporting complex client agreements. This is a hands-on engineering role working closely with stakeholders and system owners. You'll be expected to code daily (Python), manage Airflow pipelines (MWAA), build ETL processes from scratch, and improve existing workflows for better performance and scalability. Key responsibilities Design and build robust ETL pipelines using Python and AWS services … Own and maintain Airflow workflows Ensure high data quality through rigorous testing and validation Analyse and understand complex data sets before pipeline design Collaborate with stakeholders to translate business requirements into data solutions Monitor and improve pipeline performance and reliability Maintain documentation of systems, workflows, and configs Tech environment Python, SQL/PLSQL (MS SQL + Oracle), PySpark ApacheAirflow (MWAA), AWS Glue, Athena AWS services (CDK, S3, data lake architectures) Git, JIRA You should apply if you have: Strong Python and SQL skills Proven experience designing data pipelines in cloud environments Hands-on experience with Airflow (ideally MWAA) Background working with large, complex datasets Experience in finance or similar high-volume, regulated industries (preferred but More ❯
Troubleshooting: Oversee pipeline performance, address issues promptly, and maintain comprehensive data documentation. What Youll Bring Technical Expertise: Proficiency in Python and SQL; experience with data processing frameworks such as Airflow, Spark, or TensorFlow. Data Engineering Fundamentals: Strong understanding of data architecture, data modelling, and scalable data solutions. Backend Development: Willingness to develop proficiency in backend technologies (e.g., Python with … Django) to support data pipeline integrations. Cloud Platforms: Familiarity with AWS or Azure, including services like ApacheAirflow, Terraform, or SageMaker. Data Quality Management: Experience with data versioning and quality assurance practices. Automation and CI/CD: Knowledge of build and deployment automation processes. Experience within MLOps A 1st class Data degree from one of the UKs top More ❯
TransferGo. Our products are recognised by industry leaders like Gartner's Magic Quadrant, Forrester Wave and Frost Radar. Our tech stack: Superset and similar data visualisation tools. ETL tools: Airflow, DBT, Airbyte, Flink, etc. Data warehousing and storage solutions: ClickHouse, Trino, S3. AWS Cloud, Kubernetes, Helm. Relevant programming languages for data engineering tasks: SQL, Python, Java, etc. What you … methodologies. Collaborating with stakeholders to define data strategies, implement data governance policies, and ensure data security and compliance. About you: Strong technical proficiency in data engineering technologies, such as ApacheAirflow, ClickHouse, ETL tools, and SQL databases. Deep understanding of data modeling, ETL processes, data integration, and data warehousing concepts. Proficiency in programming languages commonly used in data More ❯
Maintenance - Implement robust logging, alerting, and performance monitoring for integrations. Continuous Improvement - Champion enhancements to integration architectures and best practices. Skills & Experience Required Experience with workflow orchestration tools (e.g., ApacheAirflow). Proven track record in backend development (e.g., Node.js, Python, Java). Strong knowledge of API design, integration methods (REST, Webhooks, GraphQL), and authentication protocols (OAuth2, JWT More ❯
Maintenance - Implement robust logging, alerting, and performance monitoring for integrations. Continuous Improvement - Champion enhancements to integration architectures and best practices. Skills & Experience Required Experience with workflow orchestration tools (e.g., ApacheAirflow). Proven track record in backend development (e.g., Node.js, Python, Java). Strong knowledge of API design, integration methods (REST, Webhooks, GraphQL), and authentication protocols (OAuth2, JWT More ❯
Maintenance - Implement robust logging, alerting, and performance monitoring for integrations. Continuous Improvement - Champion enhancements to integration architectures and best practices. Skills & Experience Required Experience with workflow orchestration tools (e.g., ApacheAirflow). Proven track record in backend development (e.g., Node.js, Python, Java). Strong knowledge of API design, integration methods (REST, Webhooks, GraphQL), and authentication protocols (OAuth2, JWT More ❯
of real-time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of ApacheAirflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with More ❯
both object-oriented programming (OOP) and functional programming (FP) best practices. Familiarity with Amazon Web Services (AWS), Terraform, and infrastructure as code (IaC) best practices. Familiarity with Databricks and ApacheAirflow products. Required Education: Bachelor's degree in Computer Science, Information Systems, Software, Electrical or Electronics Engineering, or comparable field of study, and/or equivalent work experience. More ❯
Gloucester, Gloucestershire, United Kingdom Hybrid / WFH Options
Navtech, Inc
of Science Degree in software engineering or a related field Proficiency in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., ApacheAirflow). Familiarity with Change Data Capture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner). Understanding of ORM More ❯
London, Victoria, United Kingdom Hybrid / WFH Options
Boston Hale
team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources including CMS, analytics, ad tech, and social platforms. Lead engineering efforts to automate workflows using tools like Airflow, dbt, and Spark. Build robust data models to support dashboards, A/B testing, and revenue analytics. Collaborate with cross-functional teams to deliver actionable insights and support strategic More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and document scalable, reliable data models Comfortable gathering business requirements and translating them into data architecture Strong problem More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources including CMS, analytics, ad tech, and social platforms. Lead engineering efforts to automate workflows using tools like Airflow, dbt, and Spark. Build robust data models to support dashboards, A/B testing, and revenue analytics. Collaborate with cross-functional teams to deliver actionable insights and support strategic More ❯
retrieval and pipeline development Experience with IaC tools such as Terraform or Ansible for deployment and infrastructure management Hands-on experience with; ETL/ELT orchestration and pipeline tools (Airflow, Airbyte, DBT, etc.) Data warehousing tools and platforms (Snowflake, Iceberg, etc.) SQL databases, particularly MySQL Desired Experience: Experience with cloud-based services, particularly AWS Proven ability to manage stakeholders More ❯
tuning. Experience with designing and programming relational database such as MySQL, RedShift, Oracle SQL Server, or Postgres. Experience with AWS based system architecture covering S3, EKS, EC2, Batch, or Airflow etc. Experience with caching and messaging technologies such as, Redis, Hazelcast, MQ, or Kafka etc. Experience with programming within a CICD pipeline such as Git, Jenkins etc. Strong problem More ❯
layers, preferably through formalized SQL models (e.g., dbt). Ability to work in a fast-paced environment and adapt solutions to changing business needs. Experience with ETL technologies like Airflow and Airbyte. Production experience with streaming systems like Kafka. Exposure to infrastructure, DevOps, and Infrastructure as Code (IaaC). Deep knowledge of distributed systems, storage, transactions, and query processing. More ❯