Grand Prairie, Texas, United States Hybrid / WFH Options
Jobot
such as Snowflake and Fabric, ensuring their effective use in large-scale data solutions. o Manage and optimize data pipelines using tools such as ApacheAirflow, Prefect, DBT, and SSIS, ensuring that all stages of the pipeline (ETL) are efficient, scalable, and reliable. o Ensure robust testing, monitoring … best practices in data architecture. o Experience with API development, including building and managing API integrations. o Proficiency with orchestration tools like Prefect or Airflow for workflow management. o Strong focus on testing and validation, ensuring that all data systems meet reliability and performance standards. Experience & Qualifications: 5+ years … data platforms and cloud environments. Strong background in relational databases, dimensional data modeling, and cloud-native solutions. Familiarity with data engineering tools such as ApacheAirflow, Prefect, and cloud storage platforms. Excellent problem-solving skills, with the ability to navigate complex technical challenges. Interested in hearing more? Easy More ❯
chelmsford, east anglia, united kingdom Hybrid / WFH Options
EMBL-EBI
services, including microservice deployment Developing and maintaining data pipelines to process and manage large-scale structural biology data Working with workflow orchestration tools like ApacheAirflow and Nextflow To liaise with the EMBL-EBI Technical Services Cluster To support the development and deployment of other scientific software The … Proficiency in containerisation Proficiency in relational databases (Oracle, PostgreSQL) Experience in developing and maintaining data pipelines Hands-on experience with workflow orchestration tools (e.g., ApacheAirflow, Nextflow) Strong interpersonal and communication skills Proficiency in oral and written English You may also have PhD in computer science, IT or … a related field, or in bioinformatics with a demonstrated IT expertise Experience in using Kubernetes Experience with web servers (Apache/Nginx) Hands-on experience with CI/CD (GitLab CI/GitHub Actions) Familiarity with networking Familiarity with Java Knowledge of, or affinity with, structural biology and bioinformatics More ❯
saffron walden, east anglia, united kingdom Hybrid / WFH Options
EMBL-EBI
services, including microservice deployment Developing and maintaining data pipelines to process and manage large-scale structural biology data Working with workflow orchestration tools like ApacheAirflow and Nextflow To liaise with the EMBL-EBI Technical Services Cluster To support the development and deployment of other scientific software The … Proficiency in containerisation Proficiency in relational databases (Oracle, PostgreSQL) Experience in developing and maintaining data pipelines Hands-on experience with workflow orchestration tools (e.g., ApacheAirflow, Nextflow) Strong interpersonal and communication skills Proficiency in oral and written English You may also have PhD in computer science, IT or … a related field, or in bioinformatics with a demonstrated IT expertise Experience in using Kubernetes Experience with web servers (Apache/Nginx) Hands-on experience with CI/CD (GitLab CI/GitHub Actions) Familiarity with networking Familiarity with Java Knowledge of, or affinity with, structural biology and bioinformatics More ❯
Saffron Walden, Essex, South East, United Kingdom Hybrid / WFH Options
EMBL-EBI
services, including microservice deployment Developing and maintaining data pipelines to process and manage large-scale structural biology data Working with workflow orchestration tools like ApacheAirflow and Nextflow To liaise with the EMBL-EBI Technical Services Cluster To support the development and deployment of other scientific software The … Proficiency in containerisation Proficiency in relational databases (Oracle, PostgreSQL) Experience in developing and maintaining data pipelines Hands-on experience with workflow orchestration tools (e.g., ApacheAirflow, Nextflow) Strong interpersonal and communication skills Proficiency in oral and written English You may also have PhD in computer science, IT or … a related field, or in bioinformatics with a demonstrated IT expertise Experience in using Kubernetes Experience with web servers (Apache/Nginx) Hands-on experience with CI/CD (GitLab CI/GitHub Actions) Familiarity with networking Familiarity with Java Knowledge of, or affinity with, structural biology and bioinformatics More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
TalentHawk
early and providing strategic guidance. Support the ongoing development of integration strategies involving Managed File Transfer solutions (e.g., GoAnywhere) and data orchestration platforms (e.g., ApacheAirflow). Provide hands-on support and detailed guidance on particularly complex integration designs where necessary. Maintain current knowledge of industry trends, technology … and migrations. Familiarity with IBM Maximo asset management platform. Knowledge and experience with Managed File Transfer solutions (e.g., GoAnywhere). Understanding and experience with ApacheAirflow orchestration platform. Strong grasp of integration best practices, security considerations, and data flow management. Ability to work collaboratively across distributed teams and More ❯
early and providing strategic guidance. Support the ongoing development of integration strategies involving Managed File Transfer solutions (e.g., GoAnywhere) and data orchestration platforms (e.g., ApacheAirflow). Provide hands-on support and detailed guidance on particularly complex integration designs where necessary. Maintain current knowledge of industry trends, technology … and migrations. Familiarity with IBM Maximo asset management platform. Knowledge and experience with Managed File Transfer solutions (e.g., GoAnywhere). Understanding and experience with ApacheAirflow orchestration platform. Strong grasp of integration best practices, security considerations, and data flow management. Ability to work collaboratively across distributed teams and More ❯
Washington, Washington DC, United States Hybrid / WFH Options
SMX
SMX is seeking a talented Data Engineer (Python) with expertise in ETL (Extract, Transform, Load) processes and Apache Airflow. The candidate will be responsible for designing and implementing robust and efficient data pipelines, ensuring high data quality, and contributing to the continuous improvement of our data management practices. This … is a remote position supporting a Washington, DC based team. Essential Duties & Responsibilities: Design, develop, and maintain ETL processes using Python and Apache Airflow. Collaborate with data analysts and other stakeholders to understand and meet their data requirements. Develop and implement data validation processes to ensure high data quality. … Skills & Experience: Proficiency in Python: Strong understanding of Python programming language. Experience with Python libraries and frameworks like Pandas, NumPy, and Django. Expertise in ApacheAirflow: Experience in designing, building, and maintaining data pipelines using Apache Airflow. Knowledge of Airflow's architecture, including DAGs and Operators. More ❯
Science, Math, or Financial Engineering degree Strong knowledge in other programming language(s) – e.g., JavaScript, Typescript, Kotlin Strong knowledge of data orchestration technologies – e.g., ApacheAirflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
Science, Math, or Financial Engineering degree Strong knowledge in other programming language(s) – e.g., JavaScript, Typescript, Kotlin Strong knowledge of data orchestration technologies – e.g., ApacheAirflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. More ❯
Science, Math, or Financial Engineering degree Strong knowledge in other programming language(s) – e.g., JavaScript, Typescript, Kotlin Strong knowledge of data orchestration technologies – e.g., ApacheAirflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
Science, Math, or Financial Engineering degree Strong knowledge in other programming language(s) – e.g., JavaScript, Typescript, Kotlin Strong knowledge of data orchestration technologies – e.g., ApacheAirflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. More ❯
diagram of proposed tables to enable discussion. Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users. Worked on ApacheAirflow before to create DAGs. Ability to work within Agile, considering minimum viable products, story pointing, and sprints. More information: Enjoy fantastic perks More ❯
Intelligence, Statistical & Data Analysis, Computational Algorithms, Data Engineering, etc. Experience working with a variety of complex, large datasets. Experience building automated pipelines (e.g., Jenkins, Airflow, etc.). Experience building or understanding end-to-end, distributed, and high-performance software infrastructures. Proven ability to work collaboratively as part of a More ❯
across the company. Role requirements 4+ years of experience You have an understanding of developing ETL pipelines using Python frameworks such as luigi or airflow; You have experience with the development of Python-based REST APIs/services and their integration with databases (e.g. Postgres); You are familiar with More ❯
Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (Python, Data Modelling, ETL/ELT, ApacheAirflow, DBT, AWS) Enterprise-scale tech firm Up to £70,000 plus benefits - FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalabale data structures and data pipelines within More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Duel
Snowflake. You understand event-driven architectures and real-time data processing. You have experience implementing and maintaining scalable data pipelines using tools like dbt, ApacheAirflow, or similar. You have no issue working with either structured or semi-structured data. You are comfortable working with data engineering and More ❯
maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem solver who can balance More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
deliverables. Strong background in data lake and data warehouse design, including data modeling and partitioning strategies Advanced proficiency in ETL tools (e.g., Talend, Informatica, ApacheAirflow) and orchestration frameworks Extensive experience with cloud data ecosystems (AWS, Azure, GCP) and containerization (e.g., Docker, Kubernetes) In-depth knowledge of CI … organizational goals One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Peaple Talent
and familiar with setting up CI/CD workflows using platforms like Azure DevOps or similar tools. Hands-on experience with orchestration tools like ApacheAirflow for managing complex data workflows. Practical familiarity with low-code or no-code platforms such as Talend and SnapLogic for streamlined pipeline More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
in python Comfortable implementing data architectures in analytical data warehouses such as Snowflake, Redshift or BigQuery Hands on experience with data orchestrators such as Airflow Knowledge of Agile development methodologies Awareness of cloud technology particularly AWS. Knowledge of automated delivery processes Hands on experience of best engineering practices (handling More ❯
for data processing, analysis and automation. Proficiency in building and maintaining batch and streaming ETL/ELT pipelines at scale, employing tools such as Airflow, Fivetran, Kafka, Iceberg, Parquet, Spark, Glue for developing end-to-end data orchestration leveraging on AWS services to ingest, transform and process large volumes More ❯
environments (e.g., Snowflake). Proficiency in SQL and at least one scripting language (e.g., Python, Bash). Familiarity with data pipeline orchestration tools (e.g., Airflow, dbt). Strong understanding of data quality principles and best practices. Excellent communication and collaboration skills. Working with AWS, Twilio Segment and Google Analytics. More ❯
San Antonio, Texas, United States Hybrid / WFH Options
IAMUS
GitLab. Data Formats: Familiarity with JSON, XML, SQL, and compressed file formats. Configuration Files: Experience using YAML files for data model and schema configuration. Apache NiFi: Significant experience with NiFi administrationand building/troubleshooting data flows. AWS S3: bucket administration. IDE: VSCode, Intellij/Pycharm, or other suitable Technical … security operations. Familiarity with Agile environments. Good communication skills. Developed documentation and training in areas of expertise. Amazon S3, SQS/SNS Admin experience ApacheAirflow Workloads via UI or CLI a plus Experience with Mage AI a plus Kubernetes, Docker More ❯