data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
london (city of london), south east england, united kingdom
HCLTech
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
data engineering tasks. Experience building and maintaining web scraping pipelines. Strong SQL skills, with expertise in performance tuning. Strong proficiency with dbt for data transformations. Hands-on experience with ApacheAirflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer, Pub/Sub) or AWS. Familiarity More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to ApacheAirflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience More ❯
Shawnee Mission, Kansas, United States Hybrid / WFH Options
ECCO Select
mainly remote) Duration: Direct Hire Benefits: Medical/Dental/Vision/401k/PTO/Holidays Job Description: • Design, build, and maintain scalable data pipelines using tools like Apache NiFi, Airflow, or equivalent orchestration systems. • Work with structured and semi-structured data using SQL and NoSQL systems (e.g., PostgreSQL, MongoDB, Elasticsearch, Neo4j). • Develop services and integrations … data pipeline or ETL contexts; Python is a plus. • Proficiency with SQL and NoSQL databases, including query optimization and large dataset processing. • Familiarity with data integration tools such as Apache NiFi, Airflow, or comparable platforms. • Knowledge of RESTful API interactions, JSON parsing, and schema transformations. • Exposure to cloud environments (especially AWS: S3, EC2, Lambda) and distributed systems. • Comfortable More ❯
problem solving skills We'd love to see: Knowledge of Database Systems and trade offs in the distributed systems Familiarity with API Designs Familiarity with Orchestration Frameworks such as ApacheAirflow, Argo Workflows, Conductor etc. Experience working with and designing systems utilizing AWS Bloomberg is an equal opportunity employer and we value diversity at our company. We do More ❯
problem solving skills We'd love to see: Knowledge of Database Systems and trade offs in the distributed systems Familiarity with API Designs Familiarity with Orchestration Frameworks such as ApacheAirflow, Argo Workflows, Conductor etc. Experience working with and designing systems utilizing AWS Bloomberg is an equal opportunity employer and we value diversity at our company. We do More ❯
Hampton, Virginia, United States Hybrid / WFH Options
Iron EagleX, Inc
APIs using FastAPI. Experience using Trino for query optimization and distributed SQL processing. Proficiency in working with Parquet Files for efficient data storage and retrieval. Hands-on experience with Airflow for orchestrating complex data workflows. Familiarity with Databricks Medallion model. Familiarity with OpenMetaData for metadata management and data governance. Experience working with streaming and batch data processing frameworks is More ❯
following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith More ❯
computing, with experience using frameworks like Databricks Strong experience with Azure or other cloud platforms (AWS) Strong experience building data pipelines or ETL tools and Orchestration frameworks such as Airflow Strong experience and appreciation of CI/CD implementation Good experience with implementing automated testing Good experience with Infrastructure as Code with Terraform Strong understanding of Agile delivery methodologies More ❯
experience in data engineering or a related field, with a focus on building scalable data systems and platforms. Expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least More ❯
Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Nice to Have Experience Understanding of various data architecture paradigms (e.g., Data Lakehouse, Data Warehouse, Data Mesh) and their applicability to different More ❯
and Analytics to ensure alignment and impact. Champion a culture of learning, innovation, and continuous improvement within the team. Tech Stack: Python | SQL | Snowflake | AWS (S3, EC2, Terraform, Docker) | Airflow | dbt | Apache Spark | Apache Iceberg | Postgres Requirements: Proven experience in a hands on data engineering leadership role. Strong background in modern data engineering (pipelines, modelling, transformations, governance … . Experience leading or mentoring small teams, with a desire to develop people as much as technology. Solid AWS cloud experience and exposure to modern tooling such as dbt, Airflow, and Snowflake. Strong communication skills with the ability to work cross-functionally and balance technical and business priorities. Curiosity around AI and how it can be used to boost More ❯
Tools: Proficiency with Maven and GitLab. • Data Formats: Familiarity with JSON, XML, SQL, and compressed file formats. • Configuration Files: Experience using YAML files for data model and schema configuration. • Apache NiFi: Significant experience with NiFi administration and building/troubleshooting data flows. • AWS S3: bucket administration. • IDE: VSCode, Intellij/Pycharm, or other suitable Technical Expertise: • ETL creation and … experience in cyber/network security operations. • Familiarity with Agile environments. • Good communication skills. • Developed documentation and training in areas of expertise. • Amazon S3, SQS/SNS Admin experience • ApacheAirflow Workloads via UI or CLI a plus • Experience with Mage AI a plus • Kubernetes, Docker Benefits $120-$220,000 salary per year, depending on experience. 11 Federal More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Rackner
in Kubernetes (AWS EKS, Rancher) with CI/CD Apply DevSecOps + security-first practices from design to delivery Tech You'll Touch AWS Python FastAPI Node.js React Terraform ApacheAirflow Trino Spark Hadoop Kubernetes You Have Active Secret Clearance 3+ years in Agile, cloud-based data engineering Experience with API design, ORM + SQL, AWS data services More ❯
Mc Lean, Virginia, United States Hybrid / WFH Options
ECCO Select
teams. Awareness of ethical considerations and responsible AI practices. Excellent problem-solving skills, attention to detail, and ability to thrive in a fast-paced, collaborative environment. Familiarity with NiFi, Airflow, or other data pipeline orchestration tools (not required, nice to have). Exposure to Kubernetes, IAM/security models, or Spark/Kafka environments (not required, nice to have More ❯
data modelling techniques (star schema, data vault, dimensional modelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands-on experience with data pipeline orchestration tools (Airflow, dbt, Prefect, or similar). Benefits: Unlimited holiday Annual Wellbeing Allowance Flexible work culture Monthly socials and events Complimentary snack bar Employer pension contribution If you're a data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Office Angels
data modelling techniques (star schema, data vault, dimensional modelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands-on experience with data pipeline orchestration tools (Airflow, dbt, Prefect, or similar). Benefits: Unlimited holiday Annual Wellbeing Allowance Flexible work culture Monthly socials and events Complimentary snack bar Employer pension contribution If you're a data More ❯
Analysis. SQL & Python: schema design, transformations, query optimisation, automation, testing. Track record of buildingETL/ELT pipelines into modern warehouses (BigQuery, Snowflake, Redshift). Familiar with tools like Dagster, Airflow, Prefect, dbt, Dataform, SQLMesh. Cloud experience (we're on GCP) + containerisation (Docker, Kubernetes). Strong sense of ownership over data standards, security, and roadmap. A collaborator at heart More ❯
software development lifecycle, from conception to deployment. Capable of conceptualizing and implementing software architectures spanning multiple technologies and platforms. Technology stack: Python Flask Java Spring JavaScript BigQuery Redis ElasticSearch Airflow Google Cloud Platform Kubernetes Docker Voted "Best Places to Work," our culture is driven by self-starters, team players, and visionaries. Headquartered in Los Angeles, California, the company operates More ❯