Arlington, Virginia, United States Hybrid / WFH Options
Full Visibility LLC
AWS S3, Azure Blob, MinIO, or similar) Proficiency in data parsing and transformation, handling structured and unstructured data Hands-on experience with ETL tools and data workflow orchestration (e.g., ApacheAirflow, Luigi, Prefect) Strong programming skills in Python, SQL, or Scala Experience with open-source data processing tools (e.g., Kafka, Spark, Flink, Hadoop) Familiarity with database technologies (PostgreSQL More ❯
and reliability across our platform. Working format: full-time, remote. Schedule: Monday to Friday (the working day is 8+1 hours). Responsibilities: Design, develop, and maintain data pipelines using ApacheAirflow . Create and support data storage systems (Data Lakes/Data Warehouses) based on AWS (S3, Redshift, Glue, Athena, etc.). Integrate data from various sources, including … pipeline into a standalone, scalable service. What we expect from you: 4+ years of experience as a Data Engineer, including 1+ year at a Senior level. Deep knowledge of Airflow : DAGs, custom operators, and monitoring. Strong command of PostgreSQL databases; familiarity with the AWS stack (S3, Glue or Redshift, Lambda, CloudWatch) is a significant plus. Excellent SQL skills and More ❯
Programming Languages: Python, Java, or Go. • Data Engineering Tools: Apache Kafka, Airflow (for orchestration), Spark (if needed for larger datasets). • OpenSearch/Elasticsearch: Indexing, querying, and optimizing. • Visualization Tools: Kibana, Grafana (for more advanced visualizations), React.js. • Cloud: AWS (ElasticSearch Service), or Azure (if using cloud infrastructure). Desired: • 15+ years of experience working in data engineering or More ❯
SQL and at least one programming language (e.g., Python) Familiarity with relational databases and data warehousing concepts Understanding of ETL concepts and tools Exposure to workflow orchestration tools like ApacheAirflow, NiFi and Kafka Strong analytical and problem-solving skills Excellent communication and teamwork abilities Eagerness to learn and grow in a fast-paced environment Experience in Jupyter More ❯
with MLOps practices and model deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in data pipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with infrastructure as code (Terraform, CloudFormation More ❯
Brighton, Sussex, United Kingdom Hybrid / WFH Options
Burns Sheehan
Lead Data Engineer £75,000-£85,000 ️ AWS, Python, SQL, Airflow Brighton, hybrid working Analyse customer behaviour using AI & ML We are partnered with a private equity backed company who provide an AI-powered, guided selling platform that helps businesses improve online sales and customer experience. They are looking for a Lead Data Engineer to lead a small team … experience in a Senior Data Engineering role. Comfortable owning and delivering technical projects end-to-end. Strong in Python, SQL, and cloud platforms (AWS or comparable). Experience with Airflow, Snowflake, Docker (or similar). Familiarity with coaching and mentoring more junior engineers, leading 1-1s and check ins. Wider tech stack : AWS, Python, Airflow, Fivetran, Snowflake … Enhanced parental leave and pay If you are interested in finding out more, please apply or contact me directly! Lead Data Engineer £75,000-£85,000 ️ AWS, Python, SQL, Airflow Brighton, hybrid working Analyse customer behaviour using AI & ML Burns Sheehan Ltd will consider applications based only on skills and ability and will not discriminate on any grounds. More ❯
system performance and functionality. Requirements: -Active Top Secret/SCI Eligibility Clearance. -Minimum of 8 years of experience in data engineering or related work. -Proficiency in Java, AWS, Python, Apache Spark, Linux, Git, Maven, and Docker. -Experience maintaining an Apache Hadoop Ecosystem using tools like HBase, MapReduce, and Spark. -Knowledge of ETL processes utilizing Linux shell scripting, Perl … Python, and Apache Airflow. -Experience with AWS services such as CloudWatch, CloudTrail, ELB, EMR, KMS, SQS, SNS, and Systems Manager. -Experience in supporting, maintaining, and migrating JavaFX applications to modern cloud-native solutions. -Strong decision-making skills and domain knowledge. -Bachelor's Degree in a related field OR an additional 4 years of relevant experience in lieu of a More ❯
pipelines to power next-gen data products in the commodities industry. Ensure data quality using the latest analytics and monitoring tools. Design and build robust pipelines with tools like Airflow and DBT. Create scalable infrastructure on Azure using technologies like Terraform. Write clean, high-quality, reusable code aligned with best practices. Drive innovation by bringing your own ideas-your … in a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. Able to communicate data architecture clearly with internal stakeholders. Experience with Azure, Airflow, DBT, Kubernetes, GitHub. Bonus points for: open-source contributions, an active GitHub profile, and curiosity for the latest in tech. A natural problem-solver who loves making things work. More ❯
pipelines to power next-gen data products in the commodities industry. Ensure data quality using the latest analytics and monitoring tools. Design and build robust pipelines with tools like Airflow and DBT. Create scalable infrastructure on Azure using technologies like Terraform. Write clean, high-quality, reusable code aligned with best practices. Drive innovation by bringing your own ideas-your … in a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. Able to communicate data architecture clearly with internal stakeholders. Experience with Azure, Airflow, DBT, Kubernetes, GitHub. Bonus points for: open-source contributions, an active GitHub profile, and curiosity for the latest in tech. A natural problem-solver who loves making things work. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
within ambitious software businesses. Specifically, you can expect to be involved in the following: Designing and developing full-stack data pipelines and platforms using modern tools such as dbt, Airflow, and cloud infrastructure Cleansing, enriching and modelling data to generate commercial insights and power C-level dashboards Delivering scalable solutions that support internal use cases and extend directly to … sales) and building tools that serve business needs Background in startups or scale-ups with high adaptability and a hands-on approach Experience with modern data tools (e.g. dbt, Airflow, CI/CD) and at least one cloud platform (AWS, GCP, Azure) Strong communication skills and a track record of credibility in high-pressure or client-facing settings BENEFITS More ❯
Employment Type: Full-Time
Salary: £100,000 - £110,000 per annum, Inc benefits
build batch data pipelines , insightful dashboards , and data products that serve new gaming customers and internal teams. What You'll Do Design and build scalable, maintainable data pipelines using Airflow and Snowflake Create data models and workflows that power dashboards, reports, and analytical products Collaborate with analysts, product teams, and game developers to deliver valuable insights Ensure data quality … documentation, and reliability across our data products Help shape the future of how data supports player engagement and product development What We're Looking For Hands-on experience with Airflow , Snowflake , and cloud data platforms (AWS/GCP) Strong SQL and Python skills for building and managing data workflows A knack for turning raw data into usable, impactful insights More ❯
build batch data pipelines , insightful dashboards , and data products that serve new gaming customers and internal teams. What You'll Do Design and build scalable, maintainable data pipelines using Airflow and Snowflake Create data models and workflows that power dashboards, reports, and analytical products Collaborate with analysts, product teams, and game developers to deliver valuable insights Ensure data quality … documentation, and reliability across our data products Help shape the future of how data supports player engagement and product development What We're Looking For Hands-on experience with Airflow , Snowflake , and cloud data platforms (AWS/GCP) Strong SQL and Python skills for building and managing data workflows A knack for turning raw data into usable, impactful insights More ❯
Data Storage & Databases: SQL & NoSQL Databases: Experience with databases like PostgreSQL, MySQL, MongoDB, and Cassandra. Big Data Ecosystems: Hadoop, Spark, Hive, and HBase. Data Integration & ETL: Data Pipelining Tools: Apache NiFi, Apache Kafka, and Apache Flink. ETL Tools: AWS Glue, Azure Data Factory, Talend, and Apache Airflow. AI & Machine Learning: Frameworks: TensorFlow, PyTorch, Scikit-learn, Keras More ❯
Spark). Experienced with Elasticsearch and Cloud Search. Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform. Experience with data pipeline orchestration tools (e.g., Airflow, Luigi) and workflow automation tools (e.g., Jenkins, GitLab CI/CD). Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes) is a plus. Data pipeline management Proven experience More ❯
platforms such as Azure or AWS, leveraging tools like Databricks, Data Factory, and Synapse Analytics. Ensuring best practices in DevOps, version control, and data governance. Managing orchestration workflows using Airflow or similar tools. Supporting AI teams by preparing high-quality, structured data for machine learning applications. Communicating technical concepts to non-technical stakeholders and making data solutions accessible to … and Python. Experience in Azure with tools like Databricks, Data Factory, and Synapse Analytics. Knowledge of data modeling techniques. Familiarity with DevOps and version control best practices. Experience with Airflow or other orchestration tools is a plus. Expertise in Machine Learning, especially NLP, is a plus. Certifications such as Azure DP-203 are a plus. Soft Skills: Ability to More ❯
Strong SQL and Python skills for building and optimising data pipelines Experience working with cloud platforms (e.g., AWS, GCP, or Azure) Familiarity with modern data stack tools (e.g., dbt, Airflow, Snowflake, Redshift, or BigQuery) Understanding of data modelling and warehousing principles Experience working with large datasets and distributed systems What's in it for you? Up to £70k Hybrid More ❯
experience with Cloud infrastructure (ideally AWS), DevOps technologies such as Docker or Terraform and CI/CD processes and tools. Have previously worked with MLOps tools like MLFlow and Airflow, or on common problems such as model and API monitoring, data drift and validation, autoscaling, access permissions Have previously worked with monitoring tools such as New Relic or Grafana … and associated ML/DS libraries (scikit-learn, numpy, pandas, LightGBM, LangChain/LangGraph, TensorFlow, etc ) PySpark AWS cloud infrastructure: EMR, ECS, ECR, Athena, etc. MLOps: Terraform, Docker, Spacelift, Airflow, MLFlow Monitoring: New Relic CI/CD: Jenkins, Github Actions More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad policy, 2-for More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
engineers Collaborate across engineering, data science, and product teams to deliver business impact Skills & Experience: Expert in SQL , dbt , and cloud data warehouses (e.g., BigQuery, Redshift) Strong experience with Airflow , Python , and multi-cloud environments (AWS/GCP) Proven background in designing and scaling analytics solutions in agile environments Proven experience as an Analytics Engineer Nice to Have: Experience More ❯
oversight across the data platform, including data pipelines, orchestration and modelling. Lead the team in building and maintaining robust data pipelines, data models, and infrastructure using tools such as Airflow, AWS Redshift, DBT and Looker.Ensuring the team follows agile methodologies to improve delivery cadence and responsiveness. Contribute to hands-on coding, particularly in areas requiring architectural input, prototyping, or … mentoring skills and ability to foster team growth and development Strong understanding of the data engineering lifecycle, from ingestion to consumption Hands-on experience with our data stack (Redshift, Airflow, Python, DVT, MongoDB, AWS, Looker, Docker) Understanding of data modelling, transformation, and orchestration best practices Experience delivering both internal analytics platforms and external data-facing products Knowledge of modern More ❯
oversight across the data platform, including data pipelines, orchestration and modelling. Lead the team in building and maintaining robust data pipelines, data models, and infrastructure using tools such as Airflow, AWS Redshift, DBT and Looker.Ensuring the team follows agile methodologies to improve delivery cadence and responsiveness. Contribute to hands-on coding, particularly in areas requiring architectural input, prototyping, or … mentoring skills and ability to foster team growth and development Strong understanding of the data engineering lifecycle, from ingestion to consumption Hands-on experience with our data stack (Redshift, Airflow, Python, DVT, MongoDB, AWS, Looker, Docker) Understanding of data modelling, transformation, and orchestration best practices Experience delivering both internal analytics platforms and external data-facing products Knowledge of modern More ❯
oversight across the data platform, including data pipelines, orchestration and modelling. Lead the team in building and maintaining robust data pipelines, data models, and infrastructure using tools such as Airflow, AWS Redshift, DBT and Looker.Ensuring the team follows agile methodologies to improve delivery cadence and responsiveness. Contribute to hands-on coding, particularly in areas requiring architectural input, prototyping, or … mentoring skills and ability to foster team growth and development Strong understanding of the data engineering lifecycle, from ingestion to consumption Hands-on experience with our data stack (Redshift, Airflow, Python, DVT, MongoDB, AWS, Looker, Docker) Understanding of data modelling, transformation, and orchestration best practices Experience delivering both internal analytics platforms and external data-facing products Knowledge of modern More ❯
trusted, reliable and available . The technology underpinning these capabilities includes industry leading data and analytics products such as Snowflake, Tableau, DBT, Talend, Collibra, Kafka/Confluent , Astronomer/Airflow , and Kubernetes . This forms part of a longer-term strategic direction to implement Data Mesh, and with it establish shared platform s that enables a connected collection of … and driv ing a culture of iterative improvemen t. Modern data stack - hands-on deploy ment and govern ance of enterprise technologies at scale (e.g. Snowflake, Tableau, DBT, Fivetran , Airflow, AWS , GitHub, Terraform, etc ) for self-service workloads . Thought leadership and influencing - deep interest in data platforms landscape to build well-articulated proposals that are supported by strong More ❯
and meeting deadlines. Proficiency in SQL (BigQuery), Python, Git/GitHub, and preferably Looker (Tableau or PowerBI are acceptable as well) Above average knowledge of DBT, Docker, GCP, and Airflow Experience in the cryptocurrency industry, fintech sector, or platform-type businesses is preferred but not required. Personal Attributes Analytical mindset with a passion for data-driven decision-making. Strong … ambitious with a results-oriented attitude and continuous improvement mindset Technologies you will work with Python SQL (BigQuery) GCP EPPO for experimentation DBT, Docker, Cloud Run/Kubernetes, and Airflow for data orchestration and data pipelines Looker data visualization Git and GitHub for code collaboration Ability to leverage AI tools such as Cursor and LLMs in the day-to More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯
of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯