of data warehousing concepts and SQL (Snowflake experience a plus) Experience with or willingness to learn CI/CD tooling (e.g. GitHub Actions), containerization (Docker), and workflow orchestration tools (Airflow/AstroCloud) Strong debugging and troubleshooting skills for data pipelines and ML systems Experience writing tests (unit, integration) and implementing monitoring/alerting for production systems Strong data skills … of data warehousing concepts and SQL (Snowflake experience a plus) Experience with or willingness to learn CI/CD tooling (e.g. GitHub Actions), containerization (Docker), and workflow orchestration tools (Airflow/AstroCloud) Strong debugging and troubleshooting skills for data pipelines and ML systems Experience writing tests (unit, integration) and implementing monitoring/alerting for production systems Strong data skills More ❯
communication skills. Bonus Points For: Experience leading projects in SCIF environments. Expertise in Cyber Analytics, PCAP, or network monitoring. Familiarity with Spark, Dask, Snowpark, Kafka, or task schedulers like Airflow and Celery. More ❯
/CD tools like Jenkins and GitHub. Understanding of how to build and run containerized applications (Docker, Helm) Familiarity with, or a working understanding of big data search tools (Airflow, Pyspark, Trino, OpenSearch, Elastic, etc. More ❯
with cloud services (Azure ML, AWS SageMaker, GCP Vertex AI) Proficient in Python and common ML libraries (TensorFlow, PyTorch, Scikit-learn) Familiarity with data engineering workflows and tools (Spark, Airflow, Databricks, etc.) Experience with GenAI, LLMs, or NLP is a strong plus Solid understanding of model governance, compliance, and responsible AI principles Excellent communication skills and stakeholder management abilities More ❯
SRE, and product teams. We'd Love to See Experience with semantic technologies: ontologies, RDF, or graph databases (e.g., Neo4j, RDF4J). Familiarity with ETL or EIS platforms like Apache Camel or Airflow. Knowledge of financial market data, especially around latency, availability, and correctness. Experience building or contributing to observability platforms or knowledge graph tooling. More ❯
including PCAP, CVEs, and network monitoring. Experience integrating with technologies such as Spark, Dask, Snowpark, or Kafka. Background in web application stacks (e.g., Flask, Django) or task schedulers (e.g., Airflow, Celery, Prefect). Compensation & Benefits: Competitive salary, equity, and performance-based bonus. Full benefits package including medical, dental, and vision. Unlimited paid time off. If you want to know More ❯
experience working effectively with cross-functional teams across multiple time zones with with remote stakeholders BS degree in Computer Science or related engineering field Nice to Have Experience with Airflow, Celery, AWS and/or Azure, Postgres Experience with API platform development Experience with Go More ❯
pipelines. 1-2 years of hands-on experience with Azure services such as Data Factory, Databricks, Synapse (DWH), Azure Functions, and other data analytics tools, including streaming. Experience with Airflow and Kubernetes. Programming skills in Python (PySpark) and scripting languages like Bash. Knowledge of Git, CI/CD operations, and Docker. Basic PowerBI knowledge is a plus. Experience deploying More ❯
with an ability to work on multiple projects simultaneously Strong interpersonal and communication skills Quick, self-learning capabilities and creativity in problem-solving Preferred: Familiarity with Python Familiarity with Airflow, ETL tools, Snowflake and MSSQL Hands-on with VCS (Git) Apply for this job indicates a required field First Name Last Name Email Phone Resume/CV Enter manually More ❯
SRE, and product teams. We'd Love to See Experience with semantic technologies: ontologies, RDF, or graph databases (e.g., Neo4j, RDF4J). Familiarity with ETL or EIS platforms like Apache Camel or Airflow. Knowledge of financial market data, especially around latency, availability, and correctness. Experience building or contributing to observability platforms or knowledge graph tooling. Bloomberg is an equal More ❯
development of a new global data platform (PaaS) Ensure scalable storage solutions (data lakes, data warehouses) to handle structured and unstructured data. Implement ETL/ELT pipelines using Dagster, Airflow, or similar tools. Optimize performance and scalability for large data volumes. Govern data security, compliance, and access controls. Development & DevOps: Strong programming and scripting skills in Python. Knowledge of More ❯
Proficiency in data modelling, data management processes, and data profiling. Experience in developing APIs and working with WebSockets. Knowledge of React, Django, FastAPI, or equivalent technologies. Previous experience with AirFlow, Linux shell commands and setup websites on IIS. What we would like from you: Bachelor’s or Master’s degree in a relevant field. Creative and solutions-oriented, with More ❯
Proficiency in data modeling, data management processes, and data profiling. Experience in developing APIs and working with WebSockets. Knowledge of React, Django, FastAPI, or equivalent technologies. Previous experience with AirFlow, Linux shell commands, and setting up websites on IIS. What we would like from you: Bachelor's or Master's degree in a relevant field. Creative and solutions-oriented More ❯
/CD processes GitHub experience including actions and CI/CD Data Modeling experience, including e xtensive experience designing dimensional models based on business use cases and reporting needs Airflow experience (Task scheduler and orchestrator) Python experience (Programming Language) Soft Skills: Interpersonal skills to engage and communicate effectively with customers and audiences of different backgrounds within the organization Please More ❯
role. A strong understanding of data quality concepts, methodologies, and best practices. Proficiency in SQL and data querying for data validation and testing purposes. Hands-on experience with Snowflake, Airflow or Matillion would be ideal Familiarity with data integration, ETL processes, and data governance frameworks. Solid understanding of data structures, relational databases, and data modelling concepts. Excellent analytical and More ❯
similarly complex operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. MLFlow, GCP Vertex, Airflow, dbt or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational teams We're flexible on experience More ❯
sensitive and global datasets. Hands-on experience integrating advanced AI/ML capabilities into operational and analytical data platforms. Extensive knowledge of modern data orchestration and workflow technologies (e.g., Airflow, Kubeflow), and infrastructure automation frameworks (Terraform, CloudFormation). Demonstrated leadership in managing technical product roadmaps, agile delivery practices, and stakeholder management in complex environments. Boston Consulting Group is an More ❯
.NET framework backend services and React frontends. You'll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You will serve as the go-to engineer for: End-to-end data migration More ❯
software development and code quality. Vendor Collaboration: Work closely with third-party vendors to integrate their solutions, ensuring they meet our high standards for production environments. Workflow Automation: Utilize Airflow to automate and optimize workflows, ensuring efficient and reliable operations. Required 5-7 years of experience in software development with a focus on production-grade code. Proficiency in Java More ❯