Glue). Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Knowledge of containerization and orchestration tools (e.g., Docker, ECS, Kubernetes). Familiarity of data orchestration tools (e.g. Prefect, Apache Airflow). Familiarity with CI/CD pipelines and DevOps practices. Familiarity with Infrastructure-as-code tools (e.g. Terraform, AWS CDK). Employee Benefits: At Intelmatix, our benefits package More ❯
writing clean, testable, production-grade ETL code at scale. Modern Data Pipelines: Experience with batch and streaming frameworks (e.g., Apache Spark, Flink, Kafka Streams, Beam), including orchestration via Airflow, Prefect or Dagster. Data Modeling & Schema Management: Demonstrated expertise in designing, evolving, and documenting schemas (OLAP/OLTP, dimensional, star/snowflake, CDC), data contracts, and data cataloguing. API & Integration Fluency More ❯
analysts and stakeholders, ensuring technical solutions meet business needs. Experience with data ingestion tools, like Fivetran. Advantageous Exposure to deploying applications with Kubernetes. Experience with Data Orchestrator tools (Airflow, Prefect, etc.) Experience with Data Observability tools (Montecarlo, Great Expectations, etc.) Experience with Data Catalog tools (Amundsen, OpenMetadata, etc.) Interview Process Call with the talent team Take home task Tech interview More ❯
of scalable training pipelines for large datasets. Working experience leading complex, cross-functional projects and influencing technical direction across multiple teams. Familiarity with modern workflow orchestration tools such as Prefect, Kubeflow, Argo, etc. Software engineering fundamentals, including data structures, design patterns, version control (Git), CI/CD, testing, and monitoring. Exceptional problem-solving skills, with a proven ability to navigate More ❯
Company Description With a history that dates back over 80 years, Starcom is a global communications planning and media leader. We are an agency still grounded in our founding principle that people are at the centre of all we do. More ❯
modelling (both machine-learning and econometric approaches) Familiarity with cloud platforms (AWS) and containerisation technologies (Docker) Familiarity with cloud-based ETL/ELT data pipelines and orchestrators (Airflow, Dagster, Prefect) Experience building backtests and deploying production ML Confident communicator with the ability to work across tech and commercial teams More ❯
on Business, Marketing, Analytics, or Computer Science Experience with AWS Experience with Javascript Basic understanding of digital interaction technologies such as live chat, virtual agents/chatbots Experience with Prefect Experience building User Interfaces What You Should Know About This Team Our Proof of Concept team is known for its collaborative spirit, a strong desire to learn, and, most importantly More ❯
and pipeline development. Extensive experience with BI/visualisation tools. Experience working with cloud data warehouses. Added bonus Proficiency in dbt. Skills in Python/R. Experience with Fivetran, Prefect, Snowflake, and Periscope. Familiarity with writing ETL pipelines using SQL and Python, and orchestration tools like Airflow or Prefect. Background in experimentation. Experience in fast-paced, venture-backed startup environments. More ❯
have experience with RAG (Retrieval-Augmented Generation) systems, vector databases, and embedding models for knowledge extraction. You can architect complex workflows. You have experience with workflow orchestration tools (Airflow, Prefect, Temporal) or have built custom pipeline systems for multi-step autonomous processes. You bridge science and engineering. You are comfortable with scientific computing libraries (NumPy, SciPy, pandas) and understand scientific More ❯