privacy, and security, ensuring our AI systems are developed and used responsibly and ethically. Tooling the Future: Get hands-on with cutting-edge technologies like Hugging Face, PyTorch, TensorFlow, Apache Spark, ApacheAirflow, and other modern data and ML frameworks. Collaborate and Lead: Partner closely with ML Engineers, Data Scientists, and Researchers to understand their data needs … their data, compute, and storage services. Programming Prowess: Strong programming skills in Python and SQL are essential. Big Data Ecosystem Expertise: Hands-on experience with big data technologies like Apache Spark, Kafka, and data orchestration tools such as ApacheAirflow or Prefect. ML Data Acumen: Solid understanding of data requirements for machine learning models, including feature engineering More ❯
analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using ApacheAirflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with ApacheAirflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding More ❯
analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using ApacheAirflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with ApacheAirflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding More ❯
analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using ApacheAirflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with ApacheAirflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding More ❯
london (city of london), south east england, united kingdom
HCLTech
analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using ApacheAirflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with ApacheAirflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding More ❯
and applying best practices in security and compliance, this role offers both technical depth and impact. Key Responsibilities Design & Optimise Pipelines - Build and refine ETL/ELT workflows using ApacheAirflow for orchestration. Data Ingestion - Create reliable ingestion processes from APIs and internal systems, leveraging tools such as Kafka, Spark, or AWS-native services. Cloud Data Platforms - Develop … DAGs and configurations. Security & Compliance - Apply encryption, access control (IAM), and GDPR-aligned data practices. Technical Skills & Experience Proficient in Python and SQL for data processing. Solid experience with ApacheAirflow - writing and configuring DAGs. Strong AWS skills (S3, Redshift, etc.). Big data experience with Apache Spark. Knowledge of data modelling, schema design, and partitioning. Understanding More ❯
and evaluation through continuous monitoring and scaling. Build & Optimise AI models in Python: fine-tune state-of-the-art architectures on our in-house GPU cluster. Orchestrate Workflows with ApacheAirflow: schedule, monitor, and maintain complex data and model pipelines. Engineer Cloud Services on AWS (Lambda, ECS/EKS, S3, Redshift, etc.) and automate deployments using GitHub Actions … testing, and monitoring. Startup mindset: proactive, resourceful, ambitious, driven to innovate, eager to learn, and comfortable wearing multiple hats in a fast-moving environment. Desirable: hands-on experience with ApacheAirflow, AWS services (especially Redshift, S3, ECS/EKS), and IaC tools like Pulumi. Why Permutable AI? Hybrid Flexibility: Spend 2+ days/week in our Vauxhall hub. More ❯
Front End ability (Vue, React or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income JavaScript Node Fixed Income Credit … the team to be in the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech attitude towards technology. Hours are More ❯
Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines and version control systems like Git. Knowledge of ETL tools and technologies such as ApacheAirflow, Informatica, or Talend. Knowledge of data governance and best practices in data management. Familiarity with cloud platforms and services such as AWS, Azure, or GCP for deploying … and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP More ❯
data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with ETL tools and technologies such as ApacheAirflow, Informatica, or Talend. Strong understanding of data governance and best practices in data management. Experience with cloud platforms and services such as AWS, Azure, or GCP for … deploying and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or More ❯
or other testing methodologies Preferred: Familiarity with PostgreSQL and Snowflake Preferred: Familiarity with Web Frameworks such as Django, Flask or FastAPI Preferred: Familiarity with event streaming platforms such as Apache Kafka Preferred: Familiarity with data pipeline platforms such as ApacheAirflow Preferred: Familiarity with Java Preferred: Experience in one or more relevant financial areas (market data, order More ❯
. Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Knowledge of containerization and orchestration tools (e.g., Docker, ECS, Kubernetes). Familiarity of data orchestration tools (e.g. Prefect, ApacheAirflow). Familiarity with CI/CD pipelines and DevOps practices. Familiarity with Infrastructure-as-code tools (e.g. Terraform, AWS CDK). Employee Benefits: At Intelmatix, our benefits More ❯
delivering end-to-end AI/ML projects. Nice to Have: Exposure to LLMs (Large Language Models), generative AI , or transformer architectures . Experience with data engineering tools (Spark, Airflow, Snowflake). Prior experience in fintech, healthtech, or similar domains is a plus. More ❯
business glossary, and data mapping framework using metadata management and data catalog tools. Automate data classification, lineage tracking, and policy enforcement through scripts, APIs, and orchestration tools (e.g., dbt, Airflow). Map & Visualize Data Flows : Design and maintain clear documentation and visualizations of data movement across systems, focusing on sensitive and business-critical data. Drive Cross-Functional Alignment: Collaborate … governance SME; support teams with tooling, guidance, and best practices. About You: Strong technical foundation in data governance architecture and tooling. You've worked with tools such as DataHub, ApacheAirflow, AWS, dbt, Snowflake, BigQuery , or similar. Hands-on experience building and maintaining centralized data inventories, business glossaries, and data mapping frameworks. Proficient in automating data classification and … lineage using scripting languages like Python , SQL , or Java , along with orchestration tools such as Airflow and dbt . 5+ years of experience in data governance, privacy, or data engineering roles-especially in settings that integrate governance tightly into data platform design. Familiarity with privacy-by-design , data minimization , and regulatory standards including GDPR, ISO 27001, SOC 2, and More ❯
systems, with a focus on data quality and reliability. Design and manage data storage solutions, including databases, warehouses, and lakes. Leverage cloud-native services and distributed processing tools (e.g., Apache Flink, AWS Batch) to support large-scale data workloads. Operations & Tooling Monitor, troubleshoot, and optimize data pipelines to ensure performance and cost efficiency. Implement data governance, access controls, and … ELT pipelines and data architectures. Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services. Comfortable with big data tools and distributed processing frameworks such as Apache Flink or AWS Batch. Strong understanding of data governance, security, and best practices for data quality. Effective communicator with the ability to work across technical and non-technical teams. … Additional Strengths Experience with orchestration tools like Apache Airflow. Knowledge of real-time data processing and event-driven architectures. Familiarity with observability tools and anomaly detection for production systems. Exposure to data visualization platforms such as Tableau or Looker. Relevant cloud or data engineering certifications. What we offer: A collaborative and transparent company culture founded on Integrity, Innovation and More ❯
engineering best practices: Git, version control, CI/CD, testing Confident communicator with the ability to work effectively across technical and non-technical teams Bonus Points For: Experience with Airflow, Airbyte, or other orchestration tools Familiarity with ingestion tools like Fivetran Experience working with Spark or distributed computing systems Exposure to AWS and broader cloud infrastructure Knowledge of reverse More ❯
meetings. What You Need to Succeed Strong skills in Python and SQL Demonstrable hands-on experience in AWS cloud Data ingestions both batch and streaming data and data transformations (Airflow, Glue, Lambda, Snowflake Data Loader, FiveTran, Spark, Hive etc.). Apply agile thinking to your work. Delivering in iterations that incrementally build on what went before. Excellent problem-solving … translate concepts into easily understood diagrams and visuals for both technical and non-technical people alike. AWS cloud products (Lambda functions, Redshift, S3, AmazonMQ, Kinesis, EMR, RDS (Postgres . ApacheAirflow for orchestration. DBT for data transformations. Machine Learning for product insights and recommendations. Experience with microservices using technologies like Docker for local development. Apply engineering best practices More ❯
fintech, crypto, or trading industries; familiarity with FIX is a plus. Experience in object-oriented development with strong software engineering foundations. Experience with data-engineering cloud technologies such as ApacheAirflow, K8S, Clickhouse, Snowflake, Redis, cache technologies, and Kafka. Proven experience with relational and non-relational databases; proficient in SQL and query optimization. Experience designing infrastructure at scale More ❯
Strong knowledge of algorithms, design patterns, OOP, threading, multiprocessing, etc. Experience with SQL, NoSQL, or tick databases Experience working in a Unix environment and git Familiarity with Kafka, Docker, AirFlow, Luigi Strong communication skills in verbal and written English. Domain knowledge in futures & swaps is a plus Highly competitive compensation and bonus structure Meritocratic environment with ample opportunity for More ❯
MySQL, PostgreSQL, or Oracle. Experience with big data technologies such as Hadoop, Spark, or Hive. Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow. Proficiency in Python and at least one other programming language such as Java, or Scala. Willingness to mentor more junior members of the team. Strong analytical and problem More ❯
following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith More ❯
experience Experience in data quality engineering, software testing, or data engineering. Strong proficiency in SQL and Python with the ability to validate large datasets. Experience with Snowflake, Databricks, Spark, Airflow, or similar tools. Proven ability to build and scale automated data testing frameworks. Experience leading strategic initiatives and/or managing engineering team members. Strong problem-solving and debugging More ❯
Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Nice to Have Experience Understanding of various data architecture paradigms (e.g., Data Lakehouse, Data Warehouse, Data Mesh) and their applicability to different More ❯