Bath, England, United Kingdom Hybrid / WFH Options
Autodesk
such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as ApacheAirflow, Argo Workflows, etc. o Data catalogs and metadata management tools More ❯
analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong More ❯
oversight across the data platform, including data pipelines, orchestration and modelling. Lead the team in building and maintaining robust data pipelines, data models, and infrastructure using tools such as Airflow, AWS Redshift, DBT and Looker.Ensuring the team follows agile methodologies to improve delivery cadence and responsiveness. Contribute to hands-on coding, particularly in areas requiring architectural input, prototyping, or … mentoring skills and ability to foster team growth and development Strong understanding of the data engineering lifecycle, from ingestion to consumption Hands-on experience with our data stack (Redshift, Airflow, Python, DVT, MongoDB, AWS, Looker, Docker) Understanding of data modelling, transformation, and orchestration best practices Experience delivering both internal analytics platforms and external data-facing products Knowledge of modern More ❯
in Python and ML/engineering frameworks such as PyTorch, TensorFlow (including Keras), Hugging Face (Transformers, Datasets) and scikit-learn, etc Experience with MLOps tools, including MLFlow, workflow orchestrators (Airflow, Metaflow, Perfect or similar), and containerisation (Docker) Strong knowledge of cloud platforms like Azure, AWS or GCP for deploying and managing ML models Familiarity with data engineering tools and More ❯
by managing data integrations between Salesforce Health Cloud and Informatica, ensuring seamless data flow between systems. You’ll troubleshoot virtual machine access via SSH, facilitate migration of outputs into Airflow, and support AWS infrastructure while interfacing with Google Cloud-hosted systems. The position plays a key role in enabling actionable insights and operational stability across complex, cross-platform data … or Informatica Cloud Solid understanding of data extraction, transformation, and loading (ETL) processes Proficiency in SSH and managing/troubleshooting virtual machine environments Familiarity with orchestration tools such as ApacheAirflow Experience with AWS services (e.g., S3, EC2, RDS) Understanding of cloud-based environments (Google Cloud Platform experience a plus) Ability to collaborate across infrastructure, data engineering, and … Manage and optimize data flows between Salesforce Health Cloud and Informatica Perform infrastructure-level troubleshooting using SSH and virtual machine access Ensure high-quality data migration and integration into Airflow-managed pipelines Support AWS-based data operations and contribute to multi-cloud data management Collaborate with infrastructure and application teams to maintain seamless operations Analyze and validate data transformations More ❯
data warehouses (Snowflake, BigQuery), lakes, and CDPs. Understanding of data pipelines, contracts, lineage, APIs, and data governance frameworks. Comfortable navigating GDPR, CCPA and other compliance requirements. Familiar with dbt, Airflow, and cloud platforms (AWS, GCP, Azure). Proven ability to manage backlogs, engage stakeholders and deliver in Agile environments. Analytical mindset with a strong commercial awareness. Why Apply? Excellent More ❯
deploying and managing AI/ML models in financial systems Proficiency in Python and familiarity with AI/ML tools and platforms such as Azure, AWS, GCP, Databricks, MLFlow, Airflow and financial-specific platforms like Bloomberg Terminal, SAS, or MATLAB Experience with structured and unstructured financial data, including time-series analysis, market data and transactional data Ability to articulate More ❯