working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to ApacheAirflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in More ❯
Gateway, and Python to maintain and enhance integrations. Analyze and understand existing R-based data pipelines created by data scientists. Migrate these pipelines into Airflow, dbt, and Terraform workflows. Modernize and scale legacy infrastructure running on AWS. Collaborate with engineering teams to ensure a smooth transition and system stability. … Phase 2: Legacy Pipeline Migration (Months 2-3) Analyze and understand existing R-based data pipelines created by data scientists. Migrate these pipelines into Airflow, dbt, and Terraform workflows. Modernize and scale legacy infrastructure running on AWS. Collaborate with engineering teams to ensure a smooth transition and system stability. … beneficial for interpreting existing scripts) Cloud & Infrastructure: AWS services including Lambda, API Gateway, S3, CloudWatch, Kinesis Firehose Terraform for infrastructure as code Orchestration & Transformation: ApacheAirflow dbt CRM & Marketing Tools: Braze (preferred) Familiarity with other CRM/marketing automation tools such as Iterable or Salesforce Marketing Cloud is More ❯
software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
maintain Python code Create Terraform scripts for infrastructure provisioning Implement IAM policies and permissions for applications and GitHub Change environment variables for applications and Airflow DAGS If you are interested, please apply with your updated CV . We will arrange a call to discuss your application further. #J More ❯
engineers, and product owners to define use cases and deliver scalable solutions. Model Deployment & Monitoring: Deploy models using MLOps practices and tools (e.g., MLflow, Airflow, Docker, cloud platforms) ensuring performance, reliability, and governance compliance. Innovation & Research: Stay current on advancements in AI/ML and proactively bring forward new … experimentation. Data Access & Engineering Collaboration : Comfort working with cloud data warehouses (e.g., Snowflake, Databricks, Redshift, BigQuery) Familiarity with data pipelines and orchestration tools like Airflow Work closely with Data Engineers to ensure model-ready data and scalable pipelines. Nice to have Prior experience working in financial services or within More ❯
Gateway, and Kinesis. Integrating third-party APIs into the data platform and transforming data for CRM delivery. Migrating R-based data streams into modern Airflow-managed Python/DBT pipelines. Ensuring observability and reliability using CloudWatch and automated monitoring. Supporting both BAU and new feature development within the data … services including Lambda, API Gateway, S3, Kinesis, and CloudWatch. Strong programming ability in Python and data transformation skills using SQL and DBT. Experience with Airflow for orchestration and scheduling. Familiarity with third-party API integration and scalable data delivery methods. Excellent communication and the ability to work in a More ❯
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/… MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions … years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF More ❯
security. Drive modernisation by transitioning from legacy systems to a lean, scalable platform. Act as a lead expert for technologies such as AWS, DBT, Airflow, and Databricks. Establish best practices for data modelling, ingestion, storage, streaming, and APIs. Governance & Standards Ensure all technical decisions are well-justified, documented, and … engineering, including data ingestion, transformation, and storage. Significant hands-on experience with AWS and its data services. Expert-level skills in SQL, Python, DBT, Airflow and Redshift. Confidence in coding, scripting, configuring, versioning, debugging, testing, and deploying. Ability to guide and mentor others in technical best practices. A product More ❯