such as TensorFlow, HuggingFace, scikit-learn, PyTorch, Pandas, NumPy, SciPy. Experience with AWS (EC2, S3, SageMaker) or Azure/GCP equivalents. Experience designing, developing, and deploying scalable infrastructure (e.g., ApacheAirflow, Luigi). Object-oriented programming concepts and design. Ability to create well-documented, modular, and unit-tested code. Understanding of Agile development and tools like pip, git More ❯
HuggingFace, scikit-learn, PyTorch, Pandas, NumPy, SciPy Experience with AWS (principally EC2, S3, SageMaker) or Azure/GCP equivalents Some experience of designing, developing and deploying scalable infrastructure (eg ApacheAirflow, Luigi or other cluster management software) Object Orientated concepts and design The ability to design and build unit-tested and well documented modular code Understanding of Agile More ❯
as the ability to learn quickly and apply new skills Desirable Solid understanding of microservices development SQL and NoSQL databases working set Familiar with or able to quickly learn Apache NiFi, ApacheAirflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM Good skills working with JSON, XML, YAML files Knowledge in Python, Java, awk, sed, Ansible More ❯
warehousing concepts and data modeling. >Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. >Understanding/hands on experience in Orchestration solutions such as Airflow >Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Axiom Software Solutions Limited
Position: Data Engineer Location: Cambridge/Luton, UK (Hybrid 2-3 days onsite in a week) Duration: Long Term B2B Contract Job Description: The ideal candidate will have a minimum of 5+ years of experience working with Snowflake, DBT, Python More ❯
Direct message the job poster from KBC Technologies Group The ideal candidate will have a minimum of 5+ years of experience with strong expertise in Snowflake, DBT, Python, and AWS to deliver ETL/ELT pipelines. Proficiency in Snowflake data More ❯
data warehousing concepts and data modeling . Excellent problem-solving and communication skills focused on delivering high-quality solutions. Understanding or hands-on experience with orchestration tools such as ApacheAirflow . Deep knowledge of non-functional requirements such as availability , scalability , operability , and maintainability . #J-18808-Ljbffr More ❯
of data warehousing concepts and data modeling. Excellent problem-solving and communication skills focused on delivering high-quality solutions. Understanding or hands-on experience with orchestration tools such as Apache Airflow. Deep knowledge of non-functional requirements such as availability, scalability, operability, and maintainability. Seniority level Seniority level Mid-Senior level Employment type Employment type Full-time Job function More ❯
warehousing concepts and data modeling. >Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. >Understanding/hands on experience in Orchestration solutions such as Airflow >Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability More ❯
warehousing concepts and data modeling. >Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. >Understanding/hands on experience in Orchestration solutions such as Airflow >Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability More ❯
warehousing concepts and data modeling. Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. Understanding/hands on experience in Orchestration solutions such as Airflow Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability #J-18808-Ljbffr More ❯
warehousing concepts and data modelling. • Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. • Understanding/hands-on experience in Orchestration solutions such as Airflow • Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability Evaluation will be done according to below: Area of Assessment Priority Data Architect Must Have More ❯
warehousing concepts and data modeling. Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. Understanding/hands on experience in Orchestration solutions such as Airflow Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability Coforge is an equal opportunities employer and welcomes applications from all sections of society and More ❯
warehousing concepts and data modeling. Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. Understanding/hands on experience in Orchestration solutions such as Airflow Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability Seniority level Seniority level Mid-Senior level Employment type Employment type Contract Job function Job More ❯
Bedford, England, United Kingdom Hybrid / WFH Options
Circle Anglia Housing Association (Wherry)
with RDBMS (preferably Snowflake or SQL Server) and large-scale data processing through ETL pipelines. Familiarity with Git or MS Visual Studio TFS, data orchestration tools like ADF, AirByte, Airflow, or Luigi, and a high competency in SQL are required. The role demands confident programming skills (C#, Python, or JavaScript) with a focus on data manipulation and integration, alongside More ❯
Required Skills: Proven experience managing Power BI deployments (including workspaces, datasets, and reports). Strong understanding of data pipeline deployment using tools like Azure Data Factory, AWS Glue, or Apache Airflow. Hands-on experience with CI/CD tools (Azure DevOps, GitHub Actions, Jenkins). Proficiency in scripting (PowerShell, Python, or Bash) for deployment automation. Experience with manual deployment More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Bit Bio
Spark-Streaming. Relational SQL and NoSQL databases, including Postgres and Cassandra. Experience designing and implementing knowledge graphs for data integration and analysis. Data pipeline and workflow management tools: Luigi, Airflow, etc. AWS cloud services: EC2, S3, Glue, Athena, API Gateway, Redshift. Experience with object-oriented and scripting languages: Python, R. Designing and building APIs (RESTful, etc.) Understanding of FAIR More ❯
Akkodis is a global leader in engineering, technology, and R&D, harnessing the power of connected data to drive digital transformation and innovation for a smarter, more sustainable future. As part of the Adecco Group, Akkodis combines the expertise of More ❯
Job Description Snowflake Architect with Azure (Permanent Role) Basildon, UK (Work from Client office 5 days a week) Responsibilities: Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS. Implement data warehousing solutions, ensuring efficient storage More ❯
platform, ensuring scalability, reliability, and security. Drive modernisation by transitioning from legacy systems to a lean, scalable platform. Act as a lead expert for technologies such as AWS, DBT, Airflow, and Databricks. Establish best practices for data modelling, ingestion, storage, streaming, and APIs. Governance & Standards Ensure all technical decisions are well-justified, documented, and aligned with business needs. Lead … in data engineering and cloud engineering, including data ingestion, transformation, and storage. Significant hands-on experience with AWS and its data services. Expert-level skills in SQL, Python, DBT, Airflow and Redshift. Confidence in coding, scripting, configuring, versioning, debugging, testing, and deploying. Ability to guide and mentor others in technical best practices. A product mindset, focusing on user needs More ❯
Norwich, England, United Kingdom Hybrid / WFH Options
Political, International and Development Studies Student Association
and/or Data Warehousing. An understanding (doesn’t need to be experience of) of Cloud Technologies. Development of data pipeline based on SQL with orchestration tools such as Airflow or Tivoli scheduler. Intermediate programming skills around Python. Awareness of CI/CD practices. We’re looking for confident communicators who can break down technical concepts in a clear More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯