to-end from scoping, designing, coding, release and continuous monitoring in production environment •ELT pipeline: Experience with ELT pipeline and orchestration systems such as Airflow •Database systems: Experience working with one of more of non-SQL databases such as Druid, Elasticsearch and neo4j •AWS: Experience deploying and managing applications more »
Join a leader in generative AI technologies, who have recently secured Series A funding to advance our work in digital avatars and human clones. Role: MLOps Engineer Location: London Salary: Up to £100,000 Responsibilities: Develop ML Pipelines: Build and more »
team. Qualifications You will have expertise within the following: Java and Python development knowledge (Essential) Previous experience with Spark or Hadoop (Essential) Trino or Airflow (Desirable) Architecture and capabilities. Designing and implementing complex solutions with a focus on scalability and security. Excellent communication and collaboration skills. Additional Information Location more »
Maven, Jenkins, GitHub, etc. Experience with Amazon Web Services a strong plus - CloudFormation, EMR, S3, EC2, Athena etc. Experience with scheduling services such as Airflow, Oozie. Experience with Data ETL and data modeling Experience with building large-scale systems with extensive knowledge in data warehousing solutions. Developing prototypes and more »
designing and building robust, scalable, distributed data systems and pipelines, using open source and public cloud technologies. Strong experience with data orchestration tools: e.g. ApacheAirflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL … . Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP, Azure, Terraform. Strong knowledge of software engineering practices: e.g. testing, CI/CD (Jenkins, Github Actions), agile development, git/version control, containers etc. more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
the above project of redesigning the Creditsafe platform into the cloud space. You will be expected to work with technologies such as Python, Linux, Airflow, AWS DynamoDB, S3, Glue, Athena, Redshift, lambda, API Gateway, Terraform, CI/CD. KEY DUTIES AND RESPONSIBILITIES You will actively contribute to the codebase … and participate in peer reviews. Design and build metadata driven, event based distributed data processing platform using technologies such as Python, Airflow, Redshift, DynamoDB, AWS Glue, S3. As an experienced Engineer, you will play a critical role in the design, development, and deployment of our business-critical system. You more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Synchro
in Python or PySpark, we encourage you to apply. Python App Developer Requirements: Proficiency with Python or PySpark. Exposure to cloud technologies such as Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka. Experience with Big Data solutions or Relational DB. Demonstrated knowledge of software applications and technical processes within a cloud more »
utilising the best breed of Cloud services and technologies. So, what tools and technologies will you be using? AWS Python Databricks/Spark Trino Airflow Docker CloudFormation/Terraform SQL/NoSQL We provide you with the opportunity to think freely and work creatively and right now, is a … Other skills we are looking for you to demonstrate include: Experience of data storage technologies: Delta Lake, Iceberg, Hudi Sound knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Knowledge of Architecture best practices and patterns Competence in more »
or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
Data Engineer Experience working with AI frameworks and libraries (PyTorch) Confidence collaborating in a complex and cross-functional teams Strong skills with: AWS, Python, Airflow, Snowflake Interested ? Apply now or reach out to daisy@wearenumi.com for further details and a chat more »
Senior AWS Data Engineer - London (Hybrid) - Permanent - £65,000-£70,000 Media Agency, Snowflake, SQL, Python, dbt, Airflow, Marketing *You must be based in London, and have full permanent right to work in the UK to apply for this role* I'm currently working with a leading media agency … ASAP and I will be in touch. Senior AWS Data Engineer - London (Hybrid) - Permanent - £65,000-£70,000 Media Agency, Snowflake, SQL, Python, dbt, Airflow, Marketing more »
Python client for Google BigQuery Advanced SQL (GoogleSQL, MySQL) Google Cloud Services Advanced BigQuery Advanced Google Cloud Storage Google Dataform Google Cloud Function Advanced ApacheAirflow Basic Tableau: ability to create basic visualisations Ability to integrate multiple data sources and databases into one system Able to create database more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
early-stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the more »
Job Title: Data Engineer Job Type: Full Time, Permanent Working location: London, Hybrid Role Purpose At Travelex we are developing modern data services, which will be at the heart of our relationship with our customers. Our data architecture is becoming more »
within the EU Fusion programme and connections to international HPC communities, showcasing contributions made to the field. - Experience in workflow management systems such as Apache Airflow. - Familiarity with Research Data Management methodologies, modern database technologies including SQL, NoSQL and Graph Databases, and parallel file access technologies such as MPI more »
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Microlise
data practices Possess strong knowledge of data tools, data management tools, and various data and information technologies. E.g. DAMA DMBOK, Microsoft SQL Server, Couchbase, Apache Druid, Spark, Kafka, Airflow, etc In-depth understanding of modern data principles, methodologies, and tools Excellent communication and collaboration skills, with the ability … native computing concepts and experience working with hybrid or private cloud platforms is a plus. Demonstrable technical experience working with a Microsoft, Redhat, and Apache data and software engineering environment. A team-oriented individual with a passion for engineered excellence and the ability to lead and motivate a team more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, ApacheAirflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of more »
AWS ecosystems like Lambdas, step functions and ECS services. Experience of Dremio is a nice to have Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to ApacheAirflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake more »
like Lambdas, step functions and ECS services.Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services.Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to ApacheAirflow, Prefect, Dagster, DBT.Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
like Lambdas, step functions and ECS services.Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services.Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to ApacheAirflow, Prefect, Dagster, DBT.Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
Maven, Jenkins, GitHub, etc. Experience with Amazon Web Services a strong plus - CloudFormation, EMR, S3, EC2, Athena etc. Experience with scheduling services such as Airflow, Oozie. Experience with Data ETL and data modeling Experience with building large-scale systems with extensive knowledge in data warehousing solutions. Developing prototypes and more »
Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (ApacheAirflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. CVs are being presented on Friday more »
Hybrid ( 2 days a week) JD : Experience of working with Streaming & Batch technology stack – Confluent Kafka, Mongdb , Streamsets, IBM CDC, Hive, Hadoop, API, Informatica, Airflow, and other similar technologies SME level skills and experience of designing/architecting test automation solutions, ability to creatively problem solve is critical for more »
week, have a valid visa as we Are not able to sponsorTechnical Stack:- Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflow Skillsyears of experience in python scripting.in developing applications in Python language.to python-oriented Algorithm’s libraries such as NumPy, pandas, beautiful soup more »