. Experience with SQL and query design on large, complex datasets. Experience with cloud and big-data tools and frameworks like Databricks/Spark, Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT … FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong problem-solving and analytical abilities. Ability to present solutions and limitations to non-IT business experts ABOUT YOU Integrity, respect, intellectually curious and an more »
the above project of redesigning the Creditsafe platform into the cloud space. You will be expected to work with technologies such as Python, Linux, Airflow, AWS DynamoDB, S3, Glue, Athena, Redshift, lambda, API Gateway, Terraform, CI/CD. KEY DUTIES AND RESPONSIBILITIES You will actively contribute to the codebase … and participate in peer reviews. Design and build metadata driven, event based distributed data processing platform using technologies such as Python, Airflow, Redshift, DynamoDB, AWS Glue, S3. As an experienced Engineer, you will play a critical role in the design, development, and deployment of our business-critical system. You more »
Databricks • Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). • In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/… MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. • Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. • Ability to be a SPOC for all technical discussions across industry groups. • Excellent design experience, with entrepreneurship skills to own and lead solutions more »
leading marketplace platform is looking for a Principal Data Engineer to take a lead on several aspects of their data platform built in Python, Airflow, Kafka, Sagemaker, AWS and GCP. 💰 Salary: up to £130k + 15% bonus 📍 Location: can be based anywhere in the UK ✅ Must have requirements: Mastery … of Python and SQL Significant experience with orchestration tools, ideally Airflow Strong Data Architecture experience Cloud expertise in either GCP or AWS (ideally in both) Knowledge of MPP systems such as Athena, BigQuery, EMR, Hive, Iceberg Exposure to data streaming technologies such as Kafka or Kinesis more »
processes, effective monitoring, and infrastructure-as-code using Terraform. Collaborate closely with our engineering teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg. Optimize infrastructure costs and develop strategies for more »
ensuring reliability, scalability, and efficiency. Design and implement ETL processes and data pipelines primarily within the AWS environment. Utilize Python coding skills, particularly in Airflow, to enhance and optimize our infrastructure. Collaborate with cross-functional teams to understand data requirements and implement solutions. Gain proficiency in AWS services such … Science, Engineering, or related field. 0-3 years of experience in data engineering or related roles. Strong proficiency in Python programming, with experience in Airflow preferred. Familiarity with AWS services such as Lambda functions, Glue, and S3 is a plus. Solid understanding of SQL for ETL processes, including experience more »
Job Title: Software Engineer (Data/Airflow) Client: Elite FinTech Firm Salary: Up to £130k + Bonus Location: London (Hybrid) Sells: Cutting-edge tech, ownership of multiple greenfield projects, no red tape, a friendly/collaborative environment, beautiful offices, personal projects on Fridays! An Elite FinTech Firm is looking … They are fully open to experience level and will find good fits for the best people Strong experience with Python or Rust Experience with Airflow Exposure to building ETL pipelines is a huge plus A desire to learn Rust Solid SQL knowledge Fantastic education Experience working in mission critical more »
Senior Data Engineer Mobysoft is one of the fastest growing SaaS providers in the UK and has been shortlisted in the "Top 50 fastest growing technology companies in the North" for four successive years. Mobysoft provides predictive analytical software that more »
Machine Learning Engineer up to £75 000 London NEW Machine Learning Engineer Opportunity Available with a Leading Organisation! The Company: Are you an expert in Machine Learning? We are on the lookout for a Machine Learning Engineer to join an more »
MACHINE LEARNING ENGINEER £75,000 HYBRID - London COMPANY: We are looking for a Machine Learning Engineer to join a leading Marketing Consultancy. They are looking to grow to be an even bigger player within the industry through Data scientists/ more »
implement the systems that require the highest data throughput in Java. We implement most of our long running services and analytics in C#.We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, ELK for logs, Grafana, Prometheus & InfluxDb for metrics, Docker more »
or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
Data Engineer Experience working with AI frameworks and libraries (PyTorch) Confidence collaborating in a complex and cross-functional teams Strong skills with: AWS, Python, Airflow, Snowflake Interested ? Apply now or reach out to daisy@wearenumi.com for further details and a chat more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
early-stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the more »
to-end from scoping, designing, coding, release and continuous monitoring in production environment •ELT pipeline: Experience with ELT pipeline and orchestration systems such as Airflow •Database systems: Experience working with one of more of non-SQL databases such as Druid, Elasticsearch and neo4j •AWS: Experience deploying and managing applications more »
utilising the best breed of Cloud services and technologies. So, what tools and technologies will you be using? AWS Python Databricks/Spark Trino Airflow Docker CloudFormation/Terraform SQL/NoSQL We provide you with the opportunity to think freely and work creatively and right now, is a … Other skills we are looking for you to demonstrate include: Experience of data storage technologies: Delta Lake, Iceberg, Hudi Sound knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Knowledge of Architecture best practices and patterns Competence in more »
Senior AWS Data Engineer - London (Hybrid) - Permanent - £65,000-£70,000 Media Agency, Snowflake, SQL, Python, dbt, Airflow, Marketing *You must be based in London, and have full permanent right to work in the UK to apply for this role* I'm currently working with a leading media agency … ASAP and I will be in touch. Senior AWS Data Engineer - London (Hybrid) - Permanent - £65,000-£70,000 Media Agency, Snowflake, SQL, Python, dbt, Airflow, Marketing more »
Job Title: Data Engineer Job Type: Full Time, Permanent Working location: London, Hybrid Role Purpose At Travelex we are developing modern data services, which will be at the heart of our relationship with our customers. Our data architecture is becoming more »
Join a leader in generative AI technologies, who have recently secured Series A funding to advance our work in digital avatars and human clones. Role: MLOps Engineer Location: London Salary: Up to £100,000 Responsibilities: Develop ML Pipelines: Build and more »
Knutsford, England, United Kingdom Hybrid / WFH Options
Capgemini
Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline orchestration tools like ApacheAirflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. Understanding of data modelling Principles and best more »
Northamptonshire, England, United Kingdom Hybrid / WFH Options
Capgemini
Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline orchestration tools like ApacheAirflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. Understanding of data modelling Principles and best more »
Maven, Jenkins, GitHub, etc. Experience with Amazon Web Services a strong plus - CloudFormation, EMR, S3, EC2, Athena etc. Experience with scheduling services such as Airflow, Oozie. Experience with Data ETL and data modeling Experience with building large-scale systems with extensive knowledge in data warehousing solutions. Developing prototypes and more »
Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (ApacheAirflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. CVs are being presented on Friday more »
have a valid visa as we Are not able to sponsor Technical Stack:- Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflow Skills years of experience in python scripting. in developing applications in Python language. to python-oriented Algorithm’s libraries such as NumPy more »