utilising the best breed of Cloud services and technologies. So, what tools and technologies will you be using? AWS Python Databricks/Spark Trino Airflow Docker CloudFormation/Terraform SQL/NoSQL We provide you with the opportunity to think freely and work creatively and right now, is a … Other skills we are looking for you to demonstrate include: Experience of data storage technologies: Delta Lake, Iceberg, Hudi Sound knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Knowledge of Architecture best practices and patterns Competence in more »
Join a leader in generative AI technologies, who have recently secured Series A funding to advance our work in digital avatars and human clones. Role: MLOps Engineer Location: London Salary: Up to £100,000 Responsibilities: Develop ML Pipelines: Build and more »
Job Title: Data Engineer Job Type: Full Time, Permanent Working location: London, Hybrid Role Purpose At Travelex we are developing modern data services, which will be at the heart of our relationship with our customers. Our data architecture is becoming more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Synchro
in Python or PySpark, we encourage you to apply. Python App Developer Requirements: Proficiency with Python or PySpark. Exposure to cloud technologies such as Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka. Experience with Big Data solutions or Relational DB. Demonstrated knowledge of software applications and technical processes within a cloud more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
early-stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the more »
Data Engineer Experience working with AI frameworks and libraries (PyTorch) Confidence collaborating in a complex and cross-functional teams Strong skills with: AWS, Python, Airflow, Snowflake Interested ? Apply now or reach out to daisy@wearenumi.com for further details and a chat more »
to-end from scoping, designing, coding, release and continuous monitoring in production environment •ELT pipeline: Experience with ELT pipeline and orchestration systems such as Airflow •Database systems: Experience working with one of more of non-SQL databases such as Druid, Elasticsearch and neo4j •AWS: Experience deploying and managing applications more »
Senior AWS Data Engineer - London (Hybrid) - Permanent - £65,000-£70,000 Media Agency, Snowflake, SQL, Python, dbt, Airflow, Marketing *You must be based in London, and have full permanent right to work in the UK to apply for this role* I'm currently working with a leading media agency … ASAP and I will be in touch. Senior AWS Data Engineer - London (Hybrid) - Permanent - £65,000-£70,000 Media Agency, Snowflake, SQL, Python, dbt, Airflow, Marketing more »
or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
Preston, Lancashire, United Kingdom Hybrid / WFH Options
Uniting Ambition
and deep knowledge in core processing and orchestration products such as Big Query, Data Flow, Data Fusion, Data Stream, Cloud Functions, Data Proc and Airflow/Composer. You will have held a leading role in a Data Engineering function with responsibility for the directing the efforts of other data more »
there is little work to do here. Experience is data-intensive applications is desirable here. Other technology in the stack includes Node, gRPC, protobuf, Apache Ignite, ApacheAirflow and AWS. They have a hybrid-working set up that requires the team to be in the office more »
Python Data Engineer (Software Engineer Programmer Developer Data Engineer Python PySpark Spark Glue Athena Iceberg Airflow Dagster DBT Java Agile AWS GCP Buy Side Asset Manager Investment Management Finance Front Office Trading Financial Services Pandas Numpy Scipy Banking) required by our asset management client in London. You MUST have … Developer/Software Engineer/Programmer Excellent Python PySpark Excellent data engineering AWS, GCP or Azure Agile The following is DESIRABLE, not essential: Iceberg Airflow or Dagster Dremio or DBT Java Finance Role: Python Data Engineer (Software Engineer Programmer Developer Data Engineer Python PySpark Spark Glue Athena Iceberg Airflow … will bring Python and PySpark experience to contribute towards this initiative. They will also be looking at the use of tooling such as Iceberg, Airflow, Dagster, Dremio, DBT, Glue and Athena. These are not essential, only 'nice-to-have' technologies. This is also an excellent opportunity to enter into more »
knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc Strong proven knowledge of Kimball/Dimensional data modelling and/or Data vault If you are interested in applying for more »