Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Adecco
approaches Experience with data ingestion and ETL pipelines Curious, adaptable, and a natural problem solver Bonus points for: Experience in financial services, insurance, or reinsurance Familiarity with Databricks, Git, PySpark or SQL Exposure to cyber risk or large-scale modelling environments Ready to Apply for this exciting Data Scientist role? Send your CV to - I'd love to hear More ❯
Data Engineer | Bristol/Hybrid | £65,000 - £80,000 | AWS | Snowflake | Glue | Redshift | Athena | S3 | Lambda | Pyspark | Python | SQL | Kafka | Amazon Web Services | Do you want to work on projects that actually help people? Or maybe you want to work on a modern AWS stack I am currently supporting a brilliant company in Bristol who build software which genuinely … pipelines using AWS services. implementing data validation, quality checks, and lineage tracking across pipelines, automate data workflows and integrate data from various sources.Tech you will use and learn – Python, Pyspark, AWS, Lambda, S3, DynamoDB, CI/CD, Kafka and more.This is Hybrid role in Bristol and you also get a bonus and generous holiday entitlement to name a couple … you be interested in finding out more? If so apply to the role or send your CV to Sponsorship isn’t available. AWS | Snowflake | Glue | Redshift | Athena | S3 | Lambda | Pyspark | Python | SQL | Kafka | Amazon Web Services More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
term data strategy with a strong focus on data integrity and GDPR compliance To be successful in the role you will have Hands-on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, Delta Lake, and Azure Synapse Strong More ❯
term data strategy with a strong focus on data integrity and GDPR compliance To be successful in the role you will have Hands-on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, Delta Lake, and Azure Synapse Strong More ❯
Work with the team to support ETL processes What you'll need to succeed Seasoned knowledge of the Azure Databricks platform and associated functionalities Strong Python programming knowledge, ideally Pyspark A logical and analytical approach to problem-solving Awareness of the modern data stack and associated methodologies What you'll get in return A rewarding contract providing exposure to More ❯
Employment Type: Contract
Rate: £500.0 - £650.0 per day + £500 to £650 per day
of ETL Processing/Data Warehousing testing; including Databricks and Data Factory. - Hands-on experience with SQL or Azure SQL. - Experience using automated testing on Python frameworks (Pystest/Pyspark) - Experience with Specflow and other frameworks. If you are interested in this Data Tester role please apply with your most recent CV alterntaviely reach out to me jordan . More ❯