the project. There is a massive emphasis on Pyspark and Databricks for this particular role. Technical Skills Required: Azure (ADF, Functions, Blob Storage, Data Lake Storage, Azure Data Bricks) Databricks Spark DeltaLake SQL Python PySpark ADLS Day To Day Responsibilities: Extensive experience in designing, developing, and more »
Airbyte for data ingestion, Prefect for pipeline orchestration, AWS Glue for managed ETL, along with Pandas and PySpark for pipeline logic implementation. We utilize DeltaLake and PostgreSQL for data storage, emphasizing the importance of data integrity and version control in our workflows. Day-to-day, you will more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
data platform. Design, develop, test, and deploy data pipelines and integrations using IPaaS technology Collaborate on engineering platform ingestion, orchestration, data warehouse/data lake and API strategies for the data management ecosystem Willingness and an enthusiastic attitude to work within existing processes/methodologies. Collaborate with the DevOps … data processing - Expertise in processing enterprise-scale volumes of data, ideally with proficiency in Snowflake. Experience Data Stores - Technical excellence in SQL, NoSQL, Blob,DeltaLake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, Apache Airflow, Logic Apps more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
Proficiency in designing and implementing data models, including star schemas and snowflake schemas. Expertise in SQL. • Data Storage Technologies - Experience in SQL, NoSQL, Blob, DeltaLake , and/or other enterprise scale data stores. • Data Processing Frameworks - Proficiency in designing and building data pipelines for data processing and more »
Bournemouth, Dorset, South West, United Kingdom Hybrid / WFH Options
LV= General Insurance
ETL processes to read, transform, and verifying data using tools such as SQL and PySpark Cloud technologies and architecture including Databricks, Kubernetes, Data/DeltaLake, Azure Machine Learning, Data Factory. Azure is preferable, but AWS and GCP experience is welcome Familiar with all parts of the Machine more »
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
LV= General Insurance
ETL processes to read, transform, and verifying data using tools such as SQL and PySpark Cloud technologies and architecture including Databricks, Kubernetes, Data/DeltaLake, Azure Machine Learning, Data Factory. Azure is preferable, but AWS and GCP experience is welcome Familiar with all parts of the Machine more »
and constructing robust data pipelines using the best of open-source data engineering and scientific Python toolset. Tech Stack: Airbyte AWS Glue Pandas Pyspark DeltaLake PostgreSQL The team follows agile ways of working and you engage with various stakeholders across the business. The role is Hybrid more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
team and wider organisation using the tech you think is required! Skills desired/what you will learn: Microsoft Azure Azure SQL Microsoft Fabric DeltaLake, Databricks and Spark Statistical Modelling Azure ML Studio Python and familiarity with libraries and frameworks for data analysis and machine learning (e.g. more »
SE1, Tower of London, Greater London, United Kingdom Hybrid / WFH Options
Avanti Recruitment
working anywhere in the world for up to 30 days! Requirements: Azure Data Bricks Data Strategy Data Transformation Data Cloud Migration Data API Design DeltaLake/Lakehouse Strong communication skills and also being able to translate to a non-technical audience Previous Finance/Fintech/Forex more »