to the ideas and delivery of the strategy; Support data queries in SQL (T-SQL/ANSI-SQL) and support data pipelines using in PySpark/Python, Databricks and AWS (Athena, Glue, S3); Analyse data needs and coordinate new data requests and data change requests. Work with clients to … Pivot Charts). Experience of supporting Data Warehousing. Basic SQL experience and understanding XML/JSON files. Basic knowledge/experience of either Python, PySpark, R, Scala etc. Experience using SQL, PowerBI, Tableau or similar tools. Preferred: Knowledge and experience of Financial Systems Support (Access Dimensions) or ServiceNow Support … and Administration. Strong knowledge of using SQL, PowerBI, Tableau etc. Strong knowledge of using Python, PySpark, R, Scala etc. Experience of supporting IT Applications and/or Platforms. Experience of cloud data solutions (AWS, Google, Microsoft Azure), AWS preferred. Degree in Business Analytics or Technology, Computer Science, Math, Statistics more »
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »
West Midlands, Dudley, West Midlands (County), United Kingdom Hybrid / WFH Options
Concept Resourcing
standardise the Data Warehouse environment and solutions. Required Skills, Knowledge, and Experience: Experience designing, developing, and testing Azure Data Factory/Fabric pipelines, including PySpark Notebooks and Dataflow Gen2 workflows. Previous experience with Informatica Power Centre is desired. Significant experience in designing, writing, editing, debugging, and testing SQL code more »
West Midlands, Dudley, West Midlands (County), United Kingdom Hybrid / WFH Options
Concept Resourcing
standardise the Data Warehouse environment and solutions. Required Skills, Knowledge, and Experience: Experience designing, developing, and testing Azure Data Factory/Fabric pipelines, including PySpark Notebooks and Dataflow Gen2 workflows. Previous experience with Informatica Power Centre is desired. Significant experience in designing, writing, editing, debugging, and testing SQL code more »
following technologies Azure Synapse, Data Factory, Databricks, SQL Db, Datalake, Key Vault Azure Dev Ops and CI/CD pipelines Coding in SQL and PySpark/Python DW/Data Vault concepts Power BI Experience with core Finance reporting (Projects, GL, AP, AR etc) - Highly desirable Preferred experience Knowledge more »
applied machine learning, probability, statistics, and quantitative risk modelling. High proficiency in Python & SQL. Experience with big data technologies and tools, particularly Databricks and Pyspark, is highly desirable. Experience in agile software development processes is a plus. Experience in insurance, cyber, or a related domain is ideal. Understanding of more »
An industry-leading organisation are looking for an experienced Senior Data Engineer who is well-versed in Databricks, PySpark and SQL to join their growing Data Engineering team. This role can be largely remote, with some travel to their Central London Head Office, likely around 2 times per month. … a wide variety of technical work. Your primary focus will be working with Databricks - you will be building data pipelines in Databricks, coding in PySpark, and supporting internal applications as they are moving away from a legacy application into new bespoke applications which rely on Databricks. This is a … to provide thought leadership, and to innovate and experiment with new technologies to deliver benefits to the business. Requirements Excellent skills in Databricks and Pyspark Excellent experience with SQL and T-SQL programming Experience designing, developing and maintaining data warehouses and data lakes Knowledge of Azure technologies including Data more »
An industry-leading organisation are looking for an experienced Senior Data Engineer who is well-versed in Databricks, PySpark and SQL to join their growing Data Engineering team. This role can be largely remote, with some travel to their Central London Head Office, likely around 2 times per month. … a wide variety of technical work. Your primary focus will be working with Databricks - you will be building data pipelines in Databricks, coding in PySpark, and supporting internal applications as they are moving away from a legacy application into new bespoke applications which rely on Databricks. This is a … to provide thought leadership, and to innovate and experiment with new technologies to deliver benefits to the business. Requirements Excellent skills in Databricks and Pyspark Excellent experience with SQL and T-SQL programming Experience designing, developing and maintaining data warehouses and data lakes Knowledge of Azure technologies including Data more »
for data engineering ie Azure Functions * Core skills in coding with SQL, Python and Spark * Proven experience using Databricks ie lakehouse, delta live tables, Pyspark etc more »
Senior Data Engineer Remote working Salary £65,000 - £70,000 plus benefits DataBricks, PySpark, SQL We are looking for a talented Senior Data Engineer to join one of the UK's leading research and law ranking companies at an exciting time of growth. Build new products, engineer new solutions more »
Employment Type: Permanent
Salary: £60000 - £70000/annum plus remote working and benefits
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
Instances, Azure DB, Data Lakes, Azure Synapse and Power BI, Azure Data Factory). Proficiency in at least one scripting language (i.e. SQL, and PySpark/Python). Proficiency of designing and building API and API Consumption. Familiarity with data visualisations tools such as PowerBI. Proficiency in ETL/… Model, Data Platforms, Data Models, Data Architecture, Data Integration, DIAS, Microsoft Azure, Azure DB, Data Lakes, Azure Synapse, Power BI, Azure Data Factory, SQL, PySpark, Python, APIs, ETL, ELT, CI/CD, Data Pipelines, Data-as-a-Service. Please note that due to a high level of applications, we more »