and data stores to support organizational requirements. Skills/Qualifications: 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines 3+ years of proficiency in working with Snowflake or similar cloud More ❯
across varied solutions. - Extensive experience of using the Databricks platform for developing and deploying data solutions/data products (including ingestion, transformation and modelling) with high proficiency in Python, PySpark and SQL. - Leadership experience in other facets necessary for solution development such as testing, the wider scope of quality assurance, CI/CD etc. - Experience in related areas of More ❯
the ability to write ad-hoc and complex queries to perform data analysis. Experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. You will be able to develop solutions in a hybrid data environment (on-Prem and Cloud). Hands on experience with developing data pipelines for structured, semi-structured More ❯
tooling, and engineering standards. We'd love to talk to you if you have: Proven experience delivering scalable, cloud-based data platforms using tools such as Databricks, Spark or PySpark, and services from AWS, Azure, or GCP. Experience in line management and people development. You've supported engineers with regular 1:1s, development planning, and performance conversations, and are More ❯
and managing project changes and interventions to achieve project outputs. Documenting all aspects of the project for future reference and audits. Technical Responsibilities: Developing SQL scripts (store procedures) and PySpark notebooks. Creating and managing ingestion, ETL & ELT processes. Designing and configuring Synapse pipelines. Data modelling in various storage systems. Analysing existing data designs and suggesting improvements for performance, stability … Experience in Project Management within the Defence & Security sector. Strong technical skills in API, Java, Python, Web Development, SQL, and Azure. Proficiency in developing and managing SQL scripts and PySpark notebooks. Understanding of ETL & ELT processes and Synapse pipeline design and configuration. Experience in data modelling and improving existing data designs. Knowledge of real-time data processing. Capable of More ❯
and managing project changes and interventions to achieve project outputs. Documenting all aspects of the project for future reference and audits. Technical Responsibilities: Developing SQL scripts (store procedures) and PySpark notebooks. Creating and managing ingestion, ETL & ELT processes. Designing and configuring Synapse pipelines. Data modelling in various storage systems. Analysing existing data designs and suggesting improvements for performance, stability … Experience in Project Management within the Defence & Security sector. Strong technical skills in API, Java, Python, Web Development, SQL, and Azure. Proficiency in developing and managing SQL scripts and PySpark notebooks. Understanding of ETL & ELT processes and Synapse pipeline design and configuration. Experience in data modelling and improving existing data designs. Knowledge of real-time data processing. Capable of More ❯