multitasking Technologies: Scala, Java, Python, Spark, Linux, shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience with process scheduling platforms like Apache Airflow Willingness to work with proprietary technologies like Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing frameworks like More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Open to working in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) • Experience in working with process scheduling platforms like Apache Airflow. • Should be ready to work in GS proprietary technology like Slang/SECDB • An understanding of compute resources and the ability to interpret More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) • Experience in working with process scheduling platforms like Apache Airflow. • Should be ready to work in GS proprietary technology like Slang/SECDB • An understanding of compute resources and the ability to interpret More ❯
languages Technologies: Scala, Java, Python, Spark, Linux, shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience with process scheduling platforms like Apache Airflow Willingness to work with proprietary technologies like Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing frameworks like More ❯
Leicester, Leicestershire, United Kingdom Hybrid / WFH Options
Akkodis
take the platform to the very next level. You'll be able to hit the ground running in Databricks and ideally have knowledge of Apache Spark. Naturally you'll have sound SQL exposure and AWS from a Cloud perspective. Pyspark would be hugely advantageous to this position too. The More ❯
Coventry, Warwickshire, United Kingdom Hybrid / WFH Options
Cadent Gas
data pipelines and models in SAP Datasphere or SAP BW4/Hana Advanced skills in SQL, data modelling, and data transformation Familiarity with Databricks, Apache Spark, PySpark, and Delta Lake Agile mindset with experience in DevOps and iterative delivery Excellent communication and stakeholder engagement abilities At Cadent, we're More ❯
Ansible. Application Development and Deployment: Develop and deploy scalable and secure applications using NodeJS ReactJS and RESTful APIs. Ensure seamless integration with Elasticsearch and Apache NiFi. CI/CD Pipelines: Design and implement CI/CD pipelines using Jenkins Kubernetes and OpenShift to automate testing building and deployment of More ❯
Ansible. Application Development and Deployment: Develop and deploy scalable and secure applications using NodeJS ReactJS and RESTful APIs. Ensure seamless integration with Elasticsearch and Apache NiFi. CI/CD Pipelines: Design and implement CI/CD pipelines using Jenkins Kubernetes and OpenShift to automate testing building and deployment of More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures … and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have experience with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Naturally you More ❯
Skills: Experience working within the public sector. Knowledge of cloud platforms (e.g., IBM Cloud, AWS, Azure). Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop). Understanding of data warehousing concepts and experience with tools like IBM Cognos or Tableau. Certifications:While not required, the following certifications … beneficial: Experience working within the public sector. Knowledge of cloud platforms (e.g., IBM Cloud, AWS, Azure). Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop). Understanding of data warehousing concepts and experience with tools like IBM Cognos or Tableau. ABOUT BUSINESS UNIT IBM Consulting is IBM More ❯