engineering or a related field, with a focus on building scalable data systems and platforms. Expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming More ❯
Intelligence platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and Spark (Scala or Python). Experience working with relational SQL databases either on premises or in the cloud. Experience delivering multiple solutions using key techniques More ❯
or a related field, with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and More ❯
your cloud/platform engineering capabilities Experience working with Big Data Experience of data storage technologies: Delta Lake, Iceberg, Hudi Knowledge and understanding of ApacheSpark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Competence in evaluating and selecting development tools and More ❯
projects with leading organizations? 💡 What You’ll Do: ✅ Design and deliver data solutions using Fabric, Azure Data Factory, and Synapse ✅ Work with SQL, Python, Spark, Kafka, Snowflake , and more ✅ Apply best practices in Data Architecture, Governance, and Engineering ✅ Collaborate with clients to drive real business impact ✅ Be part of More ❯
edinburgh, central scotland, United Kingdom Hybrid / WFH Options
Algo Capital Group
tools and alerting frameworks SQL experience including queries/updates/table creation/basic database maintenance Exposure to data technologies such as Kafka, Spark or Delta Lake is useful but not mandat Bachelor's degree in Computer Science, Engineering, or related technical field This role offers competitive compensation More ❯
end ownership • Python or similar (Ruby or Node) or another Functional Language • JavaScript and associated frameworks, preferably Vue, or similar • Cloud technologies • SQL (advantageous) • Spark (advantageous) • Docker/Kubernetes – advantageous ) • MongoDB, SQL, Postgres & Snowflake (advantageous) • Developing online, cloud based SaaS products. • Leading and building scalable architectures and distributed systems More ❯
systems such as profiling and debugging and understanding of system performance and scalability PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability More ❯
and migrating a host of applications suites to a Kubernetes infrastructure environment. Requirements: Kubernetes implementation, Kubernetes-specific architecture, hands-on migration work, Kubernetes Security Spark S3 Engine Terraform Ansible CI/CD Hadoop Linux/RHEL – on prem background/container management Grafana or Elastic Search– for observability Desirable More ❯