Ability to write clean, scalable, maintainable code in Python with a good understanding of software engineering concepts and patterns. Proficiency in other languages like Scala, Java, C#, C++ are an advantage. Proven record of building and maintaining data pipelines deployed in at least one of the big 3 cloud ML More ❯
Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one programming language (e.g. Python, Scala, Java, R). Experience deploying and maintaining cloud infrastructure (e.g. AWS, GCP, or Azure). Familiarity with data modeling and warehousing concepts, and dimensional modeling More ❯
modern data pipelines and ETL (extract, transform, load) processes using tools such as Apache Kafka and Apache Nifi Proficient in programming languages like Java, Scala, or Python Experience or expertise using, managing, and/or testing API Gateway tools and Rest APIs Experience in traditional database and data warehouse products More ❯
experience with big data technology, systems and tools such as AWS, Hadoop, Hive, and Snowflake Expertise with common Software Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages Experience with workflow orchestration tools such as Airflow Detailed problem-solving approach, coupled with More ❯
computing. What We’re Looking For 3+ years of experience in backend or full-stack software engineering. Strong programming skills in Python, Java, or Scala (bonus for multi-language proficiency). Hands-on experience with AWS, GCP, or Azure and distributed systems. Familiarity with big data frameworks like Spark, Kafka More ❯
computing. What We’re Looking For 3+ years of experience in backend or full-stack software engineering. Strong programming skills in Python, Java, or Scala (bonus for multi-language proficiency). Hands-on experience with AWS, GCP, or Azure and distributed systems. Familiarity with big data frameworks like Spark, Kafka More ❯
and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. More ❯
and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. More ❯
and principles. Solid understanding of data warehousing data modeling and data integration principles. Proficiency in at least one scripting/programming language (e.g. Python Scala Java). Experience with SQL and NoSQL databases. Familiarity with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication More ❯
AWS or Azure Experience with software development lifecycle and use of associated tools Linux Highly proficient programming skills in languages such as Java, Python, Scala, Go, Rust An understanding of formal software engineering principles including design, documentation, ticketing systems, version control and Agile methodologies Must possess the ability to understand More ❯
leadership role. Hands-on experience with data warehouse solutions such as Redshift, Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding More ❯
leadership role. Hands-on experience with data warehouse solutions such as Redshift, Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding More ❯
. Experience working with cloud platforms such as AWS, GCP, or Azure for real-time data ingestion and storage. Programming skills in Python, Java, Scala, or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache More ❯
level proficiency in SQL (e.g., MySQL, PostgreSQL, Redshift) and NoSQL databases (e.g., MongoDB, Cassandra). Proficiency in one or more programming languages (e.g., Python, Scala, Java) for data integration, automation, and problem-solving. In-depth experience with cloud data platforms (AWS, Azure, Google Cloud) and Microsoft services (e.g., Azure Storage More ❯
For: Bachelor's/Master's Degree in Computer Science, Information Technology, or equivalent. Proficiency in at least one mainstream programming language (Python, Java, Scala, Clojure, etc.). In-depth understanding of data structures and algorithmic designs. Knowledge of computer networking concepts (TCP/IP, HTTPS, DNS, etc.). Understanding More ❯
platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and Spark (Scala or Python). Experience working with relational SQL databases either on premises or in the cloud. Experience delivering multiple solutions using key techniques such as More ❯
profiling, data modeling and data engineering using relational databases like Snowflake, Oracle, SQL Server; ETL tools like Informatica IICS; scripting using Python, R, or Scala; workflow management tools like Autosys Experience with stream processing systems like Kafka, Spark streaming etc Experience in Java, JMS, SOAP, REST, JSON, XML technologies, along More ❯
minimum 2 years in Ops or DevOps experience in trading environments, with expertise in large-scale distributed systems; recent work in Java, Go, or Scala is preferred. Hands-on experience with container orchestration (Kubernetes, Docker, etc.) and cloud infrastructure, especially AWS; familiarity with Infrastructure-as-Code tools like Terraform or More ❯
data flows, architecture, and operational procedures. Required Skills and Experience 3+ years of experience in a Data Engineering role. Strong programming skills in Python , Scala , or Java . Solid experience with ETL tools (e.g., Apache Airflow, dbt, Talend). Proficiency with SQL and relational/non-relational databases (e.g., PostgreSQL More ❯
Requirements Master's or PhD in Computer Science or related field 7+ years of data engineering experience Proficiency in multiple programming languages (Python, Rust, Scala, or Go) Strong SQL and NoSQL database experience Data modeling, ETL, and warehousing expertise Cloud platform experience (AWS/GCP) and infrastructure-as-code Excellent More ❯
Requirements Master's or PhD in Computer Science or related field 7+ years of data engineering experience Proficiency in multiple programming languages (Python, Rust, Scala, or Go) Strong SQL and NoSQL database experience Data modeling, ETL, and warehousing expertise Cloud platform experience (AWS/GCP) and infrastructure-as-code Excellent More ❯
edinburgh, central scotland, United Kingdom Hybrid / WFH Options
Net Talent
data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL and at least one programming language such as Python , Scala , or Java . Demonstrated experience owning and delivering complex systems from architecture through implementation. Excellent communication skills with the ability to explain technical concepts to More ❯
big data file formats (Parquet, Avro). Skilled in data pipeline and workflow management tools (Apache Airflow, NiFi). Strong background in programming (Python, Scala, Java) for data pipeline and algorithm development. Skilled in data visualization (Tableau, Power BI) and BI reporting. Experience with cloud platforms (AWS) and services. Understanding More ❯
platform with either Java/J2EE or .NET tech stack and database technologies such as Oracle, MySQL, etc. " Exposure to polyglot programming languages like Scala, Python and Golang will be a plus " Ability to read/write code and expertise with various design patterns " Have used NoSQL database such as More ❯
one of PostgreSQL, MSSQL, Google BigQuery, SparkSQL). Modelling & Statistical Analysis experience, ideally customer related. Coding skills in at least one of Python, R, Scala, C, Java or JS. Track record of using data manipulation and machine learning libraries in one or more programming languages. Keen interest in some of More ❯