Ability to write clean, scalable, maintainable code in Python with a good understanding of software engineering concepts and patterns. Proficiency in other languages like Scala, Java, C#, C++ are an advantage. Proven record of building and maintaining data pipelines deployed in at least one of the big 3 cloud ML More ❯
Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one programming language (e.g. Python, Scala, Java, R). Experience deploying and maintaining cloud infrastructure (e.g. AWS, GCP, or Azure). Familiarity with data modeling and warehousing concepts, and dimensional modeling More ❯
experience with big data technology, systems and tools such as AWS, Hadoop, Hive, and Snowflake Expertise with common Software Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages Experience with workflow orchestration tools such as Airflow Detailed problem-solving approach, coupled with More ❯
and drive A passionate bias to action and passion for delivering high-quality data solutions Expertise with common Data Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages Experience with workflow orchestration tools such as Airflow Deep understanding of end-to-end More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
not limited to banking, insurance, healthcare, media, retail, infrastructure and telco. The ideal candidate with have expertise in some of the following: Python, SQL, Scala, and Java for data engineering. Strong experience with big data tools (Apache Spark, Hadoop, Databricks, Dask) and cloud platforms (AWS, Azure, GCP). Proficient in More ❯
leadership role. Hands-on experience with data warehouse solutions such as Redshift, Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding More ❯
. Experience working with cloud platforms such as AWS, GCP, or Azure for real-time data ingestion and storage. Programming skills in Python, Java, Scala, or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache More ❯
level proficiency in SQL (e.g., MySQL, PostgreSQL, Redshift) and NoSQL databases (e.g., MongoDB, Cassandra). Proficiency in one or more programming languages (e.g., Python, Scala, Java) for data integration, automation, and problem-solving. In-depth experience with cloud data platforms (AWS, Azure, Google Cloud) and Microsoft services (e.g., Azure Storage More ❯
For: Bachelor's/Master's Degree in Computer Science, Information Technology, or equivalent. Proficiency in at least one mainstream programming language (Python, Java, Scala, Clojure, etc.). In-depth understanding of data structures and algorithmic designs. Knowledge of computer networking concepts (TCP/IP, HTTPS, DNS, etc.). Understanding More ❯
need to see from you Qualifications Proven experience as a Data Engineer with a strong background in data pipelines. Proficiency in Python, Java, or Scala, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with Databricks, Azure AI Services, and cloud platforms (AWS, Google Cloud, Azure). Solid understanding More ❯
platform with either Java/J2EE or .NET tech stack and database technologies such as Oracle, MySQL, etc. " Exposure to polyglot programming languages like Scala, Python and Golang will be a plus " Ability to read/write code and expertise with various design patterns " Have used NoSQL database such as More ❯
least one of PostgreSQL, MSSQL, Google BigQuery, SparkSQL) Modelling & Statistical Analysis experience, ideally customer related Coding skills in at least one of Python, R, Scala, C, Java or JS Track record of using data manipulation and machine learning libraries in one or more programming languages. Keen interest in some of More ❯
platforms with either Java/J2EE or .NET tech stack and database technologies such as Oracle, MySQL, etc. Exposure to polyglot programming languages like Scala, Python and Golang will be a plus. Ability to read/write code and expertise with various design patterns. Experience with NoSQL databases such as More ❯
least one of PostgreSQL, MSSQL, Google BigQuery, SparkSQL) Modelling & Statistical Analysis experience, ideally customer related Coding skills in at least one of Python, R, Scala, C, Java or JS Track record of using data manipulation and machine learning libraries in one or more programming languages. Keen interest in some of More ❯
one of PostgreSQL, MSSQL, Google BigQuery, SparkSQL). Modelling & Statistical Analysis experience, ideally customer related. Coding skills in at least one of Python, R, Scala, C, Java or JS. Track record of using data manipulation and machine learning libraries in one or more programming languages. Keen interest in some of More ❯
asynchronous architecture Familiar with AWS, Unix/Linux, Git, SQL, and REST Bonus Points for Experience or interest in: Functional programming languages such as Scala, Haskell and Clojure Relational and NoSQL databases such as PostgreSQL and MongoDB DevOps such as Terraform, Fargate and Kubernetes Frontend development such as Node.js and More ❯
practices (version control, data quality/testing, monitoring, etc.) Expert-level SQL. Proficiency with one or more general purpose programming languages (e.g. Python, Java, Scala, etc.) Knowledge of AWS products such as Redshift, Quicksight, and Lambda. Excellent verbal/written communication & data presentation skills, including ability to succinctly summarize key More ❯
Redshift, EMR, Kinesis and RDS. - Experience with Open Source Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.) - Ability to write code in Python, Ruby, Scala or other platform-related Big data technology - Knowledge of professional software engineering practices & best practices for the full software development lifecycle, including coding standards, code More ❯
deeply technical contributors Preferred Qualifications: Deep experience using at least one interpreted and one compiled common industry programming language: e.g., Python, C/C++, Scala, Java, including toolchains for documentation, testing, and operations/observability Hands-on experience with application performance tuning and optimization, including in parallel and distributed computing More ❯
deeply technical contributors Preferred Qualifications: Deep experience using at least one interpreted and one compiled common industry programming language: e.g., Python, C/C++, Scala, Java, including toolchains for documentation, testing, and operations/observability Hands-on experience with application performance tuning and optimization, including in parallel and distributed computing More ❯
platform with either Java/J2EE or .NET tech stack and database technologies such as Oracle, MySQL, etc. Exposure to polyglot programming languages like Scala, Python and Golang will be a plus Ability to read/write code and expertise with various design patterns Have used NoSQL database such as More ❯
platform with either Java/J2EE or .NET tech stack and database technologies such as Oracle, MySQL, etc. Exposure to polyglot programming languages like Scala, Python and Golang will be a plus Ability to read/write code and expertise with various design patterns Have used NoSQL database such as More ❯
platforms with either Java/J2EE or .NET tech stack and database technologies such as Oracle, MySQL, etc. Exposure to polyglot programming languages like Scala, Python and Golang will be a plus Ability to read/write code and expertise with various design patterns Have used NoSQL databases such as More ❯
real-time data ingestion and storage. Ability to optimise and refactor existing data pipelines for greater performance and reliability. Programming skills in Python, Java, Scala, or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache More ❯
Streaming, Flink). Experience with GCP (preferred), AWS, or Azure for real-time data ingestion and storage. Strong programming skills in Python, Java, or Scala . Proficiency in SQL, NoSQL, and time-series databases . Knowledge of orchestration tools (Apache Airflow, Kubernetes). If you are a passionate and experienced More ❯