Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one programming language (e.g. Python, Scala, Java, R). Experience deploying and maintaining cloud infrastructure (e.g. AWS, GCP, or Azure). Familiarity with data modeling and warehousing concepts, and dimensional modeling More ❯
and drive A passionate bias to action and passion for delivering high-quality data solutions Expertise with common Data Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages Experience with workflow orchestration tools such as Airflow Deep understanding of end-to-end More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
not limited to banking, insurance, healthcare, media, retail, infrastructure and telco. The ideal candidate with have expertise in some of the following: Python, SQL, Scala, and Java for data engineering. Strong experience with big data tools (Apache Spark, Hadoop, Databricks, Dask) and cloud platforms (AWS, Azure, GCP). Proficient in More ❯
. Experience working with cloud platforms such as AWS, GCP, or Azure for real-time data ingestion and storage. Programming skills in Python, Java, Scala, or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache More ❯
need to see from you Qualifications Proven experience as a Data Engineer with a strong background in data pipelines. Proficiency in Python, Java, or Scala, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with Databricks, Azure AI Services, and cloud platforms (AWS, Google Cloud, Azure). Solid understanding More ❯
profiling, data modeling and data engineering using relational databases like Snowflake, Oracle, SQL Server; ETL tools like Informatica IICS; scripting using Python, R, or Scala; workflow management tools like Autosys Experience with stream processing systems like Kafka, Spark streaming etc Experience in Java, JMS, SOAP, REST, JSON, XML technologies, along More ❯
one of PostgreSQL, MSSQL, Google BigQuery, SparkSQL). Modelling & Statistical Analysis experience, ideally customer related. Coding skills in at least one of Python, R, Scala, C, Java or JS. Track record of using data manipulation and machine learning libraries in one or more programming languages. Keen interest in some of More ❯
least one of PostgreSQL, MSSQL, Google BigQuery, SparkSQL) Modelling & Statistical Analysis experience, ideally customer related Coding skills in at least one of Python, R, Scala, C, Java or JS Track record of using data manipulation and machine learning libraries in one or more programming languages. Keen interest in some of More ❯
least one of PostgreSQL, MSSQL, Google BigQuery, SparkSQL) Modelling & Statistical Analysis experience, ideally customer related Coding skills in at least one of Python, R, Scala, C, Java or JS Track record of using data manipulation and machine learning libraries in one or more programming languages. Keen interest in some of More ❯
asynchronous architecture Familiar with AWS, Unix/Linux, Git, SQL, and REST Bonus Points for Experience or interest in: Functional programming languages such as Scala, Haskell and Clojure Relational and NoSQL databases such as PostgreSQL and MongoDB DevOps such as Terraform, Fargate and Kubernetes Frontend development such as Node.js and More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Full Visibility LLC
unstructured data Hands-on experience with ETL tools and data workflow orchestration (e.g., Apache Airflow, Luigi, Prefect) Strong programming skills in Python, SQL, or Scala Experience with open-source data processing tools (e.g., Kafka, Spark, Flink, Hadoop) Familiarity with database technologies (PostgreSQL, MySQL, or NoSQL solutions) Ability to work in More ❯
processes . Strong SQL skills , with the ability to write optimized and scalable queries. Or, Proficiency in at least one programming language ( Python , Java, Scala, or .NET). CI/CD : Experience using CI/CD pipelines for development and deployment of data pipelines. Proficiency in Git-based workflows and More ❯
real-time data ingestion and storage. Ability to optimise and refactor existing data pipelines for greater performance and reliability. Programming skills in Python, Java, Scala, or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache More ❯
in Computer Science, Engineering, or related field. 1-3 years of professional software development experience. Experience with cloud-based systems, programming languages like Java, Scala, Go, C++, frontend technologies like React, SQL/NoSQL, Docker, Kubernetes, CI/CD tools, and Agile methodologies. Preferred Qualifications Strong problem-solving, creativity, and More ❯
Hanover, Maryland, United States Hybrid / WFH Options
Base-2 Solutions, LLC
from system level to individual software components. Experience with some or all of the following: Programming languages such as Java, C#, C++, JavaScript, Python, Scala, Ruby and Go IDEs such as Eclipse, NetBeans, IntelliJ IDEA, PyCharm, Visual Studio and VS Code Operating systems such as Windows, Unix, Linux and macOS More ❯
Staines, Middlesex, United Kingdom Hybrid / WFH Options
Industrial and Financial Systems
data ingestion tools like Airbyte, Fivetran, etc. for diverse data sources. Expert in large-scale data processing with Spark or Dask. Strong in Python, Scala, C# or Java, cloud SDKs and APIs. AI/ML expertise for pipeline efficiency, familiar with TensorFlow, PyTorch, AutoML, Python/R, and MLOps (MLflow More ❯
leveraging AWS services and implement effective metrics and monitoring processes. Your Skills and Experience Experience of AWS tools (e.g. Athena, Redshift, Glue, EMR), Java, Scala, Python, Spark. Experience of developing enterprise grade ETL/ELT data pipelines and demonstrable knowledge of applying Data Engineering best practices (coding practices to DS More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
CipherTek Recruitment
Expertise in Azure DevOps and GitHub Actions. Familiarity with Databricks CLI and Databricks Job Bundle. Strong programming skills in Python and SQL; familiarity with Scala is a plus. Solid understanding of AI/ML algorithms, model training, evaluation (including hyperparameter tuning), deployment, monitoring, and governance. Experience in handling large datasets More ❯
Hanover, Maryland, United States Hybrid / WFH Options
Base-2 Solutions, LLC
from system level to individual software components. Experience with some or all of the following: Programming languages such as Java, C#, C++, JavaScript, Python, Scala, Ruby and Go IDEs such as Eclipse, NetBeans, IntelliJ IDEA, PyCharm, Visual Studio and VS Code Operating systems such as Windows, Unix, Linux and macOS More ❯
Hanover, Maryland, United States Hybrid / WFH Options
Base-2 Solutions, LLC
from system level to individual software components. Experience with some or all of the following: Programming languages such as Java, C#, C++, JavaScript, Python, Scala, Ruby and Go IDEs such as Eclipse, NetBeans, IntelliJ IDEA, PyCharm, Visual Studio and VS Code Operating systems such as Windows, Unix, Linux and macOS More ❯
Hanover, Maryland, United States Hybrid / WFH Options
Base-2 Solutions, LLC
from system level to individual software components. Experience with some or all of the following: Programming languages such as Java, C#, C++, JavaScript, Python, Scala, Ruby and Go IDEs such as Eclipse, NetBeans, IntelliJ IDEA, PyCharm, Visual Studio and VS Code Operating systems such as Windows, Unix, Linux and macOS More ❯
Proven experience in software engineering and development, and a strong understanding of computer systems and how they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set More ❯
best practices for building efficient and scalable data pipelines. Requirements • Minimum 4 years of experience as a Data Engineer. • Proven experience with Python or Scala • Knowledge in Data Warehousing, Data Lake and Lakehouse paradigms • Experience with orchestration tools like Airflow or Oozie • Ability to design and implement DevOps strategies for More ❯
Reading, Oxfordshire, United Kingdom Hybrid / WFH Options
iO Associates
similar role, with a focus on Databricks. Strong expertise in data engineering, ETL processes, and data warehousing. Proficiency in programming languages such as Python, Scala, or SQL. Experience with cloud platforms (e.g., AWS, Azure, GCP) and big data technologies. Excellent leadership, communication, and problem-solving skills. Ability to work independently More ❯
you're looking for ! WHAT ARE WE LOOKING FOR? Min. 3 yrs on role Software Engineering + Backend Eng. (one of languages: Python, Java, Scala, Rust - with focus on Python today) Practice with high quality codes (unit testing, integration testing, others) + code metrics Fast discovery what is in new More ❯