swindon, wiltshire, south west england, united kingdom Hybrid / WFH Options
Adepta Partners Limited
for? Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Experience performing tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Ability to closely with other engineering teams to More ❯
south west london, south east england, united kingdom Hybrid / WFH Options
Adepta Partners Limited
for? Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Experience performing tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Ability to closely with other engineering teams to More ❯
preston, lancashire, north west england, united kingdom Hybrid / WFH Options
Adepta Partners Limited
for? Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Experience performing tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Ability to closely with other engineering teams to More ❯
high wycombe, south east england, united kingdom Hybrid / WFH Options
Adepta Partners Limited
for? Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Experience performing tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Ability to closely with other engineering teams to More ❯
milton keynes, south east england, united kingdom Hybrid / WFH Options
Adepta Partners Limited
for? Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Experience performing tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Ability to closely with other engineering teams to More ❯
oxford district, south east england, united kingdom Hybrid / WFH Options
Adepta Partners Limited
for? Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Experience performing tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Ability to closely with other engineering teams to More ❯
crawley, west sussex, south east england, united kingdom Hybrid / WFH Options
Adepta Partners Limited
for? Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Experience performing tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Ability to closely with other engineering teams to More ❯
bolton, greater manchester, north west england, united kingdom Hybrid / WFH Options
Adepta Partners Limited
for? Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Experience performing tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Ability to closely with other engineering teams to More ❯
kingston upon hull, east yorkshire, yorkshire and the humber, united kingdom Hybrid / WFH Options
Adepta Partners Limited
for? Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Experience performing tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Ability to closely with other engineering teams to More ❯
newcastle-upon-tyne, tyne and wear, north east england, united kingdom Hybrid / WFH Options
Adepta Partners Limited
for? Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Experience performing tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Ability to closely with other engineering teams to More ❯
migration of these data warehouses to modern cloud data platforms. Deep understanding and hands-on experience with big data technologies like Hadoop, HDFS, Hive, Spark and cloud data platform services. Proven track record of designing and implementing large-scale data architectures in complex environments. CICD/DevOps experience is More ❯
processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. Experience working with one or more of Spark, Kafka, or Snowflake My client have very limited interview slots and they are looking to fill this vacancy within the next 2 weeks. I More ❯
automation. Proficiency in building and maintaining batch and streaming ETL/ELT pipelines at scale, employing tools such as Airflow, Fivetran, Kafka, Iceberg, Parquet, Spark, Glue for developing end-to-end data orchestration leveraging on AWS services to ingest, transform and process large volumes of structured and unstructured data More ❯
Staines, Middlesex, United Kingdom Hybrid / WFH Options
Industrial and Financial Systems
Argo, Dagster or similar. Skilled with data ingestion tools like Airbyte, Fivetran, etc. for diverse data sources. Expert in large-scale data processing with Spark or Dask. Strong in Python, Scala, C# or Java, cloud SDKs and APIs. AI/ML expertise for pipeline efficiency, familiar with TensorFlow, PyTorch More ❯
in a data-focused role, with a strong passion for working with data and delivering value to stakeholders. Strong proficiency in SQL, Python, and ApacheSpark , with hands-on experience using these technologies in a production environment. Experience with Databricks and Microsoft Azure is highly desirable. Financial Services More ❯
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
Noir
in a data-focused role, with a strong passion for working with data and delivering value to stakeholders. Strong proficiency in SQL, Python, and ApacheSpark , with hands-on experience using these technologies in a production environment. Experience with Databricks and Microsoft Azure is highly desirable. Financial Services More ❯
will: Design and deploy production data pipelines from ingestion to consumption within a big data architecture. Work with technologies such as Python, Java, Scala, Spark, and SQL to extract, clean, transform, and integrate data. Build scalable solutions using AWS services like EMR, Glue, Redshift, Kinesis, Lambda, and DynamoDB. Process More ❯
MSSQL, PostgreSQL, MySQL, NoSQL Cloud: AWS (preferred), with working knowledge of cloud-based data solutions Nice to Have: Experience with graph databases, Hadoop/Spark, or enterprise data lake environments What You’ll Bring Strong foundation in computer science principles (data structures, algorithms, etc.) Experience building enterprise-grade pipelines More ❯
MSSQL, PostgreSQL, MySQL, NoSQL Cloud: AWS (preferred), with working knowledge of cloud-based data solutions Nice to Have: Experience with graph databases, Hadoop/Spark, or enterprise data lake environments What You’ll Bring Strong foundation in computer science principles (data structures, algorithms, etc.) Experience building enterprise-grade pipelines More ❯
london, south east england, united kingdom Hybrid / WFH Options
Randstad Digital UK
MSSQL, PostgreSQL, MySQL, NoSQL Cloud: AWS (preferred), with working knowledge of cloud-based data solutions Nice to Have: Experience with graph databases, Hadoop/Spark, or enterprise data lake environments What You’ll Bring Strong foundation in computer science principles (data structures, algorithms, etc.) Experience building enterprise-grade pipelines More ❯
Columbia, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
the time to customer sites located in Hawaii. Subject to change based on customer needs. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus Work could possibly More ❯
Hanover, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
at our Columbia, MD office. Flexibility is essential to accommodate any changes in the schedule. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus Work could possibly More ❯
Azure or AWS Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as ApacheSpark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What More ❯
Azure or AWS Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as ApacheSpark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Peaple Talent
Azure or AWS Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as ApacheSpark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What More ❯