open-source ETL, and data pipeline orchestration tools such as Apache Airflow and Nifi. Experience with large scale/Big Data technologies, such as Hadoop, Spark, Hive, Impala, PrestoDb, Kafka. Experience with workflow orchestration tools like Apache Airflow. Experience with containerisation using Docker and deployment on Kubernetes. Experience with More ❯
Skills: Advanced proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Big Data Technologies: Extensive experience with big data technologies (e.g., Hadoop, Spark). Cloud Platforms: Deep understanding of cloud platforms (AWS, GCP, Azure) and their data services. DevOps Expertise: Strong understanding and practical experience with More ❯
Solid understanding of ETL processes , data modeling, and data warehousing. Familiarity with SQL and relational databases. Knowledge of big data technologies , such as Spark, Hadoop, or Kafka, is a plus. Strong problem-solving skills and the ability to work in a collaborative team environment. Excellent verbal and written communication More ❯
Azure, or GCP. Hands-on experience with AI/ML workflows or deploying machine learning models in production. Knowledge of big data technologies like Hadoop, Hive, or Spark. Familiarity with MLOps tools and practices, such as MLflow, Kubeflow, or DataRobot. Education: Bachelor's degree in Computer Science, Software Engineering More ❯
and Scikit-Learn. • Agile development experience along with related technologies • Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark). • Experience with data engineering processes, including ETL, data integration, and data pipeline development. • Knowledge of human centered-design principles and UI/ More ❯
a team-oriented environment. Preferred Skills: Experience with programming languages such as Python or R for data analysis. Knowledge of big data technologies (e.g., Hadoop, Spark) and data warehousing concepts. Familiarity with cloud data platforms (e.g., Azure, AWS, Google Cloud) is a plus. Certification in BI tools, SQL, or More ❯
data modeling . Experience with relational and NoSQL databases such as Oracle, Sybase, PostgreSQL, SQL Server, MongoDB . Familiarity with big data platforms (e.g., Hadoop, Snowflake). Prior experience with ETL tools or as a SQL developer . Proficiency in Python for data engineering and Tableau for reporting and More ❯
large-scale data. Experience with ETL processes for data ingestion and processing. Proficiency in Python and SQL. Experience with big data technologies like ApacheHadoop and Apache Spark. Familiarity with real-time data processing frameworks such as Apache Kafka or Flink. MLOps & Deployment: Experience deploying and maintaining large-scale More ❯
Data Analytics - Specialty or AWS Certified Solutions Architect - Associate. Experience with Airflow for workflow orchestration. Exposure to big data frameworks such as Apache Spark, Hadoop, or Presto. Hands-on experience with machine learning pipelines and AI/ML data engineering on AWS. Benefits: Competitive salary and performance-based bonus More ❯
Mars Wrigley Confectionery UK (SLO, WAL, ISB & PAD)
roles, with 5+ years in leadership positions. Expertise in modern data platforms (e.g., Azure, AWS, Google Cloud) and big data technologies (e.g., Spark, Kafka, Hadoop). Strong knowledge of data governance frameworks, regulatory compliance (e.g., GDPR, CCPA), and data security best practices. Proven experience in enterprise-level architecture design More ❯
Mars Wrigley Confectionery UK (SLO, WAL, ISB & PAD)
roles, with 5+ years in leadership positions. Expertise in modern data platforms (e.g., Azure, AWS, Google Cloud) and big data technologies (e.g., Spark, Kafka, Hadoop). Strong knowledge of data governance frameworks, regulatory compliance (e.g., GDPR, CCPA), and data security best practices. Proven experience in enterprise-level architecture design More ❯
Mars Wrigley Confectionery UK (SLO, WAL, ISB & PAD)
roles, with 5+ years in leadership positions. Expertise in modern data platforms (e.g., Azure, AWS, Google Cloud) and big data technologies (e.g., Spark, Kafka, Hadoop). Strong knowledge of data governance frameworks, regulatory compliance (e.g., GDPR, CCPA), and data security best practices. Proven experience in enterprise-level architecture design More ❯
Java Commercial experience in client-facing projects is a plus, especially within multi-disciplinary teams Deep knowledge of database technologies: Distributed systems (e.g., Spark, Hadoop, EMR) RDBMS (e.g., SQL Server, Oracle, PostgreSQL, MySQL) NoSQL (e.g., MongoDB, Cassandra, DynamoDB, Neo4j) Solid understanding of software engineering best practices - code reviews, testing More ❯
Computer Science , Information Technology, or equivalent experience, coupled with relevant professional certifications . Advanced SQL knowledge for database querying. Proficiency with big data tools (Hadoop, Spark) and familiarity with big data file formats (Parquet, Avro). Skilled in data pipeline and workflow management tools (Apache Airflow, NiFi). Strong More ❯
on mission critical data pipelines and ETL systems. 5+ years of hands-on experience with big data technology, systems and tools such as AWS, Hadoop, Hive, and Snowflake Expertise with common Software Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages More ❯
East London, London, United Kingdom Hybrid / WFH Options
Asset Resourcing
with programming languages such as Python or Java. Understanding of data warehousing concepts and data modeling techniques. Experience working with big data technologies (e.g., Hadoop, Spark) is an advantage. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Responsibilities: Design, build and maintain efficient and scalable data More ❯
with programming languages such as Python or Java. Understanding of data warehousing concepts and data modeling techniques. Experience working with big data technologies (e.g., Hadoop, Spark) is an advantage. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Benefits Enhanced leave - 38 days inclusive of 8 UK More ❯
data modeling, and ETL/ELT processes. Proficiency in programming languages such as Python, Java, or Scala. Experience with big data technologies such as Hadoop, Spark, and Kafka. Familiarity with cloud platforms like AWS, Azure, or Google Cloud. Excellent problem-solving skills and the ability to think strategically. Strong More ❯
experience in their technologies You have experience in database technologies including writing complex queries against their (relational and non-relational) data stores (e.g. Postgres, Hadoop, Elasticsearch, Graph databases), and designing the database schemas to support those queries You have a good understanding of coding best practices and design patterns More ❯
Python, Golang, PowerShell, Ruby 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Our inclusive culture More ❯
Azure (desirable) Advanced use of Airflow and Databricks Data Modeling Experience Data Warehousing Solutions Experience Database Management Experience Understanding of big data technologies like Hadoop, Spark, and Kafka. Effective Communication Machine Learning Experience Agile Development Experience Experience with Docker Knowledge in Data Lakes and Data Warehouses Education and professional More ❯
Grand Prairie, Texas, United States Hybrid / WFH Options
Jobot
including data manipulation (Pandas, NumPy) and workflow management (Dask, PySpark, FastAPI). o Solid knowledge of cloud platforms (Azure, AWS) and big data technologies (Hadoop, Spark). o Hands-on experience with Docker, Kubernetes, and containerized environments. o Strong understanding of dimensional modeling (Kimball), relational database design (3NF), and More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
Noblis
domain Experience with monitoring, logging, and alerting tools (e.g., Prometheus, Grafana). Experience querying databases (SQL, Hive). Experience working with data platforms like Hadoop and Spark. Overview Noblis and our wholly owned subsidiaries, Noblis ESI , and Noblis MSD tackle the nation's toughest problems and apply advanced solutions More ❯
MongoDB, Cassandra). • In-depth knowledge of data warehousing concepts and tools (e.g., Redshift, Snowflake, Google BigQuery). • Experience with big data platforms (e.g., Hadoop, Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud). • Expertise in ETL tools and processes (e.g. More ❯