as Python, Java or Scala. - Solid understanding of data architecture, data modeling, and designing large-scale data pipelines. - Experience with big data technologies (e.g., Hadoop, Spark, Kafka) and data storage solutions (e.g., Redshift, Snowflake, S3). - Ability to manage and prioritize multiple projects and deliver them on time. - Strong More ❯
open-source ETL, and data pipeline orchestration tools such as Apache Airflow and Nifi. Experience with large scale/Big Data technologies, such as Hadoop, Spark, Hive, Impala, PrestoDb, Kafka. Experience with workflow orchestration tools like Apache Airflow. Experience with containerisation using Docker and deployment on Kubernetes. Experience with More ❯
Skills: Advanced proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Big Data Technologies: Extensive experience with big data technologies (e.g., Hadoop, Spark). Cloud Platforms: Deep understanding of cloud platforms (AWS, GCP, Azure) and their data services. DevOps Expertise: Strong understanding and practical experience with More ❯
Solid understanding of ETL processes , data modeling, and data warehousing. Familiarity with SQL and relational databases. Knowledge of big data technologies , such as Spark, Hadoop, or Kafka, is a plus. Strong problem-solving skills and the ability to work in a collaborative team environment. Excellent verbal and written communication More ❯
data modeling . Experience with relational and NoSQL databases such as Oracle, Sybase, PostgreSQL, SQL Server, MongoDB . Familiarity with big data platforms (e.g., Hadoop, Snowflake). Prior experience with ETL tools or as a SQL developer . Proficiency in Python for data engineering and Tableau for reporting and More ❯
AWS Certified Data Engineer, or AWS Certified Data Analytics, or AWS Certified Solutions Architect Experience with big data tools and technologies like Apache Spark, Hadoop, and Kafka Knowledge of CI/CD pipelines and automation tools such as Jenkins or GitLab CI About Adastra For more than 25 years More ❯
Data Analytics - Specialty or AWS Certified Solutions Architect - Associate. Experience with Airflow for workflow orchestration. Exposure to big data frameworks such as Apache Spark, Hadoop, or Presto. Hands-on experience with machine learning pipelines and AI/ML data engineering on AWS. Benefits: Competitive salary and performance-based bonus More ❯
and Scikit-Learn. • Agile development experience along with related technologies • Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark). • Experience with data engineering processes, including ETL, data integration, and data pipeline development. • Knowledge of human centered-design principles and UI/ More ❯
roles, with 5+ years in leadership positions. Expertise in modern data platforms (e.g., Azure, AWS, Google Cloud) and big data technologies (e.g., Spark, Kafka, Hadoop). Strong knowledge of data governance frameworks, regulatory compliance (e.g., GDPR, CCPA), and data security best practices. Proven experience in enterprise-level architecture design More ❯
Java Commercial experience in client-facing projects is a plus, especially within multi-disciplinary teams Deep knowledge of database technologies: Distributed systems (e.g., Spark, Hadoop, EMR) RDBMS (e.g., SQL Server, Oracle, PostgreSQL, MySQL) NoSQL (e.g., MongoDB, Cassandra, DynamoDB, Neo4j) Solid understanding of software engineering best practices - code reviews, testing More ❯
Frameworks: Extensive experience with AI frameworks and libraries, including TensorFlow, PyTorch, or similar. ? Data Processing: Expertise in big data technologies such as Apache Spark, Hadoop, and experience with data pipeline tools like Apache Airflow. ? Cloud Platforms: Strong experience with cloud services, particularly AWS, Azure, or Google Cloud Platform, including More ❯
large-scale data. Experience with ETL processes for data ingestion and processing. Proficiency in Python and SQL. Experience with big data technologies like ApacheHadoop and Apache Spark. Familiarity with real-time data processing frameworks such as Apache Kafka or Flink. MLOps & Deployment: Experience deploying and maintaining large-scale More ❯
Computer Science , Information Technology, or equivalent experience, coupled with relevant professional certifications . Advanced SQL knowledge for database querying. Proficiency with big data tools (Hadoop, Spark) and familiarity with big data file formats (Parquet, Avro). Skilled in data pipeline and workflow management tools (Apache Airflow, NiFi). Strong More ❯
Proven experience as a Data Engineer with a strong background in data pipelines. Proficiency in Python, Java, or Scala, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with Databricks, Azure AI Services, and cloud platforms (AWS, Google Cloud, Azure). Solid understanding of SQL and NoSQL databases. More ❯
years experience working on mission critical data pipelines and ETL systems, hands-on experience with big data technology, systems and tools such as AWS, Hadoop, Hive, and Snowflake Detailed problem-solving approach, coupled with a strong sense of ownership and drive A passionate bias to action and passion for More ❯
MongoDB, Cassandra). • In-depth knowledge of data warehousing concepts and tools (e.g., Redshift, Snowflake, Google BigQuery). • Experience with big data platforms (e.g., Hadoop, Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud). • Expertise in ETL tools and processes (e.g. More ❯
Naperville, Illinois, United States Hybrid / WFH Options
esrhealthcare
Familiarity with the Technology stack available in the industry for data management, data ingestion, capture, processing and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc. Familiarity with Networking, Windows/Linux virtual machines, Container, Storage, ELB, AutoScaling is a plus Experience More ❯
East London, London, United Kingdom Hybrid / WFH Options
Asset Resourcing
with programming languages such as Python or Java. Understanding of data warehousing concepts and data modeling techniques. Experience working with big data technologies (e.g., Hadoop, Spark) is an advantage. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Responsibilities: Design, build and maintain efficient and scalable data More ❯
with programming languages such as Python or Java. Understanding of data warehousing concepts and data modeling techniques. Experience working with big data technologies (e.g., Hadoop, Spark) is an advantage. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Benefits Enhanced leave - 38 days inclusive of 8 UK More ❯
data modeling, and ETL/ELT processes. Proficiency in programming languages such as Python, Java, or Scala. Experience with big data technologies such as Hadoop, Spark, and Kafka. Familiarity with cloud platforms like AWS, Azure, or Google Cloud. Excellent problem-solving skills and the ability to think strategically. Strong More ❯
experience in their technologies You have experience in database technologies including writing complex queries against their (relational and non-relational) data stores (e.g. Postgres, Hadoop, Elasticsearch, Graph databases), and designing the database schemas to support those queries You have a good understanding of coding best practices and design patterns More ❯
Azure (desirable) Advanced use of Airflow and Databricks Data Modeling Experience Data Warehousing Solutions Experience Database Management Experience Understanding of big data technologies like Hadoop, Spark, and Kafka. Effective Communication Machine Learning Experience Agile Development Experience Experience with Docker Knowledge in Data Lakes and Data Warehouses Education and professional More ❯
schemas for efficient querying. Implementing ETL/ELT pipelines to load and transform data in Snowflake. Big Data Processing Frameworks : Familiarity with Apache Spark , Hadoop, or other distributed data processing frameworks. Data Governance and Compliance : Understanding of data governance principles , security policies, and compliance standards (e.g., GDPR, HIPAA). More ❯
on mission critical data pipelines and ETL systems. 5+ years of hands-on experience with big data technology, systems and tools such as AWS, Hadoop, Hive, and Snowflake Expertise with common Software Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages More ❯