a team-oriented environment. Preferred Skills: Experience with programming languages such as Python or R for data analysis. Knowledge of big data technologies (e.g., Hadoop, Spark) and data warehousing concepts. Familiarity with cloud data platforms (e.g., Azure, AWS, Google Cloud) is a plus. Certification in BI tools, SQL, or More ❯
large-scale data. Experience with ETL processes for data ingestion and processing. Proficiency in Python and SQL. Experience with big data technologies like ApacheHadoop and Apache Spark. Familiarity with real-time data processing frameworks such as Apache Kafka or Flink. MLOps & Deployment: Experience deploying and maintaining large-scale More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Qodea
Platform (GCP). Proficiency in SQL and experience with relational databases such as MySQL, PostgreSQL, or Oracle. Experience with big data technologies such as Hadoop, Spark, or Hive. Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow. Proficiency in at least one More ❯
East London, London, United Kingdom Hybrid / WFH Options
Asset Resourcing
with programming languages such as Python or Java. Understanding of data warehousing concepts and data modeling techniques. Experience working with big data technologies (e.g., Hadoop, Spark) is an advantage. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Responsibilities: Design, build and maintain efficient and scalable data More ❯
with programming languages such as Python or Java. Understanding of data warehousing concepts and data modeling techniques. Experience working with big data technologies (e.g., Hadoop, Spark) is an advantage. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Benefits Enhanced leave - 38 days inclusive of 8 UK More ❯
experience in their technologies You have experience in database technologies including writing complex queries against their (relational and non-relational) data stores (e.g. Postgres, Hadoop, Elasticsearch, Graph databases), and designing the database schemas to support those queries You have a good understanding of coding best practices and design patterns More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
Noblis
domain Experience with monitoring, logging, and alerting tools (e.g., Prometheus, Grafana). Experience querying databases (SQL, Hive). Experience working with data platforms like Hadoop and Spark. Overview Noblis and our wholly owned subsidiaries, Noblis ESI , and Noblis MSD tackle the nation's toughest problems and apply advanced solutions More ❯
implementing cloud based data solutions using AWS services such as EC2, S3, EKS, Lambda, API Gateway, Glue and bid data tools like Spark, EMR, Hadoop etc. Hands on experience on data profiling, data modeling and data engineering using relational databases like Snowflake, Oracle, SQL Server; ETL tools like Informatica More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
Aerospace Corporation
and guiding teams toward software development best practices Experience in SQL, NoSQL, Cypher and other big data querying languages Experience with big data frameworks (Hadoop, Spark, Flink etc.) Experience with ML lifecycle management tools (MLflow, Kubeflow, etc.) Familiarity with data pipelining and streaming technologies (Apache Kafka, Apache Nifi, etc. More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Robert Half
ETL/ELT processes . Experience with data integration tools (e.g., Apache Kafka, Talend, Informatica) and APIs . Familiarity with big data technologies (e.g., Hadoop, Spark) and real-time streaming Expertise in cloud security , data governance , and compliance (e.g., GDPR, HIPAA). Strong SQL skills and proficiency in at More ❯
infrastructure and its impact on data architecture. Data Technology Skills: A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes, SQL, NoSQL databases is a More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and More ❯
with unstructured datasets. Engineering best practices and standards. Experience with data warehouse software (e.g. Snowflake, Google BigQuery, Amazon Redshift). Experience with data tools: Hadoop, Spark, Kafka, etc. Code versioning (Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing More ❯
Telford, Shropshire, West Midlands, United Kingdom Hybrid / WFH Options
JLA Resourcing Ltd
the ability to convey complex concepts to non-technical stakeholders. Desirable (but not essential): Experience with SSIS, AWS or Azure Data Factory. Familiarity with Hadoop, Jenkins, or DevOps practices including CI/CD. Cloud certifications (Azure or AWS). Knowledge of additional programming languages or ETL tools. This is More ❯
bristol, south west england, United Kingdom Hybrid / WFH Options
LHH
on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine learning algorithms and data More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and More ❯
Columbia, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
of the time to customer sites located in Hawaii. Subject to change based on customer needs. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus Work could More ❯
The role We are looking for a Data Engineer to join the Data Science & Engineering team in London. Working at WGSN Together, we create tomorrow A career with WGSN is fast-paced, exciting and full of opportunities to grow and More ❯
will be deployed You have experience in database technologies including writing complex queries against their (relational and non-relational) data stores (e.g. Postgres, ApacheHadoop, Elasticsearch, Graph databases), and designing the database schemas to support those queries You have a good understanding of coding best practices & design patterns and More ❯