or related industries. Certification in relevant areas (e.g., AWS Certified Data Analytics, Google Data Analytics Professional Certificate). Familiarity with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure). Experience with data visualization design principles and storytelling techniques. Knowledge of agile methodologies and project management. More ❯
collaborate with technical teams to build scalable solutions. Expertise in programming languages such as Python, R, SQL, and familiarity with big data technologies like Hadoop, Spark, and cloud platforms (e.g., AWS, GCP, Azure). Exceptional problem-solving and analytical thinking. Strong leadership, team-building, and mentoring skills. Excellent communication More ❯
science, mathematics, or a related quantitative field - Experience with scripting languages (e.g., Python, Java, R) and big data technologies/languages (e.g. Spark, Hive, Hadoop, PyTorch, PySpark) PREFERRED QUALIFICATIONS - Master's degree, or Advanced technical degree - Knowledge of data modeling and data pipeline design - Experience with statistical analysis, co More ❯
Keycloak. Experience with web frameworks such as Fast API, SpringBoot and Express. Demonstrated hands on experience working with Demonstrated hands-on experience working with Hadoop, Apache Spark and their related ecosystems. A candidate must be a US Citizen and requires an active/current TS/SCI with Polygraph More ❯
and virtual environments. Experience with network traffic inspection tools (e.g., Suricata, Arkime, Zeek, etc.). Knowledge of big data technologies, (e.g., Elastic Search, ApacheHadoop, Spark, Kafka, etc.). Relevant Certifications: Certifications in Cloud Engineering, (e.g., Amazon Web Services (AWS) Solutions Architect - Associate; Microsoft Certified: Azure Fundamentals; Google Associate More ❯
with unstructured datasets. Engineering best practices and standards. Experience with data warehouse software (e.g. Snowflake, Google BigQuery, Amazon Redshift). Experience with data tools: Hadoop, Spark, Kafka, etc. Code versioning (Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing More ❯
Tableau, Power BI)Knowledge of statistical analysis and machine learning techniquesFamiliarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark)Excellent analytical and problem-solving skillsStrong communication and collaboration skills More ❯
contributions to the delivery process, manage tasks, and update teams on progress. Skills & Experience: Proven experience as a Data Engineer with expertise in Databricks, Hadoop/Spark. Strong programming skills in Python, Scala, or SQL, with knowledge of CI/CD platforms. Proficiency with distributed computing frameworks and cloud More ❯
AWS Databases: MSSQL, PostgreSQL, MySQL, NoSQL Cloud: AWS (preferred), with working knowledge of cloud-based data solutions Nice to Have: Experience with graph databases, Hadoop/Spark, or enterprise data lake environments What You’ll Bring Strong foundation in computer science principles (data structures, algorithms, etc.) Experience building enterprise More ❯
london, south east england, united kingdom Hybrid / WFH Options
Randstad Digital UK
AWS Databases: MSSQL, PostgreSQL, MySQL, NoSQL Cloud: AWS (preferred), with working knowledge of cloud-based data solutions Nice to Have: Experience with graph databases, Hadoop/Spark, or enterprise data lake environments What You’ll Bring Strong foundation in computer science principles (data structures, algorithms, etc.) Experience building enterprise More ❯
e.g. Tensorflow, MXNet, scikit-learn). Knowledge of software engineering practices (coding practices to DS, unit testing, version control, code review). Experience with Hadoop (especially the Cloudera and Hortonworks distributions), other NoSQL (especially Neo4j and Elastic), and streaming technologies (especially Spark Streaming). Deep understanding of data manipulation More ❯
Java, C++) and experience with DevOps practices (CI/CD). Familiarity with containerization (Docker, Kubernetes), RESTful APIs, microservices architecture, and big data technologies (Hadoop, Spark, Flink). Knowledge of NoSQL databases (MongoDB, Cassandra, DynamoDB), message queueing systems (Kafka, RabbitMQ), and version control systems (Git). Preferred Skills: Experience More ❯
would be great if you also had: Experience of configuring and using ETL platforms such as SSIS, AWS or Azure Data Factory Experience of Hadoop and Jenkins Azure Certified AWS Certified Familiarity with Java Knowledge of DevOps practices, including CI/CD pipelines What we do for you: At More ❯
least two or more (2+) of the following: Structured Query Language (SQL); Statistical Programming; Machine Learning; Data Visualization; Big Data Tools (Nifi, Elasticsearch, Kafka, Hadoop, Spark); Data Centers; Data Mining or Data Analytics; Communications networks; Internet protocols and RedHat operating systems Strong ETL and API knowledge for both the More ❯
with distributed systems as it pertains to data storage and computing Experience with Redshift, Oracle, NoSQL etc. Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Bachelor's degree PREFERRED QUALIFICATIONS Experience working on and delivering end to end projects independently Experience providing technical leadership and mentoring More ❯
Java 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL More ❯
Java 4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 4+ year experience working on real-time data and streaming applications 4+ years of experience with NoSQL More ❯
Java 4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 4+ year experience working on real-time data and streaming applications 4+ years of experience with NoSQL More ❯
application development including Python, SQL, Scala, or Java 4+ years of experience with AWS 4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 4+ year experience working on real-time data and streaming applications 4+ years of experience with NoSQL More ❯
Java 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL More ❯
Java 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL More ❯
Java 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL More ❯
Java 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL More ❯
Java 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL More ❯
Java 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL More ❯