Analytics. Strong SQL and Python skills. Experience with data modeling, ETL processes, and data warehousing. Knowledge of big data technologies such as Spark and Hadoop is a plus. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Experience in the healthcare sector is a plus more »
manage multiple tasks and projects simultaneously. Preferred Qualifications AWS Certified Solutions Architect or other relevant AWS certifications. Experience with big data technologies such as Hadoop, Spark, or similar. Knowledge of data governance and data quality best practices. Familiarity with machine learning and AI concepts and tools. more »
ideas • Ability to set the direction and deliver on a vision with forward planning to achieve results • Technical knowledge of big data platforms (e.g., Hadoop and Hive) as well as knowledge of ML, Data science and advanced modelling techniques, technologies, and programming languages • Possess a high degree of self more »
Testing performance with JMeter or similar toolsWeb services technology such as REST, JSON or ThriftTesting web applications with Selenium WebDriverBig data technology such as Hadoop, MongoDB, Kafka or SQLNetwork principles and protocols such as HTTP, TLS and TCPContinuous integration systems such as Jenkins or BambooContinuous delivery concepts#LI-HybridWhat you more »
Red Hat Decision Central Key Responsibilities: Manage operational procedures. Transform and process data using Apache Spark. Administer AWS RDS with MySQL. Work with the Hadoop platform. Create reports using Tableau. Utilize Red Hat Decision Central About Capgemini Capgemini is a global leader in partnering with companies to transform and more »
Key Skills 3+ years of Python experience Highly statistical and Analytical Exposure to Google Cloud Platform ( BigQuery, GCS, Datalab, Dataproc, Cloud ML (desirable) Spark & Hadoop experience Strong communication skills Good problem solving skills Qualifications Bachelor's degree or equivalent experience in a quantative field (Statistics, Mathematics, Computer Science, Engineering … and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau) This is a permanent position, and offers flexibility with Hybrid working, 2-3 days per week in the office, depending on workload more »
/ELT tools.Experience with NoSQL type environments, Data Lakes, Lake-Houses (Cassandra, MongoDB or Neptune).Experience with distributed storage, processing engines such as ApacheHadoop and Apache Spark.Experience with message brokering/stream processing services such as Apache Kafka, Confluent, Azure Stream Analytics.Experience in Test Driven Development (TDD) and more »
computer science, Information Technology, or a related fieldExperience with containerisation and orchestration technologies (e.g., Docker, Kubernetes).Knowledge of big data technologies and frameworks (e.g., Hadoop, Spark).Familiarity with other cloud platforms (e.g. AWS, Google Cloud) and PaaS providers (e.g. Snowflake)Knowledge of Inner or Open Source paradigm and way more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
industry experience Experience in distributed system design Experience with Pure/Alloy Working knowledge of open-source tools such as AWS lambda, Prometheus Spark, Hadoop orSnowflake knowledge would be a plus. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Dublin more »
experience Demostrate In-depth knowledge of large-scale data platforms (Databricks, Snowflake) and cloud-native tools (Azure Synapse, RedShift) Experience of analytics technologies (Spark, Hadoop, Kafka) Have familiarity with Data Lakehouse architecture, SQL Server, DataOps, and data lineage concepts Demonstrate In-depth knowledge of large-scale data platforms (Databricks more »
South East London, England, United Kingdom Hybrid / WFH Options
Counter Terrorism Police
following: .NET (VB, C#, ASP.NET, .NET CORE) MVC Framework Python JavaScript (REACT, Bootstrap Frameworks) Database design SQL/SQL Server NoSQL technologies e.g., MongoDB, Hadoop, etc. If you’re the right person for the role, you’ll bring experience of working on a range of applications across the development more »
Experience with RDS like MySQL or PostgreSQL or others, and experienced with NoSQL like Redis or MongoDB or others; * Experience with BigData technologies like Hadoop eco-system is a plus. * Excellent writing, proof-reading and editing skills. Able to create documentation that can express cloud architectures using text and more »
EC1N, Farringdon Without, Greater London, United Kingdom
Damia Group Ltd
your key responsibilities will be to : Manage operational procedures. Transform and process data using Apache Spark. Administer AWS RDS with MySQL. Work with the Hadoop platform. Create reports using Tableau. Utilize Red Hat Decision Central Skills/Experience Required of the SC Cleared DevSecOps Engineer: Strong operational procedures knowledge. more »
Employment Type: Permanent
Salary: £50000 - £65000/annum 15% cash flex and 10% bonus
As an IT Specialist, you'll need broad expertise across various areas of the technology/software domain. Proficiency in AWS or Big Data, Hadoop or other SQL databases, Lucene, Spark, web app development (JavaScript, Node.js), Docker, Jenkins, Git, Python, or Ruby would be highly beneficial. Key Responsibilities: Meet more »
and analytical abilities. Preferred Skills: Experience with cloud databases (e.g., AWS, Azure, Google Cloud Platform). Knowledge of big data tools and frameworks (e.g., Hadoop, Spark). Certification in database management (e.g., Microsoft Certified: Azure Data Engineer Associate). What We Offer: Competitive salary and comprehensive benefits package. Opportunity more »