crawley, west sussex, south east england, united kingdom
General Motors
simulation workflows professional motorsports organization. Experience using simulation tools to optimize vehicle performance. Experience with machine learning libraries. Experience with big data tools (e.g. Hadoop, Spark, SQL, and NoSQL Database experience). About GM Our vision is a world with Zero Crashes, Zero Emissions and Zero Congestion and we More ❯
newcastle-upon-tyne, tyne and wear, north east england, united kingdom
General Motors
simulation workflows professional motorsports organization. Experience using simulation tools to optimize vehicle performance. Experience with machine learning libraries. Experience with big data tools (e.g. Hadoop, Spark, SQL, and NoSQL Database experience). About GM Our vision is a world with Zero Crashes, Zero Emissions and Zero Congestion and we More ❯
Puppet, SaltStack), Terraform, CloudFormation; Programming Languages and Frameworks Node.js, React/Material-UI (plus Angular), Python, JavaScript; Big Data Processing and Analysis e.g., ApacheHadoop (CDH), Apache Spark; Operating Systems Red Hat Enterprise Linux, CentOS, Debian, or Ubuntu. More ❯
systems such as S3, ADLS, or HDFS Experience with AWS, Azure, and Google Cloud Platform and background in large scale data processing systems (e.g., Hadoop, Spark, etc.) is a plus Ability to scope and plan solutions for big problems and mentors others on the same Interested and motivated to More ❯
etc.) and RDBMS. Eligible for SC clearance (public sector experience is a strong plus). Nice-to-Have: Experience with big data tools (e.g. Hadoop, Spark, MapReduce). Exposure to microservices for data delivery and streaming architectures. AWS certifications (e.g. Solutions Architect, Big Data). Knowledge of data visualization More ❯
Experience with programming, ideally Python, and the ability to quickly pick up handling large data volumes with modern data processing tools, e.g. by using Hadoop/Spark/SQL. Experience with or ability to quickly learn open-source software including machine learning packages, such as Pandas, and Sci-kit More ❯
large datasets, data wrangling, and data preprocessing. Ability to work independently and lead projects from inception to deployment. Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure). Preferred Skills: MSc or PhD in Computer Science, Artificial Intelligence, or related field. ADDITIONAL NOTES More ❯
Experience with programming, ideally Python, and the ability to quickly pick up handling large data volumes with modern data processing tools, e.g. by using Hadoop/Spark/SQL. Experience with or ability to quickly learn open-source software including machine learning packages, such as Pandas, scikit-learn, along More ❯
large scale project management experience 5+ years of continuous integration and continuous delivery (CI/CD) experience 5+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience 5+ years of consulting, design and implementation of serverless distributed solutions experience 3+ years of cloud based solution (AWS or More ❯
for horizontal solutions (e.g. migrations, MLOps, observability) Ability to identify, debug bottlenecks and significantly optimize slow running data pipelines Experience migrating applications from legacy Hadoop or Data Warehouses to the cloud Databricks Data Engineering certification or willing to obtain Other nice to have skills: You bring an uncompromising Growth More ❯
Location: Worcester Duration: 6 month initial contract Rate: (Outside IR35) Security: Active DV clearance required Role details: We are looking for 3 x Data Engineers to join our defence & security client on a contract basis. You will help design, develop More ❯
design, and implementation of customers modern data platforms. The ideal candidate will have extensive experience migrating traditional data warehouse technologies, including Teradata, Oracle, BW, Hadoop to modern cloud data platforms like Databricks, Snowflake, Redshift, Bigquery, or Microsoft fabric. You will be responsible for leading data platform migrations and the … SAP BW and migration of these data warehouses to modern cloud data platforms. Deep understanding and hands-on experience with big data technologies like Hadoop, HDFS, Hive, Spark and cloud data platform services. Proven track record of designing and implementing large-scale data architectures in complex environments. CICD/… practices. Excellent problem-solving, analytical, and critical-thinking skills. Strong leadership, communication, and collaboration abilities. Preferred Qualifications: Experience in data warehouse (SAP BW, Teradata, Hadoop, Oracle etc) migration to cloud data platforms. Familiarity with data visualization and BI tools (e.g., Tableau, Power BI). Experience with cloud-based data More ❯
large scale project management experience 5+ years of continuous integration and continuous delivery (CI/CD) experience 5+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience 5+ years of software development with object oriented language experience 3+ years of cloud based solution (AWS or equivalent), system More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting More ❯
tools, and validation strategies to support new features Help design/build distributed real-time systems and features Use big data technologies (e.g. Spark, Hadoop, HBase, Cassandra) to build large scale machine learning pipelines Develop new systems on top of real-time streaming technologies (e.g. Kafka, Flink) Minimum Requirements More ❯
learning, data mining and statistics. Traffic quality systems process billions of ad-impressions and clicks per day, by leveraging leading open source technologies like Hadoop, Spark, Redis and Amazon's cloud services like EC2, S3, EMR, DynamoDB and RedShift. The candidate should have reasonable programming and design skills to More ❯
invent, a track record of thought leadership and contributions that have advanced the field. - 2+ years experience with large scale distributed systems such as Hadoop, Spark etc. - Excellent written and spoken communication skills Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran More ❯
global level Prior experience working in data and/or technology- familiarity with tools such as JavaScript, Python & Python Libraries, Structured Query Language (SQL), Hadoop, Snowflake, Qlik, Tableau, Microsoft Excel, Artificial intelligence, Collibra, Manta Consultative experience Track record of solving data integration challenges and building data solutions across complex More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. - Master's degree in math/statistics/engineering or other equivalent quantitative discipline, or PhD More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. PhD Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace More ❯
tools. • Messaging & Streaming: Exposure to RabbitMQ or similar message/streaming broker technologies. • Advanced Technologies: Interest or experience in big data technologies (e.g., HBase, Hadoop), machine learning frameworks (e.g., Spark), and orbit dynamics. Why Join Us? • Innovative Environment: Be part of projects at the cutting edge of space systems More ❯
Software Design or Development, Content Distribution/CDN, Scripting/Automation, Database Architecture, Cloud Architecture, Cloud Migrations, IP Networking, IT Security, Big Data/Hadoop/Spark, Operations Management, Service Oriented Architecture etc. Experience in a 24x7 operational services or support environment. Experience with AWS Cloud services and/ More ❯
We are a leading multi-strategy systematic hedge fund based in London, leveraging advanced technology and data to drive our trading strategies. Our team includes top quantitative researchers, data scientists, and engineers, all collaborating to develop innovative solutions. We are More ❯