teams. Ideal candidates will have 5+ years of data science experience, proficiency in Python and streaming data (e.g., Kafka), and familiarity with datalake environments. Onsite Required (Ft. Belvoir & Reston, VA). Security Clearance: TS/SCI required (must be active in DISS). Minimum Requirements: • 5+ years of experience in applied data … end-to-end ML workflows (data prep to deployment and evaluation). • Ability to rapidly learn infrastructure concepts, particularly data pipelines and datalake interactions. • Skilled in developing ML models for document classification, extraction, summarization, and search. • Proven ability to manage high-throughput, production-level ML systems. • Bachelor's degree. Key Responsibilities: • Lead … and improve models for performance and accuracy. • Support data science operations tied to millions of streaming documents weekly. Skills and Proficiencies: • Strong understanding of datalake architectures and streaming data environments. • Familiarity with distributed storage systems (e.g., Cloudera) and compute frameworks (e.g., Spark, Hadoop). • Experience with MLOps processes and tools for model More ❯
Job Number: 260 Job Category: GovTech Job Title: DATA ENGINEER (260) - MARYLAND -URGENT Job Type: Full-time Clearance Level: Top Secret/SCI - Full Scope Polygraph Work Arrangement: On-site Job Location: Annapolis Junction MD Salary: 129K-155K Background Collaborate closely with a team of developers to fulfill data integration requirement Write and maintain code using … an ETL platform to ensure data is transformed into suitable formats as defined by IC ITE initiatives Interface with external teams and systems, employing various protocols including HTML and SFTP to collect data efficiently Enhance the ETL platform by adding features aimed at shortening timelines for future data integration effort Develop and maintain software … ensuring seamless integration into a fully functional system Collaboration with external teams will be necessary to validate data ingest processes Responsible for providing comprehensive documentation covering system architecture, development, and any enhancements made throughout the process Requirements Bachelor's degree with 9+ years of professional experience Master's degree with 7+ years of professional experience PhD with 5+ More ❯
and Intelligence challenges through IT solutions. We emphasize teamwork and focus on achieving goals to complete deliverables efficiently, on-time, and under budget. We are currently seeking a Data Engineer/Data Architecct to join our team to support our Government Partner. Primary Job Duties and Required Work Experience: Design, build, and maintain data pipelines that efficiently ingest, process, and transform data from various sources for analysis Collaborate with data scientists and analysts to understand data requirements and ensure data quality and accessibility. Develop and implement data governance policies and procedures to ensure data integrity and security. Stay current on … processing and analysis. Data Visualization: Proficiency in creating interactive dashboards and visual representations of data, potentially using tools like Tableau. Experience with datalake and datalake technologies and best practices Job Requirements • Active TS/SCI + Full Scope Poly U.S. Government Security is required. • Education and Years of More ❯
Software/Data Engineer Who We Are : Veson Nautical is a well-established and rapidly growing software company working to provide end-to-end logistical, operational, and … analytical solutions to propel the efficiency and effectiveness of Maritime Commerce. The Opportunity: In this role, you'll contribute to the development and maintenance of our DataLake (Snowflake) product, supporting best practices across teams. As a Software/Data Engineer, you'll help design and implement software components, working with others to balance new … is preferred. A Closer Look at the Role This role will provide opportunities to get your hands on various tasks/technologies such as: Develop and support scalable data storage solutions using Snowflake under the guidance of senior engineers. Write and optimize SQL queries within the Snowflake environment, identifying and resolving performance issues with support when needed. Contribute More ❯
spread information about new treatments to key people in the life science community. You can read more about Veeva Link on our product pages at . As a data engineer, you focus on our data pipelines and take responsibility for a major part of the Link data processing platform. We value end-to-end … ownership, which puts you into the sweet spot of finding, designing, and implementing improvements to the product's data pipelines and adjusting them to changing demands of the market. You take responsibility for features and innovation using SOLID and clean software principles, take part in the architectural enhancement process and care for the quality of the outcome. Monitoring … engineering domain, you will focus on different aspects. We decide together which domains fit best for you. What You'll Do Work on Veeva Link's next-gen Data Platform Improve our current environment with features, refactoring, and innovation Work with JVM-based languages or Python on Spark-based data pipelines Operate ML models in close More ❯
of this role is to develop, maintain and, where necessary, provide support to end users for Infor Sun Systems and associated products (Bank Reconciliation, Q&A, Datalake, interfaces etc), You will work within the Financial Systems team and your stakeholder/s with be the wider finance functionYou will have design authority and be encourage to … Development and maintenance of existing processes requiring SQL or VBA expertise. Be team representative responsible for liaising with Group Technology on technical issues (e.g. Interface/DataLake/SQL/Server/Database issues) Continually review and make recommendations for improvement tools, systems, and process. Co-ordinate the annual audit user reviews for Sun systems Support … the Finance Systems Accounting team where necessary. SKILLS/EXPERIENCE REQUIRED: Understanding of data structures and experience of extracting and utilising data to drive key strategic decisions, including the production of insightful analysis. Financial controls and an appreciation for financial/management accounting Strong communication and Stakeholder management skills. Systems development and change control. Sun Super More ❯
posted by Jobgether on behalf of Monte Carlo. We are currently looking for a Senior Solutions Architect, EMEA in the United Kingdom. Join a fast-growing environment where data reliability and AI innovation meet real-world impact. As a Senior Solutions Architect, you'll play a pivotal role in helping enterprise clients succeed with large-scale data … Success team, helping to define customer project scope, outcomes, and strategic roadmaps. Lead end-to-end implementation of data reliability platforms, especially across complex datalake and cloud environments. Develop and share best practices for deploying scalable and effective data solutions. Deliver hands-on technical training and enablement to customer engineering and data … opportunities and improve adoption. Manage multiple client engagements with enterprise-level stakeholders and timelines. 3-5 years of experience in a customer-facing technical role, ideally within the data or analytics ecosystem. Strong foundation in cloud data platforms, datalakes, ETL pipelines, BI tools, SQL, APIs; networking knowledge is a bonus. Skilled in solution More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Applicable Limited
Req ID: 320159 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking aTalend ETL Expert Developer to join our team remotely in the United States. We are seeking an experienced and … highly skilled Talend ETL Expert Developer for a consulting position. The ideal candidate will have at least 7 years of experience in designing and implementing data integration solutions, with expertise in Talend and related technologies. This role involves working on complex ETL processes, API integrations, and data transformation tasks, as well as contributing to broader data … Relevant certifications in Talend or related technologies. Hands-on experience with Talend 7.3.1 is highly preferred. Experience with cloud platforms, especially AWS S3, and knowledge of datalake architectures. Familiarity with big data tools such as Apache Spark and file formats like Parquet, ORC, and Avro. Experience with NoSQL databases and their integration with Talend. More ❯
Octopus Energy, a leading force in the energy sector, is seeking a Data Engineer to join their Energy Markets team. This role sits at the intersection of energy trading, forecasting, and technological innovation, supporting the company's mission towards Net Zero transition. The position offers a unique opportunity to work with complex data engineering challenges while … contributing to sustainable energy solutions. As a Data Engineer, you'll be responsible for designing and maintaining critical data pipelines that support core trading, forecasting, risk, and PPA processes across all Octopus international regions. The role leverages a modern tech stack including SQL, Python, Airflow, Kubernetes, and various other cutting-edge technologies. You'll work with … risk and PPA processes Develop automations and alerts for pipeline monitoring Set up and maintain processes for capturing, preparing and loading data into the datalake Design and build operational dashboards Work with international teams to ensure best practices and code standardization Take ownership of data platform improvements Share knowledge and upskill team More ❯
an SQL Developer working with T-SQL, SSIS, and SSAS (SQL 2014 and newer). Experience with Azure data tools (Data Factory, DataLake) is a plus. Strong problem-solving skills within complex SQL environments (On-premises & Azure). Excellent communication and interpersonal skills, able to engage at all levels of the business. More ❯
Requirements Technical Expertise: Possessing in-depth knowledge of specific technologies, software, and products. Key strengths on the following areas are required: Cloud technologies - Azure Cloud, AWS. DataLake & Data Warehouse: SparkDB, Azure Databricks Key accountability Consulting and Advisory: Analysing client needs, identifying technical challenges, and recommending solutions. Solution Design and Implementation: Developing and implementing technical More ❯
as specialist practices Publicis Media Exchange (PMX), Performics, Publicis Sport & Entertainment, Publicis Media Content and NextTECHnow. Together they combine deep expertise in media investment, strategy, insights and analytics, data and technology, commerce, performance marketing and content. Publicis Media is part of Publicis Groupe and is present in more than 100 countries with over 23,500 employees worldwide. Our … engineering talent, and contribute to the growth of our business Responsibilities Guide a team of engineers in developing applications that empower our clients to optimize marketing campaigns through data-driven insights and automated actions, with a specific focus on leveraging LLMs and AI Own the technical roadmap for your team, aligning it with the overall product strategy and … asynchronous queues (e.g., Kafka, RabbitMQ) and asynchronous APIs Deep understanding of cloud infrastructure (AWS, GCP) and experience deploying and managing applications at scale. Strong understanding of datalake architectures, including experience with data ingestion, storage, processing, and retrieval of large volumes of structured and unstructured data Familiarity with containerization technologies like Docker and More ❯
areas of the codebase. Support scaling the platform’s operational capability by leveraging the existing product pipeline and identifying opportunities for innovation. Collaborate with cross-functional teams including data engineering, web development, and DevOps to deliver integrated solutions. Experience & Qualifications: Significant experience professional software development in a commercial environment. Proven experience in leading and managing software development teams. … Proficiency in Python programming with strong foundations in object-oriented design and development. Experience working with large datasets, including APIs & data lakes. Extensive experience in data analysis and data engineering. Demonstrated ability to share knowledge and mentor junior and mid-level developers effectively. Experience with back-end development supporting web applications. Familiarity with embedding More ❯
areas of the codebase. Support scaling the platform’s operational capability by leveraging the existing product pipeline and identifying opportunities for innovation. Collaborate with cross-functional teams including data engineering, web development, and DevOps to deliver integrated solutions. Experience & Qualifications: Significant experience professional software development in a commercial environment. Proven experience in leading and managing software development teams. … Proficiency in Python programming with strong foundations in object-oriented design and development. Experience working with large datasets, including APIs & data lakes. Extensive experience in data analysis and data engineering. Demonstrated ability to share knowledge and mentor junior and mid-level developers effectively. Experience with back-end development supporting web applications. Familiarity with embedding More ❯
Engineer for Backend services you will be responsible for a range of backend systems such as Software applications that include areas such as: Payment Gateway, Settlement, Transaction Fraud Monitoring, Datalake Service and the Core Banking System. You will oversee their deployment, development, enhancements and production operations. You should have extensive experience and skills in Software Engineering, DevOps, project management, incident More ❯
sustainable, more inclusive world. YOUR ROLE Data Architect to lead the modernization of enterprise data platforms by designing and implementing a unified datalake on Google Cloud Platform (GCP). I am responsible for driving architectural roadmaps, designing ELT/ETL pipelines, and delivering scalable, reusable frameworks for data integration across … impact data solutions across diverse data landscapes. YOUR PROFILE To modernize the enterprise data platforms, architect a solution for unified datalake on the Google cloud platform integrating various data sources for realtime and batch data feeds Lead the design and architectural roadmaps including all Data Platform Capabilities like data ingestion, data access, datalake, data warehouse. Provide technical leadership to project team(s) to perform design to deployment related activities, provide guidance, participate in reviews, prevent and resolve technical issues perform Proof-of-concept to determine feasibility and product evaluation of various integration tools More ❯
Reston, VA/Fort Belvoir As a Sr. Data Scientist on our team, you'll lead the charge in transforming raw data into actionable insights and cutting-edge machine learning solutions that directly impact a mission-critical defense IT system. Your expertise in advanced analytics and AI will help architect the intelligence layer of the platform … you at the forefront of innovation, enabling the implementation of LLMs and streaming pipelines that handle millions of records weekly. You'll also mentor and guide a growing data science team, shaping future technical leaders while driving forward a robust, intelligence-driven infrastructure that advances the mission. You Have: • 5+ years of experience in applied data … data exploration, data cleaning, data analysis, data visualization, or data mining • Experience with production-level systems, datalake environments, and streaming data, including Kafka • Experience implementing end-to-end ML workflows from data prep to deployment and evaluation • Ability to quickly learn infrastructure More ❯
Interim Data Workstream Lead – 3-Month Contract (Outside IR35) Location: UK-based/Remote Rate: Up to £500 per day | Outside IR35 Duration: 3 months initially Start: ASAP We’re working with a public sector organisation undergoing a major transformation programme and are looking for a contractor to lead the data workstream. This role will suit … migration and transformation—legacy analysis, mapping, cleansing and load Implement and oversee data solutions using Azure-based services (e.g. Synapse, Data Factory, DataLake, Purview) Ensure robust data governance, ownership, and security controls are in place Identify and manage risks and issues related to data quality, access and integration … Proven experience leading data streams within transformation or change programmes Strong knowledge of Azure data services (e.g. Synapse, Data Factory, DataLake, Purview) Hands-on experience with data migration, transformation, and governance best practices Confident working across business and technical teams with strong stakeholder engagement skills Experience working in More ❯
About Moonbug Entertainment Thank you for considering the Data Lead role with Moonbug Entertainment, an award-winning global entertainment company inspiring kids everywhere to laugh, learn and grow. The company is behind some of the biggest kids' entertainment brands in the world including CoComelon and Blippi. Moonbug believes every child should have access to our entertaining and enriching … Benefits including Free Private Healthcare, Enhanced Maternity and Paternity leave, matched 5% pension scheme, free yoga/fitness wellbeing classes, free weekly lunch and Friday The Role The Data Lead is responsible for driving Moonbug's data engineering, platform, and governance strategy - ensuring our data infrastructure, pipelines, and practices are robust, scalable, and aligned … the CIO, works alongside the Tech Lead, and closely partners with the ML Lead, DataOps/Governance, and the BI & Analytics function to deliver a unified, high-quality data ecosystem. Key Responsibilities Lead data engineering & platform initiatives across onshore & offshore teams (Data Engineers, DB Admin, API Integration). Enable ML & AI teams by ensuring More ❯
Vortexa is a fast-growing international technology business founded to solve the immense information gap that exists in the energy industry. By using massive amounts of new satellite data and pioneering work in artificial intelligence, Vortexa creates an unprecedented view on the global seaborne energy flows in real-time, bringing transparency and efficiency to the energy markets and … society as a whole. The Role: Processing thousands of rich data points per second from many and vastly different external sources, moving terabytes of data while processing it in real-time, running complex prediction and forecasting AI models, and coupling their output into a hybrid human-machine data refinement process, all presented through a … data daily using AWS, Kubernetes, and Airflow. With solid software engineering fundamentals, fluent in Java and Python (Rust is a plus). Knowledgeable about datalake systems like Athena, and big data storage formats such as Parquet, HDF5, ORC, focusing on data ingestion. Driven by working in an intellectually engaging environment More ❯
Inside IR35 Duration: 12 months + extension Location: 2 days onsite (London), 3 days remote We are looking for a skilled Database Architect to design and build robust data systems, including databases, datalakes, and enterprise integrations across platforms such as Elastic, Oracle, and ServiceNow. This role will be central to driving data architecture … system design, and migration efforts for critical business systems. Key Responsibilities: Design and build scalable data architectures, including databases and data lakes. Architect and implement integrations between Elastic, Oracle, ServiceNow, and other enterprise platforms. Lead data migration strategies and execution across legacy and modern systems. Develop data models, schemas, and pipelines to … components. Requirements: Proven experience as a data or enterprise architect with hands-on expertise in Elastic, Oracle, and ServiceNow. Strong knowledge of database design, datalake architecture, and system integration. Experience leading end-to-end system and data migrations. Familiarity with modern data stack tools and cloud platforms is a plus. More ❯
Inside IR35 Duration: 12 months + extension Location: 2 days onsite (London), 3 days remote We are looking for a skilled Database Architect to design and build robust data systems, including databases, datalakes, and enterprise integrations across platforms such as Elastic, Oracle, and ServiceNow. This role will be central to driving data architecture … system design, and migration efforts for critical business systems. Key Responsibilities: Design and build scalable data architectures, including databases and data lakes. Architect and implement integrations between Elastic, Oracle, ServiceNow, and other enterprise platforms. Lead data migration strategies and execution across legacy and modern systems. Develop data models, schemas, and pipelines to … components. Requirements: Proven experience as a data or enterprise architect with hands-on expertise in Elastic, Oracle, and ServiceNow. Strong knowledge of database design, datalake architecture, and system integration. Experience leading end-to-end system and data migrations. Familiarity with modern data stack tools and cloud platforms is a plus. More ❯
to a post-go-live BAU model. The successful candidate will play a pivotal role in supporting, maintaining, and developing Infor SunSystems and associated finance systems, driving automation, data integrity, and … operational efficiency across the finance function. Key Responsibilities Provide front-line support, maintenance, and development for Infor SunSystems and associated modules (e.g. Bank Reconciliation, Q&A, DataLake, interfaces). Act as the SunSystems Super User, delivering training and support to both finance and wider business users. Manage Sun static data, user access, business unit … governance policies, assisting with audits as required. Serve as key liaison with the Group Technology team on technical issues (e.g. SQL/Server/Database/DataLake/interface). Deliver Business-as-Usual tasks including data loads, reconciliations, and database maintenance. Identify opportunities for system automation and process improvements; support transformation and change More ❯
and interpersonal skills, with the ability to effectively collaborate with cross-functional teams Strong problem-solving and decision-making abilities Knowledge of agile project management methodologies Experience with data analysis and reporting Ability to adapt to changing priorities and work well under pressure Project management certification (e.g., PMP) is a plus Bachelor's degree in a relevant field … Real-Time Monitoring: Implement AI algorithms to detect and flag unusual transactions in real-time. Predictive Analytics: Use machine learning models to predict potential fraud based on historical data and behavioural patterns. 3. Loan Processing Automation Credit Scoring: AI can evaluate creditworthiness more accurately by analyzing a wider range of data points. Document Verification: Automate the … main emphasis of this position to is harness the data from a variety of data tables at the bank and collate a DataLake from which to extract a variety of AI reports to increase the banks customer strategy. By strategically implementing AI in these areas, a Digital Banking Operations Manager can greatly More ❯
powering our marketing technology and performance efforts. As a Junior MarTech Analyst, you will be embedded within the acquisition and performance marketing teams, working closely with Marketing Managers, Data Engineers, and MarTech Operations leads. You'll collaborate with cross-functional partners wider data teams to connect, segment, and optimise data flows that drive our … advertising platforms. This role offers a great opportunity to develop your technical skills, particularly in SQL and data engineering, while contributing directly to marketing growth and innovation. You'll have the autonomy to support and improve our marketing data infrastructure, helping the team make data-driven decisions that fuel Trainline's success. As a … our data platforms and advertising channels. You'll collaborate with stakeholders to: Connect and configure customer, route, and pricing data from our datalake into marketing platforms via rETL and feed management platforms like Census and Channable Segment and manipulate data to ensure it is in the correct format for ad More ❯