Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage, and Azure SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and data pipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. … Experience with SQL and NoSQL databases. Experience with data quality and data governance processes. Experience with version control systems (e.g., Git). Experience with Agile development methodologies. Excellent communication, interpersonal, and problem-solving skills. Experience with streaming data technologies (e.g., Kafka, Azure Event Hubs). Experience with data visualisation tools (e.g., Tableau, Power BI). Experience with DevOps tools and More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Zolon Tech, Inc
data-driven initiatives for a federal program. This role will focus on designing and building modern reporting solutions, automating business processes, and integrating data across platforms such as SharePoint, SQL, and Excel. The ideal candidate brings both technical expertise and a practical understanding of business analytics and workflow automation in a federal environment. Key Responsibilities: Design, develop, and maintain Power … dashboards, and data models using DAX Build and support low-code/no-code applications using Microsoft Power Apps and automated workflows using Power Automate Integrate data from SharePoint, SQL Server, Excel, and other data sources into unified reporting solutions Collaborate with stakeholders to gather requirements and translate them into actionable data visualizations and tools Create reusable templates and workflows … including DAX, data modeling, and advanced visuals) Strong experience with Microsoft Power Platform, especially Power Apps and Power Automate Proven ability to integrate data from multiple systems, including SharePoint, SQL, and Excel Solid understanding of data governance, security roles, and user access management in Microsoft environments Familiarity with Agile methodologies and working within federal or highly regulated environments Strong problem More ❯
to provide ongoing analytical support. Basic Qualifications Bachelor's degree or higher in a quantitative/technical field (e.g., Computer Science, Statistics, Engineering). 2+ years of experience writing SQL queries. Experience with building and maintaining data artifacts (ETL, data models, queries). Experience with AWS services including S3, Redshift, EMR, Kinesis, and RDS. Experience delivering end-to-end projects … related to data storage and computing. Experience with data visualization tools (e.g., Tableau, Quicksight, PowerBI) and statistical methods. 2+ years analyzing data with Redshift, Oracle, NoSQL, etc. Proficiency in SQL and scripting (Python) for data processing. Preferred Qualifications Inquisitive mindset with problem-solving skills and passion for big data. Experience building multi-dimensional data models. Experience with distributed data systems … for large datasets. Proficiency in regression, classification, and cluster analysis. Knowledge of data visualization/reporting software (e.g., Tableau). Advanced SQL, data mining skills, and analytical tools (R/Python/SAS). We are committed to an inclusive culture. If you need workplace accommodations during the application process, please visit this link . More ❯
in designing and building scalable data pipelines. Should have excellent knowledge in data warehouse/data lake technology and business intelligence concepts Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queries Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some … of the below technologies Data integration – ETL tools like IBM DataStage, Talend and Informatica. Ingestion mechanism like Flume & Kafka. Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Data visualization – Tools like Tableau Big data – Hadoop eco-system, Distributions like Cloudera/Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming Hands-on experience with … multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase/Cassandra, MongoDB) Experience in cloud data eco-system - AWS, Azure or GCP in the data engineering space with at least few complex & high-volume data projects as an architect is mandatory Contributing to developing and maintaining cloud governance frameworks, policies, and procedures Proficient in integrating cloud services with More ❯
in designing and building scalable data pipelines. Should have excellent knowledge in data warehouse/data lake technology and business intelligence concepts Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queries Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some … of the below technologies Data integration – ETL tools like IBM DataStage, Talend and Informatica. Ingestion mechanism like Flume & Kafka. Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Data visualization – Tools like Tableau Big data – Hadoop eco-system, Distributions like Cloudera/Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming Hands-on experience with … multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase/Cassandra, MongoDB) Experience in cloud data eco-system - AWS, Azure or GCP in the data engineering space with at least few complex & high-volume data projects as an architect is mandatory Contributing to developing and maintaining cloud governance frameworks, policies, and procedures Proficient in integrating cloud services with More ❯
data solutions using Microsoft technologies, and deliver training to empower end users. PRINCIPAL MISSIONS Design and implement data pipelines using Azure Data Factory or Microsoft Fabric. Develop and maintain SQL-based transformations and data models (e.g., star schema, snowflakes) in SQL Server, Fabric Datawarehouse/Lakehouse. Build and optimize Power BI dashboards and reports to support business decision-making. Collaborate … across all solutions. Provide training and support to end users on BI tools and data literacy. REQUIREMENTS Previous experience as a BI Consultant, Data Analyst, or Analytics Engineer. Strong SQL scripting and data modeling skills. Proficiency in Power BI (data modeling, DAX, report design). Experience with Azure Data Factory and/or Microsoft Fabric for pipeline development (or python More ❯
diverse, and the post holder should be prepared to use the right tool for the job. The type of skills relevant to the role are (but not limited to) SQL, ETL, SQL Server 2022 (including SSIS), PowerShell, C#, Azure, DevOps, Git, Power BI, Power Automate, Automated Testing, Kanban/Scrum. Digital change within the NHS is gathering pace, and a … employees. We welcome applications from people in all under-represented groups. Job description Job responsibilities Design, develop and test solutions using a range of technologies and platforms, but primarily SQL and SQL Server Implement data solutions that handle concerns such as ETL, data quality, duplication, different formats and data structures, performance, and scalability Make effective use of development practices such More ❯
Wymondham, Norfolk, England, United Kingdom Hybrid / WFH Options
DMR Personnel Ltd
web layout design; as well as de-bugging and bug-fixes. You'll be working across the full stack Microsoft stack C# .Net Core, ASP.Net MVC, Entity Framework and SQL as well as using a range of web technologies including JavaScript and HTML/CSS. Key Responsibilities: Full Stack Development: Develop, test, and maintain both front-end and back-end … best practices. Conduct code reviews, provide feedback, and mentor junior developers when required. Design and implement RESTful APIs and services using ASP.NET Core. Ensure database performance and scalability with SQL Server and/or other database technologies. Implement responsive and user-friendly front-end interfaces using modern JavaScript and styling/CSS frameworks Ensure compatibility across different browsers and devices. … Strong experience with .NET technologies, particularly ASP.NET Core and C#. Proficiency with front-end technologies like HTML5, CSS3, JavaScript, and modern JS frameworks. Experience with relational databases such as SQL Server. Familiarity with cloud platforms (e.g., Azure, AWS) and DevOps practices. Experience with version control systems (e.g., Git). Ability to analyse and resolve complex technical issues. Strong debugging and More ❯
Melville, New York, United States Hybrid / WFH Options
Canon U.S.A., Inc
degree in a Computer Science or a related discipline required, plus 5 years of related experience At least 5 years of professional programming experience Extensive experience programming in C#, SQL, JS, XML, HTML. The most important being C#/SQL Real successes in optimizing C# algorithms or SQL transactions Experience with technology stack of ASP.Net, ASP.NET Core, MSSQL Server, IIS More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
of experience in a data engineering or similar technical role Hands-on experience with key Microsoft Azure services: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a More ❯
techniques, including software version control. Knowledge of various Workday cloud platform technologies like EIB, Core Connectors, Document Transformation, Workday Studio and Workday Extend. Thorough knowledge of Oracle database architecture, SQL, PLSQL, and Linux/Unix shell scripting. Demonstrated ability to work with customers to define requirements, create technical designs, and build solutions that meet or exceed user expectations. Demonstrated proficiency … scripting languages such as bash. Understanding of business principles as related to the various functional areas in Higher Education environments. Familiarity with database technologies such as Oracle, PL/SQL, SQL, and MySQL. Thorough understanding of Applications Development Lifecycle including Agile Methodologies and CI/CD utilizing build tools such as Jenkins, Application Administration, and Applications Support disciplines. Thorough understanding More ❯
to sprint planning and reviews Document data engineering processes, architectures, and best practices Requirements: 5+ years of experience in data engineering, software development, or related fields. Strong experience with SQL and relational databases, particularly PostgreSQL Expertise with SQL scripting, database indexing, and optimization techniques Proficient in working with views, triggers, and stored procedures Experience in developing, creating, and modifying data More ❯
efficiency, and quality. Evaluate and prototype new tools or technologies, recommending enhancements to the existing data stack. Person Specification Qualifications Essential Knowledge and proficiency in working with database systems (SQL/PostgreSQL) and writing performant SQL queries. Proficiency in optimising database queries, pipelines, and storage for speed, scalability, and cost-efficiency. Skilled in using Git. Experience Essential At least More ❯
Alternatively, relevant experience in the data engineering field Databricks, including Unity Catalog Terraform, defining, deploying, and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving More ❯
Alternatively, relevant experience in the data engineering field Databricks, including Unity Catalog Terraform, defining, deploying, and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving More ❯
model evaluation and claims analyses Investigate catastrophe model exposure and losses to assess key drivers of model differences Analyze catastrophe model output, claims data, and other data sources using SQL, R, Python, QGIS, Tableau, etc. Create professional reports and visualizations to clearly communicate findings to internal and external stakeholders Assist clients in the understanding of catastrophe risks, sensitivity studies and … using catastrophe modeling software and interpreting modeled loss output is preferred Experience working with large peril-related datasets from NOAA, FEMA, USGS, and more is preferred Programming experience using SQL, R, or Python is preferred Data visualization and mapping experience using Tableau. Power BI, or QGIS is preferred Education Related B.S. degree in meteorology, climatology, or engineering, with a strong More ❯
organise and produce work within deadlines. Skills • Good project and people management skills. • Excellent data development skills. • Excellent data manipulation and analysis skills using a variety of tools including SQL, Phyton, AWS services and the MSBI stack. • Ability to prioritise and be flexible to change those priorities at short notice. • Commercial acumen. • Able to demonstrate a practical approach to problem … design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of other data tools and programming languages such as Python & Spark and Strong SQL experience. • Experience is building Data lake and building CI/CD data pipelines • A candidate is expected to understand and can demonstrate experience across the delivery lifecycle and understand both More ❯
deployment; data flow management; implementing data lifecycle policies; troubleshooting data access issues; and developing data models Familiar with data modeling. Familiar with relational databases, such as MySQL, that utilize SQL queries. Familiar with using Java for data processing, manipulation or querying (SQL or NoSQL) Familiar with ETL/Data Integration using Spring, NiFi, Kafka, and Elasticsearch. Familiar with development in More ❯
Effective communication and teamwork skills. Preferred Skills: Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). Knowledge of DevOps practices and CI/CD pipelines. Database experience with SQL and/or NoSQL systems. All Levels of Experience Available Salary range dependent upon experience. More ❯
Bachelor's degree in Computer Science, Information Technology, or related field. • Proven experience in developing solutions with Power BI, Power Apps, and Power Automate. • Strong understanding of data modeling, SQL, and data warehousing principles. • Familiarity with Microsoft 365 and Azure services. • Certifications in Microsoft Power Platform or related areas. • Experience with AI Builder and other AI integration within the Power More ❯
trends. Recommend and implement enhancements to improve user experience and operational efficiency. Requirements: Mandatory : Hands-on experience with Palantir Foundry in a production environment. Strong programming skills in Python , SQL , and JavaScript . Experience with data pipelines, APIs, and web scraping. Familiarity with cloud platforms (Azure, AWS, or GCP). Proficiency in data visualisation tools (e.g., Power BI, Tableau, or More ❯
API development and data onboarding with energy data providers Hands-on experience with AWS services like S3, Lambda, and EC2 Manage data storage using relational databases such as MS SQL Translate business needs into technical designs and improve existing solutions Ensure operational robustness, high availability, security, and monitoring Excellent communication skills in English We value passion, willingness to learn, resilience More ❯
Maidenhead, Berkshire, United Kingdom Hybrid / WFH Options
dynaTrace software GmbH
Computer Science, Computer Engineering, Information Technology, Information Systems, or a related technical discipline. Experience with application technologies (J2EE, .NET, Citrix, Microservices). Experience with database technologies (Oracle, DB2, MS SQL). Good understanding of distributed applications. Good understanding of web and enterprise applications. Ability to address complex application environments to provide customers with clear guidance on implementation strategy and potential More ❯
Developer with Angular 12+ and .NET C# expertise strong knowledge in software architecture, design patterns (SOLID, DDD, DRY), and RESTful API development proficient in HTML, CSS, JavaScript, TypeScript, and SQL/NoSQL databases experienced with cloud platforms (AWS/Azure), Docker, Kubernetes, and CI/CD tools (Jenkins, GitLab) domain knowledge in refinish paint software, including spectrophotometer integration, color matching More ❯
integrated into delivery cycles. What we are looking for: Essential: Proven experience (2+ years) as a Data Analyst preferably within a retail or consumer-focused environment. Strong proficiency in SQL for querying and manipulating large datasets. Expertise in data visualization tools such as Tableau, Power BI, Looker, or similar, with a portfolio of impactful dashboards. Advanced Microsoft Excel skills (pivot More ❯