Lead Data Scientist/Data Analyst Kalman & Company is seeking a highly motivated, performance-driven individual with strong data analysis experience to work closely with our Department of Defense (DoD) client in the Northern VA area. Successful candidates will have More ❯
across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data engineers create pipelines that … scalable, robust code using python or similar programming languages. Background in software engineering a plus. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka … such as AWS, Azure, or GCP for deploying and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one More ❯
problems and building solutions 5) Be the domain expert and have knowledge of data availability from various sources. 6) Execute solution with scalable development practices in scripting, write & optimize SQL queries, reporting, data extraction and data visualization. 7) Proactively and independently work with stakeholders to construct use cases and associated standardized outputs for your work 8) Actively manage the timeline More ❯
of Excel or Tableau (data manipulation, macros, charts and pivot tables), - Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages - Fluency in SQL and ETL PREFERRED QUALIFICATIONS - Knowledge of data modeling and data pipeline design - Masters degree in Business, Engineering, Statistics, Computer Science, Data Science, Mathematics or related field - Experience with at least More ❯
team members and the wider business community. About the Candidate The ideal candidate will possess the following: Expert in the Microsoft Azure Stack. Experience using data tools such as SQL, Excel, or Python. Expertise in delivering data models within data visualisation tools, ideally with Power BI exposure. Strong communication and organisation skills. Strong analytical thinking and problem-solving skills. Ability More ❯
Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage, and Azure SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and data pipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. … Experience with SQL and NoSQL databases. Experience with data quality and data governance processes. Experience with version control systems (e.g., Git). Experience with Agile development methodologies. Excellent communication, interpersonal, and problem-solving skills. Experience with streaming data technologies (e.g., Kafka, Azure Event Hubs). Experience with data visualisation tools (e.g., Tableau, Power BI). Experience with DevOps tools and More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Zolon Tech, Inc
data-driven initiatives for a federal program. This role will focus on designing and building modern reporting solutions, automating business processes, and integrating data across platforms such as SharePoint, SQL, and Excel. The ideal candidate brings both technical expertise and a practical understanding of business analytics and workflow automation in a federal environment. Key Responsibilities: Design, develop, and maintain Power … dashboards, and data models using DAX Build and support low-code/no-code applications using Microsoft Power Apps and automated workflows using Power Automate Integrate data from SharePoint, SQL Server, Excel, and other data sources into unified reporting solutions Collaborate with stakeholders to gather requirements and translate them into actionable data visualizations and tools Create reusable templates and workflows … including DAX, data modeling, and advanced visuals) Strong experience with Microsoft Power Platform, especially Power Apps and Power Automate Proven ability to integrate data from multiple systems, including SharePoint, SQL, and Excel Solid understanding of data governance, security roles, and user access management in Microsoft environments Familiarity with Agile methodologies and working within federal or highly regulated environments Strong problem More ❯
to provide ongoing analytical support. Basic Qualifications Bachelor's degree or higher in a quantitative/technical field (e.g., Computer Science, Statistics, Engineering). 2+ years of experience writing SQL queries. Experience with building and maintaining data artifacts (ETL, data models, queries). Experience with AWS services including S3, Redshift, EMR, Kinesis, and RDS. Experience delivering end-to-end projects … related to data storage and computing. Experience with data visualization tools (e.g., Tableau, Quicksight, PowerBI) and statistical methods. 2+ years analyzing data with Redshift, Oracle, NoSQL, etc. Proficiency in SQL and scripting (Python) for data processing. Preferred Qualifications Inquisitive mindset with problem-solving skills and passion for big data. Experience building multi-dimensional data models. Experience with distributed data systems … for large datasets. Proficiency in regression, classification, and cluster analysis. Knowledge of data visualization/reporting software (e.g., Tableau). Advanced SQL, data mining skills, and analytical tools (R/Python/SAS). We are committed to an inclusive culture. If you need workplace accommodations during the application process, please visit this link . More ❯
in designing and building scalable data pipelines. Should have excellent knowledge in data warehouse/data lake technology and business intelligence concepts Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queries Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some … of the below technologies Data integration – ETL tools like IBM DataStage, Talend and Informatica. Ingestion mechanism like Flume & Kafka. Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Data visualization – Tools like Tableau Big data – Hadoop eco-system, Distributions like Cloudera/Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming Hands-on experience with … multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase/Cassandra, MongoDB) Experience in cloud data eco-system - AWS, Azure or GCP in the data engineering space with at least few complex & high-volume data projects as an architect is mandatory Contributing to developing and maintaining cloud governance frameworks, policies, and procedures Proficient in integrating cloud services with More ❯
in designing and building scalable data pipelines. Should have excellent knowledge in data warehouse/data lake technology and business intelligence concepts Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queries Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some … of the below technologies Data integration – ETL tools like IBM DataStage, Talend and Informatica. Ingestion mechanism like Flume & Kafka. Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Data visualization – Tools like Tableau Big data – Hadoop eco-system, Distributions like Cloudera/Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming Hands-on experience with … multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase/Cassandra, MongoDB) Experience in cloud data eco-system - AWS, Azure or GCP in the data engineering space with at least few complex & high-volume data projects as an architect is mandatory Contributing to developing and maintaining cloud governance frameworks, policies, and procedures Proficient in integrating cloud services with More ❯
data solutions using Microsoft technologies, and deliver training to empower end users. PRINCIPAL MISSIONS Design and implement data pipelines using Azure Data Factory or Microsoft Fabric. Develop and maintain SQL-based transformations and data models (e.g., star schema, snowflakes) in SQL Server, Fabric Datawarehouse/Lakehouse. Build and optimize Power BI dashboards and reports to support business decision-making. Collaborate … across all solutions. Provide training and support to end users on BI tools and data literacy. REQUIREMENTS Previous experience as a BI Consultant, Data Analyst, or Analytics Engineer. Strong SQL scripting and data modeling skills. Proficiency in Power BI (data modeling, DAX, report design). Experience with Azure Data Factory and/or Microsoft Fabric for pipeline development (or python More ❯
diverse, and the post holder should be prepared to use the right tool for the job. The type of skills relevant to the role are (but not limited to) SQL, ETL, SQL Server 2022 (including SSIS), PowerShell, C#, Azure, DevOps, Git, Power BI, Power Automate, Automated Testing, Kanban/Scrum. Digital change within the NHS is gathering pace, and a … employees. We welcome applications from people in all under-represented groups. Job description Job responsibilities Design, develop and test solutions using a range of technologies and platforms, but primarily SQL and SQL Server Implement data solutions that handle concerns such as ETL, data quality, duplication, different formats and data structures, performance, and scalability Make effective use of development practices such More ❯
Wymondham, Norfolk, England, United Kingdom Hybrid / WFH Options
DMR Personnel Ltd
web layout design; as well as de-bugging and bug-fixes. You'll be working across the full stack Microsoft stack C# .Net Core, ASP.Net MVC, Entity Framework and SQL as well as using a range of web technologies including JavaScript and HTML/CSS. Key Responsibilities: Full Stack Development: Develop, test, and maintain both front-end and back-end … best practices. Conduct code reviews, provide feedback, and mentor junior developers when required. Design and implement RESTful APIs and services using ASP.NET Core. Ensure database performance and scalability with SQL Server and/or other database technologies. Implement responsive and user-friendly front-end interfaces using modern JavaScript and styling/CSS frameworks Ensure compatibility across different browsers and devices. … Strong experience with .NET technologies, particularly ASP.NET Core and C#. Proficiency with front-end technologies like HTML5, CSS3, JavaScript, and modern JS frameworks. Experience with relational databases such as SQL Server. Familiarity with cloud platforms (e.g., Azure, AWS) and DevOps practices. Experience with version control systems (e.g., Git). Ability to analyse and resolve complex technical issues. Strong debugging and More ❯
through data modelling and the ETL process, to data visualisation and generating insight - youll get to experience it all! Main duties will include working with relational databases with MS SQL server and ETL tools such as SSIS. You will also perform analysis of data loads and implement data warehouse design theory and data modelling techniques. Producing reports and data visualisation … database design and BI systems tools Understand basic concepts of BI Architecture such as ETL, Data Warehousing and Data Modelling Ability to manipulate, model and analyse data using Excel, SQL, QlikView, Qlik Sense Experience developing reports using data visualisation tools such as QlikView, Qlik Sense, Tableau or Power BI Experience in processes related to data collection, modelling and processing data … to existing NIHR Academy technology An understanding of data protection and information governance including GDPR and the importance of reporting standards and report definitions Desirable Experience creating ad hoc SQL queries of T-SQL Experience of project management Advanced development and application of google platforms, Microsoft office and databases Contributed to peer reviewed publications Experience of Microsoft Visual Studio (SSIS More ❯
Melville, New York, United States Hybrid / WFH Options
Canon U.S.A., Inc
degree in a Computer Science or a related discipline required, plus 5 years of related experience At least 5 years of professional programming experience Extensive experience programming in C#, SQL, JS, XML, HTML. The most important being C#/SQL Real successes in optimizing C# algorithms or SQL transactions Experience with technology stack of ASP.Net, ASP.NET Core, MSSQL Server, IIS More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
of experience in a data engineering or similar technical role Hands-on experience with key Microsoft Azure services: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a More ❯
techniques, including software version control. Knowledge of various Workday cloud platform technologies like EIB, Core Connectors, Document Transformation, Workday Studio and Workday Extend. Thorough knowledge of Oracle database architecture, SQL, PLSQL, and Linux/Unix shell scripting. Demonstrated ability to work with customers to define requirements, create technical designs, and build solutions that meet or exceed user expectations. Demonstrated proficiency … scripting languages such as bash. Understanding of business principles as related to the various functional areas in Higher Education environments. Familiarity with database technologies such as Oracle, PL/SQL, SQL, and MySQL. Thorough understanding of Applications Development Lifecycle including Agile Methodologies and CI/CD utilizing build tools such as Jenkins, Application Administration, and Applications Support disciplines. Thorough understanding More ❯
to sprint planning and reviews Document data engineering processes, architectures, and best practices Requirements: 5+ years of experience in data engineering, software development, or related fields. Strong experience with SQL and relational databases, particularly PostgreSQL Expertise with SQL scripting, database indexing, and optimization techniques Proficient in working with views, triggers, and stored procedures Experience in developing, creating, and modifying data More ❯
efficiency, and quality. Evaluate and prototype new tools or technologies, recommending enhancements to the existing data stack. Person Specification Qualifications Essential Knowledge and proficiency in working with database systems (SQL/PostgreSQL) and writing performant SQL queries. Proficiency in optimising database queries, pipelines, and storage for speed, scalability, and cost-efficiency. Skilled in using Git. Experience Essential At least More ❯
Alternatively, relevant experience in the data engineering field Databricks, including Unity Catalog Terraform, defining, deploying, and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving More ❯
Alternatively, relevant experience in the data engineering field Databricks, including Unity Catalog Terraform, defining, deploying, and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving More ❯
model evaluation and claims analyses Investigate catastrophe model exposure and losses to assess key drivers of model differences Analyze catastrophe model output, claims data, and other data sources using SQL, R, Python, QGIS, Tableau, etc. Create professional reports and visualizations to clearly communicate findings to internal and external stakeholders Assist clients in the understanding of catastrophe risks, sensitivity studies and … using catastrophe modeling software and interpreting modeled loss output is preferred Experience working with large peril-related datasets from NOAA, FEMA, USGS, and more is preferred Programming experience using SQL, R, or Python is preferred Data visualization and mapping experience using Tableau. Power BI, or QGIS is preferred Education Related B.S. degree in meteorology, climatology, or engineering, with a strong More ❯
organise and produce work within deadlines. Skills • Good project and people management skills. • Excellent data development skills. • Excellent data manipulation and analysis skills using a variety of tools including SQL, Phyton, AWS services and the MSBI stack. • Ability to prioritise and be flexible to change those priorities at short notice. • Commercial acumen. • Able to demonstrate a practical approach to problem … design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of other data tools and programming languages such as Python & Spark and Strong SQL experience. • Experience is building Data lake and building CI/CD data pipelines • A candidate is expected to understand and can demonstrate experience across the delivery lifecycle and understand both More ❯
deployment; data flow management; implementing data lifecycle policies; troubleshooting data access issues; and developing data models Familiar with data modeling. Familiar with relational databases, such as MySQL, that utilize SQL queries. Familiar with using Java for data processing, manipulation or querying (SQL or NoSQL) Familiar with ETL/Data Integration using Spring, NiFi, Kafka, and Elasticsearch. Familiar with development in More ❯
Effective communication and teamwork skills. Preferred Skills: Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). Knowledge of DevOps practices and CI/CD pipelines. Database experience with SQL and/or NoSQL systems. All Levels of Experience Available Salary range dependent upon experience. More ❯