Senior DBA
Date: 20 Mar 2026
Location: MU
Company: International SOS
About the role
We are seeking a Senior Database Administrator (DBA) who will play a pivotal role within our development team, contributing to the evolution of our flagship product by driving enhancements in feature development and ensuring robust scalability. This role demands a deep understanding of how data from various sources integrates and flows through our systems, enabling a comprehensive and clear view of travel-related information. Your expertise will be crucial not only in managing the underlying database technologies but also in maintaining and optimizing the intricate relationships between diverse data sets to support business growth and performance improvements.
The successful candidate will demonstrate strong analytical skills and a proactive approach to problem-solving, particularly in diagnosing issues arising from complex data interactions. This position offers the chance to work in a dynamic environment where technology and data converge to power insightful business decisions and enhance customer experience at an international level.
Join us in a role that combines technical leadership with strategic collaboration, where your contributions will directly impact the scalability and reliability of our data infrastructure and help shape the future of travel safety and security solutions.
Key responsibilities
- Collaborate effectively with cross-functional teams including business stakeholders, data analysts, and data scientists to develop software solutions that provide critical insights into customer and network behaviour, influencing strategic decisions.
- Lead the design, implementation, and maintenance of batch and real-time data pipelines that support business intelligence, analytics, and machine learning initiatives across the organisation.
- Manage and optimise database environments, including relational and NoSQL systems, ensuring data integrity, security, and accessibility.
- Build and maintain data storage solutions such as Data Lakes, APIs, and databases, facilitating the smooth flow and accessibility of data throughout the enterprise.
- Develop robust data extraction and integration modules using established and innovative methods to handle batch and incremental data flows from a variety of sources.
- Utilise automated deployment processes, adhering to best practices in software release and version control, ensuring seamless transitions from development to production environments.
- Monitor the health and performance of the data ecosystem by configuring and maintaining monitoring tools, setting alerts on critical failure points, and providing actionable feedback to data owners and business units on data quality and reliability.
- Design, implement, and maintain automated testing frameworks including unit, integration, and functional tests to ensure data accuracy and system robustness.
- Continuously evaluate and improve data workflows, focusing on performance optimization, scaling, and security compliance to support evolving business needs.
Job Profile
Essential Skills and Competencies:
- Expertise in AWS cloud platforms including S3 Data Lake, Athena, Redshift, EMR, alongside RDS and MySQL, MS SQL, and PostgreSQL database systems.
- Proficiency in Python for data engineering tasks, utilising libraries such as Pandas and PySpark to process and manipulate data efficiently.
- Experience with object-oriented and functional scripting languages including Java, C#, and Rust, enabling versatile development capabilities.
- Strong command of source control practices using Git for version management and collaborative development.
- Advanced knowledge of relational (MS SQL, PostgreSQL, MySQL) and NoSQL databases (DynamoDB, Cassandra) with practical experience in handling complex data types such as JSON and XML.
- Understanding of workflow orchestration concepts to streamline and automate data pipelines and processing workflows.
- Familiarity with diverse operating environments including Windows, Linux, and containerised deployments, to ensure adaptability and continuous integration/delivery.
- Experience working with big data technologies such as Hadoop, Spark, and Kafka, supporting scalable and high-performance data solutions.
- Proven understanding of microservices architecture and integration technologies including APIs, ESB, Kafka, MQ, ETL/ELT processes, Change Data Capture (CDC), B2Bi, and streaming data platforms.
- Strong commitment to security and data privacy principles, ensuring adherence to industry best practices and regulatory requirements.
Core Competencies:
- Advanced relational SQL skills, particularly with MS SQL Server, combined with hands-on experience managing NoSQL databases.
- Proven ability to design, deploy, and maintain large-scale, performance-optimised data systems that support complex analytics and machine learning workloads.
- Demonstrated capacity to work in collaborative, multi-disciplinary teams, effectively communicating technical concepts to non-technical stakeholders.
- Strong organisational and project management skills with a focus on delivering high-quality results on time.
- A proactive learner with the ability to adapt to emerging technologies and evolving business needs.