We are seeking a skilled Data Engineer
with hands-on experience in FHIR (Fast Healthcare Interoperability Resources)
to join our growing data team. This role focuses on designing, building, and maintaining healthcare data pipelines and operational data stores in cloud environments.
- This is a 12 months contract to start with
- Work on exciting projects within Medtech/Healthcare domain
Key Responsibilities:
- Design, develop, and maintain data pipelines and ETL processes based on the FHIR
standard
- Implement and optimize Operational Data Stores (ODS)
for healthcare data using FHIR on cloud platforms (preferably Google Cloud Platform)
- Integrate structured and unstructured healthcare data from various sources into a unified data model
- Develop solutions for streaming data ingestion, transformation, and real-time analytics
- Collaborate with data architects, analysts, and stakeholders to ensure data accuracy, security, and performance
- Contribute to the definition and implementation of data governance and quality standards
Required Skills & Experience:
- Strong experience working with FHIR
and healthcare data interoperability standards
- Hands-on experience with cloud platforms(GCP preferred) for data engineering and storage
- Proficient in Python, SQL, and data pipeline frameworks(e.g., Apache Beam, Dataflow, or similar)
- Experience with data streaming technologies(Kafka, Pub/Sub, etc.)
- Familiarity with data warehousing, data modeling, and API-based data integration
- Ability to work independently in a remote setup with occasional onsite collaboration
Nice to Have:
- Experience with BigQuery, FHIR servers, or healthcare APIs
- Understanding of data privacy regulations(e.g., HIPAA, PDPA)
Argyll Scott Consulting Pte Ltd
Argyll Scott Asia is acting as an Employment Business in relation to this vacancy.