Leading international financial services group in Hong-Kong which invest into Data. At the end of 2018, they were more than 34,000 employees, over 82,000 agents, and thousands of distribution partners, serving almost 28 million customers.
They operate globally in Asia, Canada and USA and value innovation, agile projects and also an open culture. Engineering teams related to data are currently scale-up their projects across all Asia and they have just expanded to 9 countries.
This position has been created in order to support the current director of Data architecture / integration for the new projects. Purpose of the role is to provide technical lead to a team of 20 people on site and offshore. Also creating internal process and structure while staying hand on technique.
Team is mostly composed by junior and mid-experienced Data engineer profiles. Project Exposure is across all Asia and can even be on a global scale sometimes. Your daily points of contact will be members of your team, but also architect from different countries and vendors.
From to time to time, you will coordinate projects with other data teams related to governance and analytics.
- Create standardised data engineering frameworks for data integration
- ELT/ETL programs, design, plan and handle platform upgrade activities for Asia Data Office
- Perform POCs and generate valuable results together with new big data technologies based on team's strategies.
- Participates in Agile sprints and ceremonies; supports rapid iteration and development
- Develops, maintains, and tests data pipelines, application framework, infrastructure for data generation; works closely with information architects and Data Scientists
- Implements data orchestration pipelines, data sourcing, cleansing, augmentation and quality control processes
- Translates business needs into data architecture solutions
Career evolution: Progressively, you will get back the management of different pool of persons and be more involved with the business stakeholders.
- Master's degree in Computer Science, Statistics, Informatics, Information System, Mathematics or equivalent quantitative field preferred
- Strong project background in infrastructure, data engineering and big Data.
- Ideally between 8 and 16 years of experience Technical stack
- Big data engineering skills: Apache Nifi, Hive, Hbase, Hdfs
- Platform technologies: HDP, HDF or any big data platform equivalent
- Database: Oracle, MySQL, Postgres, Hadoop, Spark
- Data streaming tools: Kafka, Spark
- Agile and DevOps principles - Modern software architectures and API-driven development
- Big data processing frameworks: HDFS, MapReduce, Storage formats (Avro, Parquet), Stream processing
- Data processing tools: SQL, Spark, Python Other Requirements
- Fluent in English, Cantonese is helpful. A second language such as Japanese or Vietnamese is a plus
- Occasional business trips in Asia
- Attractive base salary
- 20 annual leaves
- Medical coverage
Argyll Scott Asia is acting as an Employment Agency in relation to this vacancy.