Job no: 527932
Work type: Full time
Location: Quarry Bay 鰂魚涌
Categories: Information Technology
Principal Duties and Responsibilities:
- Sustain the Modern Data Architecture: Setting the strategic rollout, selection and delivery of proven as well as emerging tooling to ensure that data pipelines are scalable, repeatable and secure serving multiple users within the organization. The incumbent is responsible for implementing highly complex, multi-faceted big data initiatives associated the following functions areas:
- Translate complex functional and technical requirements for enhancement into detailed design and high performing capabilities;
- Support the implemented data patterns and services - both batch, real-time and complex event handing - leveraging open technologies;
- Ensure the timely delivery and incident resolution to meet SLA by investigating issue and design and implement the resolution.
- Effectively manage operational risk and change with a continuous focus on process improvements.
- Governance and Controls: Responsible for ensuring all data management processes conform to standard operating procedures and aligns with how a modern data architecture aligns to enterprise technology standards and policies; contribute and provide guidance around data quality, metadata management, data stewardship and security, and access controls.
- Innovation and Continuous Improvement: Evaluate/conduct PoC of new data technologies and capabilities that are value-driven; work closely with product innovation teams, remain current with industry advancements working closely external vendors as well connected to the Apache Foundation projects;
Required Knowledge and Skills:
- Demonstrated 6-8 years professional experience in big data/data management including a university degree in Engineering, Computer Science or equivalent program;
- Extensive expertise in data technologies and the use of data to support software development, advanced analytics and reporting. Particular focus on Cloud (Azure) and Hadoop-based technologies and programming or scripting languages like Java, Scala, Linux, C++, PHP, Ruby, Python, R and SAS. Also expert knowledge should be present regarding different (NoSQL or RDBMS) databases such as Hawq/HDB, MongoDB, Cassandra or Hbase; Working experience with modern data streaming process with Kafka, Apache Spark, Flink and data ingestion framework NiFi, Hive, Pig, etc.
- Experience with security/data protection solutions, Kerberos, Active Directory, HDFS Access Control, OAuth2, OpenID, LDAP
- Experience with network layer security design in VPN, firewall, Loadbalancer
- Experience and capability in translating non-technical user requests into complex technical specifications and solutions that meet these requirements;
- Excellent organizational and time management skills, with ability to multi-task. Ability to work with minimal or no supervision while performing duties; has the ability and initiative to organize various functions necessary to accomplish department activities or goals and be a strong team player.
Lesser qualified candidates will also be considered for junior data engineer and data analyst roles.
The Firm (Dairy Farm / Banner)is responsible for ensuring that all personal information collected from each Candidate presented to Dairy Farm is used for recruitment purposes only and the data collection process is in accordance with all applicable laws and compliant with the Code of Practice on Human Resource Management.
Job posting date: China Standard Time
Applications close: China Standard Time
Back to search results Apply now Refer a friend