Our client is looking for two Permanent Residents or Citizens with solid experience as a Big Data Engineer to join their Team in Singapore as Full Time Employees or on Long Term Fixed Term Contracts, working on site at one of their Financial Services Clients.
Architect and process complex datasets with Spark, Beam, etc.
Read/write data with Kafka, HFDS, Amazon S3, Postgres etc.
Produce fault-tolerant, intelligent data processing pipelines.
Skills & Qualifications:
You have a great understanding of Python language and standard library
You understand how to use a data processing frameworks, such as Apache Spark or Beam
You know how to read/write data to message and storage platforms, such as Apache Kafka, HFDS or Amazon S3.
You are interested in creating data processing pipelines for machine learning, using Apache Airflow, Dask etc.
You have very high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment.
You are an adaptable, resourceful, well-organised team player with strong work ethic.
You agree that written and verbal communication skills are vital.
You are educated to degree level or above.
Based on Client site in Singapore
Pay Rate: Max SGD $95,000 Per Annum (Crica US$70,000 per annum)
Industry: Financial Services Sector
Visa Type: Permanent Resident / Citizen ONLY
Open Roles: 2#
|Job Category||Big Data Engineer (Financial Services)|