Responsibilities:
- Improve payment performance and drive business growth of company's core businesses
- Develop data pipelines and participate in the standard definition of data development;
- Build data quality system and complete data monitoring/verification process;
- Develop data-related product requirements, and be able to independently understand requirement, design schemes, and implement.
Qualifications:
- Bachelor's degree or above in Computer Science, Statistics, Mathematics or other related majors;
- At least 2 years of experiences and above;
- Proficient in at least one coding skill such as Python, Java, Scala, Go, etc., with a good engineering background and interest in data;
- Prior experience with writing and debugging data pipelines using a distributed data framework (Hadoop/Spark/Flink/Storm etc.);
- Familiar with OLAP engines (Hive/ES/Clickhouse/Druid/Kylin/Doris etc.);
- Familiar with data warehouse architecture, data modelling methods and data governance; enthusiastic about data mining, goodbusiness understanding and abstraction capabilities;
- Proficient in databases, goood SQL/ETL development ability;
- Experience in real-time data warehouse development is a plus.