Our client is a successful fintech start-up based in London. The client provides a software platform and an open API for the banking industry. The open API allows fintech enterprises to access banking services, create digital wallets, connect them, receive, send, and convert funds, launch new cards, and administer loans. The range of services and geographical presence of the client are growing fast.
We are looking for a data engineering specialist who will join the client’s data and analytics team and will work on the implementation of data warehouses/lakes, data pipelines, and data analytics models.
The client uses AWS data and analytics stack, Databricks/Spark (PySpark), Fivetran, Looker, Snowflake.
Required Skills and Experience:
- Experience with data analytics, data engineering, and big-data technologies
- Spark is highly desired, PySpark is preferred over Scala
- Experience building data warehouses/lakes and data pipelines using cloud platform tools as well as vendor ETL/ELT technologies
- Knowledge of data and analytics fundamentals, data modelling, data quality, and data governance principles
- Spoken English
- Professional Development:
- Experienced colleagues who are ready to share knowledge;
- The ability to switch projects, technology stacks, try yourself in different roles;
- More than 150 workplaces for advanced training;
- Study and practice of English: courses and communication with colleagues and clients from different countries;
- Support of speakers who make presentations at conferences and meetings of technology communities.
- The ability to focus on your work: a lack of bureaucracy and micromanagement, and convenient corporate services;
- Friendly atmosphere, concern for the comfort of specialists;
- Flexible schedule and the ability to work remotely;
- The ability to work in any of our development centers.