Lead Data Engineer

13 октября


Технологии:
etl transform rdbms nosql kafka designing troubleshooting education sql mysql mariadb oracle presto bigquery redshift integrator talend ssis pentaho english reporting tableau gcp java basic linux unix git jenkins
Английский: eng: Upper-intermediate

Purpose of the job:

Lohika company provides premium software engineering services to leading technology companies. Our customers usually range from startup to high growth and VC backed companies, which drives a culture of acceleration and innovation. We are sure that team extension is the only engagement model, which works best.
Our customer is one of the biggest fashion accessories retailers. Focused on design, confection, manufacturing, as well as distribution and retail. Multi-branded and fast-expanding with a remarkable presence globally. The main idea is to bring customers closer to the products they need. Currently we are looking for a team leader for a small team to migrate operational data from regional DWHs into global worldwide representation.

MAIN TASKS AND RESPONSIBILITIES:

  • Creating and maintaining existing ETL processes
  • Transform raw data (RDBMS, NoSQL) and raw events (Kafka, MQ) into the data that business users can use
  • Performing end-to-end data flows checks
  • Optimizing existing data processes, improving the performance and monitoring of existing solutions
  • Designing DB/DWH schema according to data processing needs
  • Create new views and improve reporting dashboard
  • Participate in daily meetings, technical discussions and regular planning sessions
  • Work in a close contact with team customer, provide technical solutions add value to the product
  • Troubleshooting problems, maintain integration points (changes in kafka topics, etc)
  • Manage team, provide day-to-day team and tech leadership

EDUCATION, SKILLS AND EXPERIENCE:

MUST HAVE:

  • Experience in datawarehousing and multi-dimensional database design and development using formal methodologies;
  • Strong knowledge of SQL (and be passionate about it)
  • Strong knowledge of 1 of RDBMS (MS SQL, MySQL/MariaDB, Oracle);
  • Experience with SQL optimization and performance tuning
  • Experience with Snowflake (or analogs: Presto, DataBricks, BigQuery, RedShift)
  • Experience in building ETL processes, using ETL tools (Oracle Data Integrator, Talend, SSIS, Pentaho)
  • Experience with event processing (Kafka, IBM MQ)
  • Good team player
  • Upper-intermediate English level
  • Ability to focus on producing results

WOULD BE A PLUS:

  • Experience with BI/Reporting tools (Domo, Tableau, Looker, etc)
  • Hands-on experience with 1 of major cloud service (Azure, GCP)
  • Java development experience
  • Basic Linux/Unix skills (user level)
  • Understanding CI/CD pipelines (Git/Jenkins)