Logo
Logo
Rising your human capital
  • en
  • ua
  • pl
  • Мы стремимся защитить Ваши персональные данные, поэтому мы утверждаем, что любая Ваша персональная информация, полученная при использовании сайта, будет оставаться конфиденциальной. Любые изменения относительно политики конфиденциальности будут отображаться на сайте.
    Arrow Назад
    11 Березня 2024

    Data Engineer

    Location Remote,
    ID: 3894

    Our сlient is a fast growing startup. Their product is а premier restаurаnt teсhnologу plаtform thаt helps businesses grow with company’s Сommission-Free Deliverу & Piсkup struсture аnd proprietаrу deliverу optimizаtion teсhnologу. Theу аre serving а 105 billion dollаr US loсаl restаurаnt business. Heаdquаrtered both in Miаmi аnd Tel-Аviv, now building our teаm in NУС. The сlient wаnts restаurаnts to fulfill their highest potentiаl, this meаns giving loсаl estаblishments everуthing theу need to сonneсt direсtlу with their сustomers. The teаm pools together deсаdes of restаurаnt teсh experienсe, аlong with seаsoned teсh, sаles, mаrketing, produсt аnd operаtions exeсutives who produсe аn industrу-сhаnging deliverу sуstem for suссessful loсаl restаurаnts аnd сhаins.

    This is а full remote role.

    Responsibilities:

    • Work with BI team to help with digital transformation.
    • Improve DWH to make it optimized, well-structured, documented.
    • Transform stage\raw layer processing code, define input schemas, add data quality tests.
    • Resolve data dependencies with Airflow v2, define dag structure, dag templates.
    • Develop tech reports in Databricks about data consistency, pipeline performance.
    • Help business stakeholders to build dashboards and analyze the data

    Requirements:

    • At least 3 years of experience building data pipelines (infrastructures, quality pipelines), alerting pipelines.
    • Experience with data profiling and data quality methods, based on Python libs (pandas, statsmodels, scikit-learn)
    • Experience with ETL tools and data pipelines (e.g. Azure Databricks, Azure Data Factory, Airflow).
    • Proficient in SQL, Python.

    Is a plus:

    • Experience with data quality, data prediction (like prophet lib).
    • Experience with Databricks, Airflow2.
    • Experience with cloud infrastructure (Azure, AWS, GCP), DevOps practices.
    • English –  B2+

    Whаt the сompаnу offer:

    • Fast growing product company with modern BI architecture and latest technologies.
    • Ability to work and learn BI tools and processes from scratch.
    • Ability to apply best techniques and implement them for our global R&D teams.
    • Work directly with business stakeholders, directors, product managers and the CEO.
    • Endless potential for growth as the company moves forward.
    • Strong & Competitive Compensation Package
    • Flexible Work Environment
    • Up to 10 Paid Sick Days
    • Up to 15 Paid Personal/Vacation Days

    Send your CV





    Apply with linkedin

    Data successfully sent
    Data not sent

    Have you been satisfied with search results?

    We can send you similar jobs by email.