The company

Adore Me is the lingerie underdog, disrupting a huge traditional industry. We started the journey just 5 years ago and we’re one of the fastest growing companies in the US.

The tech factor is strong in our DNA. We have more engineers than marketers, fashion designers or business folks and we’re proud of it.

The tech

The end goal is to craft the perfect bra and ship it to millions of women. So we’re talking about scalable e-commerce, coordinating our factories & robots in the distribution center, internal tools that catalyse everything.

We’re one of those scaling technology startups. Think many small teams, dev-ops, millions of users, hundreds of git repos, crazy experimenting.

It’s not all perfect. We’re always evolving and change is everywhere. So we’re either moving from Magento to micro-services, from MySQL databases to a cloud DWH, moving from local environments to a serverless Machine Learning platform and there’s always some technical debt to hunt & kill. We manage a lot of uncertainty and always try to learn from our mistakes. We’re getting there.

The job

You will work in a organised full-stack scrum team. Alongside AI/data people, frontend folks (angular/vue apps), test engineers, product & UX people and scrum masters.

As a data analyst you’ll be playing all day long with ELT procedures, API/message brokers/MySQL data sources, data mapping and data modeling, mart layer business calculation processes and reports. Building BI features or tools. We’re all a bit multidisciplinary so you should not be afraid to set up a VM or docker container, investigate some hanging workers or fix a failing unit test.

More concretely, data in Adore Me, can mean: enhance the DWH ecosystem, integrate various machine learning projects on top of it, work with the 30+ distinct sources exposing the data in at least 6 different technologies. Use JSON and nested data as the de-facto format. Migrate the batch processes in stream mode using cloud native technologies like dataflow, airflow, pub/sub, datastore. Build custom BI tools, expanding over the “database” borders.

Keywords

DWH, BI, data, SQL, ELT, ETL, data modeling, batch, data streaming, data pipelines, cloud run, cloud functions, BigQuery, Data Science, Machine Learning, ML, Python, R, Looker, AppEngine, Dataflow, Airflow, message broker, container, MySQL, Dash, agile, scrum, kanban, continuous improvement, pair programming, unit testing, continuous integration.

You

You’re super experienced with SQL, DWH and BI. Very comfortable with at least a couple of frameworks and languages (GCP, Python, R). Know your data stuff so a little modelling challenge doesn’t scare you and you can survive on a linux box, R console or a Jupyter notebook.

You’re passionate about technical excellence. You know data quality processes, how to interact with APIs and message brokers, build ELT flows, standard coding style, code reviews, unit tests, documentation, refactoring, good naming. You have opinions about classical and modern data models, both business and technical KPIs and good, the Apache stack and design patterns.

You’re a builder and you want to impact the world around you. So you need to get stuff done. Problem solve. Move fast.

You’re optimistic, have a high self-worth, you speak your mind and have strong empathy for your peers. More of a team player than a lone wolf.

The perks

Competitive pay. Private health services. Trips to our Manhattan office. Crazy continuous learning. Playful mood. Foosball, Punching bag. Ping pong. Specialty coffee. Did we mention continuous learning? Hipster mansion near Cismigiu. Lots of couches. Whiteboards everywhere. We write on the walls too. We’re hosting meetups and we’re going to conferences. Friday lunch-party, catered breakfast, craft beer-stocked fridges. Lounge chairs in the garden. Experienced engineers doing no-bullshit work. Macbooks and fancy dev tools.