Join one of the fastest growing SaaS companies in the world
At Trax, our people are at the core of our business, as we create technology and services that are changing the world of retail. Get to know us.
Trax is a rapidly growing
company with start-up values
We started out with the daring mission of solving long-standing problems for our CPG and retail customers. Today, we are proud to be market makers, and our journey is streaked with perseverance, team work and a whole lot of fun.
Unleash your potential by joining a powerhouse of brilliant minds from diverse backgrounds, in an environment that is learning-friendly - and often filled with food!SEE OUR CURRENT VACANCIES
About The Position
Trax is looking for a Backend\Data Engineer to join its new solution for retailers. This person will be extending our existing data engineering team and work with our Data Scientists, and Data Analysts on building data pipelines, data engines and improving our data platforms. He or She will act as a key enabler of ML / statistical models, AI modules and advanced analysis over our production environment.
The ideal candidate will be an independent, highly motivated, SW engineer with strong affinity to state-of-the-art data technologies. He or she is expected to be a self-learner that continuously seeks to push the envelope, following new and emerging trends in technology while working in a complex setting.
- Design, build and deploy complex data pipelines, moving and manipulating Trax data.
- Shorten time to value of new modules by taking ownership of their productization process. Integrate and deploy these within the Trax environment.
- Develop and optimize engine performance and develop new capabilities.
- Identify new and emerging technologies that may drive efficiency and introduce these to relevant architecture, research and development teams.
- 2+ years of professional experience in software development.
- Proficiency in one or more modern programming languages such as Python, Java, Ruby.
- Fundamental knowledge of data structures, algorithms, and OOP.
- Experience in working with databases (SQL and NoSQL)
- Experience with data analysis methods and concepts
- Experience in working in a cloud environment (ideally AWS/GCP)
- Experience with big data & data analytics tools: NumPy, pandas, Hadoop, Spark, Kafka, etc.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- BA/B.Sc (or higher) degree in Computer Science from known university or equivalent experience from lead army technological units.