We’re building a platform to challenge the Status Quo of Trading. If you like to tackle interesting engineering challenges and explore beyond the limits of the current possibilities, then you will find yourself comfortable with us.
What’s our culture?
- Transparency: Collaborate openly and transparently, putting kindness and respect into every conversation
- Innovation: Constantly look for innovative ways to solve problems and bring additional value to our teams
- Scale: Always drive, design, devise, execute and measure with future scale in mind.
- Independence: Fully remote team spanning across America and Europe
What will you be collaborating on?
- The design and implementation of high performance data processing pipelines
- The deployment of public facing and internal web services and applications
- Collaborating with the Data Science team on the implementation of innovative trading strategies
- Identifying innovative opportunities across the platform and in the industry
We are looking for someone with:
- At least 3 years of experience developing with Python
- At least 3 years of experience with PostgreSQL or other relational database management systems. Experience with other kinds of data storage is also welcome.
- Experience building data pipelines and ETL systems
- Familiarity with data processing libraries such as Pandas
- Strong focus on quality and best practices
Some other stuff that would be nice to have?
- Familiarity with stream processing tools like Apache Flink, Kafka and/or Kinesis and service orchestration tools such as Linux containers (Docker) and
- Some experience with the Rust programming language
- Basic literacy on data science and machine learning techniques
Current Remote Locations & Hubs: