Senior Data Engineer

at RabbitHole

227 d. ago
About the Job

Senior Data Engineer

Remote /
RabbitHole /
/ Remote

Apply for this job
At RabbitHole, we are on a mission to increase global economic opportunity by making crypto more accessible and meritocratic. We believe that by making it easy for protocols to distribute their tokens to users, we can help anyone in the world earn income through provable contribution rather than their background or credentials.

We have created an efficient way for protocols to distribute tokens by segmenting users based on on-chain & off-chain data. By completing on-chain tasks in various protocols, users build their on-chain resume and earn token ownership.

We recognized that protocols are constantly looking for new ways to find and engage users but don’t have an easy way to do so. Tokens of these protocols often fall in the hands of speculators who sit on centralized exchanges, rather than users of these networks. On the other hand, new users in crypto have a difficult time knowing where to start on their crypto journey and don’t know who to trust.

Our goal is to make it more efficient for protocols to distribute their token to network participants. By doing this, we are both driving more participation to protocols and making it easier for users to increase their ownership in protocols rather than speculators. By increasing the number of token holders who are using the network, we are making the underlying protocol more sustainable and putting more money in the pockets of users.

We're looking for product-obsessed individuals with early-stage startup experience who want to work with a dynamic fast-moving team and build the roadmap for RabbitHole to become the best way for protocols to distribute their token and engage their users. If this is you, we are super excited to meet you and learn more.

How'd you help onboard next wave of crypto users

    • Analyzing both off-chain and on-chain data to provide insights that guide protocols and the internal team in making decisions
    • Designing and implementing an engine to match prospective token holders with protocols, taking ownership of the process from start to finish
    • Promoting engineering practices and initiatives that prioritize code quality while still allowing for a high team velocity

Who you are

    • A data enthusiast who leverages data-driven approaches to problem-solving
    • Committed to writing clean, scalable, and maintainable code for team collaboration and efficiency
    • Demonstrated ability to make significant technical decisions and take responsibility for their outcomes
    • Experienced in deploying applications to production and passionate about user feedback and engagement
    • Thrives in a dynamic team environment that balances rapid iteration with feature delivery


    • 4+ years of experience in a data-related role, working with a team on production-level projects
    • Bachelor's degree in computer science, data science, statistics, or a related field, or equivalent experience
    • Excellent communication and collaboration skills, with the ability to effectively communicate complex technical concepts to non-technical stakeholders
    • Proven track record of deploying robust data pipelinesStrong proficiency in Python, R, or Scala
    • Expert proficiency in SQL
    • Deep knowledge and professional experience in operating large-scale data infrastructure, including data warehousing, data lakes, and distributed computing systems
    • Exposure to ML model development, with experience in data preprocessing, feature engineering, model training and validation, and model deployment
    • Passion for learning, with the ability to keep up with new technologies and industry trends, and introduce new ideas and best practices to the team


    • Design, deploy, and maintain data pipelines
    • Create scalable data infrastructure to support growing user base
    • Extract insights from large datasets with complex queries
    • Scale data infrastructure to handle large volumes of data and trafficImplement security measures to protect user data
    • Stay up-to-date with new tools and technologies and propose improvements to data architecture

Nice to haves

    • Early stage web3 startup experience
    • Experience building or using on-chain data such as Dune Analytics or Flipside
Apply for this job