We're seeking a Data Engineer to join our burgeoning analytics team. The new hire will be in charge of extending and improving our data and data pipeline architecture, as well as streamlining data collection and flow for cross-functional teams.
An experienced data pipeline builder and data wrangler who enjoys optimising data systems and building them from the ground up is the ideal candidate. The Data Engineer will provide data-related support to our software developers, database architects, data analysts, and data scientists, as well as ensuring that the best data delivery architecture is used across all projects.
They must be self-directed and capable of supporting multiple teams, systems, and products' data demands. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives
Responsibilities and Duties:
- Develop Cloud Data and Analytics solutions in the Block Chain area as part of a team.
- Storage (cloud data warehouse, S3 data lake), orchestration (Airflow), processing (Spark, Flink), streaming services (AWS Kinesis & Kafka), BI tools, graph database, and real-time large scale event aggregation store are all examples of data architecture to design and maintain.
- Work on cloud data warehouses, data as a service, business intelligence, and machine learning solutions.
- Data wrangling in a diverse environment.
- Ability to provide data and analytics solutions that are cutting-edge.
- Using the GCP Stack (Google Kubernetes Engine (GKE), BigQuery, and GCP Databricks) to build modern data warehouse applications.
- Mentor a group of High-Performing Engineers in the delivery of difficult data solutions and the identification of Data-Driven Development prospects.
- Identify strategic and Operational KPIs for the team and drive the team to deliver the committed targets.
- Collaborating with Global stakeholders.
- SQL knowledge, as well as programming skills in Scala or Python.
- 5+ years of applicable data warehousing, data engineering, or data architect experience
- Experience with the GCP stack (BigQuery, GCP Databricks) is required.
- Good understanding of the GCP Databricks platform and ability to design data analytics solutions to meet performance and scaling requirements.
- Demonstrated analytical and problem-solving abilities, particularly in the context of large data.
- Data warehousing concepts and modern data warehouse/Lambda architecture are well-understood.
- Good understanding of the Machine Learning and Artificial Intelligence (AI) solution space.
- GIT is a source code control system that you should be familiar with.
- Communication and interpersonal skills at all levels of management
- You are a detail-oriented person with excellent communication skills and a strong sense of teamwork.
- Working in an international team atmosphere is a plus.
bitsCrunch focuses on Analytics, Finance Technology, Blockchain / Cryptocurrency, Web3, and NFT. Their company has offices in Bengaluru, Munich, and Chennai. They have a small team that's between 11-50 employees. To date, bitsCrunch has raised $4.35M of funding; their latest round was closed on February 2022.
You can view their website at https://www.bitscrunch.com/ or find them on Twitter, Facebook, and LinkedIn.