We are seeking a talented Data Engineer with strong expertise in data architecture and cloud technologies. In this role, you will be responsible for designing scalable, reliable, and high-performance data processing systems from scratch to production, supporting our global live-streaming platform. You will be optimizing ETL workflows, troubleshooting data integration issues, and ensuring top-tier data quality for millions of users worldwide.
Key Responsibilities:
- Design and implement data systems to support a high-performance platform.
- Develop and maintain ETL workflows for various data sources.
- Optimize query performance in large-scale environments.
- Collaborate with cross-functional teams to solve data-related issues.
- Stay current with the latest technologies and best practices.
Requirements:
- 5+ years of experience as a Data Engineer or Software Engineer in data.
- Extensive Python experience, along with proficiency in SQL and data libraries.
- 3+ years working with cloud providers (GCP/AWS).
- Strong experience with data modeling, ETL processes, and data warehousing.
- Familiarity with big data technologies, streaming platforms like Kafka, and GitOps.
Nice to have:
- Experience with BigQuery.
- CI/CD data pipeline development with Terraform.
- Knowledge of functional programming.