Experience:
10+ Years (Data Engineering / Data Modeling)
Rate:
Open
Role Overview:
We are hiring a Core Data Engineer with strong experience in Big Data (100TB+), Google Cloud Platform, and data pipeline architecture.
Key Responsibilities:
• Build and maintain scalable data pipelines
• Design and optimize conceptual & logical data models
• Process large-scale structured and unstructured datasets
• Implement data strategies and architecture
• Improve system performance and troubleshoot issues
• Work closely with Data Scientists and Analytics teams
• Develop ML-ready datasets and pipelines
• Configure and manage Pub/Sub systems
✅ Required Skills:
• Expertise in:
• Python
• SQL
• ERD / Data Modeling
• Strong experience in Google Cloud Platform:
• BigQuery
• GCS
• Cloud Functions
• Composer
• Pub/Sub
• Hands-on experience with DBT
• Experience handling 100TB+ data
• Strong knowledge of:
• Data mining
• Segmentation
• Data pipeline architecture
Nice to Have:
• Data visualization tools
• Experience supporting ML workflows
Must Have:
• 10+ years in data modeling
• 3+ years in Big Data