Jobs via Dice
No H1B :: Google Cloud Platform Big Data Engineer (Lumi Platform) :: Phoenix, AZ
Salary
-
Job type
Full-time
Location
Phoenix, Arizona, US
Remote
No
Posted
1 week ago
Resume Examples
Browse professional resume examples with key skills, action verbs, and ATS-friendly formatting.
Browse resume examplesJob description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Bright Sol, is seeking the following. Apply via Dice today!
Google Cloud Platform Big Data Engineer (Lumi Platform)
Location: Phoenix, AZ (Onsite/Hybrid)
Visa: Open / Visa-Free Candidates Preferred
Experience: 8+ Years
Job Description
We are seeking a highly skilled Google Cloud Platform Big Data Engineer to work on the Lumi platform, leveraging a suite of Google Cloud Platform services to design, build, and optimize scalable data pipelines and analytics solutions.
The ideal candidate will have strong hands-on experience in Google Cloud Platform big data technologies, data engineering, and workflow orchestration.
Key Responsibilities
- Design and develop scalable data pipelines using Google Cloud Platform services
- Work extensively with BigQuery for data warehousing and analytics
- Build and manage data workflows using Cloud Composer (Airflow DAGs)
- Develop batch and streaming pipelines using Dataflow
- Handle large-scale data processing using Dataproc (Spark/Hadoop)
- Manage and optimize data storage using Cloud Storage
- Collaborate with cross-functional teams to deliver high-quality data solutions
- Ensure data quality, performance, and reliability across pipelines
- Troubleshoot and optimize existing data workflows
Required Skills
Google Cloud Platform, BigQuery, Cloud Storage, Dataflow, Dataproc, Cloud Composer, Airflow, DAGs, Python, SQL, Spark, Hadoop, ETL, Data Engineering
Preferred Skills
- Experience with Lumi platform or similar enterprise data platforms
- Knowledge of streaming frameworks and real-time processing
- Familiarity with CI/CD pipelines in Google Cloud Platform
- Strong problem-solving and debugging skills
Responsibilities
- Design and develop scalable data pipelines using Google Cloud Platform services
- Work extensively with BigQuery for data warehousing and analytics
- Build and manage data workflows using Cloud Composer (Airflow DAGs)
- Develop batch and streaming pipelines using Dataflow
- Handle large-scale data processing using Dataproc (Spark/Hadoop)
- Manage and optimize data storage using Cloud Storage
- Collaborate with cross-functional teams to deliver high-quality data solutions
- Ensure data quality, performance, and reliability across pipelines
- Troubleshoot and optimize existing data workflows
Qualifications
- Experience: 8+ Years
- The ideal candidate will have strong hands-on experience in Google Cloud Platform big data technologies, data engineering, and workflow orchestration
- Google Cloud Platform, BigQuery, Cloud Storage, Dataflow, Dataproc, Cloud Composer, Airflow, DAGs, Python, SQL, Spark, Hadoop, ETL, Data Engineering
Stand out from other applicants
AI reads this job description and tailors your resume to match, optimized for ATS filters.
Similar jobs
REDLEO SOFTWARE INC.
Phoenix, US
Conch Technologies Inc
Phoenix, US
Brooksource
Phoenix, US
USAA
Phoenix, US
West Yost
Phoenix, US - $109k - $181k/YEAR
Jobs via Dice
Phoenix, US
Ready to land your next role?
Join thousands of professionals who use Mokaru to manage their job search. AI-powered resume tailoring, application tracking, and more.
Create Free Resume