Google Cloud Engineer

Providence Staffing LLC

Google Cloud Engineer

San Antonio, TX
Full Time
Paid
  • Responsibilities

    Providence Staffing’s client is seeking a skilled Google Cloud Engineer with experience in Cloud Dataflow, Cloud Bigtable, or Cloud AI Platform, as well as proficiency with DevOps tools like Spinnaker, Cloud Build, or Cloud Source Repositories. In this role, you will design, implement, and maintain scalable cloud solutions while ensuring seamless integration with CI/CD pipelines to optimize development and deployment processes.

     

    Key Responsibilities:

    \- Cloud Solution Design: Architect and implement GCP solutions utilizing Cloud Dataflow, Cloud Bigtable, and Cloud AI Platform to support data-intensive and AI-driven applications.

    \- Data Pipeline Development: Build and maintain scalable, fault-tolerant data pipelines for real-time and batch processing using Cloud Dataflow.

    \- Database Management: Optimize and manage Cloud Bigtable for high-throughput, low-latency data access and storage.

    \- AI & ML Integration: Deploy and manage machine learning models using Cloud AI Platform, ensuring efficient model training, evaluation, and deployment.

    \- DevOps Integration: Set up and manage CI/CD pipelines using DevOps tools like Spinnaker, Cloud Build, or Cloud Source Repositories.

    \- Automate deployment workflows and infrastructure provisioning for cloud-based solutions.

    \- *erformance Optimization: Monitor and enhance the performance, scalability, and reliability of cloud applications and workflows.

    \- Security & Compliance: Ensure solutions follow GCP best practices for security, identity management, and compliance with industry standards.

    \- Collaboration: Partner with data engineers, software developers, and DevOps teams to deliver comprehensive cloud solutions.

    \- Documentation: Create and maintain detailed documentation for system architecture, processes, CI/CD workflows, and troubleshooting guides.

     

     

    Qualifications and Education:

    Bachelor’s degree in Computer Science, Information Technology, or related field (or equivalent experience).

    \- Proven expertise in GCP services, including Cloud Dataflow, Cloud Bigtable, and Cloud AI Platform.

    \- Hands-on experience with DevOps tools such as Spinnaker, Cloud Build, or Cloud Source Repositories.

    \- Strong background in cloud architecture and distributed systems.

    \- Proficiency in programming languages such as Python, Java, or Go.

    \- Certifications: GCP certifications such as Professional Data Engineer, Professional Cloud Architect, or Professional DevOps Engineer are highly desirable.

    \- Experience in SQL and NoSQL database design and optimization.

    \- Familiarity with big data tools and frameworks like Apache Beam or TensorFlow.

    \- Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes).

    \- Strong problem-solving and debugging skills.

    \- Effective communication and collaboration abilities in a fast-paced environment.

    Compensation:

    The salary range for this position is $105,000-$145,000 per year based on experience.