Classification: Contract
Contract Length: 6 Months
Position Summary
The Big Data Engineer/Consulting-Level serves as a primary development resource for design, writing code, test, implementation, document functionality, and maintain of NextGen solutions for the GCP Cloud enterprise data initiatives. The role requires working closely with data teams, frequently in a matrixed environment as part of a broader project team. Due to the emerging and fast-evolving nature of technology and practice, the position requires that one stay well-informed of technological advancements and be proficient at putting new innovations into effective practice. In addition, this position requires a candidate who can analyze business requirements, perform design tasks, construct, test, and implement solutions with minimal supervision. This candidate will have a record of accomplishment of participation in successful projects in a fast-paced, mixed team environment.
Responsibilities
- Work with data engineers, data architects, SIEM Team stakeholders to understand product requirements and then design, build, and monitor streaming platforms and capabilities that meet today's requirements but can gracefully scale.
- Implement automated workflows that lower manual/operational costs, define and uphold SLAs for timely delivery of data, and move the company closer to democratizing streaming data.
- Enable a self-service SIEM data architecture supporting query exploration, dashboards, data catalog, and rich data discovery.
- Promote a collaborative team environment that prioritizes effective communication, team member growth, and success of the team over success of the individual.
- Design and create real-time data services using Confluent Kafka and/or Streamsets that accelerate the time from idea to insight.
- Adheres to and supports platform engineering best practices, processes, and standards.
- Produce high quality, modular, reusable code that incorporates best practices
- Helps promote and support security best practices that align with industry standards and regulatory and legal requirements. Help mentor team members on complex data projects and following the Agile process.
- Help lead implementation of unit and integration tests and promote and conduct performance testing where appropriate.
- Be a leader in the HCA data community. Evangelize data and platform engineering best practices and standards, participate or present at community events, and encourage the continual growth and development of others.
- Develop a strong understanding of relevant product area, codebase, and/or systems
- Demonstrate proficiency in data analysis, programming, and software engineering
- Work closely with the Lead Architect and Product Owner to define, design and build new features and improve existing products
- Produce high quality code with good test coverage, using modern abstractions and frameworks
- Work independently, and complete tasks on-schedule by exercising strong judgment and problem-solving skills
- Closely collaborates with team members to successfully execute development initiatives using Agile practices and principles
- Participates in the deployment, change, configuration, management, administration and maintenance of deployment process and systems
- Proven experience effectively prioritizing workload to meet deadlines and work objectives
- Works in an environment with rapidly changing business requirements and priorities
- Work collaboratively with Data Scientists and business and IT leaders throughout the company to understand their needs and use cases.
- Work closely with management, architects and other teams to develop and implement the projects.
- Actively participate in technical group discussions and adopt any new technologies to improve the development and operations.
Requirements
- This role will provide application development for specific business environments.
- Responsible for building and supporting a GCP based ecosystem designed for enterprise-wide analysis of structured, semi-structured, and unstructured data.
- Bring new data sources into GCP, transform and load to databases and support regular requests to move data from one cluster to another
- Develop a strong understanding of relevant product area, codebase, and/or systems
- Demonstrate proficiency in data analysis, programming, and software engineering
- Work closely with the Lead Architect and Product Owner to define, design and build new features and improve existing products
- Produce high quality code with good test coverage, using modern abstractions and frameworks
- Work independently, and complete tasks on-schedule by exercising strong judgment and problem-solving skills
- Closely collaborates with team members to successfully execute development initiatives using Agile practices and principles
- Participates in the deployment, change, configuration, management, administration and maintenance of deployment process and systems
- Proven experience effectively prioritizing workload to meet deadlines and work objectives
- Works in an environment with rapidly changing business requirements and priorities
- Work collaboratively with Data Scientists and business and IT leaders throughout the company to understand their needs and use cases.
- Work closely with management, architects and other teams to develop and implement the projects.
- Actively participate in technical group discussions and adopt any new technologies to improve the development and operations.