Jr. AWS Data Engineer

AGENCYKPI, INC.

Jr. AWS Data Engineer

austin, TX
Full Time
Paid
  • Responsibilities

    Role: Junior AWS Data Engineer

    Department: Technology

    Reporting : Director of Engineering

    About AgencyKPI:

    AgencyKPI is an Insurtech company that provides a business intelligence platform for insurance networks, independent agencies, and insurers. The people at AgencyKPI are an unparalleled combination of experienced insurance industry leaders along with exceptional data scientists, software developers and business strategists, and we are looking for the right people with the right skills to join us.

    During a time when most Insurtech companies claim that disruption is the path to the future, AgencyKPI is developing software platforms that support Harmony, Understanding, and Balance between all partners and vendors in the insurance industry. Why this approach? Because we have a fundamental belief that insurance agencies, networks, carriers, and wholesalers desire to deepen their relationships through mutual understanding and the harmonizing and balancing of their collective efforts.

    Culture :

    We are a remote-first company with employees located in various regions throughout the United States. Our core values of Integrity, Innovation and Delivery guide us individually and as a team to work together in a cohesive manner to define and drive our organizational success. We are seeking talented individuals who share these values and want to make a direct impact to the insurance industry.

    What is the role:

    The Junior AWS Data Engineer, reporting to the Director of Engineering, will work with other data engineers and database architects to build, monitor, and maintain ELT/ETL pipelines and data architectures within the AWS cloud environment. The ideal candidate will have strong skills in PySpark/Python and a solid understanding of SQL and relational data models. They should understand data lake concepts and columnar file formats such as parquet. The Junior AWS Data Engineer will collaborate closely with BI developers, ML engineers, and database architects. Flexibility and adaptability in a rapidly changing environment are key attributes for success in this position.

    What you’ll do on a daily basis:

    • Develop, monitor, and maintain ELT/ETL pipelines within the AWS cloud environment and Databricks.
    • Work with AWS Glue, Databricks, S3, EMR, Step Functions, and CloudFormation.
    • Create and troubleshoot PySpark scripts and SQL queries.
    • Work with source control and follow development lifecycle best practices.
    • Apply data engineering best practices in terms of quality, security, scalability and
    • maintainability of all pipelines and architecture.
    • Develop means for automating data- and analytics-related systems and processes, as
    • appropriate, to support machine learning and BI activities.

    What background should you have?

    Required

    • 3+ Years experience developing ELT/ETL pipelines within the AWS Cloud environment.
    • Experience with the following AWS services: Glue, S3, Lambda, Step Functions,
    • Database Migration Service, IAM; Experience with AWS-hosted Databricks.
    • Familiarity with Databricks installed over AWS, including experience with Databricks
    • clusters, notebooks, and integration with AWS services.
    • Strong SQL, Python, and PySpark skills; understanding of the Spark distributed
    • computing framework.
    • Experience extracting data from multiple data sources: SQL/RDBMS, flat files, JSON,
    • APIs, external datalakes, etc...
    • Understanding of relational data models and RDBMSs.
    • Experience with data warehousing and data lake concepts with Amazon AWS services
    • at scale.
    • Bachelor's degree in a relevant field (Computer Science, Engineering, Mathematics,
    • etc.), or equivalent industry experience.
    • Flexible, adaptable and ability to work in a rapidly changing environment.

    What Will Make You Stand Out

    • Knowledge of columnar data formats such as parquet and data lake frameworks such as Delta.
    • Understanding of AWS CloudFormation and Infrastructure as Code concepts
    • Advanced working knowledge of Databricks and Unity catalog to service BI reporting solutions.
    • Ability to effectively articulate technical challenges and solutions
    • Ability to work with ambiguous/undefined problems; ability to think abstractly.

    What do we offer?

    • Competitive compensation package
    • Generous health, dental and vision insurance package that covers employee, spouse, and dependents
    • Instantly vesting 401k with company match
    • Getting to work with an incredibly talented team, in a rapidly scaling company, with unmatched industry buy-in