AWS Cloud Developer (AWS Lambda, Glue, Python, RDS)
Benefits:
401(k)
Bonus based on performance
Competitive salary
Dental insurance
Health insurance
Paid time off
Vision insurance
AWS Cloud Developer (AWS Lambda, Glue, Python, RDS) – Job Description
Job Summary:
We are seeking a skilled Cloud Developer with expertise in AWS Lambda, AWS Glue, Python, and Amazon RDS to design, develop, and optimize cloud-based applications and ETL pipelines. The ideal candidate should have experience working with serverless architectures, data processing workflows, and relational databases. A relevant AWS certification (e.g., AWS Certified Developer – Associate or AWS Certified Data Analytics – Specialty) is preferred.
Key Responsibilities:
Develop and maintain serverless applications using AWS Lambda.
Design and implement ETL pipelines with AWS Glue for data transformation.
Work with Amazon RDS (MySQL, PostgreSQL, SQL Server, or Aurora) to design, query, and optimize databases.
Write scalable Python scripts for data processing and automation.
Integrate AWS services such as S3, DynamoDB, API Gateway, Step Functions, SNS, and SQS.
Optimize SQL queries and database performance, including indexing, partitioning, and caching.
Ensure security best practices using IAM roles, encryption, and access control.
Automate deployments with AWS CDK, CloudFormation, or Terraform.
Monitor and troubleshoot cloud applications using AWS CloudWatch, X-Ray, and logging solutions.
Required Skills & Qualifications:
Bachelor’s degree in Computer Science, IT, or a related field.
3+ years of experience in cloud development with AWS Lambda, Glue, Python, and RDS.
Strong proficiency in SQL and experience working with relational databases (MySQL, PostgreSQL, or SQL Server).
Experience with event-driven architectures and real-time data processing.
Knowledge of API development, RESTful services, and JSON processing.
Familiarity with CI/CD tools like GitHub Actions, CodePipeline, or Jenkins.
AWS Certification (e.g., AWS Certified Developer – Associate or AWS Certified Data Analytics – Specialty) is a plus.
Strong problem-solving and debugging skills with the ability to work in a fast-paced environment.
Preferred Qualifications:
Hands-on experience with PySpark for big data processing.
Knowledge of data lake architectures using S3, Glue Data Catalog, and Athena.
Understanding of containerization using Docker and AWS Fargate.
Experience in database migrations, backup strategies, and performance tuning in Amazon RDS.
This is a remote position.