AWS Data Engineer (Standard) with skills Data Engineering, Python, Apache Hive, AWS-Apps, DevOps Engineering for location Any Infogain Base Location (Noida, Gurugram, Bangalore, Mumbai, Pune)
ROLES & RESPONSIBILITIES
Job Summary
We are seeking a skilled AWS DevOps Engineer with strong expertise in Kubernetes (EKS), Terraform, and AWS cloud services to design, build, and maintain CI/CD pipelines and infrastructure automation for our platform. This role plays a key part in enabling secure, scalable, and repeatable cloud-native deployments within the AWS ecosystem.
Key Responsibilities
Design and implement CI/CD pipelines using AWS CodePipeline, GitHub Actions, or GitLab CI.
Provision and manage AWS infrastructure using Terraform (IaC) or AWS CDK across multiple environments.
Configure and manage Amazon Elastic Kubernetes Service (EKS) clusters for hosting containerized applications.
Implement Helm charts or Kustomize for Kubernetes manifests.
Integrate security and quality gates into DevOps pipelines (e.g., Snyk, SonarQube, Prisma Cloud).
Manage secrets securely using AWS Secrets Manager or Parameter Store and pipeline integrations.
Monitor and troubleshoot infrastructure, deployments, and container workloads using CloudWatch and X-Ray.
Collaborate with cloud architects, developers, and security teams to ensure reliable deployments.
Automate routine operational tasks using AWS Lambda or Python to enhance platform resilience.
Support blue/green and canary deployments using AWS App Mesh or Route 53 routing policies.
Document and share DevOps best practices with engineering teams.
Required Skills & Experience
3+ years hands-on experience in DevOps, infrastructure automation, or cloud engineering.
Strong working knowledge of AWS Cloud Computing Services (EC2, S3, RDS, VPC, IAM).
Expertise in Terraform (HCL), with modular, reusable templates for AWS provider.
Hands-on experience managing Kubernetes clusters (specifically EKS).
Familiarity with containerization using Docker, Helm, and Amazon ECR.
Experience integrating with tools such as AWS CloudWatch, Prometheus, and Grafana.
Deep understanding of AWS networking (VPC peering, Transit Gateway, Security Groups, ALB/NLB).
Proficiency in scripting languages such as Bash or Python (Boto3).
Good to Have / Preferred
AWS Certifications: AWS Certified SysOps Administrator, AWS Certified DevOps Engineer – Professional, or CKA/CKAD.
Familiarity with GitOps tools like ArgoCD or Flux.
Experience with policy as code (e.g., AWS Config, CloudFormation Guard, OPA).
Knowledge of DevSecOps principles and automated compliance.
Exposure to multi-cloud or hybrid cloud environments (AWS Outposts).
Education
Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
Soft Skills
Strong problem-solving and troubleshooting skills in distributed systems.
Clear communication and documentation abilities for technical stakeholders.
Ability to work in Agile and cross-functional teams.
Self-driven with a high degree of attention to detail and security.
EXPERIENCE
- 4.5-6 Years
SKILLS
- Primary Skill: Data Engineering
- Sub Skill(s): Data Engineering
- Additional Skill(s): Python, Apache Hive, AWS-Apps, DevOps Engineering
ABOUT THE COMPANY
Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP).
Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru.