About the Role
Key Responsibilities
Architect and implement scalable, secure, and cost-effective data platforms on AWS.
Lead the development of complex ETL/ELT pipelines and data workflows.
Define, standardize, and enforce data engineering best practices and coding standards.
Collaborate closely with data scientists, analysts, and business stakeholders to deliver data-driven solutions.
Conduct code reviews to ensure high-quality, maintainable deliverables.
Optimize data workflows for performance, scalability, and cost-efficiency.
Ensure strong data quality, governance, and security across all data environments.
Work with cross-functional teams to gather, analyze, and refine data requirements.
Monitor, maintain, and troubleshoot data pipelines and associated infrastructure.
Technical Requirements
Bachelor’s degree in Computer Science, Engineering, or related field.
5–8 years of experience in data engineering, including at least 3 years of hands-on expertise with AWS Big Data platforms.
Proficiency in Python, SQL, Spark, and various AWS data services.
Strong understanding of data architecture, data modeling, and data warehousing concepts.
Experience in DevOps practices and tools, including CI/CD, monitoring, and infrastructure-as-code (IaC).
Hands-on experience with key AWS services: Glue, S3, Redshift, Lambda, CloudWatch, IAM.
Experience with CI/CD tools and IaC technologies such as Terraform or CloudFormation.
Proficiency with Git for version control and collaborative development.
Requirements
Key Responsibilities
Architect and implement scalable, secure, and cost-effective data platforms on AWS.
Lead the development of complex ETL/ELT pipelines and data workflows.
Define, standardize, and enforce data engineering best practices and coding standards.
Collaborate closely with data scientists, analysts, and business stakeholders to deliver data-driven solutions.
Conduct code reviews to ensure high-quality, maintainable deliverables.
Optimize data workflows for performance, scalability, and cost-efficiency.
Ensure strong data quality, governance, and security across all data environments.
Work with cross-functional teams to gather, analyze, and refine data requirements.
Monitor, maintain, and troubleshoot data pipelines and associated infrastructure.
Technical Requirements
Bachelor’s degree in Computer Science, Engineering, or related field.
5–8 years of experience in data engineering, including at least 3 years of hands-on expertise with AWS Big Data platforms.
Proficiency in Python, SQL, Spark, and various AWS data services.
Strong understanding of data architecture, data modeling, and data warehousing concepts.
Experience in DevOps practices and tools, including CI/CD, monitoring, and infrastructure-as-code (IaC).
Hands-on experience with key AWS services: Glue, S3, Redshift, Lambda, CloudWatch, IAM.
Experience with CI/CD tools and IaC technologies such as Terraform or CloudFormation.
Proficiency with Git for version control and collaborative development.
About the Company
Cigres Technologies Private Limited is a technology consulting and services company that focuses on helping clients resolve their significant digital problems and enabling radical digital transformation using multiple technologies on premise or in the cloud. The company was founded with the goal of leveraging cutting-edge technology to deliver innovative solutions to clients across various industries.
%20(1).png)