Job Title: Senior AWS Data Engineer

Location: Bangalore, India

Experience: Minimum 5 years

Job Overview

As an AWS Data Engineer with Snowflake expertise, you will be responsible for designing, implementing, and optimizing data pipelines and solutions within our AWS environment, utilizing Snowflake as our core data warehousing platform. You will collaborate with cross-functional teams to support data initiatives, ensuring the efficient collection, storage, and transformation of large datasets from various sources.

Key Responsibilities:

  • Design, develop, and maintain scalable data pipelines and architectures in AWS, leveraging services such as S3, Lambda, Glue, and Redshift.
  • Implement, optimize, and manage Snowflake data warehouse environments, ensuring performance, scalability, and reliability.
  • Integrate data from multiple sources (structured and unstructured) into Snowflake, including real-time and batch data processing.
  • Collaborate with data scientists, analysts, and business stakeholders to gather requirements, develop data models, and create efficient data solutions.
  • Optimize data storage, query performance, and data retrieval processes to ensure minimal latency and maximum efficiency.
  • Develop and maintain ETL processes using AWS services and Snowflake’s advanced features like Snowpipe, data sharing, and time travel.
  • Monitor and troubleshoot data pipelines, ensuring data quality, consistency, and security.
  • Implement best practices for data governance, security, and compliance within the AWS and Snowflake environments.
  • Collaborate with DevOps teams to automate deployment and scaling of data pipelines and infrastructure using CI/CD practices.

Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).
  • 5+ years of experience as a Data Engineer, with a strong focus on AWS and Snowflake.
  • Proficiency in AWS services (S3, Lambda, Glue, Redshift, EC2, IAM, CloudFormation).
  • Expertise in Snowflake Data Warehouse, including schema design, query optimization, and data security.
  • Strong experience with ETL/ELT processes and tools (AWS Glue, Apache Airflow, etc.).
  • Proficiency in SQL, Python, or another programming language for data manipulation and pipeline development.
  • Experience with data modelling, data warehousing, and Big Data technologies.
  • Familiarity with CI/CD pipelines and DevOps practices for data engineering.
  • Knowledge of data governance, privacy, and security best practices.
  • Strong problem-solving skills, attention to detail, and the ability to work in a collaborative team environment.

Preferred Qualifications:

  • AWS Certified Solutions Architect or AWS Certified Data Analytics certification.
  • Experience with other cloud platforms (Azure, GCP) and big data tools like Hadoop, Spark, or Kafka.
  • Familiarity with business intelligence tools like Power BI or Tableau.

Join our team and let's build something amazing together!

Submit your application through the form, and our HR team will connect with you soon!