We are seeking an experienced Software/Data Engineer with expertise in Kafka, AWS MSK, Snowflake, Debezium (or similar tool) and data modeling to join our team. The ideal candidate will have at least 5 years of experience in data integration, data warehousing, and ETL development.
This is what you'll do:
- Design, develop, and implement ETL pipelines using Kafka, AWS MSK, and Snowflake.
- Build and maintain data integration processes using Kafka Connect, Kafka Streams, and AWS MSK & Debezium (or similar tool)
- Collaborate with cross-functional teams to understand and integrate data from various sources into Snowflake data warehousing solution.
- Design and maintain data models that support business requirements in Snowflake.
- Optimize and improve the performance of existing ETL processes.
- Monitor and troubleshoot data integration issues as they arise.
- Follow best practices for data security and compliance in AWS.
This is what you'll need:
- 5+ years of experience in ETL development, data warehousing, and data integration.
- Strong experience with Kafka, AWS MSK, and Snowflake & Debezium (or similar tool)
- Proficient in SQL and at least one programming language such as Java, Python, or Scala.
- Extensive experience in data modeling, particularly in a data warehousing context.
- Strong understanding of data integration and ETL best practices.
- Familiarity with data security and compliance requirements in AWS.
- Strong problem-solving and analytical skills.
- Ability to work in a fast-paced and dynamic environment.
- Education: Bachelor's degree in Computer Science, Information Technology, or a related field is highly preferred.