Datalake DevOps

4 weeks ago


Kuala Lumpur, Malaysia INSCALE Full time

**Responsibilities**:
As a Data Lake DevOps Engineer, you will play a crucial role in monitoring and maintaining our data pipelines, AWS services, and ensuring seamless data flow within our data lake ecosystem. Your technical prowess will be put to the test as you diagnose, troubleshoot, and resolve complex data pipeline issues, all while supporting stakeholders and enhancing the overall efficiency of our data operations, based on AWS and Snowflake as the key platforms.
You will collaborate closely with cross-functional teams to enable efficient data storage, processing, and analysis for our organization's data-driven initiatives.

Your primary tasks will be to:

- Monitor Datalake data pipelines QC + AWS Services such as StepFunction/Glue/Lambda/ElasticSearch (good to have also)
- Diagnosing, troubleshooting, and identifying root cause of data pipelines issue and resolve them.
- Providing support and maintenance to Datalake stakeholders
- Standardizing deployment process
- Disaster recovery
- Security and legal compliance
- Annual patch schedule
- Operations planning

**Qualifications**:

- Must have:
- Atleast 5+ years proven experience as a DevOps Engineer or similar role, with a focus on data lake environment.
- Proficiency in Python, Javascript, and SQL is a must.
- Working in an Agile scrum team
- Good to have:
- Experience with AWS services (e.g., Step Functions, Glue, Lambda) and familiarity with Snowflake is a plus.
- Cloud Certification is good to have
- CI/CD experience

**Travelling**:
Travel to Denmark for induction and subsequence travels depend on project requirement but is likely.

**Remarks**: