Skip to main content
logo
Recruitment
  • Register Log in
    You are not logged in. (Log in)
Job Openings Calendar 0

All Blocks Settings

Job info

  1. Home
  2. Job Openings
  3. Job Postings
  4. Specialist - Data Engineer
  5. Summary

Specialist - Data Engineer

Specialist - Data Engineer
SHARE


Job Description
The Responsibilities of the Role:
  • More than 3 years' hands-on experience in building and maintaining data pipelines using tools such as Azure Data Factory and Databricks.
  • Proven experience in ETL development, with a strong focus on scalability, performance, and data integrity.
  • Proficient in programming with Python, R, or similar languages commonly used in data engineering and analytics.
  • Experienced in applying DevOps methodologies for data workflows, including CI/CD and automated deployment.
  • Solid background in cloud-based data platforms, with preference for Microsoft Azure technologies.
  • Holds valid and relevant professional certifications in cloud and data solutions, such as:
  • Azure Data Engineer Associate
  • Fabric Analytics Engineer Associate
  • Databricks Solution Architect Champion
  • Contributes to the development of robust data solutions that support analytics, reporting, and machine learning.
  • Collaborates effectively with cross-functional teams to deliver high-quality, secure, and scalable data infrastructure.
Skill Requirements:
  • Azure Fabric.
  • Databrick.
  • Excellent command of spoken & written English and Mandarin is added advantage.
  • Good multitasking ability.
  • Excellent computer knowledge: MS Office including MS Outlook and surfing the internet for information or research.
  • Excellent presentation and communication skills.
  • Fast typing skills (minimum 35 words per minute).
  • Good problem-solving skills.
  • Preferably with contact center background.
  • Good knowledge of Laptop / Desktop / Tablet products, peripheral devices.
  • Network basics (LAN & basic Wi-Fi knowledge).
  • Hardware Basics (Motherboard / Hard disk / Memory / Display / BIOS etc.).
  • Speak with good pace, articulate and have clear pronunciation.
  • More than 3 years of experience on data pipeline e.g. Azure Data Factory/Databricks.
  • Good experience with ETL.
  • Experience with Python, R or other similar languages.
  • Experience with DevOps methodologies.
  • Experience with cloud-based data platforms and Azure cloud technology preferred.
  • Degree, or an equivalent professional qualification in any principle.
  • Process valid professional certification(s) in cloud infrastructure and or Data Solution.
The Package :
  • Attractive Salary (RM6,250 – RM10,800).
  • Mandarin is an added advantage.
  • Medical Leave 14 days.
  • Medical and hospitalization coverage.
  • Experience Required :
    • Associate
    Location :
    • Bangsar South, Kuala Lumpur.
    Employment Type :
    • Full Time.

    About Us
    • Scicom (MSC) Bhd is a leading Malaysian technology and Business Process Outsourcing (BPO) company, bringing the power of the world's emerging technologies to companies all over the world.
    • Contact Us
      • hr.recruitment@scicom.com.my
      • 25th Floor, Menara TA One, 22, Jalan P. Ramlee, 50250 Kuala Lumpur, Malaysia
      • +(60) 3 21621088
      • Monday-Friday 9.00am-5.30pm
    • Follow Us On
      • Facebook
      • Twitter
      • LinkedIn
      • Instagram
    • Our Community
      • instagram-1-image
      • instagram-2-image
      • instagram-3-image
      • instagram-4-image
      • instagram-5-image
      • instagram-6-image
      Follow Us
    • Copyright © 2025 SCICOM (MSC) BERHAD.

    Policies