Software Engineer
Medha is building a unified intelligence ecosystem for healthcare. As a designer, you’ll influence products that drive decisions for clinicians, operations, business leaders, and frontline users.
Our design team is new and growing. You’ll play a foundational role in establishing the design culture, processes, and standards from the ground up.
Every product you design directly supports patient outcomes, operational efficiency, and healthcare quality across some of the largest health systems.
You’ll have opportunities to meet users across hospitals, understand real-world behaviours, and translate them into actionable product experiences.
Work closely with product, engineering, and domain teams to deeply understand Medha’s core value proposition and solve problems with clarity and intent.
We are looking for enthusiastic and motivated freshers to join our Data Engineering team. As a Data Engineer, you will work on building and maintaining data pipelines, supporting analytics use cases, and contributing to modern data platform development. You will collaborate with cross-functional teams to deliver reliable and scalable data solutions.
Key Responsibilities:
- Assist in building and maintaining data pipelines for data ingestion, transformation, and loading (ETL/ELT).
- Support in developing and managing datasets for reporting and dashboarding needs.
- Work with structured and unstructured data from multiple sources.
- Write and optimize SQL queries for data extraction and analysis.
- Help in data cleaning, transformation, and basic data modeling tasks.
- Collaborate with product, analytics, and business teams to understand data requirements.
- Assist in monitoring data pipelines and resolving basic issues.
- Contribute to improving data quality and reliability.
- Document processes, workflows, and data definitions.
Required Skills (Technical):
- Basic knowledge of SQL (must-have).
- Understanding of any programming language such as Python / Java / Scala (Python preferred).
- Familiarity with Excel for data analysis.
- Basic understanding of data structures and algorithms.
- Basic understanding of ETL concepts and data pipelines.
- Exposure to cloud platforms (AWS / Azure / GCP) is a plus.
- Understanding of big data concepts (Spark, Hadoop)
Required Skills (Non-Technical):
- Strong analytical and problem-solving skills.
- Willingness to learn and adapt to new technologies.
- Ability to work in a fast-paced and dynamic environment.
-
Good communication skills (written and verbal).
-
Ability to work both independently and in a team.