About the Role
As a Data Scientist with 5+ years of experience, you will design, develop and deploy end-to-end AI/ML solutions — from classical ML models to LLM-powered agentic systems. You will work across the full ML lifecycle: data exploration, feature engineering, model training, evaluation, fine-tuning and production deployment. You will collaborate closely with engineering, product and domain teams to translate business problems into scalable, reliable AI solutions.
Key Responsibilities
Design and develop machine learning models (classification, regression, clustering, recommendation) using Python and industry-standard ML libraries (scikit-learn, XGBoost, LightGBM, TensorFlow/PyTorch).
Build and optimize Retrieval-Augmented Generation (RAG) and Agentic RAG pipelines for knowledge-intensive applications using vector databases (FAISS, Azure AI Search, ChromaDB) and embedding models.
Architect and implement multi-agent orchestration systems using frameworks such as LangChain, LangGraph, Semantic Kernel, AutoGen or CrewAI.
Design and integrate Model Context Protocol (MCP) based tool-use patterns to enable LLM agents to interact with external APIs, databases and enterprise systems.
Fine-tune foundation models (LLMs, SLMs) using techniques such as LoRA, QLoRA, PEFT and RLHF for domain-specific tasks.
Perform prompt engineering, chain-of-thought reasoning and evaluation of LLM outputs for accuracy, safety and reliability.
Conduct exploratory data analysis (EDA), feature engineering and data pipeline development to support model training and inference.
Deploy and serve models using Azure ML, MLflow, FastAPI or similar frameworks with containerized (Docker/Kubernetes) production environments.
Monitor model performance in production, implement drift detection and establish retraining strategies.
Collaborate with cross-functional teams to define problem statements, success metrics and deliver AI solutions aligned with business objectives.
Required Skills
5+ years of hands-on experience in data science, machine learning or applied AI roles.
Strong proficiency in Python and ML/DL libraries (scikit-learn, pandas, NumPy, TensorFlow, PyTorch, Hugging Face Transformers).
Solid understanding of classical ML algorithms (linear/logistic regression, decision trees, ensemble methods, SVMs, clustering, dimensionality reduction).
Proven experience building RAG pipelines and working with vector stores, embeddings and retrieval strategies.
Hands-on experience with agentic AI patterns, multi-agent orchestration and tool-use frameworks (LangChain, LangGraph, Semantic Kernel, AutoGen, CrewAI).
Familiarity with Model Context Protocol (MCP) and its application in connecting LLM agents to external tools and data sources.
Experience fine-tuning large language models using LoRA, QLoRA, PEFT or similar parameter-efficient methods.
Working knowledge of Azure AI/ML services (Azure OpenAI, Azure ML, Cognitive Services, Azure AI Search).
Experience with experiment tracking and model registry tools (MLflow, Weights & Biases).
Strong analytical and problem-solving skills with the ability to communicate technical findings to non-technical stakeholders.
Nice to Have
Experience with MLOps/AIOps practices: CI/CD for ML, automated training pipelines, model versioning, A/B testing, model monitoring and AIOps-driven incident detection.
Hands-on experience deploying models at scale using Docker, Kubernetes (AKS) and serverless inference endpoints.
Exposure to graph neural networks, time-series forecasting or computer vision.
Experience with Spark/PySpark or Databricks for large-scale data processing.
Domain experience in automotive, manufacturing or engineering systems.
Familiarity with responsible AI practices, bias detection and model explainability (SHAP, LIME).
Tooling & Engineering Expectations
Use Git for version control with standard branching and pull request workflows.
Maintain reproducible experiments using MLflow, DVC or equivalent experiment tracking tools.
Participate in code reviews and follow agreed coding standards and documentation practices.
Work within existing CI/CD pipelines (e.g., Azure DevOps/GitHub Actions) for model training, testing and deployment automation.
Document model architectures, training procedures, evaluation metrics and deployment configurations.
Responsibilities
About the Role
As a Data Scientist with 5+ years of experience, you will design, develop and deploy end-to-end AI/ML solutions — from classical ML models to LLM-powered agentic systems. You will work across the full ML lifecycle: data exploration, feature engineering, model training, evaluation, fine-tuning and production deployment. You will collaborate closely with engineering, product and domain teams to translate business problems into scalable, reliable AI solutions.
Key Responsibilities
Design and develop machine learning models (classification, regression, clustering, recommendation) using Python and industry-standard ML libraries (scikit-learn, XGBoost, LightGBM, TensorFlow/PyTorch).
RAG
Build and optimize Retrieval-Augmented Generation (RAG) and Agentic RAG pipelines for knowledge-intensive applications using vector databases (FAISS, Azure AI Search, ChromaDB) and embedding models.
Architect and implement multi-agent orchestration systems using frameworks such as LangChain, LangGraph, Semantic Kernel, AutoGen or CrewAI.
Design and integrate Model Context Protocol (MCP) based tool-use patterns to enable LLM agents to interact with external APIs, databases and enterprise systems.
Fine-tune foundation models (LLMs, SLMs) using techniques such as LoRA, QLoRA, PEFT and RLHF for domain-specific tasks.
Perform prompt engineering, chain-of-thought reasoning and evaluation of LLM outputs for accuracy, safety and reliability.
Conduct exploratory data analysis (EDA), feature engineering and data pipeline development to support model training and inference.
Deploy and serve models using Azure ML, MLflow, FastAPI or similar frameworks with containerized (Docker/Kubernetes) production environments.
Monitor model performance in production, implement drift detection and establish retraining strategies.
Collaborate with cross-functional teams to define problem statements, success metrics and deliver AI solutions aligned with business objectives.
Required Skills
5+ years of hands-on experience in data science, machine learning or applied AI roles.
Strong proficiency in Python and ML/DL libraries (scikit-learn, pandas, NumPy, TensorFlow, PyTorch, Hugging Face Transformers).
Solid understanding of classical ML algorithms (linear/logistic regression, decision trees, ensemble methods, SVMs, clustering, dimensionality reduction).
Proven experience building RAG pipelines and working with vector stores, embeddings and retrieval strategies.
Hands-on experience with agentic AI patterns, multi-agent orchestration and tool-use frameworks (LangChain, LangGraph, Semantic Kernel, AutoGen, CrewAI).
Familiarity with Model Context Protocol (MCP) and its application in connecting LLM agents to external tools and data sources.
Experience fine-tuning large language models using LoRA, QLoRA, PEFT or similar parameter-efficient methods.
Working knowledge of Azure AI/ML services (Azure OpenAI, Azure ML, Cognitive Services, Azure AI Search).
Experience with experiment tracking and model registry tools (MLflow, Weights & Biases).
Strong analytical and problem-solving skills with the ability to communicate technical findings to non-technical stakeholders.
Nice to Have
Experience with MLOps/AIOps practices: CI/CD for ML, automated training pipelines, model versioning, A/B testing, model monitoring and AIOps-driven incident detection.
Hands-on experience deploying models at scale using Docker, Kubernetes (AKS) and serverless inference endpoints.
Exposure to graph neural networks, time-series forecasting or computer vision.
Experience with Spark/PySpark or Databricks for large-scale data processing.
Domain experience in automotive, manufacturing or engineering systems.
Familiarity with responsible AI practices, bias detection and model explainability (SHAP, LIME).
Tooling & Engineering Expectations
Use Git for version control with standard branching and pull request workflows.
Maintain reproducible experiments using MLflow, DVC or equivalent experiment tracking tools.
Participate in code reviews and follow agreed coding standards and documentation practices.
Work within existing CI/CD pipelines (e.g., Azure DevOps/GitHub Actions) for model training, testing and deployment automation.
Document model architectures, training procedures, evaluation metrics and deployment configurations.
Salary : Rs. 14,00,000.0 - Rs. 15,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Hands on exp on FICO
Good understanding of Finance related Business process and period end closing activities
exp on sub modules like AP, AR, GL, Asses accounting , TAX and bank accounting
tech aspects like idocs, user exits,BADI's workflow
integration with SD and MM
basic knowledge to controlling - COPA, PCA, CCA
good communication skils
interpersonal skills
analyze,design,configuration , test , implement SAP Solutions to meet business requirement
Responsibilities
Hands on exp on FICO
Good understanding of Finance related Business process and period end closing activities
exp on sub modules like AP, AR, GL, Asses accounting , TAX and bank accounting
tech aspects like idocs, user exits,BADI's workflow
integration with SD and MM
basic knowledge to controlling - COPA, PCA, CCA
good communication skils
interpersonal skills
analyze, design, configuration , test , implement SAP Solutions to meet business requirement
Salary : Rs. 14,00,000.0 - Rs. 16,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Demonstrate technical expertise in end to end configuration of sales and supply chain processes in SAP
SAP configuration of SD module.
Sales order processes, bill of lading , shipping, distributed requirement planning, variant configuration
SAP Pricing , releaase procedures,Condition records, condition types.
Design, customize, configure and testing of SD.
Inter company billing,Intercompany STO,Third party sales, Output procedure
Responsibilities
Demonstrate technical expertise in end to end configuration of sales and supply chain processes in SAP
SAP configuration of SD module.
Sales order processes, bill of lading , shipping, distributed requirement planning, variant configuration
SAP Pricing , releaase procedures,Condition records, condition types.
Design, customize, configure and testing of SD.
Inter company billing,Intercompany STO,Third party sales, Output procedure
Salary : Rs. 14,00,000.0 - Rs. 18,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Maintain SOPs, process documentation, and user guides for PTP processes.
Ensure adherence to internal controls, audit requirements, and standard procedures.
Identify opportunities to improve PTP efficiency and process compliance.
Required Skills & Experience
Core Functional Skills
3–4 years of hands‑on experience in Oracle EBS PTP processes.
Strong working knowledge of:
Purchase Orders
Accounts Payable (AP) Processing
Invoice Matching (2‑way / 3‑way)
Experience supporting PTP operations in ERP environments.
ERP Knowledge
Oracle E‑Business Suite (EBS) functional expertise in Purchasing and AP modules.
Understanding of the end‑to‑end Procure‑to‑Pay lifecycle.
Soft Skills
Strong analytical and problem‑solving skills.
Good communication skills to interact with business and finance teams.
Ability to manage multiple issues in a support environment.
Good to Have
Experience in ERP AMS or shared services models.
Exposure to vendor reconciliations and audit support.
Familiarity with ITSM / ticketing tools."
Responsibilities
Maintain SOPs, process documentation, and user guides for PTP processes.
Ensure adherence to internal controls, audit requirements, and standard procedures.
Identify opportunities to improve PTP efficiency and process compliance.
Required Skills & Experience
Core Functional Skills
3–4 years of hands‑on experience in Oracle EBS PTP processes.
Strong working knowledge of:
Purchase Orders
Accounts Payable (AP) Processing
Invoice Matching (2‑way / 3‑way)
Experience supporting PTP operations in ERP environments.
ERP Knowledge
Oracle E‑Business Suite (EBS) functional expertise in Purchasing and AP modules.
Understanding of the end‑to‑end Procure‑to‑Pay lifecycle.
Soft Skills
Strong analytical and problem‑solving skills.
Good communication skills to interact with business and finance teams.
Ability to manage multiple issues in a support environment.
Good to Have
Experience in ERP AMS or shared services models.
Exposure to vendor reconciliations and audit support.
Familiarity with ITSM / ticketing tools."
Salary : Rs. 9,00,000.0 - Rs. 11,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
6+ years of experience with SAP BW and BW on HANA -Mandatory
Experience on BW ABAP - Mandatory
Experience & Good Knowledge in all BW key areas covering architecture, modelling ,extraction, ETL & Reporting -Mandatory
Experience & Good Knowledge in LSA ++ architecture
Experience & Good Knowledge in BW on HANA concepts Hybrid Modelling including Native HANA Models ( Calculation Views / CDS Views) - Mandatory
ABAP skills from BW Perspective [Routines, Classes, AMDP, Procedures ABAP CDS etc) – Hands on Experience Required
End to End Experience in gathering requirement, Functional Analysis, HLD, LLD, Buiil, Testing & Production deployment
Extensive experience with analysis, design, development, customization, and BW analytics
Proficiency in analyzing and translating business requirements to technical requirements and architecture.
Extensive experience with complex SAP BW Environment & Architectures
Delivering complex projects using Agile Scrum methodology
Preferred Functional Knowledge Production Planning, Finance , Month End closing related tasks
Should be flexible to support team during MEC period.
Preferred knowledge in Devops Test automation
Good Problem-solving skills
Responsibilities
6+ years of experience with SAP BW and BW on HANA -Mandatory
Experience on BW ABAP - Mandatory
Experience & Good Knowledge in all BW key areas covering architecture, modelling ,extraction, ETL & Reporting -Mandatory
Experience & Good Knowledge in LSA ++ architecture
Experience & Good Knowledge in BW on HANA concepts Hybrid Modelling including Native HANA Models ( Calculation Views / CDS Views) - Mandatory
ABAP skills from BW Perspective [Routines, Classes, AMDP, Procedures ABAP CDS etc) – Hands on Experience Required
End to End Experience in gathering requirement, Functional Analysis, HLD, LLD, Buiil, Testing & Production deployment
Extensive experience with analysis, design, development, customization, and BW analytics
Proficiency in analyzing and translating business requirements to technical requirements and architecture.
Extensive experience with complex SAP BW Environment & Architectures
Delivering complex projects using Agile Scrum methodology
Preferred Functional Knowledge Production Planning, Finance , Month End closing related tasks
Should be flexible to support team during MEC period.
Preferred knowledge in Devops Test automation
Good Problem-solving skills
Salary : Rs. 14,00,000.0 - Rs. 18,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Databricks
Role Descriptions: Platform InfrastructureOversee Databricks platform configuration| resource management| workspace structuring| and cluster optimization.Monitor and troubleshoot performance issues across clusters| jobs| notebooks| and pipelines.Implement governance| security| compliance and data access control using Role-Based Access Control (RBAC) and Unity Catalog.Pipeline Development ArchitectureDesign and implement end-to-end data pipelines using PySpark| SQL| and Delta Lake within a medallion architecture using Data factory And Data bricks.Build real-time and batch DLT pipelines using Databricks Delta Live Tables with a focus on reliability and scalability.Optimize Lakehouse architecture for performance| cost-efficiency| and data integrity.Automate data ingestion| transformation| and validation| including support for streaming (Autoloader) and scheduled workflows.Perform data transformations| cleansing and validations using data quality rules for consistent and accurate data sets.Manage and monitor job orchestration| ensuring efficient pipelines run and reliability.CICD DevOpsDesign and maintain CICD pipelines for Databricks artifacts (notebooks| jobs| libraries) using tools such as Azure DevOps| GitHub Actions| Terraform or Jenkins.Support trunk-based development| deployment workflows| and infrastructure-as-code practices.Manage version control and automated testing using Git and related DevOps practices.Collaboration DeliveryCollaborate with product owners| business stakeholders| and data teams to gather requirements and translate them into technical solutions.Drive the adoption of best practices in coding| versioning| testing| deployment| monitoring| and security.Provide thought leadership on the best practices in Data Engineering| Architecture and Cloud Computing.Performance OptimizationDeliver optimized spark jobs and SQL queries for large scale data processing.Implement partitioning| caching and indexing strategies to improve performance and scalability of big data workloads.Conduct POCs for capacity planning and recommend appropriate infrastructure optimizations for cost effectiveness.Documentation Knowledge SharingCreated detailed documentations and review them for data workflows| SOPs| Architectural reviews etc.Mentor junior team members and promote a culture of learning and innovation.Promote the culture of optimization and cost saving and enable research driven development.Required QualificationsTechnical Expertise5 years in data engineering| with a strong focus on Databricks and Azure ecosystems.Deep hands-on experience with Data Factory| Databricks Lakehouse Architecture| Delta Lake| PySpark| and Spark job optimization.Proficiency in Python| SQL| and optionally Scala for building scalable ETLELT pipelines.Strong SQL skills are essential| with hands-on experience in SQL Server or other RDBMS platforms.Strong experience in designing and optimizing DLT pipelines| managing asset
Responsibilities
Databricks
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance