We found 193 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Job Description: Develop and manage Informatica application. Essential Skills: 6 + Hands on in Informatica or Talend Development Desirable Skills: Insurance domain knowledge.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

1. SAP Native HANA development experience with a good understanding of database and data warehousing concepts and analytics. 2. Design, build, Test SAP HANA Models by effectively translating business requirements into scalable solutions. 3. Experience working with HANA Studio, Web IDE and other SAP tools with an understanding of the Schemas and packaging in HANA. 4. Experience in HANA Smart Data Integration 5. Strong SAP HANA Dimensional Modelling Experience should be well versed in developing different HANA data models with graphical. 6. Experienced in SQL script, stored procedures, Table Functions, Hierarchies, designing tables. 7. Must have strong experience on Data Modeling Calculation views

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP HANA Modeling

Job Description

Job Description: We need associates experienced with skills Guidwire claim centre. Guidewire ClaimCenter- need Developer profiles only with Guidwire claim centre, Billing Centre & Policy Centre~

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Guidewire ClaimCenter

Job Description

Work Location: PUNE, MH / CHENNAI, TN Skill Required: Digital: Python, Digital: Amazon Web Service (AWS) Cloud Computing, Digital : PySpark Experience Range: 8 to 10 Years Role Description - 8 years of relevant experience in Python, Pyspark, Apache Airflow and AWS Services (EKS, EMR, Hashi Corp Vault, Glue, Docker, Kubernetes). - Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Experience in creating data pipelines and orchestrating using Apache Airflow. - Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. - Good to have Experience with cloud ETL and ELT in one of the tools like DBTGlueEMR. - Excellent communication skills to liaise with Business IT stakeholders. - Expertise in planning execution of a project and efforts estimation. - Exposure to working in Agile ways of working. Essential Skills: - AWS Data Engineer

Responsibilities

Work Location: PUNE, MH / CHENNAI, TN Skill Required: Digital: Python, Digital: Amazon Web Service (AWS) Cloud Computing, Digital : PySpark Experience Range: 8 to 10 Years Role Description - 8 years of relevant experience in Python, Pyspark, Apache Airflow and AWS Services (EKS, EMR, Hashi Corp Vault, Glue, Docker, Kubernetes). - Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Experience in creating data pipelines and orchestrating using Apache Airflow. - Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. - Good to have Experience with cloud ETL and ELT in one of the tools like DBTGlueEMR. - Excellent communication skills to liaise with Business IT stakeholders. - Expertise in planning execution of a project and efforts estimation. - Exposure to working in Agile ways of working. Essential Skills: - AWS Data Engineer
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

Work Location: PUNE, MH / CHENNAI, TN Skill Required: Digital: Python, Digital: Amazon Web Service (AWS) Cloud Computing, Digital : PySpark Experience Range: 8 to 10 Years Role Description - 8 years of relevant experience in Python, Pyspark, Apache Airflow and AWS Services (EKS, EMR, Hashi Corp Vault, Glue, Docker, Kubernetes). - Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Experience in creating data pipelines and orchestrating using Apache Airflow. - Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. - Good to have Experience with cloud ETL and ELT in one of the tools like DBTGlueEMR. - Excellent communication skills to liaise with Business IT stakeholders. - Expertise in planning execution of a project and efforts estimation. - Exposure to working in Agile ways of working. Essential Skills: - AWS Data Engineer

Responsibilities

Work Location: PUNE, MH / CHENNAI, TN Skill Required: Digital: Python, Digital: Amazon Web Service (AWS) Cloud Computing, Digital : PySpark Experience Range: 8 to 10 Years Role Description - 8 years of relevant experience in Python, Pyspark, Apache Airflow and AWS Services (EKS, EMR, Hashi Corp Vault, Glue, Docker, Kubernetes). - Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Experience in creating data pipelines and orchestrating using Apache Airflow. - Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. - Good to have Experience with cloud ETL and ELT in one of the tools like DBTGlueEMR. - Excellent communication skills to liaise with Business IT stakeholders. - Expertise in planning execution of a project and efforts estimation. - Exposure to working in Agile ways of working. Essential Skills: - AWS Data Engineer
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

Job Description: Design and configure Pega case types, flows, integrations, and UI. • Implement CDH strategies, including next-best-action, predictive analytics, and personalized engagement. • Develop customer service workflows including call centre support, digital servicing, and automation. • Collaborate with business analysts, architects, and testers to deliver end-to-end Pega solutions. • Ensure adherence to guardrails, best practices, and compliance standards. • Provide technical guidance to junior developers and assist in code reviews.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Pega Developer

Job Description

1)Proficient in maintenance and configuration of Guidewire products, with extensive experience in PolicyCenter, BillingCenter and ContactManager 2)Good working experience in Java 7 or higher version 3)Strong knowledge and hands on experience in PL SQL 4)Extensive experience in understanding and writing GOSU scripts 5)Hands-on experience in configurations tools, example Gitlab, Jenkins 6)Provide technical support for incidents raised by end users related to PolicyCenter, BillingCenter, Contact Manager and Radar Live applications 7)Experience in configuration of monitoring tools Splunk, Dynatrace

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Guidewire BillingCenter~Guidewire : PolicyCenter - Domain

Job Description

Agile Way of Working Workday Integration Cloud and Cloud Connect ob description: Workday Extend, Workday Studio (Expert), EIB, Core Connector, reporting, XSLT, document transformation etc Essential Skills: Workday Extend, Workday Studio (Expert), EIB, Core Connector, reporting, XSLT, document transformation etc Workday Extend, Workday Studio (Expert), EIB, Core Connector, reporting, XSLT, document transformation etc Desire Skills: Workday Extend, Workday Studio (Expert), EIB, Core Connector, reporting, XSLT, document transformation etc

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Agile Way of Working Workday Integration Cloud and Cloud Connect

Job Description

Work Location: PUNE, MH / CHENNAI, TN Skill Required: Digital: Python, Digital: Amazon Web Service (AWS) Cloud Computing, Digital : PySpark Experience Range: 8 to 10 Years Role Description - 8 years of relevant experience in Python, Pyspark, Apache Airflow and AWS Services (EKS, EMR, Hashi Corp Vault, Glue, Docker, Kubernetes). - Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Experience in creating data pipelines and orchestrating using Apache Airflow. - Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. - Good to have Experience with cloud ETL and ELT in one of the tools like DBTGlueEMR. - Excellent communication skills to liaise with Business IT stakeholders. - Expertise in planning execution of a project and efforts estimation. - Exposure to working in Agile ways of working. Essential Skills: - AWS Data Engineer

Responsibilities

Work Location: PUNE, MH / CHENNAI, TN Skill Required: Digital: Python, Digital: Amazon Web Service (AWS) Cloud Computing, Digital : PySpark Experience Range: 8 to 10 Years Role Description - 8 years of relevant experience in Python, Pyspark, Apache Airflow and AWS Services (EKS, EMR, Hashi Corp Vault, Glue, Docker, Kubernetes). - Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Experience in creating data pipelines and orchestrating using Apache Airflow. - Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. - Good to have Experience with cloud ETL and ELT in one of the tools like DBTGlueEMR. - Excellent communication skills to liaise with Business IT stakeholders. - Expertise in planning execution of a project and efforts estimation. - Exposure to working in Agile ways of working. Essential Skills: - AWS Data Engineer
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer