We found 153 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Job Title: Developer Work Location: ~BANGALORE~ all metropolitan cities Skill Required: Digital : Google Data Engineering Experience Range in Required Skills: 5 - 10 yrs Job Description: Senior GCP Data Engineer Good Hands on knowledge on GCP, pyspark Should have worked on Data Migration projects, from On-prem to Cloud Should have Cloud Storage Knowledge, Big Query, Cluster Knowledge Sound programming knowledge on PySpark & SQL in terms of processing large amount of semi structured & unstructured data Ability to design data pipelines in end to end manner Knowledge on Avro, Parquet format Knowledge on working on Hadoop Big Data platform and ecosystem Strong debugging and troubleshooting capabilities. Have experience to guide the Technical Team for attaining the delivery milestones Essential Skills: Senior GCP Data Engineer Senior GCP Data Engineer

Responsibilities

Job Title: Developer Work Location: ~BANGALORE~ all metropolitan cities Skill Required: Digital : Google Data Engineering Experience Range in Required Skills: 5 - 10 yrs Job Description: Senior GCP Data Engineer Good Hands on knowledge on GCP, pyspark Should have worked on Data Migration projects, from On-prem to Cloud Should have Cloud Storage Knowledge, Big Query, Cluster Knowledge Sound programming knowledge on PySpark & SQL in terms of processing large amount of semi structured & unstructured data Ability to design data pipelines in end to end manner Knowledge on Avro, Parquet format Knowledge on working on Hadoop Big Data platform and ecosystem Strong debugging and troubleshooting capabilities. Have experience to guide the Technical Team for attaining the delivery milestones Essential Skills: Senior GCP Data Engineer Senior GCP Data Engineer
  • Salary : Rs. 90,000.0 - Rs. 1,40,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

Work Location: PUNE, MH / CHENNAI, TN Skill Required: Digital: Python, Digital: Amazon Web Service (AWS) Cloud Computing, Digital : PySpark Experience Range: 8 to 10 Years Role Description - 8 years of relevant experience in Python, Pyspark, Apache Airflow and AWS Services (EKS, EMR, Hashi Corp Vault, Glue, Docker, Kubernetes). - Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Experience in creating data pipelines and orchestrating using Apache Airflow. - Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. - Good to have Experience with cloud ETL and ELT in one of the tools like DBTGlueEMR. - Excellent communication skills to liaise with Business IT stakeholders. - Expertise in planning execution of a project and efforts estimation. - Exposure to working in Agile ways of working. Essential Skills: - AWS Data Engineer

Responsibilities

Work Location: PUNE, MH / CHENNAI, TN Skill Required: Digital: Python, Digital: Amazon Web Service (AWS) Cloud Computing, Digital : PySpark Experience Range: 8 to 10 Years Role Description - 8 years of relevant experience in Python, Pyspark, Apache Airflow and AWS Services (EKS, EMR, Hashi Corp Vault, Glue, Docker, Kubernetes). - Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Experience in creating data pipelines and orchestrating using Apache Airflow. - Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. - Good to have Experience with cloud ETL and ELT in one of the tools like DBTGlueEMR. - Excellent communication skills to liaise with Business IT stakeholders. - Expertise in planning execution of a project and efforts estimation. - Exposure to working in Agile ways of working. Essential Skills: - AWS Data Engineer
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

Work Location: PUNE, MH / CHENNAI, TN Skill Required: Digital: Python, Digital: Amazon Web Service (AWS) Cloud Computing, Digital : PySpark Experience Range: 8 to 10 Years Role Description - 8 years of relevant experience in Python, Pyspark, Apache Airflow and AWS Services (EKS, EMR, Hashi Corp Vault, Glue, Docker, Kubernetes). - Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Experience in creating data pipelines and orchestrating using Apache Airflow. - Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. - Good to have Experience with cloud ETL and ELT in one of the tools like DBTGlueEMR. - Excellent communication skills to liaise with Business IT stakeholders. - Expertise in planning execution of a project and efforts estimation. - Exposure to working in Agile ways of working. Essential Skills: - AWS Data Engineer

Responsibilities

Work Location: PUNE, MH / CHENNAI, TN Skill Required: Digital: Python, Digital: Amazon Web Service (AWS) Cloud Computing, Digital : PySpark Experience Range: 8 to 10 Years Role Description - 8 years of relevant experience in Python, Pyspark, Apache Airflow and AWS Services (EKS, EMR, Hashi Corp Vault, Glue, Docker, Kubernetes). - Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Experience in creating data pipelines and orchestrating using Apache Airflow. - Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. - Good to have Experience with cloud ETL and ELT in one of the tools like DBTGlueEMR. - Excellent communication skills to liaise with Business IT stakeholders. - Expertise in planning execution of a project and efforts estimation. - Exposure to working in Agile ways of working. Essential Skills: - AWS Data Engineer
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

Work Location: PUNE, MH / CHENNAI, TN Skill Required: Digital: Python, Digital: Amazon Web Service (AWS) Cloud Computing, Digital : PySpark Experience Range: 8 to 10 Years Role Description - 8 years of relevant experience in Python, Pyspark, Apache Airflow and AWS Services (EKS, EMR, Hashi Corp Vault, Glue, Docker, Kubernetes). - Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Experience in creating data pipelines and orchestrating using Apache Airflow. - Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. - Good to have Experience with cloud ETL and ELT in one of the tools like DBTGlueEMR. - Excellent communication skills to liaise with Business IT stakeholders. - Expertise in planning execution of a project and efforts estimation. - Exposure to working in Agile ways of working. Essential Skills: - AWS Data Engineer

Responsibilities

Work Location: PUNE, MH / CHENNAI, TN Skill Required: Digital: Python, Digital: Amazon Web Service (AWS) Cloud Computing, Digital : PySpark Experience Range: 8 to 10 Years Role Description - 8 years of relevant experience in Python, Pyspark, Apache Airflow and AWS Services (EKS, EMR, Hashi Corp Vault, Glue, Docker, Kubernetes). - Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Experience in creating data pipelines and orchestrating using Apache Airflow. - Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. - Good to have Experience with cloud ETL and ELT in one of the tools like DBTGlueEMR. - Excellent communication skills to liaise with Business IT stakeholders. - Expertise in planning execution of a project and efforts estimation. - Exposure to working in Agile ways of working. Essential Skills: - AWS Data Engineer
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

As an Infra Tech Support Practitioner, you will provide ongoing technical support and maintenance for production and development systems and software products. Your typical day will involve troubleshooting issues, implementing technology solutions, and ensuring the smooth operation of services across various platforms. You will engage with both remote and onsite teams to address hardware and software challenges, contributing to the overall efficiency and reliability of the systems in place. Your role will require a proactive approach to problem-solving and a commitment to maintaining high standards of service delivery. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their technical skills.- Monitor system performance and implement improvements to optimize efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in DevOps.- Strong understanding of continuous integration and continuous deployment practices.- Experience with cloud platforms such as AWS, Azure, or Google Cloud.- Familiarity with containerization technologies like Docker and Kubernetes.- Knowledge of scripting languages such as Python, Bash, or PowerShell. Additional Information: - The candidate should have minimum 5 years of experience in DevOps.- This position is based at our Bengaluru office.- A 15 years full time education is required.

Responsibilities

As an Infra Tech Support Practitioner, you will provide ongoing technical support and maintenance for production and development systems and software products. Your typical day will involve troubleshooting issues, implementing technology solutions, and ensuring the smooth operation of services across various platforms. You will engage with both remote and onsite teams to address hardware and software challenges, contributing to the overall efficiency and reliability of the systems in place. Your role will require a proactive approach to problem-solving and a commitment to maintaining high standards of service delivery. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their technical skills.- Monitor system performance and implement improvements to optimize efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in DevOps.- Strong understanding of continuous integration and continuous deployment practices.- Experience with cloud platforms such as AWS, Azure, or Google Cloud.- Familiarity with containerization technologies like Docker and Kubernetes.- Knowledge of scripting languages such as Python, Bash, or PowerShell. Additional Information: - The candidate should have minimum 5 years of experience in DevOps.- This position is based at our Bengaluru office.- A 15 years full time education is required.
  • Salary : Rs. 0.0 - Rs. 1,80,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Infra Tech Support Practitioner

Job Description

IT IS_AMS_Mainframe_IMS Administration Job Description: This is a remote position with work hours aligned to EST shifts. Proficiency in REXX, and CLIST programming. Familiarity with incident management tools (ServiceNow, BMC Remedy).

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :IT IS_AMS_Mainframe_IMS Administration

Job Description

As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality applications that meet user expectations and business goals. Roles & Responsibilities: - DataBricks - Strong SQL , Data Bricks Experience with AWS. DevOps CI/CD + Python - Optional but preferred- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Participate in code reviews to ensure adherence to best practices and coding standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with DevOps, Python (Programming Language), Oracle Procedural Language Extensions to SQL (PLSQL).- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and analytics.- Familiarity with agile development methodologies. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required.

Responsibilities

As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality applications that meet user expectations and business goals. Roles & Responsibilities: - DataBricks - Strong SQL , Data Bricks Experience with AWS. DevOps CI/CD + Python - Optional but preferred- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Participate in code reviews to ensure adherence to best practices and coding standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with DevOps, Python (Programming Language), Oracle Procedural Language Extensions to SQL (PLSQL).- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and analytics.- Familiarity with agile development methodologies. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required.
  • Salary : Rs. 0.0 - Rs. 1,80,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Application Developer

Job Description

 Which SAP service is responsible for managing locks in a distributed system?  ENQUEUE 2  What is the default port number for SAP Dispatcher communication?  3200 3  Which SAP profile parameter defines the maximum number of dialog work processes?  rdisp/wp_no_dia_wp 4  Transaction code to monitor SAP work process and memory allocation details? ST02 5  Which transaction code do you use to do workload analysis in SAP system? ST03/ST03n 6  Which tool do you use to upgrade/update a SAP system? SUM 7  Where does the Default user credentials maintained for SAP Application to HANA Database connectivity? HDBUSERSTORE 8  Which tool is used to update/upgrade a HANA Database? HDBLCM 9  What services are hosted in a ASCS instance? message server and enqueue server 10  Which service distributes the workprocesses in an application instance? Dispatcher

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP BASIS

Job Description

Customer Engagement Manager/Application Service Manager (ASM) Location: SAP Pune Experience Required: 5+ Years Job Summary: SAP Labs Pune is looking for an Application Service Manager to take ownership of customer relationships, drive retention and renewals and identify opportunities for growth. In this role, you will manage end-toend transitions, collaborate with global stakeholders and ensure smooth handovers from implementation to support. Acting as the primary point of contact for customers, you will lead regular interactions, resolve escalations, champion continuous improvement initiatives and position SAP services to boost customer satisfaction and long-term account value. Key Responsibilities: • Lead end-to-end project transitions from build to run phases, ensuring seamless handover to the RUN team. • Front-end transition activities, from initial setup to operational readiness, across multiple global accounts. • Collaborate with internal SAP teams (Infrastructure, Delivery, Customer Teams) and global stakeholders to ensure successful service delivery. • Act as the primary customer contact, managing engagement, service escalations, and ongoing relationship health. • Drive renewals, reduce churn, and identify opportunities for upselling and cross-selling SAP services. • Identify risks early and work with stakeholders to implement effective mitigation plans. • Analyze customer processes and challenges to recommend continuous improvement (CI) and innovation opportunities. • Conduct customer workshops to position SAP services and support long-term customer success. Personal Attributes: • Strong leadership and management skills. • Strong interpersonal and communication skills. • Customer-oriented with a solution-driven approach. • Analytical, detail-oriented and skilled in project management • Results-driven with a proactive mindset. • Expertise in Microsoft Excel, PowerPoint. Education Requirements: - Mandatory master’s degree from a recognized university - Bachelor’s degree in engineering, Computer Science, Information Systems, or a similar domain

Responsibilities

  • Salary : Rs. 13,74,100.0 - Rs. 13,74,100.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP BASIS