Job Title: Consultant
Work Location: ~HYDERABAD~PUNE~BANGALORE~THANE~
Skills Required: SAP ERP Human Capital Management (HCM)
Experience Range in Required Skills: 5+ Relevant years
Job Description:Responsibilities Develop and provide functional and technical expertise for HR solutions focusing on the such as Personnel Administration, Time Management (Negative and Positive) and Payroll. Must have Time Management (Negative and Positive) expertise Good understanding of the HCM Functionalities to be able to prepare high quality functional Specifications as per the requirements. Good experience in SAP Time Management (Negative and Positive), with strong experience in customizing Personnel Calculation Rules (PCRs) and Schemas Extensive Experience in Time Management Attendance Modules at least 1- 2 full life cycle implementations Knowledge of employee life cycle and payroll process is an advantage. Liaise directly with business areas to diagnose problems with existing SAP HCM programs or initiate designs of new SAP applications independently, to define business needs and potential solutions based on information technology. Key Responsibilities - Collaborate with team members Collaborate with key stakeholders to gather requirements, identify process improvement opportunities, and develop effective solutions aligned with organizational goals. Must have experience in SAP Time Management (Negative and Positive) and Payroll, with strong experience in customizing Personnel Calculation Rules (PCRs) and Schemas - Assist in the development and implementation of Time and Payroll functionalities. - Work with other team members to ensure project deadlines are met. - Involvement in process improvements, system enhancements, and general problem solving to improve customer experience and team effectiveness - Participate in team meetings to discuss project updates and progress.
Responsibilities
Job Title: Consultant
Work Location: ~HYDERABAD~PUNE~BANGALORE~THANE~
Skills Required: SAP ERP Human Capital Management (HCM)
Experience Range in Required Skills: 5+ Relevant years
Job Description:Responsibilities Develop and provide functional and technical expertise for HR solutions focusing on the such as Personnel Administration, Time Management (Negative and Positive) and Payroll. Must have Time Management (Negative and Positive) expertise Good understanding of the HCM Functionalities to be able to prepare high quality functional Specifications as per the requirements. Good experience in SAP Time Management (Negative and Positive), with strong experience in customizing Personnel Calculation Rules (PCRs) and Schemas Extensive Experience in Time Management Attendance Modules at least 1- 2 full life cycle implementations Knowledge of employee life cycle and payroll process is an advantage. Liaise directly with business areas to diagnose problems with existing SAP HCM programs or initiate designs of new SAP applications independently, to define business needs and potential solutions based on information technology. Key Responsibilities - Collaborate with team members Collaborate with key stakeholders to gather requirements, identify process improvement opportunities, and develop effective solutions aligned with organizational goals. Must have experience in SAP Time Management (Negative and Positive) and Payroll, with strong experience in customizing Personnel Calculation Rules (PCRs) and Schemas - Assist in the development and implementation of Time and Payroll functionalities. - Work with other team members to ensure project deadlines are met. - Involvement in process improvements, system enhancements, and general problem solving to improve customer experience and team effectiveness - Participate in team meetings to discuss project updates and progress.
Salary : Rs. 70,000.0 - Rs. 1,10,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
. Concur expense consultant with experience in configuration and integration with Credit Card systems,
2. Someone who can also document procedures, train end users, and accounting folks for GL integration process.
3. Improve end user experience and to enable better reporting for Travel spend analysis.
4. Responsible for Setup and help maintain client Concur environment throughout the lifecycle (Setup Configuration in Test and Production)
5. Monitor Daily Job between SAP Concur and SAP Finance and correct errors (IDOC Posting)
6. Support Employee Reimbursement process via Payroll
7. Support Incidents, User access Requests and Reporting
8. Support client testing cycles, research complex customer problems, issues, and circumstances and provide recommendations, alternatives and risk assessments
9. Minimum 5+ years' relevant SAP Concur Expense implementation experience; experience with other Concur Platforms such as Invoice, Travel, and Request a plus
10. Hands-on functional configuration and design experience
11. Prior work experience in Accounting and/or Accounts Payable; knowledge of business processes around account settlement, clearing accounts etc.
12. For Concur the resource needs to be certified to get access to Concur application
Responsibilities
. Concur expense consultant with experience in configuration and integration with Credit Card systems,
2. Someone who can also document procedures, train end users, and accounting folks for GL integration process.
3. Improve end user experience and to enable better reporting for Travel spend analysis.
4. Responsible for Setup and help maintain client Concur environment throughout the lifecycle (Setup Configuration in Test and Production)
5. Monitor Daily Job between SAP Concur and SAP Finance and correct errors (IDOC Posting)
6. Support Employee Reimbursement process via Payroll
7. Support Incidents, User access Requests and Reporting
8. Support client testing cycles, research complex customer problems, issues, and circumstances and provide recommendations, alternatives and risk assessments
9. Minimum 5+ years' relevant SAP Concur Expense implementation experience; experience with other Concur Platforms such as Invoice, Travel, and Request a plus
10. Hands-on functional configuration and design experience
11. Prior work experience in Accounting and/or Accounts Payable; knowledge of business processes around account settlement, clearing accounts etc.
12. For Concur the resource needs to be certified to get access to Concur application
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in shaping the direction of application projects and ensuring that they meet the needs of the organization and its clients. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Good To Have Skills: Experience with cloud computing platforms.- Strong understanding of application development methodologies.- Experience in managing cross-functional teams.- Familiarity with Agile and DevOps practices. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft Azure Databricks.- This position is based at our Hyderabad office.- A 15 years full time education is required.
Responsibilities
As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in shaping the direction of application projects and ensuring that they meet the needs of the organization and its clients. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Good To Have Skills: Experience with cloud computing platforms.- Strong understanding of application development methodologies.- Experience in managing cross-functional teams.- Familiarity with Agile and DevOps practices. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft Azure Databricks.- This position is based at our Hyderabad office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 18,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Description: * CMDB configuration and customization: Design, develop, and maintain the CMDB, including configuring Configuration Item (CI) classes, attributes, and relationships according to the Common Service Data Model (CSDM) framework.
* Discovery and service mapping: Implement, manage, and configure ServiceNow Discovery to automate IT asset identification and track CIs. Configure and maintain Service Mapping to build and visualize relationships between CIs.
* Identification and reconciliation: Configure and use the Identification and Reconciliation Engine (IRE) to manage data imports from multiple sources and ensure data integrity within the CMDB.
Integrations: Develop and maintain integrations between the ServiceNow CMDB and other IT systems (e.g., SCCM, monitoring tools, cloud services) using REST/SOAP APIs and IntegrationHub to ensure comprehensive and accurate CI data.
* Scripting and development: Write and maintain server-side (JavaScript, Glide API) and client-side (JavaScript) scripts, as well as Flow Designer actions, to enhance CMDB functionality and automate processes.
Collaboration and support
* Troubleshooting: Diagnose, isolate, and resolve complex CMDB-related issues, such as data discrepancies, integration failures, and discovery errors.
* Technical guidance: Provide technical support and guidance to other IT teams, process owners, and stakeholders regarding CMDB best practices and capabilities.
Responsibilities
Description: * CMDB configuration and customization: Design, develop, and maintain the CMDB, including configuring Configuration Item (CI) classes, attributes, and relationships according to the Common Service Data Model (CSDM) framework.
* Discovery and service mapping: Implement, manage, and configure ServiceNow Discovery to automate IT asset identification and track CIs. Configure and maintain Service Mapping to build and visualize relationships between CIs.
* Identification and reconciliation: Configure and use the Identification and Reconciliation Engine (IRE) to manage data imports from multiple sources and ensure data integrity within the CMDB.
Integrations: Develop and maintain integrations between the ServiceNow CMDB and other IT systems (e.g., SCCM, monitoring tools, cloud services) using REST/SOAP APIs and IntegrationHub to ensure comprehensive and accurate CI data.
* Scripting and development: Write and maintain server-side (JavaScript, Glide API) and client-side (JavaScript) scripts, as well as Flow Designer actions, to enhance CMDB functionality and automate processes.
Collaboration and support
* Troubleshooting: Diagnose, isolate, and resolve complex CMDB-related issues, such as data discrepancies, integration failures, and discovery errors.
* Technical guidance: Provide technical support and guidance to other IT teams, process owners, and stakeholders regarding CMDB best practices and capabilities.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Experience Range in Required Skills: 6-8
//Considerable Overall 5+ yrs
Shift: 2 - 11pm , flexibility WFH 4hours
Location: PAN India
// Required DATA Science not the Data engineer
Primary is Strong in AI/ML - 3+ yrs
Secondary is pyspark - 2+ yrs
// continuous INTERVIEW happening at this one,
Interview Feedback/ Questions/Pointers
> Should have hands-on experience on real-time data and also good information with academic projects.
>Data Science / ML, should be able to explain well.
> Programming Languages Python and Pyspark
> Working knowledge on Predictive / ML based models
> Working experience with Cloud
---- Connect with me for SAMPLE RESUME
"Job Description: Must have: Candidate must have expertise/experience with below tasks
- Candidate must have experience on Linux, git, CICD, Release management, production deployment and support.
- Strong Knowledge on Apache Spark is MUST
- Strong Knowledge of PySpark is MUST
- Strong Knowledge on SQL is MUST
- Good Knowledge of Data Science workload
- Good Knowledge on Kubernetes/Docker
- Good Knowledge of Python is MUST
- Good Knowledge of Java language.
- Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset "
Essential Skills: additional for reference : Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Comments for Suppliers:
Responsibilities
Experience Range in Required Skills: 6-8
//Considerable Overall 5+ yrs
Shift: 2 - 11pm , flexibility WFH 4hours
Location: PAN India
// Required DATA Science not the Data engineer
Primary is Strong in AI/ML - 3+ yrs
Secondary is pyspark - 2+ yrs
// continuous INTERVIEW happening at this one,
Interview Feedback/ Questions/Pointers
> Should have hands-on experience on real-time data and also good information with academic projects.
>Data Science / ML, should be able to explain well.
> Programming Languages Python and Pyspark
> Working knowledge on Predictive / ML based models
> Working experience with Cloud
---- Connect with me for SAMPLE RESUME
"Job Description: Must have: Candidate must have expertise/experience with below tasks
- Candidate must have experience on Linux, git, CICD, Release management, production deployment and support.
- Strong Knowledge on Apache Spark is MUST
- Strong Knowledge of PySpark is MUST
- Strong Knowledge on SQL is MUST
- Good Knowledge of Data Science workload
- Good Knowledge on Kubernetes/Docker
- Good Knowledge of Python is MUST
- Good Knowledge of Java language.
- Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset "
Essential Skills: additional for reference : Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Comments for Suppliers:
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
DevOps Architect Role JD
Skills/JD/Role Requirements
Minimum 8+ years Relevant Experience
The position is for DevOps architect. We are looking for an offshore candidate, well versed in AWS services, Kubernetes, DevOps strategy, infrastructure and automation expertise to work on cutting edge tools and applications, streamline the SDLC, support CICD team and create pipelines, work with SRE team and address architecture and development issues.
This is a design role.
In additional to technical expertise, the right candidate will have excellent communication skills, good problem-solving ability, work well with the implementation teams, and be self-driven. The primary output will be the design work and its acceptance by the architecture team as well as the implementation leads. There will also occasionally be small proof of concept projects that require implementation by the candidate prior to the finalization of the design.
Responsibilities
DevOps Architect Role JD
Skills/JD/Role Requirements
Minimum 8+ years Relevant Experience
The position is for DevOps architect. We are looking for an offshore candidate, well versed in AWS services, Kubernetes, DevOps strategy, infrastructure and automation expertise to work on cutting edge tools and applications, streamline the SDLC, support CICD team and create pipelines, work with SRE team and address architecture and development issues.
This is a design role.
In additional to technical expertise, the right candidate will have excellent communication skills, good problem-solving ability, work well with the implementation teams, and be self-driven. The primary output will be the design work and its acceptance by the architecture team as well as the implementation leads. There will also occasionally be small proof of concept projects that require implementation by the candidate prior to the finalization of the design.
Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Engineer
Work Location: ~CHENNAI~HYDERABAD~BANGALORE~KOLKATA~ //PAN India
"Skill Required: Digital : Apache Spark~Digital : Python for Data Science~Digital : PySpark~MySQL
Role required for Data Engineer with Data science "
Experience Range in Required Skills: 6-8
//Considerable Overall 5+ yrs
Shift: 2 - 11pm , flexibility WFH 4hours
Location: PAN India
// Required DATA Science not the Data engineer
Primary is Strong in AI/ML - 3+ yrs
Secondary is pyspark - 2+ yrs
// continuous INTERVIEW happening at this one,
Interview Feedback/ Questions/Pointers
> Should have hands-on experience on real-time data and also good information with academic projects.
>Data Science / ML, should be able to explain well.
> Programming Languages Python and Pyspark
> Working knowledge on Predictive / ML based models
> Working experience with Cloud
---- Connect with me for SAMPLE RESUME
"Job Description: Must have: Candidate must have expertise/experience with below tasks
- Candidate must have experience on Linux, git, CICD, Release management, production deployment and support.
- Strong Knowledge on Apache Spark is MUST
- Strong Knowledge of PySpark is MUST
- Strong Knowledge on SQL is MUST
- Good Knowledge of Data Science workload
- Good Knowledge on Kubernetes/Docker
- Good Knowledge of Python is MUST
- Good Knowledge of Java language.
- Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset "
Essential Skills: Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Comments for Suppliers:
Responsibilities
Job Title: Engineer
Work Location: ~CHENNAI~HYDERABAD~BANGALORE~KOLKATA~ //PAN India
"Skill Required: Digital : Apache Spark~Digital : Python for Data Science~Digital : PySpark~MySQL
Role required for Data Engineer with Data science "
Experience Range in Required Skills: 6-8
//Considerable Overall 5+ yrs
Shift: 2 - 11pm , flexibility WFH 4hours
Location: PAN India
// Required DATA Science not the Data engineer
Primary is Strong in AI/ML - 3+ yrs
Secondary is pyspark - 2+ yrs
// continuous INTERVIEW happening at this one,
Interview Feedback/ Questions/Pointers
> Should have hands-on experience on real-time data and also good information with academic projects.
>Data Science / ML, should be able to explain well.
> Programming Languages Python and Pyspark
> Working knowledge on Predictive / ML based models
> Working experience with Cloud
---- Connect with me for SAMPLE RESUME
"Job Description: Must have: Candidate must have expertise/experience with below tasks
- Candidate must have experience on Linux, git, CICD, Release management, production deployment and support.
- Strong Knowledge on Apache Spark is MUST
- Strong Knowledge of PySpark is MUST
- Strong Knowledge on SQL is MUST
- Good Knowledge of Data Science workload
- Good Knowledge on Kubernetes/Docker
- Good Knowledge of Python is MUST
- Good Knowledge of Java language.
- Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset "
Essential Skills: Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Comments for Suppliers:
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Experience Range in Required Skills: 5+ Relevant years
Job Description: • Expertise in configuration & implementation w.r.t. Sales & Distribution Module
• Expertise in Pricing functionality
• Must aware - Integration of SD with MM & FI
• Should have good experience on Third party process, STO process, Idocs, Workflow.
Expertise in providing Solution to fulfill requirement and map Business processes in system – Customization
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance