Job Description:
Knowledge of Kubernetes: As the environment is containerized and managed by Kubernetes, the candidate should have a deep understanding of Kubernetes concepts, such as pods, services, and deployments. They should also know how to troubleshoot issues within a Kubernetes environment.Problem-Solving Skills: The candidate should have strong problem-solving skills, including the ability to identify, analyze, and resolve complex technical issues.Monitoring Tools: Familiarity with monitoring tools like Prometheus and Grafana to monitor the performance of a Kubernetes environment.Performance Tuning: The candidate should have experience in performance tuning and optimization of SAS programs. This includes understanding how to identify and resolve performance issues, such as slow-running programs or memory leaks.Knowledge of Cloud Platforms: Understanding of cloud platforms and Azure, in particular.Communication Skills: The candidate should have strong communication skills, as they will need to communicate with various stakeholders, including developers, system administrators, and end users.Understanding of Large Userbase Management: The candidate should have experience managing a large userbase, including understanding how to optimize performance for many users.Containerization and Docker Skills: The candidate should have a good understanding of containerization concepts and Docker. They should be able to create, manage, and troubleshoot Docker containers.SAS Programming Skills: The candidate should have a strong understanding of SAS programming, including the ability to write, debug, and optimize SAS code.Experience with Domino Data Lab: The candidate should have experience with the Domino product from Domino Data Lab. This includes understanding how to use the platform to manage and run SAS programs, as well as how to troubleshoot any issues that may arise.
Essential skills :
12+ years relevant experience in DevOps• 8+ years experience in Azure DevOps and GitHub Action Pipelines• Should be able guide and lead the team in technical prospective• Understanding of Data tools ( ADB, ADF, Snowflake, Denodo)Good to have skills-• Python (3+ years)• Infra knowledge (Azure Cloud) (2+years)• GenAI (Basic)"1. Azure Cloud2. CI/CD3. AKS4. Azure DevOps5. Azure Cloud"Azure Devops, Kubernetes, Docker, StackStorm, Azure cloud, Argocd, Python• Strong CICD Experience with Azure Pipelines• Experience of Azure Cloud• Strong expertise in Kubernetes and Docker• Must have relevant experience (4+ years) " MongoDB2. Linux3. GitHub4. Agile Scrum5. Domino6. SAS
Desirable skills :
MongoDB2. Linux3. GitHub4. Agile Scrum5. Domino6. SAS
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :DevOps~Digital : DevOps Continuous Integration and Continuous Delivery (CI/CD)~Digital : Docker~Digital : Kubernetes
5+ years of Experience in New Relic
hands-on expertise with New Relic, encompassing its full suite of capabilities. The candidate will have to actively involve in implementation, troubleshooting, and continuous improvement. Experience with other observability tools and AWS services is highly desirable and will be a significant asset.
Responsibilities
5+ years of Experience in New Relic
hands-on expertise with New Relic, encompassing its full suite of capabilities. The candidate will have to actively involve in implementation, troubleshooting, and continuous improvement. Experience with other observability tools and AWS services is highly desirable and will be a significant asset.
Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Description:
Experience with TOSCA Automation Tool. Automation skills required using tools beyond recording features. Knowledge on advanced coding methods expected. Mandatory to have good Knowledge on using properties like Reusable blocks, Recovery, loops, checkbox, Repetition, Excel Operations, APIs etc. Should be very good at analyzing and solving all kind of test cases as per the client expectation. Strong understanding and implementation of TOSCA features like Requirements, Test case Design, Modules, Test case, Execution Lists, Web Services, ClassicT Box framework. SAP QA background with strong analytical skills, ability to decompose requirements into testable cases covering positive, negative, variety of data scenarios Experience in testing backend services, integrations with upstreamdownstream systems Sound knowledge on SQL skills which includes complex query writing, ability to use variety of SQL tools to
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :SAP Analytics Cloud~ TOSCA Automation tester
Job Description:
Managing data with Workbench, Salesforce inspector, data import wizard.
Making validation rules
standards, specifications and policies that are supported within the Salesforce ecosystem.
Integrate multiple technologies with Salesforce Marketing Cloud.
Architect, design, and develop advanced customizations utilizing Salesforce Marketing cloud
Essential Skill:
Sales cloud, Service cloud experience.
Lightning Migration experience
Design and Architect Salesforce solutions
Prepare Technical design and review SFDC code
Certified Application Architect & System Architect .
Experience in Enterprise Architecture and Integration (API)
Experienced in Multi Cloud Applications
Experience in Platform development (Force.com) Mentoring and grooming other team members as required
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Salesforce Service Cloud~Digital : Salesforce Sales Cloud~Digital : Salesforce Field Service Lightning
SAP SuccessFactors - Learning Management Systems (SF-LMS)~Cornerstone OnDemand Integration & Migration~Edcast Learning eXperience (LXP)~MySQL
Job Description: Experience with LXP and SMB Essentials modules (Critical)Hands-on exposure to Cornerstone Web Services (APIs), especially for auto-provisioning integrations (Critical)Recent and consistent hands-on experience with Cornerstone (Critical)Proficiency with SQL (Good to have)Relevant product certifications (Good to have).
Essential Skills: Experience with LXP and SMB Essentials modules (Critical)Hands-on exposure to Cornerstone Web Services (APIs), especially for auto-provisioning integrations (Critical)Recent and consistent hands-on experience with Cornerstone (Critical)Proficiency with SQL (Good to have) Relevant product certifications (Good to have).
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :SAP SuccessFactors - Learning Management Systems (SF-LMS)~Cornerstone OnDemand Integration & Migration~Edcast Learning eXperience (LXP)~MySQL
Job Description:
Responsible for the administration and support for Microsoft 365 environment (Exchange Online, MS Teams, Active Directory). Manage user accounts, Mailboxes, license and groups through M365 Admin Center and Power Shell. Manage Proofpoint email security system with Implementation and customization of policies and rules. Maintain and update documentation related to configuration, procedure and change management. Automate administrative tasks using Power Shell scripts. Stay updated with MS 365 roadmap and recommend relative improvements.
EssentialSkills:
O365, EXCHANGE
Desirable Skills:
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Databricks, experienced in designing scalable data architectures that can be implemented for modern data platforms within our existing data landscape (Databricks, Unity catalog) and experienced in AWS and python
Responsibilities
Databricks, experienced in designing scalable data architectures that can be implemented for modern data platforms within our existing data landscape (Databricks, Unity catalog) and experienced in AWS and python
Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance