We found 1214 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Job Description: Very strong hands-on experience in using Microsoft Azure services like Azure Data Factory, Data brick services (like processing streaming data using Spark clusters), usage of Blob containers, ESB, Event Grid, Azure SQL server, Cosmos DB, Azure functions, Analytics (like Power BI) is a mandatory requirement.? Deep Microsoft Azure product knowledge and understanding of product features, with exposure to other cloud solutions (Google, AWS) a plus.? Proven track record in designing, developing and implementing data modelling, at a conceptual, logical and physical level.? Experience with accountability for E2E Data Architecture and Lifecycle (including non-functional requirements and operations) management.? Understands industry recognised data modelling patterns and standards? Understands overall IT system design, in particular networking, authorization and authentication protocols, data security, disaster recovery.? Knowledge of the latest technologies to process large amount of data (Apache Spark; Databricks is a plus)? Experience in MDM, Metadata Management, Data Quality and Data Lineage tools.? Expertise in cloud security and identity best practices? A deep understanding of ETL and ELT tools and techniques? Strong experience in architecting ?fit-for-purpose? and ?fit-for-use? complex solutions on the Microsoft Azure cloud stack [IaaS, PaaS and SaaS] ? Excellent understanding of architecture patterns and best practices with proven innovative and leading-edge thinking to build cloud-ready systems (e.g., Domain Driven Architecture, multi-tenancy, building for resilience, scalability, performance, and federation, microservices etc.) ? Hands-on work experience on application platforms like .Net, Python, Java etc. preferred ? Understanding of big data platforms concepts and NoSQL technologies, and ability to map such? The candidate should have experience in modeling data for traditional structured data, as well as information design in Data Lake and NoSQL database engines. ? Fluent with modelling, development and project execution processes around star, snowflake, 3NF, and Data Vault approaches. ? Good experience in working with data using SQLs, Python and Scala in Microsoft based resources like Azure Synapse and Data Lake. ? Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Essential Skills: Very strong hands-on experience in using Microsoft Azure services like Azure Data Factory, Data brick services (like processing streaming data using Spark clusters), usage of Blob containers, ESB, Event Grid, Azure SQL server, Cosmos DB, Azure functions, Analytics (like Power BI) is a mandatory requirement.? Deep Microsoft Azure product knowledge and understanding of product features, with exposure to other cloud solutions (Google, AWS) a plus.? Proven track record in designing, developing and implementing data modelling, at a conceptual, logical and physical level.? Experience with accountability for E2E Data Architecture and Lifecycle (including non-functional requirements and operations) management.? Understands industry recognised data modelling patterns and standards? Understands overall IT system design, in particular networking, authorization and authentication protocols, data security, disaster recovery.? Knowledge of the latest technologies to process large amount of data (Apache Spark; Databricks is a plus)? Experience in MDM, Metadata Management, Data Quality and Data Lineage tools.? Expertise in cloud security and identity best practices? A deep understanding of ETL and ELT tools and techniques? Strong experience in architecting ?fit-for-purpose? and ?fit-for-use? complex solutions on the Microsoft Azure cloud stack [IaaS, PaaS and SaaS] ? Excellent understanding of architecture patterns and best practices with proven innovative and leading-edge thinking to build cloud-ready systems (e.g., Domain Driven Architecture, multi-tenancy, building for resilience, scalability, performance, and federation, microservices etc.) ? Hands-on work experience on application platforms like .Net, Python, Java etc. preferred ? Understanding of big data platforms concepts and NoSQL technologies, and ability to map such? The candidate should have experience in modeling data for traditional structured data, as well as information design in Data Lake and NoSQL database engines. ? Fluent with modelling, development and project execution processes around star, snowflake, 3NF, and Data Vault approaches. ? Good experience in working with data using SQLs, Python and Scala in Microsoft based resources like Azure Synapse and Data Lake. ? Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Desirable Skills : ? Excellent interpersonal, organizational, written communication, oral communication and listening skills? Should come up with the work estimation and should provide inputs to managers on resource and risk planning.? 3. Ability to coordinate with SMEs , stakeholders, manage timelines, escalation & provide on time status

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Databricks, Azure Data Factory

Job Description

Job Description: TIBCO BW CE/6.x Developer 1. 5+ years of hands-on development experience across application integration using TIBCO stack of products. 2. At least 2 years of working experience on TIBCO BW6 / BWCE. 3. Strong technical experience in TIBCO BW 6/BWCE(container edition) and EMS. 4. Experience working on Jenkins CI/CD pipeline and supporting containerized runtime environments (such as Maven,Docker, Kubernetes).(Optional) 5. Good understanding of XSD, XML, XSLT, XPATH, JSON, Webservice- SOAP and REST. 6. Good to have experience /knowledge on SQL. 7. Knowledge on TIBCO Mashery/ Grafana is plus. 8. Should have knowledge on Jenkins Pipeline deployments. 9. Should have experience at least one monitoring tool. 10. Should have knowledge on TIBCO BW administration

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :TIBCO BW

Job Description

Job Description: Pega Data Scientist Essential Skills: Pega Data Scientist

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Pega, Digital: Pega Decision Management

Job Description

Job Description: AWS DevOps, Terraform

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : DevOps

Job Description

Job Description: ClickFSE Developer Essential Skills: ClickFSE Developer Desirable Skills: ClickFSE Developer

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :EIS : ClickSchedule

Job Description

React Next

Responsibilities

React Next
  • Salary : Rs. 1,00,000.0 - Rs. 20,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :React Next

Job Description

SAP SD

Responsibilities

SAP SD
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP SD

Job Description

What You Will Do • Understand business functionalities and technical landscapes of the applications in scope under HRT Capability. • Collaborate with Software Engineers, Functional SMEs, Product Owners, Scrum Masters and BSAs to understand the requirements/user stories and testing requirements. • Build and maintain test automation framework using Tosca and create robust and maintainable test cases that cover various testing scenarios, including functional, regression. • Participate in test planning, designing, maintenance, and execution of automation test scripts. • Implement Quality Engineering Strategies, best practices and guidelines ensuring scalability, reusability, and maintainability. • Collaborate with application team members for successful execution of end-to-end functional flows. • Understand the complexity and derive the testing efforts. • Prepare detailed test reports and maintain test documentation. • Identifying, replicating, and reporting defects and verify defect fixes. • Communicate QA status to product teams. • Pro-active, strong minded, Contribution motivated, adapt to changes & Strong Communication (Verbal & Written) Skills. Who You Are (Basic Qualifications) • 4-6 years’ experience in designing Test Automation frameworks, scripting & maintenance using TOSCA across SDLC. • Good Knowledge of Test Data Management, Test Data Service, TQL implementations using TOSCA. • Basic Knowledge in TOSCA Administrative activities like Distribution execution maintenance and debugging. • Strong Knowledge of QA methodologies, tools, and processes. • Good knowledge of Agile methodology. • Excellent in executing integration, regression, smoke, and sanity automation tests. • Strong technical knowledge of UI (Web and windows) applications. • Strong technical knowledge with testing APIs. • Experience with test management using ADO. • Strong attention to detail with ability to work under pressure, critical thinking skills and able to operate with minimal supervision. • Contribution motivated • Good Verbal & Written Communication skills. • Good Mentorship and Knowledge sharing skills. What Will Put You Ahead • Good understanding of Infor, Avature, Dayforce, Kronos (UKG) and/or ServiceNow Platforms. • Experience with test management using Tricentis Qtest. • Experience and working knowledge in automation testing using TOSCA. • Basic understanding of integration of TOSCA with various test management tools. • Hands on experience with Azure DevOps. • Hands on experience in Dev-Ops like CI/CD pipeline setup and Tosca Administrative set-ups. • Strong analytical, problem-solving, and troubleshooting skills. • Good knowledge of the current market trends.

Responsibilities

What You Will Do • Understand business functionalities and technical landscapes of the applications in scope under HRT Capability. • Collaborate with Software Engineers, Functional SMEs, Product Owners, Scrum Masters and BSAs to understand the requirements/user stories and testing requirements. • Build and maintain test automation framework using Tosca and create robust and maintainable test cases that cover various testing scenarios, including functional, regression. • Participate in test planning, designing, maintenance, and execution of automation test scripts. • Implement Quality Engineering Strategies, best practices and guidelines ensuring scalability, reusability, and maintainability. • Collaborate with application team members for successful execution of end-to-end functional flows. • Understand the complexity and derive the testing efforts. • Prepare detailed test reports and maintain test documentation. • Identifying, replicating, and reporting defects and verify defect fixes. • Communicate QA status to product teams. • Pro-active, strong minded, Contribution motivated, adapt to changes & Strong Communication (Verbal & Written) Skills. Who You Are (Basic Qualifications) • 4-6 years’ experience in designing Test Automation frameworks, scripting & maintenance using TOSCA across SDLC. • Good Knowledge of Test Data Management, Test Data Service, TQL implementations using TOSCA. • Basic Knowledge in TOSCA Administrative activities like Distribution execution maintenance and debugging. • Strong Knowledge of QA methodologies, tools, and processes. • Good knowledge of Agile methodology. • Excellent in executing integration, regression, smoke, and sanity automation tests. • Strong technical knowledge of UI (Web and windows) applications. • Strong technical knowledge with testing APIs. • Experience with test management using ADO. • Strong attention to detail with ability to work under pressure, critical thinking skills and able to operate with minimal supervision. • Contribution motivated • Good Verbal & Written Communication skills. • Good Mentorship and Knowledge sharing skills. What Will Put You Ahead • Good understanding of Infor, Avature, Dayforce, Kronos (UKG) and/or ServiceNow Platforms. • Experience with test management using Tricentis Qtest. • Experience and working knowledge in automation testing using TOSCA. • Basic understanding of integration of TOSCA with various test management tools. • Hands on experience with Azure DevOps. • Hands on experience in Dev-Ops like CI/CD pipeline setup and Tosca Administrative set-ups. • Strong analytical, problem-solving, and troubleshooting skills. • Good knowledge of the current market trends.
  • Salary : Rs. 8,00,000.0 - Rs. 15,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :QA Automation Engineer

Job Description

Primary Skill: SAP FI Asset accounting with minimal leasing implementation/support experience Experience: 8 to 12 Years Project : Support Project Particulars Job Description SAP FI Asset accounting Expert with 8 ~ 12 years of experience Technical Skill sets : SAP FI Asset accounting Expert • Should have minimum 7 years of experience in FI – Asset accounting and Minimal implementation/support experience in SAP FI-Leasing is mandatory. • Should be able to Design & configure SAP Asset accounting/Leasing solutions • Should have experience with integration to FI, CO, AA, PP and other modules and implementation of SAP best practice. • Should be well versed in demonstrated configuration skills in key areas like AR, AP, General Ledger, Asset, Accounting, Cost center accounting and profit center accounting. • Candidate should have experience in writing Functional Specifications independently and should have worked on Custom Objects build from Scratch to Deployments • Should have experience in at least 2 end-to-end Implementations. • Should have good Experience in areas like Concur, T&E and Industry specific solutions. • The candidate should have been in a customer-facing role. • Should be able to analyze client business processes, gather requirements, maintain data mapping Document of interfaces & conversions. • Should have good experience in the interfaces. • Good experience in writing Functional Specifications and good to have experience in SAP S4 HANA Enterprise Contract Management ECM. • Should have good experience in the interfaces. Must Have Skills: • Deep expertise on SAP FI-AA application functionality, design and implementation. • Minimal implementation/support expertise on SAP FI-leasing application functionality, design and implementation. • Implementation & support experience in SAP FICO including GL, AR, AP, AA, CCA & PCA. • Must be skilled in user support, troubleshooting and error resolution. • Must have good experience in RICEF • Must have worked on Interfaces using IDOCs, ALE, Proxies, EDI and RFC. • Must have strong business understanding and suggest SAP solutions for various business Scenarios • Ability to help resolve complex issues and independently manage critical/complex Situations

Responsibilities

  • Salary : Rs. 0.0 - Rs. 35.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP FI AA Consultant - Kripa