We found 48 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Job Description: • Experience: 5-7 years • Communication: Good (profile will be working in Customer managed Scrum team) o Technical Experience: o Salesforce Industries CPQ (Vlocity CPQ) o Order Management o Flexcard o Omnistudio o Salesforce Core

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Salesforce

Job Description

Job Description: • Experience: 5-7 years • Communication: Good (profile will be working in Customer managed Scrum team) o Technical Experience: o Salesforce Industries CPQ (Vlocity CPQ) o Order Management o Flexcard o Omnistudio o Salesforce Core

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Salesforce

Job Description

Back end developer (Event Streaming-KafkaSpark) Primary Skills J2EE,Sprintboot, Distributed Database,Big Query, Real time Streaming (Spark) Good to have ( Some knowledge about .Net(c))

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Digital : Kafka~Advanced Java Concepts~Digital : Spring Boot

Job Description

Job Description: Azure Data Engineer - What is Strictly Required: Minimum 5-6 years of hands-on experience in ETL pipeline using Azure Data factory & Azure synapse & PySpark 3-4 weeks of notice period Average to excellent communication skill and has experience of managing client delivery independently. Essential Skills: Minimum 5-6 years of hands-on experience in ETL pipeline using Azure Data factory & Azure synapse & PySpark 3-4 weeks of notice period Average to excellent communication skill and has experience of managing client delivery independently. Desirable Skills: Azure Data Engineer

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Azure data engineer

Job Description

Job Description – skilled and proactive Automation Developer with expertise in Python, Selenium, and BDD frameworks such as Behave or Radish Responsible for designing, developing, and maintaining automated test suites in a continuous testing environment. Experience with Jira is essential Key responsibilities: Design and implement robust automation suits using Python, Selenium, and Behave/Radish Develop and maintain BDD-style test scenarios and step definitions Collaborate with QA, development, and product teams to define acceptance criteria and test coverage Integrate automated tests into CI/CD pipelines to support continuous testing Maintain traceability of test cases and defects using Jira Required skills & qualifications: Strong programming skills in Python Hands-on experience with Selenium WebDriver for UI automation Proficiency in BDD frameworks like Behave or Radish Solid understanding of BDD methodology and Gherkin syntax Experience with Jira for issue tracking and test management Familiarity with CI/CD tools (e.g., Jenkins, GitLab CI, Azure DevOps) Knowledge of version control systems like Git

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Digital: Python Selenium

Job Description

Forgerock Developer/ Lead

Responsibilities

Forgerock Developer/ Lead
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Forgerock Developer/ Lead

Job Description

Job Description: Experience with TOSCA Automation Tool. Automation skills required using tools beyond recording features. Knowledge on advanced coding methods expected. Mandatory to have good Knowledge on using properties like Reusable blocks, Recovery, loops, checkbox, Repetition, Excel Operations, APIs etc. Should be very good at analyzing and solving all kind of test cases as per the client expectation. Strong understanding and implementation of TOSCA features like Requirements, Test case Design, Modules, Test case, Execution Lists, Web Services, ClassicT Box framework. SAP QA background with strong analytical skills, ability to decompose requirements into testable cases covering positive, negative, variety of data scenarios Experience in testing backend services, integrations with upstreamdownstream systems Sound knowledge on SQL skills which includes complex query writing, ability to use variety of SQL tools to

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP Analytics Cloud~ TOSCA Automation tester

Job Description

Job Description: Financial Crime experience, SQL, Data Modeling, System Analysis, Databricks engineering and architecture, Database admiration, project and resource planning and management Essential Skills: Financial Crime experience, SQL, Data Modeling, System Analysis, Databricks engineering and architecture, Database admiration, project and resource planning and management Desirable Skills: Pyspark

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : ORACLE ,SQL, Databricks

Job Description

Job Description: Expertise on AWS Databricks.(Mandatory) • 5–6 years of total experience in data engineering or big data development. • 2–3 years hands-on experience with Databricks and Apache Spark. • Proficient in AWS cloud services (S3, Glue, Lambda, EMR, Redshift, CloudWatch, IAM). • Strong programming skills in PySpark, Python, and optionally Scala. • Solid understanding of data lakes, lakehouses, and Delta Lake concepts. • Experience in SQL development and performance tuning. • Familiarity with Airflow, dbt, or similar orchestration tools is a plus. • Experience in CI/CD tools like Jenkins, GitHub Actions, or CodePipeline. • Knowledge of data security, governance, and compliance frameworks. Roles & Responsibilities - • Develop and maintain scalable data pipelines using Apache Spark on Databricks. • Build end-to-end ETL/ELT pipelines on AWS using services like S3, Glue, Lambda, EMR, and Step Functions. • Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data solutions. • Design and implement data models, schemas, and lakehouse architecture in Databricks. • Optimize and tune Spark jobs for performance and cost-efficiency. • Integrate data from multiple structured and unstructured data sources. • Monitor and manage data workflows, ensuring data quality, consistency, and security. • Follow best practices in CI/CD, code versioning (Git), and DevOps practices for data applications. • Write clean, reusable, well-documented code using Python / PySpark / Scala. Experience Required - 5-6 years+ Education - 4 years bachelor’s degree

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Python Digital : Databricks Digital : PySpark